I think the question can be will a user wants to see 1M rows at a time. Probably no.
Spreadsheet uses virtualisation to display data, ie only data viewable in a Row x Column viewport is displayed. This improves rendering performance.
Data performance can be improved by
1. Lazy loading data or Infinite scrolling. We have a built-in async hook that does this.
2. Only save pointers to data that is being displayed. So if you have 1M rows, on the JavaScript side, you only load 100 rows in memory and when user scrolls, you can replace this rows with new data. This will make the browser happy.
3. Streaming data from the server similar to google sheets.
But to answer your question, we have a Max row limit of 1_048_576 and max column limit of 16_384
I'm checking it out on my phone and scrolling through the interface is pretty choppy, but rather fluid in a Google sheet. Maybe you might be able to further tweak the virtual row rendering?
Props for the fact that when I scroll a bit faster things don't disappear, which I've seen before in SPAs with infinite scroll.
Excel is probably the wrong tool for that job. At the scale of a million rows or more, this is probably better done by a real database and programming language.
Actually, this is an important comment. Building a spreadsheet with web technology is not that difficult. However, building one with more than just the most basic features, which performs well, once the data grows, is a surprisingly hard task.
To give an example of what makes it so difficult: Some browsers (e.g. Chrome) seem to have special optimizations for rendering tables. So performance-wise it makes a difference if you use table-tr-td-tags or div-tags within your DOM.
A few weeks ago I found a StackExchange thread where someone had benchmarked the two approaches and found, that table tags were somehow optimized. However, this was tested with identical content and with a table for which every cell had to be rendered. In other scenarios, the performance might be different, but for spreadsheets, this should be applicable. Sadly I can't find the link at the moment (was on a different PC).
I don't understand your comment. It's not far fetched to have millions of rows to handle. I myself have about 12 million rows (notes mainly), although I don't load everything at the same time, so maybe you're referring to that aspect?
It's pushing the boundaries of a spreadsheet's practical usability to be honest. I would not use it for more than a few hundred at most. In fact I just checked my largest sheet is 292 rows. Anything larger goes in anything else! Usually I use SQLite for local storage data like that now.
A funny problem I had a few years back was a company with users keeping 200 meg+ Excels sheet on their desktops. It took them forever to log off every day because the profile was copied back to the server.
Oh yes, I’m one of those users (and who are “those”? Lizard people?). At least once a week I need to operate datasets that have hundreds of thousands of rows, and while accessing them is not a problem, batch computations can be tricky.