Working with large amounts of data
-
We’re working with a site where we’ve been given an Excel spreadsheet to upload into a table in Ninja Tables. The sheet has been converted from separate tabs of data to a single, large sheet; then saved as a CSV file in preparation for upload. The table has 46,653 rows in it, and while I managed to eventually get this giant thing to import, the frontend is having issues displaying it performantly, even with the AJAX option turned on.
I’m not sure how to proceed given the above information. I could build something custom, but the original idea was that we’d just use Ninja Tables Pro. Any thoughts on possible solutions?
I should let you know that (for now, while testing), I’ve removed any server-side throttling limitations on CPU, memory, inode, etc. I’ve also raised PHP limits for this site significantly, but it doesn’t seem to solve the issue.
It doesn’t need a plugin disabling test, because I think the plugin is technically working with a lessor amount of rows. So the plugin isn’t “broken”, but I still need a way to get this stuff to work well on the frontend ???♂?
- The topic ‘Working with large amounts of data’ is closed to new replies.