• Resolved ashworth

    (@ashworth)


    We’re working with a site where we’ve been given an Excel spreadsheet to upload into a table in Ninja Tables. The sheet has been converted from separate tabs of data to a single, large sheet; then saved as a CSV file in preparation for upload. The table has 46,653 rows in it, and while I managed to eventually get this giant thing to import, the frontend is having issues displaying it performantly, even with the AJAX option turned on.

    I’m not sure how to proceed given the above information. I could build something custom, but the original idea was that we’d just use Ninja Tables Pro. Any thoughts on possible solutions?

    I should let you know that (for now, while testing), I’ve removed any server-side throttling limitations on CPU, memory, inode, etc. I’ve also raised PHP limits for this site significantly, but it doesn’t seem to solve the issue.

    It doesn’t need a plugin disabling test, because I think the plugin is technically working with a lessor amount of rows. So the plugin isn’t “broken”, but I still need a way to get this stuff to work well on the frontend ???♂?

Viewing 1 replies (of 1 total)
  • Plugin Support Syed Numan

    (@annuman)

    Hello @ashworth,

    First, for such a large amount of data, the server must have the ability to manipulate them. Also, in Ajax Rendering Method we load the table data in the Chunk method where each chunk loads 3 thousand data. So, if you want to load the table faster you can set chunk size lower using the below PHP snippet in the theme’s functions.php file where it will loads 1 thousand data once and display the table in the frontend and all the table data will be loaded in the background. Also, you might give that time to load all the data until doing any further action in the table frontend.

    add_filter(‘ninja_table_per_chunk’, function($limit) {
       return 1000;
    })

    Thanks

Viewing 1 replies (of 1 total)
  • The topic ‘Working with large amounts of data’ is closed to new replies.