• Resolved macwinson

    (@macwinson)


    Hi Roland, I sincerely hope that you and your family are doing great!

    I’m testing the “Import CSV File” functionality of PDb.

    1)
    I manage to get a clean .csv with the right headers and format (&#39 for apostrophes and single quotes and &#34 for double quotes); PDb is importing the total records successfully (no errors). The fields are of the following kind: 2 text-line fields, 2 numeric fields, 1 dropdown and 1 rich-text field -some of the rich-text fields are long-). Total database: 1.000 records, 9.43MB.

    2) I can see ALL records (1.000) imported on “List Participant”, and I can see the data associated with ALL records (the data are OK).

    3)
    But when I browse the records (pdb_list) I get some consecutive records rows not displaying (those rows are empty, no data is shown).
    When I go to the plug-in “List Participant” and try to edit one of those not-displaying-records I get a blank page (the plug-in is no able to open that record to edit it). No errors in the Debugging Log.
    For instance,
    /wp-admin/admin.php?page=participants-database-edit_participant&action=edit&id=199
    is not showing anything, only the WordPress menu on the left side. But no error or anything else.

    4)
    Then, I delete those records in PDd “List Participant”, create a new .csv file with ONLY those “rebel” records and import them back in PDb (Import CSV File).

    5)
    Now, I browse the records (pdb_list) and ALL the records are showing, including those not displaying previously on step 3.

    Those records don’t have any strange characters, in fact they are imported without errors the first time (full records) and the second time again (only the selected records).

    An important fact: the issue is always with correlative records, never in random order:
    for instance, Record ID 484 to 499, and no more than 10, 15 records. Maybe a time-out issue?

    I tried all the process (both localhost and Web server) several times with different databases (originally exported from phpmyadmin .SQL) and I get the same result: at some point, for instance, importing a total of 500 records I’m getting this issue with 10-20 records.

    Any hint on what can be the cause???

    Many thanks and congratulations for an excellent plug-in with constant improvements!!!

    WP: 5.7.2
    Participants Database: 1.9.7.2

Viewing 15 replies - 1 through 15 (of 17 total)
  • Thread Starter macwinson

    (@macwinson)

    Additional info:
    -Encoding UTF-8
    -Always starting with fresh (truncate) wp_participants_database table.
    -“Import CSV File” parameters:
    Enclosure character: ”
    Delimiter character: ,
    Duplicate Record Preference: Create new record…
    Duplicate Record Preference: Record ID

    Plugin Author xnau webdesign

    (@xnau)

    I’m investigating this, but it has nothing to do with the CSV file itself, or the importing of data. As you’ve seen, the import is completely successful.

    The problem is related to the caching of participant records that is done to minimize load on the database server.

    For reasons I have yet to determine, the cache is missing data for some records.

    Thread Starter macwinson

    (@macwinson)

    OK Roland, thanks for the feedback!!!
    Please let me know if I can be of any help.
    I’m able to send you by email (I think I have your address) the full .csv file and the .csv with the missed records (compressed, both, are less than 1MB)
    Thanks again!!

    hello dear MacWinson hello dear Roland,

    many many thanks for this thread and for your engagement in this thread.
    The import-csv-feature is a highly appreciated feature of Wp-Participants-database. I love it and it is so awesome.

    Keep up the great work – it rocks
    And i am pretty sure that the participants-database has got a global community of users and friends.

    greetings from south europe

    Plugin Author xnau webdesign

    (@xnau)

    I have made some progress with this, however I have not identified the specific problem you’re seeing. The best I can do for now is to make sure the cached data is cleared when importing a CSV.

    Thanks for the offer to send the CSV file, but I don’t think it will help.

    One thing you can do is test the import with plugin debugging on. You may see something specific for records that are imported, but end up blank in the list view.

    Thread Starter macwinson

    (@macwinson)

    Hi Roland,
    Good advice, thanks!! but unfortunately, I was unable to get something in the Participants Database Debugging Log. I tried with different plug-in configuration (“plugin debug”, “all errors” and “verbose”) but I can’t catch anything while importing records (I tried with the same sets of records as before).

    Thanks and greetings also for @tangermaroc for the nice words and support!
    I will keep up running and testing!
    Cheers!

    Thread Starter macwinson

    (@macwinson)

    Hi Roland !!
    Just to let you know…
    I’m testing after the update to 1.9.7.3 and seems to have significant improvements!
    The Import CSV File is working great, for instance, the same .CSV records that previously had some issues now are displaying correctly:
    * All records are imported fine, and I have no errors on the Debugging Log while importing.
    * All the records are displayed fine when I browse the records (pdb_list) or when I edit them on PDd “List Participants”.
    * The only odd thing is, while making some records browsing (page with pdb_list shortcode) I get this on the Debugging Log:

    PDb_Participant_Cache::get getting participant id 499 failed 
    PDb_Participant_Cache::get getting participant id 490 failed 
    PDb_Participant_Cache::get getting participant id 491 failed ...

    (10 or less records, always with correlative records, never in random order)
    BUT, as I stated, the records are displayed OK on pdb_list or PDd “List Participants”.

    Seems to be solved? those Debugging Log messages are irrelevant?

    Thanks!!

    Plugin Author xnau webdesign

    (@xnau)

    I had to guess what the problem was, so thanks for confirming the changes I made helped.

    Those are just debugging messages, meant to show that the cached data was missed. The data comes from the db if it is not found in the cache, so it’s not really a problem.

    Thread Starter macwinson

    (@macwinson)

    Glad to know that my feedback was helpful! Thank YOU for your expertise and commitment to the plug-in!!!
    The issue was related with the “Import CSV File” functionality? or the handling, in general, of several records? (or handling a certain amount of records).

    A related question:
    It’s possible to use one of the imported fields as a custom ID, so I can keep URLs as they are in another site?

    Now, when importing records by default I have:

    /single-record-page/?pdb=494 (default “Record ID” created while importing)

    but I wonder if it’s possible:

    /single-record-page/?pdb= (my custom imported ID number)

    I tried importing “id” as a field in the .CSV but, the import process resets the ID and starts with ID=1 incremental.

    I browsed the settings and documentation but I couldn’t find something related.

    Thanks again!!!!

    Plugin Author xnau webdesign

    (@xnau)

    I decided to post a tutorial that explains how to set up an alternate record ID:

    Using a Custom Identifier Value for Participants Database Records

    side note: a new question should be posted to a new topic, makes the answer easier to find for other users.

    Thread Starter macwinson

    (@macwinson)

    Hi Roland, thanks for everything, great work!!!
    Should I change this Topic as Resolved?
    I will try the “Using a Custom Identifier…” tutorial and plug-in ASAP, and I will start a new topic with a question about it, so we can collaborate spreading the news.

    Cheers!

    Thread Starter macwinson

    (@macwinson)

    Hi Roland!

    I’m having some issues importing chunks of .CSV files of 1000 records each.
    Sometimes the file import process exceeds the max_execution_time of 120 seconds (default on my server, can’t change it).
    And sometimes, I start again, and the same file that fails previously is now loaded (all records) on 94 seconds without errors.

    I tried with chunks of 500 records in each .CSV with similar results: at some point the load of records exceeds the max_execution_time of 120 seconds. Later, the very same file is imported in less than 60 seconds… ??!!

    Any suggestion to optimize the Import CSV File process? Some cache to clear? Something to do between each file import for instance? Or anything I can do?

    This is normal or expected with many records?

    As usual, many thanks!!

    Plugin Author xnau webdesign

    (@xnau)

    I am aware of the inefficiencies of the CSV import process, it’s something I am working on. There isn’t much you can do to help that. If you have a lot of columns in the import that will be slower, avoiding importing unneeded columns (if there are any) will help.

    It is normal for the import time to vary quite a bit, there are several things that can affect it such as the specific server load at the time.

    This may be something you want to take up with your hosting provider, allowing a longer execution time probably is the simplest.

    If you know what you’re doing and know exactly the data format you need to use for each column, you can try directly importing the data using a CSV with phpMyAdmin. That will be a lot faster.

    Thread Starter macwinson

    (@macwinson)

    Hi Roland!

    Thanks for the info!
    I think that is easier to make some miracle than convince hosting providers to change shared server settings (max_execution_time, or some other). ??

    The good thing is that PDb is now handling fine the browsing of records (I mean, no more the issue that originated this topic). I think the CSV import process issue can be workaround, or in the worst scenario with a lot of chunks of 100 or 200 records each CSV file.

    So far, this is the only workflow that works for me:

    1) in localhost (xampp) with this custom setup:
    max_execution_time = 6000
    max_input_time = 6000
    memory_limit = 3072M
    I was able to import near 4000 records using 3 .CVS files
    (No time out or memory issues) using the PDb “Import CSV File” feature.

    2) export SQL file from localhost phpmyadmin including tables:
    wp_participants_database
    wp_participants_database_fields
    wp_participants_database_groups

    3) go to live site, deactivate PDb plug-in,
    With phpmyadmin again, delete the three tables with the prefix wp_participants_
    (same as step 2)

    4) import SQL file into the database (phpmyadmin), with default settings but disabling “Allow the interruption…”

    5) In a webserver with max_execution_time=90 took near 5 minutes but the importation was successful (no interruptions).

    Cheers!

    Thread Starter macwinson

    (@macwinson)

    Hi Roland,
    Regarding plugin optimization for DB export/import, can I safely delete the “private_id” field? (I don’t need it). There is any other admin field that I can delete without?compromising the operation of PDb?
    Thanks!!

Viewing 15 replies - 1 through 15 (of 17 total)
  • The topic ‘Import CSV File – missing records – Rows not showing’ is closed to new replies.