I think that you have max_request_body_size limit on your squid proxy(or whatever). You should contact proxy administrator to allow bigger POST request body, or divide to smaller parts of CSVs and repeat step for each one.
Up vote 0 down vote favorite share g+ share fb share tw.
I am trying to import huge . Csv file into the database. It is Drupal database, I am importing products into the node, content_type_ and uc_products tables.
But this is not so relevant as due to the performance optimization I stopped using Drupal functions node_load and/or node_save and I insert/update records directly via SQL commands. Scenario: Read whole csv file from local server (approx. 40MB) into the variable (to prevent accessing the file per each record) Parse it to array per line Parse every line to fields Check if the record in database exists (1 SQL select, compare particular field) If exists, update it (1 SQL update) If does not exists, insert it (3 inserts, 1 update) 6.1. If record has image code and it is different from the one in the database, download (curl) the new image file from external server and save it at the local server.
1 additional delayed insert to the log table When I run this script, I got 502 Proxy error (Reason: Error reading from remote server) after approx. 10k imported records and the page stops loading. However, the import continues (new records are still being added into tables).
This stops after 20k-26k (tried several times) records. Two questions: What's that Proxy error and why am I getting one? How to optimize my script even more?
Is there some other, usual way to import huge amount of data into the MySQL? Thanks a lot. Php mysql drupal csv import link|improve this question asked Jan 11 at 14:08petiar585 14% accept rate.
Possible duplicate of Import tool Excel CSV – Gordon Jan 11 at 14:10 1 For just the SQL stuff, I had to do something similar 2 days ago - over 600,000 records from a CSV file. What I did was write a script to convert the CSV to INSERT statements (in your case, you could use conditional inserts either using a WHERE clause or EXISTS). I ended up with a 47m SQL text file that I then used with mysql -uuser -ppass dbname – Matt H Jan 11 at 14:17.
I think that you have max_request_body_size limit on your squid proxy(or whatever). You should contact proxy administrator to allow bigger POST request body, or divide to smaller parts of CSVs and repeat step for each one. Or maybe you can check this post : serverfault.com/questions/185894/proxy-e....
This sounds like a proxy server in between you and the php code (very possibly the reverse proxy on the server if that is your setup) is giving up on waiting for a response from your script. While you can probably adjust those timeouts, personally I would try decoupling very long actions like this from the http lifecycle. Not a php developer so don't know what the common pho approaches are.
This sounds like a proxy server in between you and the php code (very possibly the reverse proxy on the server if that is your setup) is giving up on waiting for a response from your script.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.