Upload 95 GB CSV file into MySQL MyISAM table via Load data infile: CSV engine an alternative?

You should disable all the constraints when you are importing. Apart from that I think it should work properly and to be noted that it is going to take a while, probably hours.

Up vote 3 down vote favorite 3 share g+ share fb share tw.

I'm trying to upload a 95 GB CSV file into a MySQL database (MySQL 5.1.36) via the following command: CREATE TABLE MOD13Q1 ( rid INT UNSIGNED NOT NULL AUTO_INCREMENT, gid MEDIUMINT(6) UNSIGNED NOT NULL , yr SMALLINT(4) UNSIGNED NOT NULL , dyyr SMALLINT(4) UNSIGNED NOT NULL , ndvi DECIMAL(7,4) NOT NULL comment 'NA value is 9', reliability TINYINT(4) NOT NULL comment 'NA value is 9', ndviquality1 TINYINT(1) NOT NULL , ndviquality2 TINYINT(1) NOT NULL , primary key (rid), key(gid) ) ENGINE = MyISAM ; LOAD DATA INFILE 'datafile. Csv' INTO TABLE MOD13Q1 FIELDS TERMINATED by ',' LINES TERMINATED BY '\r\n' IGNORE 1 LINES (gid, yr, dyyr, ndvi, reliability, ndviquality1, ndviquality2 ) ; I'm running this script via DOS at the moment, but the database is not responding. It works for smaller CSV files (1.5 GB) fine.

Would it work for this file size? Do you have any recommendation on how to do this more efficiently/faster? Would engine = CSV be an alternative (indexing not activated!

-> so queries might run super slow? ). Update Thanks for the tips, It worked!

Mysql> LOAD DATA INFILE 'E:\\AAJan\\data\\data. Csv' INTO TABL E MOD13Q1 -> FIELDS TERMINATED by ',' -> LINES TERMINATED BY '\r\n' -> IGNORE 1 LINES -> (gid, yr, dyyr, ndvi, reliability, -> ndviquality1, ndviquality2 -> ) ; Query OK, -1923241485 rows affected (18 hours 28 min 51.26 sec) Records: -1923241485 Deleted: 0 Skipped: 0 Warnings: 0 mysql> Hope this is helpful for others avoiding splitting data up in chunks. Mysql file-upload import mysql-management load-data-infile link|improve this question edited Mar 21 '11 at 9:02Tim Post?15.9k23879 asked Nov 20 '09 at 7:28Jan Verbesselt757611 100% accept rate.

If not you should try Tokyo Cabinet - 1978th.net/tokyocabinet its famous for the speed – YOU Nov 20 '09 at 7:45 @Mark: tokyo cabinet is key/value DBM so nothing to do IMHO – RageZ Nov 20 '09 at 7:56.

No easy way, you will have to split your data in chunks and then import those...

.................................. Oh wait. It does not matter anyway it will be some bulk transaction. You need chunks.

You need it to avoid overfilling yout log segment space. The lock count limits. Anything greater than 1 million of things at a time it too much.

So the best known batch size for BCP is 10,000 records!

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions