Question
What is the fastest way to dump a large (> 1GB) XML file into a MySQL database?
Data
The data in question is the StackOverflow Creative Commons Data Dump.
Purpose
This will be used in an offline StackOverflow viewer I am building, since I am looking to do some studying/coding in places where I will not have access to the internet.
I would like to release this to the rest of the StackOverflow membership for their own use when the project is finished.
Problem
Originally, I was reading from XML/writing to DB one record at a time. This took about 10 hours to run on my machine. The hacktastic code I'm using now throws 500 records into an array, then creates an insertion query to load all 500 at once (eg. "INSERT INTO posts VALUES (...), (...), (...) ... ;"). While this is faster, it still takes hours to run. Clearly this is not the best way to go about it, so I'm hoping the big brains on this site will know of a better way.
Constraints
- I am building the application using C# as a desktop application (i.e. WinForms).
- I am using MySQL 5.1 as my database. This means that features such as "
LOAD XML INFILE filename.xml" are not usable in this project, as this feature is only available in MySQL 5.4 and above. This constraint is largely due to my hope that the project would be useful to people other than myself, and I'd rather not force people to use Beta versions of MySQL. - I'd like the data load to be built into my application (i.e. no instructions to "Load the dump into MySQL using 'foo' before running this application.").
- I'm using MySQL Connector/Net, so anything in the
MySql.Datanamespace is acceptable.
Thanks for any pointers you can provide!
Ideas so far
stored procedure that loads an entire XML file into a column, then parses it using XPath
- This didn't work since the file size is subject to the limitations of the max_allowed_packet variable, which is set to 1 MB by default. This is far below the size of the data dump files.
There are 2 parts to this:
- reading the xml file
- writing to the database
For reading the xml file, this link http://csharptutorial.blogspot.com/2006/10/reading-xml-fast.html , shows that 1 MB can be read in 2.4 sec using stream reader, that would be 2400 seconds or 40 mins (if my maths is working this late) for 1 GB file.
From what I have read the fastest way to get data into MySQL is to use LOAD DATA.
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Therefore, if you can read the xml data, write it to files that can be used by LOAD DATA, then run LOAD DATA. The total time may be less than the hours that you are experiancing.
Ok, I'm going to be an idiot here and answer your question with a question.
Why put it in a database?
What if ... just a what-if... you wrote the xml to files on local drive and, if needed, write some indexing information in the database. This should perform significantly faster than trying to load a database and would much more portable. All you would need on top of it is a way to search and a way to index relational references. There should be plenty of help with searching, and the relational aspect should be easy enough to build? You might even consider re-writing the information so that each file contains a single post with all the answers and comments right there.
Anyway, just my two-cents (and that is not worth a dime).
I have a few thoughts to help speed this up...
The size of the query may need to be tweaked, there's often a point where the big statement costs more in parsing time and so becomes slower. The 500 may be optimal, but perhaps it is not and you could tweak that a little (it could be more, it could be less).
Go multithreaded. Assuming your system isn't already flatlined on the processing, you could make some gains by having breaking up the data in to chunks and having threads process them. Again, it's an experimentation thing to find the optimal number of threads, but a lot of people are using multicore machines and have CPU cycles to spare.
On the database front, make sure that the table is as bare as it can be. Turn off any indexes and load the data before indexing it.
SqlBulkCopy ROCKS. I used it to turn a 30 min function to 4 seconds. However this is applicable only to MS SQL Server.
Might I suggest you look at the constraints on your table you've created? If you drop all keys on the database, constraints etc, the database will do less work on your insertions and less recursive work.
Secondly setup the tables with big initial sizes to prevent your resizes if you are inserting into a blank database.
Finally see if there is a bulk copy style API for MySQL. SQL Server basically formats the data as it would go down to disk and the SQL server links the stream up to the disk and you pump in data. It then performs one consistency check for all the data instead of one per insert, dramatically improving your performance. Good luck ;)
Do you need MySQL? SQL Server makes your life easier if you are using Visual Studio and your database is low performance/size.
Does this help at all? It's a stored procedure that loads an entire XML file into a column, then parses it using XPath and creates a table / inserts the data from there. Seems kind of crazy, but it might work.
Not the answer you want, but the mysql c api has the mysql_stmt_send_long_data function.
I noticed in one of your comments above that you are considering MSSQL, so I thought I'd post this. SQL Server has a utility called SQMLXMLBulkLoad which is designed to import large amounts of XML Data into a SQL Server database. Here is the documentation for the SQL Sever 2008 version:
http://msdn.microsoft.com/en-us/library/ms171993.aspx
Earlier versions of SQL Server also have this utility
In PostgreSQL, the absolute fastest way to get bulk data in is to drop all indexes and triggers, use the equivalent of MySQL's LOAD DATA and then recreate your indexes/triggers. I use this technique to pull 5 GB of forum data into a PostgreSQL database in roughly 10 minutes.
Granted, this may not apply to MySQL, but it's worth a shot. Also, this SO question's answer suggests that this is in fact a viable strategy for MySQL.
A quick google turned up some tips on increasing the performance of MySQL's LOAD DATA.
来源:https://stackoverflow.com/questions/1456086/what-is-the-fastest-way-to-load-an-xml-file-into-mysql-using-c