Handling of huge Blobs in MySQL?

社会主义新天地 提交于 2019-12-10 13:37:00

问题


How can I insert huge BLOBs into a MySQL database (InnoDB)?

Fields of type LONGBLOB support data sizes of up to 4 GB according to the MySQL manual. But how does data of such a huge size get into the database?

I tried to use

INSERT INTO table (bindata) VALUES ( LOAD_FILE('c:/tmp/hugefile') );

which fails if the size of hugefile is bigger than about 500 MB. I have set max_allowed_packet to an appropriate size; the value of innodb_buffer_pool_size doesn't seem to have an influence.

My server machine runs Windows Server 2003 and has 2 GB RAM. I'm using MySQL 5.0.74-enterprise-nt.


回答1:


BLOBs are cached in memory, that's why you will have 3 copies of a BLOB as you are inserting it into a database.

Your 500 MB BLOB occupies 1,500 MB in RAM, which seems to hit your memory limit.




回答2:


I do not know which client/api you use, but when trying to use blobs from own Java and Objective-C clients it seems, MySQL does not really support streaming of blobs. You need enough memory to hold the whole blob as byte array in ram (server and client side) more than once! Moving to a 64 bit linux helps, but is not desired solution.

MySQL ist not made for BLOB handling (ok for small BOBs :-). It occupies twice or three times the ram to be stored/read in the "BLOB".

You have to use an other database like PostgreSQL to get real BLOB support, sorry.



来源:https://stackoverflow.com/questions/945471/handling-of-huge-blobs-in-mysql

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!