Zend_DB whats the right strategy to export large amounts of data to csv? – chunks – fetchAll – fetchRow [duplicate]

我只是一个虾纸丫 提交于 2019-12-13 04:44:40

问题


I have to export a huge amount of data. Also I have to transform every record a little bit through php. Whats the right strategy to export large amounts of data?

  • Do I split Zend_Db requests in multiple chunked queries with limit(1000,x)?
  • Do I use fetchAll or fetchRow?
  • Which fetchrow or fetchall is performing better considering high performance?

I cannot use SQL OUTFILE since I have to interpret the xml/html coming from one column. As far as I know mysql is not able to do this. This means for me, I either can use fetchrow or fetchall, but I cannot process on the mysql server. Since I'm fetching a huge amount of data fetchAll may leed to a out of memory of php. So I'm not sure if I can avoid this by using fetchrow or if I have to use chunks anyways? Is fetchrow slower than fetchall?


回答1:


SELECT field, field1, field2 
INTO OUTFILE '/home/user/out.csv' 
FIELDS TERMINATED BY ',' 
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\'
LINES TERMINATED BY '\n'
FROM your_table WHERE 1=1


来源:https://stackoverflow.com/questions/11198008/zend-db-whats-the-right-strategy-to-export-large-amounts-of-data-to-csv-chunk

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!