I\'m using Pandas\' to_sql function to write to MySQL, which is timing out due to large frame size (1M rows, 20 columns).
http://pandas.pydata.org/panda
Update: this functionality has been merged in pandas master and will be released in 0.15 (probably end of september), thanks to @artemyk! See https://github.com/pydata/pandas/pull/8062
So starting from 0.15, you can specify the chunksize argument and e.g. simply do:
df.to_sql('table', engine, chunksize=20000)