问题
I have a SqlDataReader
reading a large record set (1M records approx.) and I'm trying to export it to a PDF document using iTextSharp.
This is my code:
if (reader.HasRows)
{
int rowNum = 0;
while (reader.Read())
{
if (rowNum % 2 == 1)
datatable.DefaultCell.GrayFill = 0.8f;
else
datatable.DefaultCell.GrayFill = 0.95f;
if (meRes.Trans(Lang, "Dir", CompanyID).ToUpper() == "RTL")
for (int i = reader.FieldCount - 1; i >= 0; i--)
{
object o = reader[i];
datatable.AddCell(new Phrase(o.ToString(), fntList));
}
else
for (int i = 0; i < reader.FieldCount; i++)
{
object o = reader[i];
datatable.AddCell(new Phrase(o.ToString(), fntList));
}
rowNum++;
}
myDocument.Add(datatable);
}
When I run this, it causing a terrible memory leak. What can I do differently to improve this?
回答1:
You can set the number of rows per page to reduce the memory pressure
if (rowNum>0 && table1.Rows.Count % 7 == 0) // 7 = number of rows per page
{
pdfDoc.Add(table1);
table1.DeleteBodyRows(); // free resources
pdfDoc.NewPage();
}
来源:https://stackoverflow.com/questions/14101142/itextsharp-very-large-table-memory-leak