I have two excel files.
First excel file contains the Person Name and Total Days Present column Ex.
PersonName TotalDays
xyz
ab
An array of 20,000 elements containing 4 Variants each will take up less than 2 MB of RAM. I don't think memory has anything to do with your problem -- unless you happen to be using an old computer with 2 MB of RAM or something like that.
A more likely reason why your code is so heavy is that you are looping through cells. There is significant overhead to each communication between VBA and Excel sheet data, and this adds up when you refer to many cells one at a time. In your case, your loop does up to 200,000 separate cell references.
Instead, you should load all your data at once into a Variant
array, and then loop through that array, as shown below. This is significantly faster (even though this uses more memory, not less; but again, I don't think memory is your issue).
lblStatus.Caption = "Loading to memory"
Dim ArrAuditData() As AData
Dim varTemp As Variant
Dim TotalLookUpCount As Integer
' Load everything into a Variant array.
varTemp = lookUpRange
ReDim ArrAuditData(1 To UBound(varTemp, 1)) As AData
For J = 1 To UBound(varTemp, 1)
If varTemp(J, cmbChoice.ListIndex) = "Fail" Then
ArrAuditData(TotalLookUpCount).AuditType = varTemp(J, cmdAudit2.ListIndex)
ArrAuditData(TotalLookUpCount).TransTime = varTemp(J, cmbChoice.ListIndex - 1)
ArrAuditData(TotalLookUpCount).AuditValue = varTemp(J, cmbChoice.ListIndex)
ArrAuditData(TotalLookUpCount).Slno = varTemp(J, 0)
TotalLookUpCount = TotalLookUpCount + 1
ElseIf varTemp(J, cmbChoice.ListIndex) = "" And J > 4 Then
Exit For
End If
DoEvents
Next
ReDim Preserve ArrAuditData(TotalLookUpCount) As AData
For further reading, have a look at this old but still relevant article: http://www.avdf.com/apr98/art_ot003.html
If you still think RAM is the issue, then please show us the AData
type declaration.
EDIT: Also, never ReDim Preserve
inside a loop like that! ReDim Preserve
is a very expensive operation and rarely needs to be done more than once on any given array. Doing it 20,000 times will slow down your code. Here I take it out of the loop, and just use it once at the end to trim off the unused elements. (Notice how I initially ReDim
'ed the array to fit the largest conceivable number of elements.)
I would suggest a different approach.
If I interpret the question correctly:
The approach I would use is to use the COUNTIF (or if you hvae Excel 2007 or later COUNTIFS)
Assume
Solution for Excel 2007 or 2010
=COUNTIFS(file2.xlsx!StatusData[Name],[Name],file2.xlsx!StatusData[Status],StatusReport[[#Headers],[Present]])
=COUNTIFS(file2.xlsx!StatusData[Name],[Name],file2.xlsx!StatusData[Status],StatusReport[[#Headers],[Absent]])
Solution for Excel 2003
Add an extra column D to file2 StatusData table (call it Code)
=Sheet1!$A2&"_"&Sheet1!$C2
file1 cell B2
=COUNTIF([file2.xls]Sheet1!$D:$D,Sheet2!$A2&"_"&Sheet2!$B$1)
=COUNTIF([file2.xls]Sheet1!$D:$D,Sheet2!$A2&"_"&Sheet2!$C$1)
Note: while these formula give the same result the COUNTIFS + Table references version in 2010 if so much faster its not funny (my test on about 300,000 rows updates in a few seconds).