How to sort 30Million csv records in Powershell

此生再无相见时 提交于 2021-02-11 12:50:03

问题


I am using oledbconnection to sort the first column of csv file. Oledb connection is executed up to 9 million records within 6 min duration successfully. But when am executing 10 million records, getting following alert message.

Exception calling "ExecuteReader" with "0" argument(s): "The query cannot be completed. Either the size of the query result is larger than the maximum size of a database (2 GB), or there is not enough temporary storage space on the disk to store the query result."

is there any other solution to sort 30 million using Powershell?

here is my script

$OutputFile = "D:\Performance_test_data\output1.csv"
$stream = [System.IO.StreamWriter]::new( $OutputFile )

$sb = [System.Text.StringBuilder]::new()
$sw = [Diagnostics.Stopwatch]::StartNew()

$conn = New-Object System.Data.OleDb.OleDbConnection("Provider=Microsoft.ACE.OLEDB.12.0;Data Source='D:\Performance_test_data\';Extended Properties='Text;HDR=Yes;CharacterSet=65001;FMT=Delimited';")
$cmd=$conn.CreateCommand()
$cmd.CommandText="Select * from 1crores.csv order by col6"

$conn.open()

$data = $cmd.ExecuteReader()

echo "Query has been completed!"
$stream.WriteLine( "col1,col2,col3,col4,col5,col6")

while ($data.read()) 
{ 
  $stream.WriteLine( $data.GetValue(0) +',' + $data.GetValue(1)+',' + $data.GetValue(2)+',' + $data.GetValue(3)+',' + $data.GetValue(4)+',' + $data.GetValue(5))

}
echo "data written successfully!!!"

$stream.close()
$sw.Stop()
$sw.Elapsed

$cmd.Dispose()
$conn.Dispose()

回答1:


Putting the performance aside and at least come to a solution that works (meaning one that doesn't hang due to memory shortage) I would rely on the PowerShell pipeline. The issue is thou that for sorting an object you will need to stall te pipeline as the last object might potentially become the first object.
To resolve this part, I would do a coarse division on the first character(s) of the concern property first. Once that is done, fine sort each coarse division and append the results:

Function Sort-BigObject {
    [CmdletBinding()][OutputType([scriptblock])] param(
        [Parameter(ValueFromPipeLine = $True)]$InputObject,
        [Parameter(Position = 0)][String]$Property,
        [ValidateRange(1,9)]$Coarse = 1,
        [System.Text.Encoding]$Encoding = [System.Text.Encoding]::Default
    )
    Begin {
        $TemporaryFiles = [System.Collections.SortedList]::new()
    }
    Process {
        if ($InputObject.$Property) {
            $Grain = $InputObject.$Property.SubString(0, $Coarse)
            if (!$TemporaryFiles.Contains($Grain)) { $TemporaryFiles[$Grain] = New-TemporaryFile }
            $InputObject | Export-Csv $TemporaryFiles[$Grain] -Encoding $Encoding -Append
        } else { $InputObject.$Property }
    }
    End {
        Foreach ($TemporaryFile in $TemporaryFiles.Values) {
            Import-Csv $TemporaryFile -Encoding $Encoding | Sort-Object $Property
            Remove-Item -LiteralPath $TemporaryFile
        }
    }
}

Usage
(Don't assign the stream to a variable and don't use parenthesis.)

Import-Csv .\1crores.csv | Sort-BigObject <PropertyName> | Export-Csv .\output1.csv
  • If the temporary files still get too big to handle, you might need to increase the -Coarse parameter

Caveats (improvement considerations)

  • Objects with an empty sort property will be immediately outputted
  • The sort column is presumed to be a (single) string column
  • I presume the performance is poor (I didn't do a full test on 30 million records, but 10.000 records take about 8 second which means about 8 hours). Consider replacing native PowerShell cmdlets with .Net streaming methods. buffer/cache file input and outputs, parallel processing?



回答2:


You can try using this:

$CSVPath = 'C:\test\CSVTest.csv'
$Delimiter = ';'

# list we use to hold the results
$ResultList = [System.Collections.Generic.List[Object]]::new()

# Create a stream (I use OpenText because it returns a streamreader)
$File = [System.IO.File]::OpenText($CSVPath)

# Read and parse the header
$HeaderString = $File.ReadLine()

# Get the properties from the string, replace quotes
$Properties = $HeaderString.Split($Delimiter).Replace('"',$null)
$PropertyCount = $Properties.Count

# now read the rest of the data, parse it, build an object and add it to a list
while ($File.EndOfStream -ne $true)
{
    # Read the line
    $Line = $File.ReadLine()
    # split the fields and replace the quotes
    $LineData = $Line.Split($Delimiter).Replace('"',$null)
    # Create a hashtable with the properties (we convert this to a PSCustomObject later on). I use an ordered hashtable to keep the order
    $PropHash = [System.Collections.Specialized.OrderedDictionary]@{}
    # if loop to add the properties and values
    for ($i = 0; $i -lt $PropertyCount; $i++)
    { 
        $PropHash.Add($Properties[$i],$LineData[$i])
    }
    # Now convert the data to a PSCustomObject and add it to the list
    $ResultList.Add($([PSCustomObject]$PropHash))
}

# Now you can sort this list using Linq:
Add-Type -AssemblyName System.Linq
# Sort using propertyname (my sample data had a prop called "Name")
$Sorted = [Linq.Enumerable]::OrderBy($ResultList, [Func[object,string]] { $args[0].Name })

Instead of using import-csv I've written a quick parser which uses a streamreader and parses the CSV data on the fly and puts it in a PSCustomObject. This is then added to a list.

edit: fixed the linq sample




回答3:


You could try SQLite:

$OutputFile = "D:\Performance_test_data\output1.csv"

$sw = [Diagnostics.Stopwatch]::StartNew()

sqlite3 output1.db '.mode csv' '.import 1crores.csv 1crores' '.headers on' 'Select * from 1crores order by 最終アクセス日時' ".output $OutputFile"

echo "data written successfully!!!"

$sw.Stop()
$sw.Elapsed



回答4:


I downloaded gnu sort.exe from here: http://gnuwin32.sourceforge.net/packages/coreutils.htm It also requires libiconv2.dll and libintl3.dll from the dependency zip. I basically did this within cmd.exe, and it used a little less than a gig of ram and took about 5 minutes. It's a 500 meg file of about 30 million random numbers. This command can also merge sorted files with --merge. You can also specify begin and end key position for sorting --key. It automatically uses temp files.

.\sort.exe < file1.csv > file2.csv

Actually it works in a similar way with the windows sort from the cmd prompt. The windows sort also has a /+n option to specify what character column to start the sort by.

sort.exe < file1.csv > file2.csv


来源:https://stackoverflow.com/questions/66057891/how-to-sort-30million-csv-records-in-powershell

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!