You can try implement radix sort algorithm. Because radix scans the whole list sequentially and only few times, it can help here to prevent gigant number of scans and seeks of your 100 GB file.
Radix sort intend to classificate your records each iteration by one part. This part can be a digit, or a datetime part like year, month, day. in this case you dont even need to convert the string to DateTime, you can convert only the specific part to int.
Edit:
For sorting purposes, you can create temp binary file with only 2 columns: DateTime (DateTime.ToBinary() as Int64) and line address in the source file (as Int64).
Then you getting a much smaller file with fixed size records, only 16 bytes per record, then you can sort it much faster (IO operations will be faster at least).
Once finished sorting the temp file, you can create back the full sorted 100 GB log file.