| ||Ask HN: Sorting massive text files?|
44 points by JBerlinsky 2008 days ago | past | web | 68 comments |
|I've got a ~35GB text file full of data, and I want to parse it so I only have unique results in the end file. In the past, I've had no problem with cat FILE | sort | uniq, but I've never worked with anything of this magnitude. Even running a line count takes an extraordinary amount of time:|
time cat FILE | wc -l
Any suggestions on how I can go about getting unique records from this type of file?
Applications are open for YC Summer 2016
| Apply to YC