I have a file on HDFS that I want to know how many lines are. (testfile)
In linux, I can do:
wc -l
Can I do somet
You cannot do it with a hadoop fs command. Either you have to write a mapreduce code with the logic explained in this post or this pig script would help.
A = LOAD 'file' using PigStorage() as(...);
B = group A all;
cnt = foreach B generate COUNT(A);
Makesure you have the correct extension for your snappy file so that pig could detect and read it.