-
Notifications
You must be signed in to change notification settings - Fork 603
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"skip lines" feature #336
Comments
You can do this with
That would convert I just ran into the same issue with school data in Massachusetts. |
I realize about tail. It, however, is applicable for command line usage, On Thu, Sep 18, 2014 at 5:05 PM, Chris Amico [email protected]
|
There are many kinds of Junk lines that can appear in a csv / text file, especially in fixed width reports. They generally appear as report headers / footers, page headers / footers, blank / junk rows. I handle such kind of reports regularly. Some old reports cause so much trouble as they have junk rows after each data row. I think there should be a way to identify all those junk rows using the schema file while running "in2csv" and eliminate those while importing, as they do not serve any purpose. |
Hi I'd like to pronounce myself to that issue also. I am with tabular files almost 24/7 and have found csvkit very powerful to select look at and select data from a file in the terminal in order to continue working with it, etc.. I have a habit to add important information about file generation as commented lines at the top of the file, which is very important for backtracking issues. Below an example. I am circumventing these things with e.g. What are your toughts?
|
@mpschr I have a similar problem when working with VCF files, which have comments prefixed with
|
|
Related: #669 |
Noting that |
Sometimes the first few lines of a csv need to be skipped (header comments, copyright lines). It would be nice to have this capability in csvkit.
I think it belongs in the reader's constructor.
The text was updated successfully, but these errors were encountered: