-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
not able to read in vsd_normalized files #3
Comments
I should add that this is on Mac OSX 10.11.6. |
Ok, so in the Rembrandts.sh script on line 38-41, adding a newline (\n) to the echo command seems to fix the warning, but the other error remains. This is an example of what I have changed in the Rembrandts.sh script: |
ah - now i see that creates a problem later on too. I think its when its trying to write new columns to the file and close it. Back to the drawing board. |
I have it working now. The only thing I did was actually delete the newline '\n' I added, then ran the script over the top of the existing files. Somehow this has fixed it. Maybe it had something to do with needing content in those files to bypass the error? |
Hi
I'm getting an error when trying to run the script. It is related to the creating of the vsd_normalization files such that R cannot load the files at line 15
exon <- read.csv(paste(tempFolder,"/vsd_normalized.exonic.all.centered.mx.txt",sep=""),sep="\t")
The error itself comes when I'm loading 4 samples (8 files in total: 4 exonic, 4 intronic):
Warning message: In read.table(file = file, header = header, sep = sep, quote = quote, : incomplete final line found by readTableHeader on './tmp/miR29_analysis/vsd_normalized.intronic.all.mx.txt'
It seems to be that the file is created without the correct line ending. If I manually add the line ending to the file, R can load it, but the next time I run it, the file is regenerated and the script fails.
I'll try to have a go at a fix, but would appreciate it you have any advice
Thanks
Sam
The text was updated successfully, but these errors were encountered: