csvRead fails at loading a long csv file (more than 10000 lines)
Reported by Antoine Monmayrant
long_csv (1.18 MB, text/plain)
csv too long to be loaded in scilab
BUG DESCRIPTION:
----------------
csvRead fails at loading a long csv file (more than 10000 lines): scilab hangs (ie spinnning icon of death stays forever and scilab has to be killed).
Splitting the file in smaller csv file and loading them one at a time works perfectly
ERROR LOG:
----------
none, scilab hangs.
HOW TO REPRODUCE THE BUG:
-------------------------
// csvRead_bug.sci : use long_csv file attached to this bug report
// csvRead fails at loading a long csv file (more than 10000 lines)
// splitting the file in smaller csv file and loading them one at a time works
// In a linux terminal:
// split -l10000 long_csv shorter_csv_
//
// long_csv is 20002 lines long
// shorter_csv_aa is 10000 lines long
// shorter_csv_ab is 10000 lines long
// shorter_csv_ac is 2 lines long
// This works like a charm
fname="shorter_csv_aa";
[Ma, comments]=csvRead(fname, ',', '.', 'double', []);
fname="shorter_csv_ab";
[Mb, comments]=csvRead(fname, ',', '.', 'double', []);
fname="shorter_csv_ac";
[Mc, comments]=csvRead(fname, ',', '.', 'double', []);
M=[Ma;Mb;Mc];
//// FAILS : scilab keeps on working for ever and cannot be interrupted by Ctrl+C
fname="long_csv";
[M, comments]=csvRead(fname, ',', '.', 'double', []);
OTHER INFORMATION:
------------------
Bug tested on Linux 64 bits with scilab-branch-6.0-1534494461
Could this be related to http://bugzilla.scilab.org/show_bug.cgi?id=15437 or http://bugzilla.scilab.org/show_bug.cgi?id=15445 ?