Altera_Forum
Honored Contributor
11 years agoProblem reading a big txt file using system console and loading it to a SDRAM
Hi forum,
I'm trying to load a txt file containing 1350000 32 bits words to the SDRAM of a DE0 nano dev board but system console is freezing or crashing. I've tried to read the whole file and loading it to a variable (set data [split $file_data "\n"]) and then write it to the SDRAM in a single command (master_write_32 $m $sdram $data) but that crashes system console instantly. Then i tried to read the file line by line and, on each line, i would write it to the SDRAM, using the following code: --- Quote Start --- while { [gets $fp line] >= 0 } { foreach word [split $line] { master_write_32 $m [expr $SDRAM+[expr ($i*4)]] $line incr i } } --- Quote End --- I left the code running for a whole day just to find that system console froze :( the code worked perfectly for a file with only 10k lines. So i came to ask for help, is there any other way to do this? Is there a way to break the big file into small files using tcl? Thanks for the help.