Unexpected behaviour of DDR3 SDRAM ready and data_valid signals
Hi,
I am trying to configure DDR3 SDRAM controller with uniphy IP and NIOS II with my custom RTL code. This custom RTL codes generates read/write req with an interface data width of 128 bits.
I have attached all the settings I have selected in SDRAM IP, in snapshots(Capture 1,2,3,4).
Problem I am facing: When I generate read req for a burst size of any(say 10), I was exptecting SDRAM to give me data(total of 128 x 10 = 1280 bits) in 10 clock cycles . But I am getting data in different chunks( 128 bits in every 3 or 4 clock cycles), instead of single burst(128 bits in every clock cycles 10 times). This has degraded the performance of my design drastically.
I have also attached the stp result screenshot(result.png) which explains the problem .
Please let me know what is the mistake I might be doing , and how to resolve this so I get 128 bits data every clock cycle.
regards,
Yogesh