Altera_Forum
Honored Contributor
15 years agoPLL Tuning
Hello.
I am slightly confused about tuning a PLL output for an SDRAM clock and wondered if someone here could answer a couple of questions. Following this example [http://www.altera.co.kr/_altera/html/_excalibur/nios-sdram-tuning/sdram_pll_tuning.pdf] I am getting confused about some of the language. The PDF states: SDRAM Clock Can Lead System Clock by: Minimum of: tcoutmin(FPGA)–th(SDRAM) = 2 ns –1 ns = 1 nstclk–thz(SDRAM)–tsu(FPGA) = 10 ns –5.5 ns –2.4 ns = 2.1 ns (And a similar set for Lag) What are the system clock and SDRAM clock? I take it one of them is the actual output with the phase shift? (SDRAM clock im thinking this one is) What about System Clock, does this refer to the actual data, i.e. the 'clock' that would clock it correctly if it existed? The SDRAM core documentation says that I should read the appropriate delay values for the FPGA from the timing analysis done on compilation, but I can't find any values in either the Classic Timing Analyzer or TimeQuest reports that the examples mention. Do I have to set it to analyze the pins relating to the memory in this way specifically?