Hello.
Yes I am now using the timequest timing analyzer to constrain my clock. What I am doing is (after going through the timequest quickstart tutorial and the timequest cookbook):
- I run the compilation just before the fitter step so I can create a post map netlist (whats the difference between post fit and post map netlists by the way? The quick start guides and other documentation usually create a timing netlist based on post map. Any specific reason?)
- using
derive_pll_clocks -create_base_clocks
to inform timequest about the presence of a PLL and automatically update all the places it is being used - updating the timing netlist
- writing an SDC file out of that
- adding the SDC file in my project
Now I see that there is already an SDC file called cpu_0.sdc or something in my project directory. I believe this is a default setting. I wonder what that does.
Nevertheless, after doing that I rerun the entire compilation. This time the timing requirements are met (but I have not added the DMA controller yet, which was one of the components causing most, but not all, of the timing failures). Since I cannot get the usb blaster II driver to work in Linux (thats fodder for another thread), I recreate the entire design in the windows version of quartus doing exactly the same steps as I was doing in Linux but on windows the timing requirements are not met (despite not adding the DMA controller). This is puzzling me a lot..one thing I notice is that quartus reports (in the critical warning section) in windows that the PLL mode has been changed (automatically) to 'no compensation'. Perhaps that has got something to do with the timing requirements being not met..
FYI I have tristated all unused pins with a weak pull up (at the project level), to avoid mistakingly resetting the reset pin (which is active low) so (I hope) no issues there. Additionally daixiwen suggested 'watching' the clocked output. So you mean I should actually watch the input to the PLL and the output waveforms using ..modelsim ?? But that will a purely simulational watching right? And since im using the web version, I cannot use signal tap too..could you please elaborate about this
I have posted a lot I know but I believe this to be necessary since suggesting a way forward might require as much if not more input.
To summarize I want to know the following:
- Why is a design which works on Linux fails to do so in windows as far as timing analysis is concerned
- Why is it that DMA causes the timing analysis to fail (even after constraining the PLL clock in the above mentioned fashion)? How do I go about fixing that? (I know this is easier said than done, but if pointed to in the right direction, I'd take that as an exercise and pursue it relentlessly keeping you guys posted here.)
- It was mentioned in the last post about 'watching the output of the PLL'. Id be glad if that could be elaborated more on (perhaps this question would get covered under the answer to the 2nd point which makes it redundant. But in case it isn't, here it is :) )
I am keen to hear from you folks..
Regards,
Aijaz