Forum Discussion
Altera_Forum
Honored Contributor
15 years agoThe more I see TimeQuest examples or documentations, the more I feel there are errors and editorial bugs.
I just came across this example: http://www.altera.com/support/examples/timequest/exm-tq-basic-source-sync.html According to this example: set_output_delay is the same as the traditional setting of tCO at pins (data transition with respect to output clock pin). In fact, this is much easier since I can calculate max/min from tSU/tH of device as we did in the classic timing analyser for years The controversy is this: how on earth can delay be tCO in this example but tSU (for max) or -tH (for min) in the equations. Surely that is impossible to comprehend. For example, in this post, tSU is 1.2n, clk period is 8 ns, hence data can be delayed as much as tCO of 8 - 1.2 = 6.8 ns and this should be maximum delay between output clk and its data transition (assuming zero board delay diff of clk and data) so that it does not violate tSU at latching edge. while data can be as early as tCO of .2ns to avoid tH violation at external device and this should the minimum. Accordingly, the equations should be (max = UI - tSU, min = +tH) instead of: max = tSU min = - tH Any comments welcome.