Forum Discussion
Altera_Forum
Honored Contributor
12 years ago --- Quote Start --- The short answer first: given tCO of external input device then (ignoring board delays) set delays as follows: set_input_delay -max tCO ... set_input_delay -min tCO ... if you want to include board delays and I bet that is horribly difficult but you can make estimate using length of trace, speed of light, capacitance ... then change tCO to : tCO +data delay(max) - clk delay(min) for -max change tCO to : tCO +data delay(min) - clk delay(max) for -min Now the long story: My personal verdict on TimeQuest(TQ) package of documentation is that it is the worst ever that I have seen. I don't care if pins are called ports or ports are called pins, the trouble is they have done their best to confuse and wipe off common sense. According to my struggle I see Timequest allows three ways to determine input delays. 1) given tCO of external input device 2) given tSU/tH of external device. 3) given skew The crux of the matter is that Timequest needs to know the clock data relationship. Thats all. From that it inserts delays to move timing window to a location at pins so that internal registers timing window is optimum. The relation of these delays to timing window shift is common sense: tSU(at pin) = data delay - clk delay + tSU(at register) tH(at pin) = clk delay - data delay + tH(at register) The full timing window at pins is tSU(at pins) + tH(at pins) Thus to get optimum window at pins, TQ needs to know the three variables in above equation. The register tSU/tH are related to device type ...etc. the delays are inserted by fitter so it knows their values. To make life easy !!! for us, TQ lets you enter tCO of external device and it works out the timing window at pins. The main problem to me is the meaning of "delay". Does it mean above delays in the equations. I doubt it. As an example of confusion of meaning of delay, you have the option to use one of two sets of commands: set_input_delay -min (called system centric) as in the short reply above set_min_delay (called fpga centric) The strange thing is that the meaning of delay flips over and does not seem to mean the above delays in the equations or anything consistent as the two statements below are equivalent ????: set_input_delay -min <tH> --i.e. positive tH set_min_delay <-tH> --i.e. negative tH Now let us pretend we understood tCO case (1) What about number (2): How do we relate tSU/tH of external device to data/clk relation. A complete mystery to me. The case of skew (3) is not difficult to understand but I haven't yet seen a device giving skew instead of tCO. --- Quote End --- hi,kaz."number (2): How do we relate tSU/tH of external device to data/clk relation. A complete mystery to me." I'm confused too. This question , for you, almost 3 years pass,I think you must have understood it. Could you show me how to relate tSU/tH of external device to data/clk relation? according to the altera document"AN433:Constraining and Analyzing Source-Synchronous Interfaces" set_input_delay -max UI-tSU_ext set_output_delay -min tH_ext it's difficult to understand for me. Instead ,the case(1) you mentioned using tCO to set_input_delay is much easier 。Then the question can also change to find out the relationship of tCO and tSU/tH of external device's output port .In my opinion, for the output port of external device ,tSU/tH means the output data will be stable during the {tSU,tH} window,right? So, after the launch clock edge,the data must be transport to the output port before tH of the external device,otherwise the data will be unstable.So, we get tCO_max=tH. for the similar situation,tCO_min = -tSU. then,we should get: set_input_delay -max tH_ext set_output_delay -min -tSU_ext what do you think? thanks. Andrew