Problem using set_min_delay between Quartus and Timequest. Results between the two tools are inconsistent.
I'm using a test design to examine why my set_max/min_delay constraints are not working. It appears that Quartus uses the set_min_delay constraint incorrectly during compile.
Setup: The design has a few outputs with varying levels of combinational logic between the clock and the output pins.
Constraints: The clock is attached to the output pin of the global clock driver giving ~1.7 to 1.9ns clock delay over the network. It is related to no other clocks. These are the pertinent constraints:
create_clock -name clk_25 -period 40 -waveform {0 20} [get_pins {U0_pll_50_25_400|pll_50_25_400_inst|altera_pll_i|outclk_wire[0]~CLKENA0|outclk}]
set out_clk clk_25
set_max_delay -from $out_clk -to [get_keepers {OUTA[*]}] 14.000
set_min_delay -from $out_clk -to [get_ports {OUTA[*]}] 5.00
We can ignore the set_max_delay as it appears to work normally.
First off, Timequest appears to work normally. When I vary the minimum constraint (without recompiling) it fails when the clock path + data path > min_delay. All is good.
The problem is that that constraint does not work with Quartus. I recompiled the design with varying min_delay constraints and these are the results I found:
min_delay data_path clock_path
4 7.6-8.2 1.8
4.5 8.4-9.4 1.8
5 9.4-10.2 1.8
5.5 10.3-11.4 1.8
6 11.2-12.2 1.8 (this exceeds max_delay and timing fails)
These results are repeatable. I can get the results I want, but I have to use different constraints between Quartus and Timequest. As an example, if I want 10-12ns delay (clock to IO) I would have to do the following:
Quartus: set_min_delay = 4.5 (mystery math, see above table)
Timequest: set_min_delay = 10 (normal math, 1.8 clk+ 8.2 data)
I'm attaching the project because I'm figure people will assume I've screwed up. If I have, great, let me know where I've gone astray. Otherwise I really need a solution. The FAA doesn't really like the "just because" engineering solution.
Thanks in advance,
Colin