Altera_Forum
Honored Contributor
17 years agotco question on VHDL
I've this VHDL code:
ENTITY count_tia IS
PORT
(
clock: IN STD_LOGIC;
sload: IN STD_LOGIC;
data: IN integer RANGE 0 TO 127;
result: OUT integer RANGE 0 TO 127
);
END count_tia;
ARCHITECTURE rtl OF count_tia IS
SIGNAL result_reg : integer RANGE 0 TO 127;
BEGIN
PROCESS (clock)
BEGIN
IF (clock'event AND clock = '1') THEN
IF (sload = '1') THEN
result_reg <= data;
ELSE
result_reg <= result_reg + 1;
END IF;
END IF;
END PROCESS;
result <= result_reg after 100ns;
END rtl;
The classical timer analyzer say that the maximum tco (between clock and result) is 5ns, but this is not possibile because there's a delay time greater than 5ns. What's wrong?