Altera_Forum
Honored Contributor
8 years agoSetting bits in a std_logic_vector based on a run time pattern
I am trying to set bits in a std_logic_vector according to a number given from another hit_vector.
the hit vector is a combination of three detector hits, and I am trying to decode this back into hit pattern from the single detectors. The decode goes fine on cl'event, but then when nothing changes in hit_vector, the LSB of all decoded values changes to 1 every time on the rising clock. I am probably doing something stupid, but I cannot find the problem. at 100 ns the hit vector changes, and quadrant gets decoded correctly; i.e. 11 -> bit 3, but after this quadrant switches back and forth between 1 and 0 Thanks for any help. andi