--- Quote Start ---
So it is per definition that during the change from one state to next the asynchronous decoder can have only the previous or the next state. If the decoder output decodes the same value in both states, then the outputs must stay stable even if the decoder input switches randomly between the two states. Correct?
--- Quote End ---
No. You havn't glitches in the gray encoded output. But pinsCypressData is
not gray encoded. If more than one input term to the combinational logic for an output bit changes, you can get gliches.
By the way, I got glitches in the simulation of your AHDL, too.
P.S.: It's not exactly clear, what can be regarded as glitches in this design. When you look at the AHDL simulation, the decoded output pins are changing it's state at different times. So the output code has already glitches. With the VHDL design, you often see glitches of a single output bit. If you look at the technology map, you realize, that each output bit is generated by two cascaded LUTs. At the second LUT stage, multiple input bits can change simultaneously. In this case, glitches are normal behaviour.
Also the AHDL design has cascaded LUTs, but the structure is different. To avoid glitches of individual output bits, the outputs must be registered. But you have the problem of delay skew between outputs, so the code still has glitches.
There has been a discussion, if the LUT output can be always expected glitch free with only one input bit changíng. The empirical results seem to suggest this, but apparently, it can't be guaranteed.