Intel LVDS Serdes Receiver removed input pins after compilation
Hi,
I'm trying to read out data from an 12 bit LVDS ADC using the Intel LVDS SERDES IP.
The IP is configured as a 9 channel RX DPA-FIFO with an Serdes Factor of 6. I'm using bit slip for alignment.
I can't get it working due to optimizations in Quartus during compilaton. A lot of the registers are reduced due to "stuck datain" and so after A & S these messages appeare:
Warning (21074): Design contains 9 input pin(s) that do not drive logic
Warning (15610): No output dependent on input pin "adc_fclk_0_i"
Warning (15610): No output dependent on input pin "adc_rxin_0_i[0]"
Warning (15610): No output dependent on input pin "adc_rxin_0_i[3]"
Warning (15610): No output dependent on input pin "adc_rxin_0_i[5]"
Warning (15610): No output dependent on input pin "adc_rxin_0_i[2]"
Warning (15610): No output dependent on input pin "adc_rxin_0_i[7]"
Warning (15610): No output dependent on input pin "adc_rxin_0_i[1]"
Warning (15610): No output dependent on input pin "adc_rxin_0_i[4]"
Warning (15610): No output dependent on input pin "adc_rxin_0_i[6]"
The input pins of the board are connected to the lvds serdes rx_in pins. The external clock from the ADC is connected to the inclock of the IP.
The IP is instantiated in a wrapper component (ip/adc_out), which controls the bit slip and creates 12 bit signals after receiving two 6 bit signals form the IP.
Im working on this problem for a few days now and can't find a solution. I've attached the project (Quartus Prime Standard 21.1) . Do you have any clue what causes this optimization?
Thanks & Best,
Thomas