Altera_Forum
Honored Contributor
16 years agoTiming problems, and calculating max and min input delays...
Hi all...
Another day, another problem... Before I start I should say that I am using the FTDI Morph-IC-II board, which has a Cyclone II EP2C5F256C8 FPGA and an FT2232H Usb bridge, Quartus II 10.0 sp1 web edition, TimeQuest timing analyser and Altera-ModelSim for my project. I am currently trying to read in a 24 bit deserialised LVDS signal and pixel clock into my FPGA design. The 24 bit LVDS signal is made up of 18 bits of pixel RGB color data, a Hsync and a Vsync signal and 4 reserved channels. This data is then multiplexed into an 8 bit wide fifo. When required the data is sent a byte at a time over an FT245 Synchronous interface to the TX buffer on the FTDI FT2232H chip. When the FT2232H chip recieves the request from the drivers it then sends the data to the application on my PC over USB... My problem is that I am currently losing bytes of data and I am pretty sure this is due to timing problems with the inputs to my design. Before using the actual streamed LVDS data from the module, I created a design that used the real pixel clock, Hsync and Vsync... but used a counter to provide dummy data instead of the 18 bit pixel data, and padded the 4 reserved bits will null values. This design is fully compiled and time constrained and works, as within the application note for the FTDI 245 Synchronous interface it gave me the required setup and hold times required. However, my actual LVDS data is being generated by a module and is being passed through the national semconductor DS90C124 Rx De-serializer Board: http://www.national.com/appinfo/interface/files/national_serdes24-35usb.pdf I also have a short ~10-15cm loom connecting the deserializer outputs to the inputs. I don't know what my setup and hold times need to be to constrain these inputs, and am not sure how to work any of it out... I can attempt a trial and error basis, but was hoping to put more thought behind it and to use a more intelligent way of determining these input delays.... Any help, insight or guidance is much appreciated... I have looked at Rysc's timequest guide, but I don't know where I can find the setup and hold times of "the external device" in order to calculate my max and min input delays the National Semiconductor documentation for the deserialiser board has no timing information, and the documentation I have about the LVDS only breifly discusses timing, but with resepect to the serialised LVDS datastream... --- Quote Start --- For input constraints, they look like so: External device parameters: Tco_ext = Tsu of external device minTco_ext = Th of external device Data delays on board: Max_ext2fpga = Max board delay from external device to FPGA min_ext2fpga = min board delay from external device to FPGA Clock delays on board:21 Max_clk2fpga = Max delay from board clock to FPGA min_clk2fpga = min board delay from clock to FPGA Max_clk2ext = Max delay from board clock to external device min_clk2ext = min board delay from clock to external device set_input_delay -max = Tco_ext + Max_ext2fpg set_input_delay -min = minTco_ext + min_ext2fpga --- Quote End --- As I said, any help or insight, is very appreciated. If you need anymore information then let me know! Cheers, Lee H