Questions about BER Data Rate.
Hi,
I’m currently carrying out an evaluation in which I measure the Bit Error Rate(BER) of an RG174 cable using Stratix and then utilized this data to generate an Eye diagram for signal integrity verification.
Recently, I discovered that the data rate for BER measurements could be adjusted in the “low_latency_10g_1ch.Qsys” file.
And I have configured values at 1.25Gbps, 5.00Gbps and 6.25Gbps.
While experimenting with these data rates, I encountered some matters that piqued my curiosity.
Upon examining the actual transmitted bit rate, it was observed that the real number of transmitted bit was about 5.2% of the set data rate for all three proposed settings.
(To illustrate, the bit per second transfer rate for the 1.25Gbps configuration was calculated at 62,395,687bit/s; for 5Gbps, it was 261,911,177bit/s; and for 6.25Gbps, it was 325,677,826bit/s)
The specs of the RG174 in use note a maximum frequency of 3Ghz.
Given these anomalies, I have the following questions:
- Could the Data Rate specified in Qsys(e.g., 6Gbps) not be the actual bits being transferred? Might it be considering all the internal data exchanges, inclusive of the BER pre-calculation? (The reason I’m inquiring is my curiosity on whether applying a 6Gbps rate to a cable with 3GB max frequency is technically queried as an over-spec situation)
- If so, could you explain what roles the additional bits ply in the Data Rate, aside from the bits actually transmitted through the cable?
Your support and advice would be profoundly critical to me. Should you have any thoughts or even related field suggestions where I could focus further, I'd be immensely grateful.
Likewise, if any more deep-rooted scenarios or much deeper layer information are needed to address my little stir of panic, please do not hesitate to share.