Altera_Forum
Honored Contributor
10 years agoOdd ADC issue
Okay, to start off, I'm not sure this is an FPGA issue, or a more fundamental sampling issue, but I figured the folks here might be able to give me a pointer or two.
I have a custom data acquisition board/controller with a Cyclone 4 (EP4C15) and a pair of ADAS3022 ADCs running in parallel. Both ADCs are being operated at 1MSps, with an effective rate of 125kSps per channel. The channels are scanned sequentially by the driver, with the data being dumped into a dual-port RAM where the channel number forms part of the address. That part appears to be working flawlessly, and I can use SignalTap to recreate perfect waveforms by tying into the write side of the memory interface between the driver and the controller. I checked several channels and the signals are being replicated just as they appear on the scope. (albeit with slightly lower resolution) Now, where things get hinky is on the read side. The controller reads some parameters at 100 Hz, and other parameters at 25 kHz. For the most part, the 100 Hz process works fine - it pulls four samples from the buffer, computes the differential voltages, does some error calculations and adjusts the system, then repeats until the differential error is within tolerable limits. This has been highly reliable so far, with no known instances of failure. The 25kHz process, on the other hand, has been having issues. Some of the signals we are measuring are nearly square, and when the data is being pushed out for collection, we see a lot of junk in the data. Specifically, repeated edges, dropouts, etc. I initially thought that perhaps it was a memory access problem, so I rewrote the controller to bypass the DMA mechanism and literally copy/paste using CPU instructions, only to see identical garbage data. Now I am beginning to think there is some more fundamental issue. Now, my original thought was that I could subsample the output of the ADC driver, which is running at 125 kSps/channel, and the worse that would happen is that I might miss fast edges (which is fine, we are measuring fairly long square wave like signals) or be off a bit in time. Instead, it looks like random noise is being generated during every step function of the signal being measured. Once the signal is stable, this effect seems to go away, which leads me to conclude that perhaps this is an aliasing issue. As an aside, the memory interface between the ADC firmware and the controller CPU is double-buffered so that the ADC process isn't ever writing to the same page that the controller is reading. The ADC firmware verifies that the controller isn't reading it's side before flipping the page. Also, both sides are running at the same clock frequency (100MHz) and the entire design is making timing. I verified this by setting up a SDC file and running TimeQuest. This doesn't appear to be a low-level timing issue or RAM setup & hold issue. My current plan is to simply pace the ADC subsystem with the 25kHz process, rather than letting it collect samples at the ADC's fastest data rate, which should resolve the issue. However, I'm still not sure why my original design is having this issue, and I'd like to understand the underlying problem. Thanks!