Integrated ADC Accuracy on the C10GX
While working with a C10GX, I'm having issues getting accurate voltage readings on the ADC. The voltages I'm seeing are roughly accurate within +/- 10%, but I would expect better results because I'm using an external +/- 0.2% 1.25V reference voltage. I understand the ADC is only accurate to ~20mV, but, for example, I'm reading 950mV when my oscilloscope measures 900mV (controlled voltage using a voltage divider circuit).
I'm using the Voltage Sensor Intel FPGA IP, and reading the samples directly from the sample_store_csr. I noticed when I initialize the voltage sensor IP, by first setting the controller_csr bottom bits to 0x102 and then enabling the bottom bit to start the ADC sampling, when reading back the controller_csr, the CAL bits are still set to 0 (which according to the documentation means my ADC is not calibrated). How can I calibrate the voltage sensor IP or my ADCs?
Could it also be possible my ADCs are somehow using the internal reference voltage (+/- 10%) instead of my external reference voltage (+/- 0.2%)?
Note: my results are still not close enough to the actual value when averaging...