Thank you for looking into this.
The device we are using is a single supply variant. Our design powers it from a 3.3V supply, we use a 3.0 volt external reference.
I believe the pre-scaler is only applicable to channel 8 on this device, our issue affects all channels. We have made sure the pre-scaler is not enabled.
Our issue is not related to conversion accuracy the value we read from the ADC agrees with the voltage on the input. Our issue is that the ADC inputs are affecting the measured voltage, it looks like the weak internal pull-ups are enabled, this would explain why we measure 330mV on the input when we would expect 300 mV. This is emphasized in our application as the input impedance is higher than the value recommended in the datasheet.
I have gone on to do a further test since my initial post. I have disconnected the input from the ADC by removing the resistor in series with the input. I have then connected a 22k resistor between ADCIN1 and ground. The effect of the 22k resistor should be to pull the input to ground. When I measure the voltage on ADCIN1, I get 2.06 volts. It looks like there is a 13k pull-up internal to the ADC input.