I am not familiar with the implementation of the analog input on the ADC pin. However, I do know that if an analog signal is sampled digitally by any converter, it MUST be filtered such that the bandwidth of the analog signal input to the digital converter is at MOST half of the sample rate. This is to prevent aliasing of the signal into the converter. If the analog input signal to and ADC is not filtered in any way, then it is very reasonable to see "spurious" values, as these could be occuring at any frequency above the sample frequency, and being aliased by the ADC during its sample period. No amount of software processing will remove these spurious values, as they must be filtered out prior to digital sampling. Perhaps this is the source of frustration that Rob is experiencing, because if an input signal to the ADC contains out of band noise, it will show up on the ADC as 'spurious' values, and cannot be removed in software.
The only way to prevent this source of input degradation is to filter the incoming analog signal to be within a bandwidth of less than half of the sampling frequency of the ADC. Calling it noise is somewhat misleading, as the problem here is not so much with noise, which tends to be random, but with out of band noise, which can oftentimes appear as spurious. It is this out of band noise that requires the filtering. Also, you could easily test this, if there is no filtering, and see this if you were to apply a signal at , for example, 30 Hz, while sampling at 50 Hz, and the system would read it in and tell you that there was a signal at 20 Hz. This is due to aliasing of the out of band signal (30 Hz), wrapping back into the bandwidth (25 Hz), set by the sample rate of the ADC (50 Hz).
I just got my boards yesterday, so I have not played around with much of any of this yet, but I wanted to chime in, because I thought maybe this information might be helpful, because it seems like there is an attempt to apply a digital solution (software) to an analog problem. Analog design can be tricky, as some have pointed out, referring to separate isolated analog power supplies and extra ground planes, etc., but a 10-bit A/D is low resolution, and shouldn't require a significant amount of analog design effort. However, the input to an ADC should ALWAYS be buffered and filtered to the correct bandwidth be at most half the ADC sample rate. Also, another note, if you just hook up wire leads to a high input impedance directly with no filtering, you have an antenna.
Looking at the stats and thinking about this, there isn't much difference at the low deviations for different clock speeds, the main difference is at the high deviations. What I suspect is that the lower clock simply means that the ADC is missing more of the glitches that are occuring at the analog inputs, rather than it is converting better. If this is the case, then it's probably better to use the higher rate and do an average with discard of outliers. This might best be implemented as an interrupt driven process happening in the background.
Dave