lib_startkit_support ADC accuracy

If you have a simple question and just want an answer.
User avatar
aclassifier
XCore Expert
Posts: 512
Joined: Wed Apr 25, 2012 8:52 pm

lib_startkit_support ADC accuracy

Post by aclassifier »

I will show some simple code that runs on the startKIT that seems to show that the accuracy of single values of the ADC in this case is around 8 bits, but if taken in a series of 1000 it's around 10-11 bits. I expect better from the XS1-U16A-128-FB217 Datasheet at http://www.xmos.com/published/xs1-u16a- ... -datasheet.

I have made an R-R board that connects to J2, with stable 750, 760, 770 and 780 mV, from 3V3A (diagram attached).

The code uses lib_startkit_support with ADC_TRIG_DELAY of 40 (default) and 100 (changed). ADC_PERIOD_TIME_USEC is 1000.

Code: Select all

	/*
	 * startKIT-adc-test.xc
	 *
	 *  Created on: 29. sep. 2015
	 *      Author: Øyvind Teig
	 *      http://www.teigfam.net/oyvind/home/technology/098-my-xmos-notes
	 */
	 
	#include <platform.h>
	#include <stdio.h>
	#include "startkit_adc.h"
	 
	#define NUM_STARTKIT_ADC_VALS 4
	 
	typedef struct tag_startkitd_adc_vals {
	    unsigned short x[NUM_STARTKIT_ADC_VALS];
	    unsigned short max[NUM_STARTKIT_ADC_VALS];
	    unsigned int mean_sum[NUM_STARTKIT_ADC_VALS];
	    unsigned int mean_cnt;
	    unsigned short min[NUM_STARTKIT_ADC_VALS];
	} t_startkit_adc_vals;
	 
	#define ADC_PERIOD_TIME_USEC 1000 // 1000, 2000 ok
	#define ADC_NUM_SAMPLES 1000
	 
	void test_adc (
	    client interface startkit_adc_if i_analogue)
	{
	    unsigned wait_for_adc = 0;
	    unsigned int loop_cnt = 0;
	    unsigned int adc_cnt = 0;
	    unsigned int no_adc_cnt = 0;
	    t_startkit_adc_vals adc_vals;
	 
	    for (int i=0; i<NUM_STARTKIT_ADC_VALS; i++) {
	        adc_vals.min[i] = 0xffff;
	        adc_vals.max[i] = 0;
	        adc_vals.mean_sum[i] = 0;
	        adc_vals.mean_cnt = 0;
	    }
	 
	    while(loop_cnt < ADC_NUM_SAMPLES) {
	        loop_cnt++;
	        i_analogue.trigger();
	        wait_for_adc = 1;
	        select
	        {
	            case wait_for_adc => i_analogue.complete():
	            {
	                wait_for_adc = 0;
	                if (i_analogue.read (adc_vals.x)) {
	                    for (int i=0; i<NUM_STARTKIT_ADC_VALS; i++) {
	                        if (adc_vals.x[i] > adc_vals.max[i]) {
	                            adc_vals.max[i] = adc_vals.x[i];
	                        }
	 
	                        if (adc_vals.x[i] < adc_vals.min[i]) {
	                           adc_vals.min[i] = adc_vals.x[i];
	                        }
	 
	                        adc_vals.mean_sum[i] += adc_vals.x[i];
	                    }
	 
	                    adc_vals.mean_cnt++; // Equally many for each
	                    adc_cnt++;
	                } else {
	                    no_adc_cnt++;
	                }
	                break;
	            }
	        }
	    }
	 
	    // Internal A/D-converter
	    // 0 to 65520 (0xFFF0)
	    // Div 16 = 0 to 4095, ref = 3.3V
	    // 3300 mV / 4096 = 0,8056640625 mV
	    // 750 mV = 25 DegC then 10 mV/DegC = 8.05 mV/DegC
	    // See XS1-A8A-64-FB96-Datasheet(1.3).pdf
	 
	    printf ("Summary of ADC: %u trials %u readings (%u lost)\n",
	            ADC_NUM_SAMPLES, adc_cnt++, no_adc_cnt++);
	 
	    for (int i=0; i<NUM_STARTKIT_ADC_VALS; i++) {
	 
	        unsigned int adc_val_mean_i =
	                adc_vals.mean_sum[i]/adc_vals.mean_cnt;
	        int  DegC_OneTenth_Parts =
	                (((adc_val_mean_i*100) - 198545) / 1985) - 400;
	        int  DegC_Unary_Part =
	                DegC_OneTenth_Parts/10;
	        int  DegC_Decimal_Part =
	                DegC_OneTenth_Parts - (DegC_Unary_Part*10);
	 
	        printf ("i:%u DegC=%u.%u ",
	                i, DegC_Unary_Part, DegC_Decimal_Part);
	 
	        if (i < (NUM_STARTKIT_ADC_VALS-1)) {
	 
	            unsigned overlapping =
	                    (adc_vals.max[i] >= adc_vals.min[i+1]);
	            unsigned int adc_val_mean_ip1 =
	                    adc_vals.mean_sum[i+1]/adc_vals.mean_cnt;
	 
	            printf("min=%u mean=%u max=%u diff=%d X=%u\n",
	                    adc_vals.min[i],
	                    adc_val_mean_i,
	                    adc_vals.max[i],
	                    adc_val_mean_ip1-adc_val_mean_i,
	                    overlapping);
	        } else {
	            printf("min=%u mean=%u max=%u\n",
	                    adc_vals.min[i],
	                    adc_val_mean_i,
	                    adc_vals.max[i]);
	        }
	    }
	}
	 
	int main () {
	    chan      c_analogue;
	    interface startkit_adc_if i_analogue;
	 
	    par {
	        on tile[0]:
	            test_adc (i_analogue);
	        on tile[0].core[0]:
	            adc_task (i_analogue, c_analogue, ADC_PERIOD_TIME_USEC);
	            startkit_adc (c_analogue); // Declare the ADC service,
	            //  startkit_adc is the ADC hardware, not a task
	    }
	    return 0;
	}
 
 
I use xTIMEcompose 14.1.0. But I have to restart both the board and xTIMEcomposer after each run. This also goes for my startKIT board #2. 14.0.4 seems to behaves a little better. Some typical logs would be seen below:

Code: Select all

= ADC_PERIOD_TIME_USEC 1000 for all= ADC_TRIG_DELAY 100Summary of ADC: 1000 trials 1000 readings (0 lost)i:0 DegC=25.7 min=14880 mean=15030 max=15120 diff=209 X=1i:1 DegC=26.7 min=15104 mean=15239 max=15376 diff=195 X=1i:2 DegC=27.7 min=15312 mean=15434 max=15536 diff=198 X=1i:3 DegC=28.7 min=15536 mean=15632 max=15760= ADC_TRIG_DELAY 100Summary of ADC: 1000 trials 1000 readings (0 lost)i:0 DegC=25.6 min=14928 mean=15025 max=15120 diff=208 X=1i:1 DegC=26.7 min=15088 mean=15233 max=15344 diff=197 X=1i:2 DegC=27.7 min=15344 mean=15430 max=15536 diff=199 X=1i:3 DegC=28.7 min=15472 mean=15629 max=15728= ADC_TRIG_DELAY 40Summary of ADC: 1000 trials 992 readings (8 lost)i:0 DegC=25.6 min=14848 mean=15019 max=15120 diff=130 X=1i:1 DegC=26.3 min=15008 mean=15149 max=15248 diff=197 X=1i:2 DegC=27.3 min=15216 mean=15346 max=15440 diff=200 X=1i:3 DegC=28.3 min=15360 mean=15546 max=15648= ADC_TRIG_DELAY 40Summary of ADC: 1000 trials 992 readings (8 lost)i:0 DegC=25.7 min=14848 mean=15037 max=15136 diff=130 X=1i:1 DegC=26.4 min=15040 mean=15167 max=15312 diff=196 X=1i:2 DegC=27.3 min=15248 mean=15363 max=15456 diff=200 X=1i:3 DegC=28.4 min=15424 mean=15563 max=15664
  1. Each sample as compared to another in a set seems to be at worst 8 bit from the next
  2. Mean of 1000 measurements is around 10-11 bits
  3. Why are some samples lost or not when ADC_TRIG_DELAY is changed?
  4. Why doesn’t XMOS default value of ADC_TRIG_DELAY 40 cause zero lost?
  5. Is there some timing related issue in the adc_task sw or startkit_adc service?
  6. I have tried different cores, no change

I would certainly like to get to the bottom of this, and would appreciate help. If not I will attempt to report this as a bug.

 
All of this is also on http://www.teigfam.net/oyvind/home/tech ... dc_problem (standard disclaimer about no money or gifts etc.)
You do not have the required permissions to view the files attached to this post.
User avatar
aclassifier
XCore Expert
Posts: 512
Joined: Wed Apr 25, 2012 8:52 pm

Post by aclassifier »

Thanks! The problem is not how to get the temperature in, per se - as you mention thay may be done many better ways than 10mV/DegC. I work with these kind of matter daily; the problem is why I need a series of samples to get the accuracy that the data sheet states I should get as repeatable single measurements. Same value +/- some little every time, not +/- some much. That's why I do some simple statistics: min, mean and max. The overlap is out-of-10-mV window and that is too much. Overlap is adjacent max, mins. The error in the R-R is also not a big issue, I have measured the voltages, and by Ohm's law the voltages are increasing by some 10mV per input. But anyhow the R's don't live and fluctuate more than standard Johnson–Nyquist noise (which certainly increases by R, but anyhow), I wouldn't think it were in that range. And what started this was reading lower impedance values. Also, I didn't want to draw too much from 3V3A. 

My hypothesis is that in adc_task or startkit_adc there is some timing that influences on the converted value in such a way that the mean (of 1000) would even out to be rather good. Since it's a successive approximation that might need longer sample and hold, maybe that's something. What counts against this is history, why should my two startKITS be the first?

The questions I query in my initial question also are rather important. Like why I loose anything when it's synchronised.

Or, where is this case flawed?

--
Øyvind Teig
Trondheim (Norway)
https://www.teigfam.net/oyvind/home/