Dynamically Configured IO Topic is solved

If you have a simple question and just want an answer.
kyle123
Member++
Posts: 25
Joined: Tue May 19, 2015 8:17 pm

Dynamically Configured IO

Post by kyle123 »

High Speed Data Acquisition using the THS-1206 ADC

Overview: The THS-1206 is a 12'b high speed data acquisition device made by Texas Instruments. I am using the XS1-U16A-128 to communicate with the device and process the incoming data.
  • Device XS1-U16A-128
  • THS-1206 ADC
Description of Problem: The THS1206 uses a bidirectional data bus for programming the chip and reading the data. Communication with the ADC is initiated using a combination of !WR (Write-Read), !RD(Read), CS0 (Chip-select 0) and CS1 (Chip-select 1) to indicate that data is being transferred to or from the ADC.

I have successfully programmed the configuration into the ADC and verified both the timing and ADC response (after configuration) with an oscilloscope. Upon successfully configuring the device, I start the conversion clock (CONVST_CLK). The ADC starts acquiring data and sets the DATA_AV flag when the FIFO has reached the trigger level set during configuration.

Once the DATA_AV flag is set, data may be read by toggling the !RD bit. The issue that I am running into is that I cannot start a read event after the DATA_AV bit is asserted. I can clearly see when the DATA_AV pin is asserted on the oscilloscope. However, I can not start the RD_CLK. It appears to skip this block of code and move on. I have also unsuccessfully tried a variant where I manually toggled the N_RD pin. In this case:

Code: Select all

while(1)
{
select{
        case DATA_AV when pinsneq(0):> void:
            
            N_RD<:0;
            Delay(1);
            N_RD<:1;
        break;
        case HS_ADC when pinsneq(1):> void:
            WR <:1;
            start_clock(CONVST_CLK);
        break;
        }
}
My hypothesis is that I am somehow violating the parallel-usage rules of the N_RD pin. However, XTime Studio does not throw any warnings.

I would really appreciate any and all insight that people may have on this topic. I have hunted on the forums for similar issues to no avail. If it helps I can definitely include a snapshot of the oscilloscope capture to show the problem.

--Thanks!

Kyle

Code: Select all

void GET_HS_ADC_PACKET(streaming chanend c)
    {
     //This function will initialize the high speed adc on startup and acquire data when the DATA_AV pin is active.

    configure_clock_rate(CONVST_CLK, 100, 50);
    configure_port_clock_output(HS_ADC_TRIGGER, CONVST_CLK);

    configure_clock_rate(RD_CLK, 100, 5);
    configure_port_clock_output(N_RD, RD_CLK);

        N_RD   <: 1;   //RD input has to be tied to high-level when using R/!W, !CS0-controlled.
        WR      <: 0;   //WR input is tied low while writing to the ADC
        CS1     <: 1;   //CS1 input is tied high to set ADC to enable Data_I/O
        N_CS0 <: 0;   //N_CS0 input is always 0.


//    Configure the THS1206 ADC.  Two writes sequences are required:
        HS_ADC  <:  RST_ADC;       //RESET ADC
        HS_ADC  <:  CLR_RST_ADC;   //Clear ADC RESET
        HS_ADC  <:  CNTRL_REG0;    //Set control register 0: Need to implement hs_adc_config_reg0 structs later...
        HS_ADC  <:  CNTRL_REG1;    //Set control register 1: Need to implement hs_adc_config_reg1 structs later...
       
        HS_ADC  <:  RST_ADC;       //RESET ADC
        HS_ADC  <:  CLR_RST_ADC;   //Clear ADC RESET
        HS_ADC  <:  CNTRL_REG0;    //Set control register 0: Need to implement hs_adc_config_reg0 structs later...
        HS_ADC  <:  CNTRL_REG1;    //Set control register 1: Need to implement hs_adc_config_reg1 structs later...

        HS_ADC :> void;

//Use hardware trigger to start acquiring data.
select{
        case HS_ADC when pinsneq(1):> void:
        WR <:1;
        start_clock(CONVST_CLK);
        break;
        }

//Use hardware trigger to start reading data.
select{
  case DATA_AV when pinseq(1):> void:
            start_clock(RD_CLK);
        break;
}

select{
    case DATA_AV when pinsneq(1) :> void:
        stop_clock(RD_CLK);
        break;
}

You do not have the required permissions to view the files attached to this post.
View Solution
User avatar
xsamc
Active Member
Posts: 55
Joined: Fri Mar 04, 2011 3:38 pm

Post by xsamc »

Hi kyle123,
kyle123 wrote: My hypothesis is that I am somehow violating the parallel-usage rules of the N_RD pin. However, XTime Studio does not throw any warnings.
I can't see anything that would result in a parallel usage violation from the code snippets you've posted, are you calling GET_HS_ADC_PACKET() from more than one task?

Cheers,
Sam
srinie
XCore Addict
Posts: 158
Joined: Thu Mar 20, 2014 8:04 am

Post by srinie »

Hi,
I am not clear why there are multiple select blocks in your second part of the code!
* N_RD if its pulled-up by default, that would serve as a sufficient init condition, and code can just start/stop RD_CLK

In the first code segment, I guess the N_RD toggle would still work if its not configured to a clock - can you try this?

Regards,
srinie
srinie
XCore Addict
Posts: 158
Joined: Thu Mar 20, 2014 8:04 am

Post by srinie »

Just curios to know the elapsed time between DATA_AV signal? How much it is typically?
kyle123
Member++
Posts: 25
Joined: Tue May 19, 2015 8:17 pm

Post by kyle123 »

I have made a few more experiments to test the behavior of the clock pin mode and the trigger when command. But first to address the points above.
  • I broke the select statement into individual pieces to test my understanding of the select function. Prior to posting, I used a single select function with multiple case statements inside of a while(1) loop.
  • I have discovered how to change pin mode on the fly using set_pin_mode_data(pin_name)and set_pin_mode_clock(pin_name).
  • I am not calling this function from more than one task.
  • Time elapsed question: The data_av pin may be set to level or pulsed mode. I believe I have measured pulse widths on the order of 100ns or so. The level is set until the FIFO buffer is cleared.
However, I still have been unable to generate a hardware trigger using the when pinsneq command. My observation on the scope in combination with the debugger has been that any pin changes after this trigger are ignored.

Code: Select all

    DATA_AV when pinsneq(0) :> void;

        set_port_mode_clock(N_RD);
        configure_port_clock_output(N_RD, RD_CLK);

The scope_1 screenshot clearly shows the DATA_AV transition from 0 to 1. Now on the other hand, if I change the trigger to !1, the rest of the code is executed. It is my understanding that the program should wait until the DATA_AV condition is satisfied before continuing. But...it appears that I am mistaken in this regards.

Code: Select all

    DATA_AV when pinsneq(1) :> void;

        set_port_mode_clock(N_RD);
        configure_port_clock_output(N_RD, RD_CLK);

You do not have the required permissions to view the files attached to this post.
User avatar
infiniteimprobability
Verified
XCore Legend
Posts: 1164
Joined: Thu May 27, 2010 10:08 am

Post by infiniteimprobability »

Hi,
did you get this going? Your syntax is correct and the code should pause until that condition is true..

We know the pinseq/pinsneq condition logic works (a large amount of IP depends on it) so I can only conclude the condition isn't being satisfied. What width is the port DATA_AV? If greater than 1, are there other lines on there which are not at an expected level?

Here's an example of pinseq, waiting for the value 11 on the port. You can see from the simulator trace, as soon as the condition is met, the dut() function drops through and asserts the debug I/O line.

Code: Select all

#include <xs1.h>

out port p_stimulus = XS1_PORT_4A;      //This is looped back to p_pinseq_test
in port p_pinseq_test = XS1_PORT_4B;
out port p_debug    = XS1_PORT_1C;


void stimulus(void){
    unsigned port_val = 0;
    while(port_val < 16){
        p_stimulus <: port_val;
        port_val++;
        delay_microseconds(1);
    }
}

void dut(void){
    p_debug <: 0;
    p_pinseq_test when pinseq(11) :> void;
    p_debug <: 1;
}

int main(void){
    par{
        dut();
        stimulus();
    }
    return 0;
}
You do not have the required permissions to view the files attached to this post.
kyle123
Member++
Posts: 25
Joined: Tue May 19, 2015 8:17 pm

Post by kyle123 »

Hello:) Thanks for taking a look!

Yes, I did get it working once I realized that the debugger was too slow to keep up. The delay would cause the ADC to lose its mind and corrupt the configuration. I started this project without access to a scope and just used printf for everything. In retrospect it all makes perfect sense! There is no reason why I should print to the screen at 2MHZ:)

Kyle
User avatar
infiniteimprobability
Verified
XCore Legend
Posts: 1164
Joined: Thu May 27, 2010 10:08 am

Post by infiniteimprobability »

Good news! Thanks for letting us know.
There is no reason why I should print to the screen at 2MHZ:)
Yes, that is asking quite a lot. If it was a single putchar() via xscope, you might just manage 2MHz...but a formatted string.. definitely not!