I'm implementing an interface to a not-quite-SPI chip. Since there's no MISO line in the interface, I didn't want to use the standard SPI module as it requires all four ports. But, I've been looking at the 'sc_spi' implementation as reference.
In spi_master.xc::spi_master_out_byte_internal(), there's the following code:
Code: Select all
// handle first bit
asm("setc res[%0], 8" :: "r"(spi_if.mosi)); // reset port
spi_if.mosi <: x; // output first bit
asm("setc res[%0], 8" :: "r"(spi_if.mosi)); // reset port
asm("setc res[%0], 0x200f" :: "r"(spi_if.mosi)); // set to buffering
asm("settw res[%0], %1" :: "r"(spi_if.mosi), "r"(32)); // set transfer width to 32
stop_clock(spi_if.blk2);
configure_clock_src(spi_if.blk2, spi_if.sclk);
configure_out_port(spi_if.mosi, spi_if.blk2, x);
start_clock(spi_if.blk2);
The purpose of this code is fairly clear: driving the low-bit value before the clock to accomodate CPHA=0. This requires unbuffering the port, then resetting the buffering options to return to normal operation.
But, what confuses me is the line
Code: Select all
asm("settw res[%0], %1" :: "r"(spi_if.mosi), "r"(32)); // set transfer width to 32
That line appears to set the buffering width of a single-bit port to 32 bits. But, if I try to do "buffered out port:32 if_mosi = XS1_PORT_1E;", the XC compiler tells me that I'm only allowed to buffer 8 bits on a 1-bit port. I assume this isn't an error in the SPI code. But, what explains the discrepancy between the settw instruction buffering 32 bits, and the XC declaration that won't take more than 8 bits? What does the settw actually accomplish in this situation?