why is there a delay between (DR register written) and (data really showed) in UART on STM32F103CB?

时光怂恿深爱的人放手 提交于 2021-02-11 14:32:29

问题


I'm curious about the delay time between the title mentioned, I toggled an IO when I wrote data into UART->DR, the delay time varies from 3 micro seconds to 10x micro seconds

int main(void)
{
  /* initial code generated by STMCubeMX */
  HAL_Init();
  SystemClock_Config();
  MX_GPIO_Init();
  MX_USART1_UART_Init();

  while (1)
  {
    HAL_Delay(50);

    if (USART_GetFlagStatus(&huart1, USART_SR_TXE) == SET)
    {
      USART_SendData(&huart1, 'F');
    }
  }
}


void USART_SendData(UART_HandleTypeDef *huart, uint16_t Data)
{
  assert_param(IS_USART_ALL_PERIPH(USARTx));

  assert_param(IS_USART_DATA(Data));

  GPIOB->BSRR = GPIO_PIN_1;                   // Tick an IO pin for debugging
  GPIOB->BSRR = (uint32_t)GPIO_PIN_1 << 16u;  // reset bit

  huart->Instance->DR = (uint8_t)(Data & (uint8_t)0x00FF);      // send data (write DR)
}

I'm not sure whether the time jitters is related with BAUD rate 9600(104 micro seconds/bit),

Isn't the data should be showed immediately when DR register written????

And why isn't the delay time all the same(or close)?


回答1:


Isn't the data should be showed immediately when DR register written????

Not necessarily.
You are only showing us high-level language source code.
Have you looked at the actual instruction trace to determine the instruction time between these operations?
How do you ensure that no interrupt is serviced between these operations?

And why isn't the delay time all the same(or close)?

Apparently that depends on the design of the UART.
You report that the baudrate is 9600, and (as expected) the intervals for each bit appear to be slightly longer than 100microsec.

The fact that the observed latency is less than one bit interval is significant.
The typical UART uses a clock (aka the baudrate generator) that is 16 times faster than the configured baudrate.
This faster-than-necessary clock is needed to oversample the receiving signal, which can arrive at anytime, i.e. it's asynchronous communication after all.

For the transmit clock, the baudrate generator is divided down to the nominal baudrate.
So for transmission, that clock quantizes in time when each bit (of the frame) will start (and end) its transmission.
Since the write to the UART TxD data register is performed by the CPU, and that operation is not synchronized with the transmit clock, you should therefore expect a random delay of up to one bit interval before the start bit of the frame appears on the wire.



来源:https://stackoverflow.com/questions/64040912/why-is-there-a-delay-between-dr-register-written-and-data-really-showed-in-u

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!