This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TCAN1042GV-Q1: Basic questions on transceiver behavior

Part Number: TCAN1042GV-Q1

Background: We are developing a product that will likely include TCAN1042GV-Q1. We wanted to run some EMI tests, so we setup a simple application SW that loops an 8 byte CAN message, each byte being 0x55 to get a lot of toggling. In the test setup there is only one trasmitting node and one receiving node. Looking at the resulting traffic with the oscilloscope I was surprised to see that when disconnecting the receiving node (only having a terminated cable attached to the trasmitter) I got a lot more traffic. Looking further I could see it was due to the frames repeating more often.

1. The SW engineer says the SW is simply a while loop to repeat the 8 byte message without any intentional delays. Looking at the result with the oscilloscope I see the message sent every ~360 us when there is a receiveing node setting the ACK bit dominant. Disconnecting this node (so that no node sets the ACK bit) the message is instead sent every ~125 us. The message as such is 108 bits long (Start of frame bit to last End of fram bit), all transmitted at 1 Mbps. Could it be so that 360 us is the time it takes for the while loop to repeat, while the explanation to the more frequent resend without receiver is that the transmitter itself decides (with no connection to the application SW while loop) to resends the message within 20 us due to noting that no ACK bit was set?

Question 2 and 3 are closely related, about understanding how bit rate is set / known. I assume they might be basic for anyone more familiar with the SW / driver side.

2. The transceiver is CAN FD compliant, with data bit rates up to 5 Mbps. Yet my complete frame is sent at 1 Mbps. Is this a parameter in the API / driver, so that the SW engineer would set some parameter (through the TXD pin) along with the message to send in order to transmit a CAN FD frame instead?

3. How does the transceiver know what receiving bit rate to expect? I assume it can listen for the FDF bit and the BRS bit to learn that it is a CAN FD frame with higher bit rate for the data. But how does it know whether to expect 2 or 5 Mbps for instance? Would we need to know ahead and tell it from the application SW side?

Thanks!

  • Hello,

    You're on the right track with your debugging efforts. It sounds like the missing piece is just understanding the function of the CAN controller, which is the IP in your MCU which is responsible for the data link layer (framing, error detection, etc.) of the CAN interface.

    CAN controllers will typically by default consider the lack of acknowledgement an indicator of a transmission issue and automatically transmit a message after some minimum hold-off time. That's why you are seeing the traffic increase with one node versus two. This feature can sometimes be disabled.

    Similarly, it is up to the controller to set the bit rate. Most transceivers, TCAN1042 included, are simple analog devices with no internal clocks and no real concept of bit rate/duration. The 5 Mbps rating just means it can toggle its outputs fast enough and with low enough jitter (specified as bit timing symmetry) to work with a 5 Mbps controller.

    Regards,

    Max

  • I guess I should have figured the transceiver is just that limited PHY related part of the chain by looking to the functional block diagram in the datasheet.

    Thanks!