Background: We are developing a product that will likely include TCAN1042GV-Q1. We wanted to run some EMI tests, so we setup a simple application SW that loops an 8 byte CAN message, each byte being 0x55 to get a lot of toggling. In the test setup there is only one trasmitting node and one receiving node. Looking at the resulting traffic with the oscilloscope I was surprised to see that when disconnecting the receiving node (only having a terminated cable attached to the trasmitter) I got a lot more traffic. Looking further I could see it was due to the frames repeating more often.
1. The SW engineer says the SW is simply a while loop to repeat the 8 byte message without any intentional delays. Looking at the result with the oscilloscope I see the message sent every ~360 us when there is a receiveing node setting the ACK bit dominant. Disconnecting this node (so that no node sets the ACK bit) the message is instead sent every ~125 us. The message as such is 108 bits long (Start of frame bit to last End of fram bit), all transmitted at 1 Mbps. Could it be so that 360 us is the time it takes for the while loop to repeat, while the explanation to the more frequent resend without receiver is that the transmitter itself decides (with no connection to the application SW while loop) to resends the message within 20 us due to noting that no ACK bit was set?
Question 2 and 3 are closely related, about understanding how bit rate is set / known. I assume they might be basic for anyone more familiar with the SW / driver side.
2. The transceiver is CAN FD compliant, with data bit rates up to 5 Mbps. Yet my complete frame is sent at 1 Mbps. Is this a parameter in the API / driver, so that the SW engineer would set some parameter (through the TXD pin) along with the message to send in order to transmit a CAN FD frame instead?
3. How does the transceiver know what receiving bit rate to expect? I assume it can listen for the FDF bit and the BRS bit to learn that it is a CAN FD frame with higher bit rate for the data. But how does it know whether to expect 2 or 5 Mbps for instance? Would we need to know ahead and tell it from the application SW side?
Thanks!