I'm running on a Tiva C TM4C1290NCPDT at 120 mhz, using the Bluetopia FP library.
I have measured the number of CPU cycles / time it takes for the SBC codec to process a frame.
At a minimum, with interrupts turned off, I have it taking around 118k cycles / 0.987 milliseconds at a minimum to encode a frame to send out, and 142k cycles / 1.184 milliseconds to decode an incoming frame.
During this test, there were about 371 frames decoded and about 328 frames encoded every second. This means that per second, I am spending 44% CPU time decoding and 32% CPU time encoding. This is after what had to be dropped due to lack of available processor time (if I am not doing both at once, I can get more data in/out--the processor meanwhile has to do other things like user interface, dual-SPI driving to generate the I2S, etc.). I am not sure how to measure exactly how much time it's taking to transmit the data via UART, but I'm running the UART at 3 mbit if that gives you any idea how much time the processor is spending just transmitting the data to/from the module.
It seems that I must encode or decode frame by frame, and cannot give it a large chunk of data to save on any overhead from setting up the call (I get 8 frames at a time). I don't know how much this impacts the amount of time it spends, but given that it's so many frames per second, I'm sure that every bit counts.
Is this the expected performance of the SBC encoder/decoder included with the TI/Bluetopia Bluetooth stack?
Is there anything I can do to decrease the time it takes to decode & encode? Decreasing the bitpool did not seem to have any effect.