This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

CC2564C: CVSD datapath on HFP Non assisted mode.

Part Number: CC2564C
Other Parts Discussed in Thread: CC2564

Hi ,

We are using the CC2564C with STM32 controller with the Bluetopia stack support.

Our Device is meant to Support A2DP,HFP &BLE features. So we are not using any of the assisted modes.

Our Audio data path is like  BTCHIP->HCI_UART->STM32(Decoder+Mixer)->I2S->Codec_IC->Speaker. We are not using any of the Audio pins(I2S-PCM).

In the datasheet I was able to find details about Assisted mSBC and Assisted A2DP conditions. But I didn't find anything regarding CVSD data Path.

We were able to use the Stack provided mSBC and SBC API to decode and encode mSBC(WBS) and SBC(A2DP) data.

My question is, let's say we are not using the Audio pins of the BT controller & we are using the coprocessor to support BLE feature.

Now If I choose the HFP to work on CVSD what will be the data flow ?

In the above case What will be the data I will get on the  etHFRE_Audio_Data_Indication? will it be CVSD data or PCM data?

Thanks & Regards

Vishnuprasad V

  • Hi Vishnuprasad,

    -          NB and WB HFP are synchronous and should use the PCM interface, no HCI. SCO/eSCO (synchronous data) over HCI(serial asynchronous transport) is known to have stability/voice quality issues. The PCM is synchronous communication and we have stable, working solution for both NBS and WBS over PCM.

    -          NB HFP over PCM is allowed and stable. Also very easy to demonstrate using our SDK examples. It is unassisted and doesn’t actually use the co-processor. NB HFP uses CVSD over the air. It can be used simultaneously with BLE.

    -          WB HFP uses the co-processor and can’t be used simultaneously with BLE. WB HFP uses SBC codec for the air interface.

    -          PCM data is simply defined by sample rate and width

    -          A2DP is not synchronous data as it is buffered, it should use the HCI-UART interface. We have SDK sample applications demonstrating this as well.

    Regards,

    Travis

  • Thanks Tarvis ,

    For the reply .

    I agree with your statements.

    For some reason we have to use it in the way I said. I agree that there will be voice quality issues and challenges.

    We are only relying on the Data coming through the HCI UART lines.

    According to what you said shall I conclude the statements as below ?

    1- When we are using NBS, the etAudiodataIndication will  give us PCM data (via HCI Uart)? (CVSD to PCM conversion will happen in the lower levels inside the chip itself.)

    2- When we are using WBS , the etAudioDataIndication will give us mSBC encoded data (via HCI Uart) and then we have to decode it using the stack provided msBC decode functions to get the PCM data ?

    Thanks and regards.

  • Hi,

    Correct, if I'm not mistaken for both NBS and WBS the host would receive a etHFRE_Audio_Data_Indication event in the HFP callback event letting the host know there is audio data. 

    For NBS, CVSD to PCM conversion does happen on the lower level - you are correct.

    For WBS, if you are operating in assisted mode the CC2564 will handle mSBC encoding and decoding to offload it from the host and then send the audio data via PCM. If you use unassisted mode the host is expected to handle the mSBC encoding/decoding.

    -Jesus

  • Thanks Jesus,

    This is the answer I wanted to know.

    Thanks