This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

EDGE-AI-STUDIO: how to depoly the model with CCS and how to live preview the results

Part Number: EDGE-AI-STUDIO
Other Parts Discussed in Thread: C2000WARE, LAUNCHXL-F28P55X

Tool/software:

Hello,

We are creating a demo of Edge AI motor fault detection. 

Here are some questions.

1. What's the function of the file "test_vector.c" located at C2000Ware_MotorControl_SDK_5_03_00_00\solutions\edge_ai_fault_detection_with_mc\motor_fault_live_ preview_validation_f28p55x? Its role is to provide input data to the model deployed on the MCU and verify the results without sensorless data input?

2. Can "livepreview" be demonstrated without vibration sensor input? What preparation is required to implement livepreview? We have LAUNCHXL-F28P55X. Do we need to configure other peripherals?

3. If we want to collect data, what type of vibration sensor should be used, and what physical quantity should be collected?

Best Regards.

  • Hello,

    I've brought this thread to attention to the Edge AI Studio experts for further attention.

    Thanks

    ki

  • Thanks. In addition to the question 3, we just learned that we can use adxl355. However, we still have questions. The voltage signal collected by adxl is proportional to the acceleration. According to your data, it does not look like the original voltage data.

  • Hi,

    1. What's the function of the file "test_vector.c" located at C2000Ware_MotorControl_SDK_5_03_00_00\solutions\edge_ai_fault_detection_with_mc\motor_fault_live_ preview_validation_f28p55x? Its role is to provide input data to the model deployed on the MCU and verify the results without sensorless data input?

    The role of this code is to verify that the same data pre-processing is done on the target that was done for model training.    

    2. Can "livepreview" be demonstrated without vibration sensor input? What preparation is required to implement livepreview? We have LAUNCHXL-F28P55X. Do we need to configure other peripherals?

    Currently we do not support live on-target inference (Live Preview) without sensor data. However, it is a feature that is on our roadmap for a future release.

     

    3. If we want to collect data, what type of vibration sensor should be used, and what physical quantity should be collected?

    For motor control sensor requirements please refer to https://dev.ti.com/tirex/explore/node?node=A__AYjCIAmJIjRiZJ7OuRmv0w__motor_control_c2000ware_sdk_software_package__0.jXikd__LATEST

    Let me know if this addresses your questions,.