Hi, we have an application which is a wearable device that provides vibration alarms to the user when mounted on the arm with special adhesive tape. We use a 1.8Vrms 235Hz LRA inside the unit. Currently, the device executes LRA autocalibration every time it is plugged in for recharge. However, we suspect that this sometimes produces sub-optimal parameters when it is unplugged and attached to the arm afterwards.
We were exploring two options for changing how our software uses autocalibration:
1. Find optimal parameters based on externally-measured vibration amplitude, then hard-code those calibration parameters into every device at factory, and stop using auto-calibration parameters after that. We assume all units are built very similar to each other.
2. Detect when device is attached to the arm, and execute autocalibration each time it happens, such that it is always optimized for its current application. The problem with that is, we can not be sure that this will happen under expected conditions each time.
In our testing, we found that hard-coding calibration parameters, on the average, results in consistently slightly better (+6~7%) vibration amplitude than autocalibration. From reading the datasheet, there seems to be no clear answer what is the better approach.
So I was wondering if there is any additional data supporting one way over the other? Any recommendations based on your experience?