Hello!
We are developing an IoT system, which also uses flowmeters as its sensors.
I need to understand the algorithms involved to be able to debug problems as they arise.
For this reason, I have connected the FR6047 eval board to my computer and I ran water through a tube with 2 ultrasonic transducers.
I am currently facing a behavior that I cannot explain and which potentially points to my lack of understanding of the algorithms involved.
I was under the impression, that
Volume flow rate = Meter constant * dTOF / ( absTOF1 * absTOF2 )
And that the dTOF is calculated by different means than absTOF1/2, perhaps to improve noise immunity (even though mathematically speaking, you could just calculate the absolute times of flight
and then subtract them from one another).
This agrees with the information presented here:
So I lowered the "envelope crossing threshold" so that it gets burried in the noise. This causes the abstof values to jump, sometimes resulting in the "DToF - Shift value was greater than maxSampleShift" error.
This is to be expected. What I did NOT expect is that this also changes the result of the DtoF calculation. But the parameter itself should only affect the absolute times of flight?
Why am I observing this behavior? Is the DtoF not calculated using correlation? This doesn't make sense either, since I am sometimes getting correlation errors when I mess with the settings.
Also, what's the difference between USS_Alg_dToF_Calculation_Option_water and USS_Alg_dToF_Calculation_Option_estimate?
Thank you in advance for the clarification!
Best regards,
David Šibrava