Dear people of the TI E2E community,
everyone can do regression analysis on excel,
and a few can implement such a working algorithm on a microcontroller.
However doing it the normal way is very bulky and slow once you have lots of data.
I am not well informed on the state of the art fast regression analysis algorithms,
are you?
To be more specific:
There is two ways regression analysis can be too slow:
-
Lots of Lots of Lots of Data, and you don't want to spend days just calculating
-
You are doing regression analysis on a live/ real time signal where time is critical and your sensor has a high sample rate (you can't afford to lose to much time due to bulky algorithms)
I am confronted by problem 2 however believe that solutions to problem 1 can already be helpful for solving problem 2.
My question:
There should be a way to do a fast version of regression analysis that is commonly used. (similar to fourier transform and fast fourier transform)
Would you be so kind as to point me into the right direction?
(Approved References or if you coincidentally know of any literature, that would be great as well)
I tried to ask this question in the microcontroller forum,
however I was told to ask in a more general forum.
I hope I am right this time?
I am looking forward to your answers!
Also please let me know if this is the wrong place for such a question.
Best regards
Merlin