Hi
We are designing a system where we want to drive the LED using PWM. Now we have the 5V voltage which is fed to the system as the system supply. There are some MOSFET that is driving the LEDs and Gate of the Mosfet is driven by the Microcontroller.
Now to power up the microcontroller, we have used TPS76333DBVR which converts the 5V to 3.3V. And PWM driving frequency is 200kHz with the varying duty cycle from 10% (on Time) to 100% (On time) based on the selection. Now between 25% (On Time) to 70% (On Time), 3.3V becomes 3.52 or even 3.6V. We suspect that is due to the high-Frequency switching happening in the supply voltage.
Can you please guide us how we can control the same.Or suggest any other part from TI which we can use. We saw the TPS7533 part but its footprint is too big for our application.
Attached is the snap of the DSO for the zoomed out and zoomed in, showing the PWM Output from the Micro Controller, waveforms captured when C120 & C121 are 10uF, 25V, GRM21BC71E106KE11L in the attached schematic snap.
Filename
1) 1.BMP (25% On Time)
2) 2.BMP (25% On Time, Zoomed In Image)
3) 3.BMP (70% On Time)
Thanks