I need to measure current from a photodiode that has 80pF of capacitance. The diode's current can be as low as 0.1uA. I only care about static brightness of the light, not an AC signal. Because my signal is DC, I don't think "noise" really affects my application, so I'm fine with a two stage design (although less components / less cost is always preferred).
The light is only turned on for 10 microseconds to be measured. So I need to settle in roughly 5us so I still have time to take A/D readings. I don't have a bandwidth requirement, just this settling time requirement. My understanding is that a 5 microsecond settling time will require an effective bandwidth between 600kHz and 2MHz.
To obtain a 100mV output would require a megaohm feedback resistor. According to equation 4 of AN-1803 to get to 600kHz would require a GBWP of 180MHz, and 2MHz would require a GBWP of 2000MHz. Using a smaller feedback resistor might allow a slower opamp, but I worry about how small of output voltage is really measurable.
I'm still trying to understand some of the basics of transimpedance amplifier design. Obviously there are some very fast opamps out there, but there are other parameters to consider: input bias current, input offset current/voltage, voltage/current noise versus sqrt(frequency). I'm trying to understand how these factors will impact accuracy of the opamp. How can I calculate how much these different parameters matter?
I was looking at another post, JFET + transimpedance amplifier, where Julien TAIEB was trying to measure a 0.1uA signal with an OPA847IDR opamp. However this part lists -19uA typical Input Bias Current and 0.1uA typical Input Offset Current. I guess I thought that either one of these input current errors would make it impossible to measure a signal as small as 0.1uA