How can I maximize or optimize the range of voltages the ADC can sample for a highly sensitive system?
Let me start by providing a little background on my project. I'm working on developing an avalanche transceiver. For those of you who are unfamiliar with avalanche transceivers, an avalanche transceiver is a search and rescue device used to help locate buried avalanche victims. A transceiver in transmit mode sends a 457kHz pulse for ~70ms every (depending on the model) 400-800ms. This generates an electromagnetic flux pattern that can be detected by a transceiver in receive mode. Modern transceivers are equipped with 2 or more ferrite rod loopstick antennas.
Currently my design consists of a loopstick antenna feeding an AD8310 logarithmic amplifier. The log amp is needed because the emitted signal falls off exponentially as a receiver moves away from the transmitter. The output voltage (VOUT) of the AD8310 ranges from 0.4V to ~2.6V. From what I can tell this output range pairs nicely with the input range of the ADCs on STM32 ADCs. However, the ferrite rod loopstick antennas are highly susceptible to noise coming from cellphones, radios, etc. So my plan was to feed AD8310's VOUT through an active low-pass filter with a gain of ~2 to clean up the signal as much as possible and extend the range of VOUT from 0.8V to ~5.2V. My thought was that the wider output range would help the firmware differentiate changes in true signal strength vs signal noise. But if the STM32's ADC input ranges from 0V to 3.6V, then the gain from the low-pass filter is useless, right? Is there any way to extend the input range of the ADC? Or is my thought process way off? Are there any other options?
Thanks for taking the time to read this and happy holidays!
