Skip to main content
parth kothiya
Associate III
February 15, 2022
Solved

0-10V(IP) dc to 0-3.3V(OP) dc analogue linear conversion circuit?

  • February 15, 2022
  • 1 reply
  • 5713 views

i want to convert 0-10V in to 0-3.3V linearly for analogue input of STM32 MCU so for this application any op-amp circuit which can done this?

    This topic has been closed for replies.
    Best answer by Peter BENSCH

    It depends on what you want to do with the output voltage: should it be measured or is it used for further processing with an internal comparator or opamp?

    If it is to be measured: does it necessarily have to be converted to 3.3V, or can you also accept a lower output voltage?

    To stay with the measurement: both have advantages and disadvantages:

    • If the maximum output voltage is the same as the reference voltage, you use the full range of the ADC, resulting in maximum resolution.
    • if the maximum output voltage is smaller than the reference voltage, you can correct the measurement result with a fixed factor, but this reduces the usable resolution

    Let's assume for the sake of simplicity that you want to measure the signal converted from 10V to 3.3V. Then you can, for example, reduce the 10V to 3.3V with a voltage divider 67:10 (e.g. 68k+33k or a better optimized ratio), then buffer it with a Unity Gain Buffer Amplifier as described here in Wikipedia, fig 3, and then connect it to the ADC input.

    If the opamp is supplied by VDD=3.3V, you have to make sure that it is a RRIO amplifier, i.e. rail-to-rail input and output, like e.g. TSZ121 or TZ181. Of course you can also use other types with a larger offset, but don't forget that the offset is an error in the output signal.

    Good luck!

    Regards

    /Peter

    1 reply

    Peter BENSCH
    Peter BENSCHBest answer
    Technical Moderator
    February 15, 2022

    It depends on what you want to do with the output voltage: should it be measured or is it used for further processing with an internal comparator or opamp?

    If it is to be measured: does it necessarily have to be converted to 3.3V, or can you also accept a lower output voltage?

    To stay with the measurement: both have advantages and disadvantages:

    • If the maximum output voltage is the same as the reference voltage, you use the full range of the ADC, resulting in maximum resolution.
    • if the maximum output voltage is smaller than the reference voltage, you can correct the measurement result with a fixed factor, but this reduces the usable resolution

    Let's assume for the sake of simplicity that you want to measure the signal converted from 10V to 3.3V. Then you can, for example, reduce the 10V to 3.3V with a voltage divider 67:10 (e.g. 68k+33k or a better optimized ratio), then buffer it with a Unity Gain Buffer Amplifier as described here in Wikipedia, fig 3, and then connect it to the ADC input.

    If the opamp is supplied by VDD=3.3V, you have to make sure that it is a RRIO amplifier, i.e. rail-to-rail input and output, like e.g. TSZ121 or TZ181. Of course you can also use other types with a larger offset, but don't forget that the offset is an error in the output signal.

    Good luck!

    Regards

    /Peter

    Peter BENSCH
    Technical Moderator
    February 15, 2022

    If the problem is resolved, please mark this topic as answered by selecting Select as best. This will help other users find that answer faster.

    /Peter

    parth kothiya
    Associate III
    February 16, 2022

    Can you please suggest me any circuit schemetic for this​.

    Voltage divider > voltage follower for importance matching.

    If I want to add protection here op voltage is must between 0-3.3V because more than 3.3V STM32 ADC pin demage how can I add this protection any circuit schemetic?​