Stanfi
Electrical
- Oct 11, 2004
- 71
I have an application where I have a 0-36mv dc signal going into an instrumentation amplifer, and I set the gain to give me around 0 to 10VDC out.
In my application the lowest the my mv singal may go is 20mv, so my scalable signal is only 0 to 36mv an span of 16mv. I would like to be able to ingnore this 20mv, and scale my input so that I can amplify 0 to 16mv and get 0 to 10V out. I am limited on my output, and I cannot get enought output singal to allow me to offset the output, so would like to find a way to do it on the input, but I do not want to introduce uncessary distortion.
Any suggestions, would be appreciated.
In my application the lowest the my mv singal may go is 20mv, so my scalable signal is only 0 to 36mv an span of 16mv. I would like to be able to ingnore this 20mv, and scale my input so that I can amplify 0 to 16mv and get 0 to 10V out. I am limited on my output, and I cannot get enought output singal to allow me to offset the output, so would like to find a way to do it on the input, but I do not want to introduce uncessary distortion.
Any suggestions, would be appreciated.