×
INTELLIGENT WORK FORUMS
FOR ENGINEERING PROFESSIONALS

Log In

Come Join Us!

Are you an
Engineering professional?
Join Eng-Tips Forums!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!
  • Students Click Here

*Eng-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.

Students Click Here

Jobs

Intrument low range accuracy

Intrument low range accuracy

Intrument low range accuracy

(OP)
Folks,

first excuse me if I ask something stupid, as an excuse I would say I am rather a software guy.

So I have a stand which has a 1000A and a 50A current transducer (will refer to them as CT. Both of them are hall effect sensors). I need to measure currents up to 900A but I also need to measure low currents accurately below 10A. So the plan was to measure the high currents with the 1000A CT (0.5% accuracy) and the low currents with the 50A CT (1% accuracy).

What I have noticed though: if I take couple hundred readings with my data acquisition device, average these samples and then substract this average from each reading I will take during my process (so basicly do a software offset) then the low currents taken with the 1000A CT and the 50A CT show extreme correlation. My processes take only less than 10s each, so I can do this offset easily before ever run.

I have verified the correlation between several CTs and its true for all of them, and I am confused as generally speaking instruments should not work accurately in their 1% range.

I have tried to understand what do I see and my explanation is this: I take my offset reading in a state when I know exactly how much current we have in the system (0A as the circuit is not energized). So offsetting by this value I can achive pretty much 0.00% accuracy at 0A. The reading combines all the errors (eg.: offset, linearity, gain, temperature etc etc). By start reading values close to this point these errors are start to change, but the change accross 1% of the range of the instrument (0-10A on a 1000A unit) is so small, that I practically can achive very good accuracy in this region.

If this is true then I can drop the 50A CT out of the system and just use the 1000A one with a software offset.

Again, my theory may not be accurate.

Can you explain me whats going on? Your help is really appreciated!

Thanks,
Istvan

Red Flag This Post

Please let us know here why this post is inappropriate. Reasons such as off-topic, duplicates, flames, illegal, vulgar, or students posting their homework.

Red Flag Submitted

Thank you for helping keep Eng-Tips Forums free from inappropriate posts.
The Eng-Tips staff will check this out and take appropriate action.

Reply To This Thread

Posting in the Eng-Tips forums is a member-only feature.

Click Here to join Eng-Tips and talk with other members!


Resources