Full Scale (FS) is typically used on pressure gauges.
A 0-100 PSI gauge would have a ±1psi error, which is huge error as a percentage of the reading at the lower end of the scale, say at, 3 psi, ±1psi, the error is 33% of the reading. But at 85 psi, ±1 psi, the percentage error is far less.
Temperature transmitters, on the other hand, are frequently spec'd for error in percentage of span, where span is Upper range value (URV, the 20mA eng value) minus Lower Range Value (LRV, the 4mA eng value), because the lower range value (LRV) is frequently not at zero like the typical pressure gauge whose bottom range is zero.
A 50° to 100° measurement output range is a 50° span (100°-50°). If you have 2°error for a 50° span, I would say that you do indeed have a 4% error, 2/50.
If your comparison is to a traceable standard, I'd say something is out of whack.
FYI, thermocouples themselves have a fairly wide range of uncertainty, out-of-the-box, per the ANSI standard that states standard and special limits-of-error. For example, a type K standard limit of error in the 0 to 293°C range has an acceptable uncertainty of ±2.2°C. The special limit-of-error is ±1.1°C for the 0 to 275°C range.