Histor The ability to instrument equipment is a manufacturer responsibility. What that means is figuring how to do it in a safe, cost-effective, reliable fashion. The specific items that require instrumenting are usually defined by the equipment user, as it is often a function of how critical the machinery is to the process (or the financial bottom line). For equipment that is more critical to process success, or is more costly / time-consuming to fix, there is usually a higher degree of instrumentation. A "protect your investment" philosophy in action, if you will.
Instrumentation can be used to determine several parameters that can tell you about the health of the machine being monitored. These include: thermal sensors (winding and/or bearing temperatures, coolant temperatures, etc.), vibration sensors (bearings, drive train). power quality sensors (drive input/output harmonics, signal distortion, etc.), flow or pressure sensors (usually related to coolant but sometimes to lubricant), and so on.
The TYPE of instrument used to obtain a certain measurement is also usually defined by the equipment user - often because part of the control system is already existing, or there is an over-arching philosophy to it. For your specific example, there are multiple sensor types: resistance temperature detectors (RTDs), thermistors, thermocouples, infrared sensors, thermometers, bimetallic switches, and change-of-state (piezoelectric) devices. What you use is really up to you - and to device availability, in some cases.
In some cases, the user relies on the system designer to figure things out and create the specification. From the manufacturer's perspective, it makes no difference: it always boils down to "us" (the equipment - in this instance motor - manufacturer) or "them" (everyone else).
Converting energy to motion for more than half a century