As a first order approximation, reactive power correlates to voltage magnitude and real power correlates to voltage phase angle. For typical 3 phase distribution circuit the voltage drop of 1500 kVAR is significant whereas the voltage impact of 1500 kW is minor.
Is it a fair assumption tat your motor would be running no load at approximately 1500 kVA @ 0.1 pf? And that reduced voltage starting would have a spick of roughly 200%?
I would care about the no-load testing because of the possible impacts to voltage. Adding 1500 kVA @ 0.1 pf is almost as much voltage change as I would expect from turning off two switched pole mounted capacitor banks. Even if the utility had installed 2 spare switch capacitors close to you, it would be challenging to align the operation of utility owned capacitors with the timing of your motor testing. For certain loads with low power factor I have required customers to install switched capacitors so that the Point of Connection is maintained at a reasonable power factor.
In many places on my 12.5 kV system just turning off the motor would cause flicker above the visibility curve in the ancient
GE flicker curve. For nearly all locations in my system, turning off the motor would cause visible flicker when when being served from an alternate substation. Although outdated, the flicker curve is a baseline for starting a more in depth analysis.
For controlling the substation transformer LTC or substation voltage regulator, the designer needs to consider the power factor of the load in order to determine voltage regulator setpoints (band center, R, and X) that will work optimally. Large swings in load power factor make it more challenging to optimize voltage regulator settings.