No, because it is almost totally load dependent and based on the selection criteria for the motor in the first place. In other words the motor can be selected to allow for a severe voltage drop without stalling if the designer anticipates this as a potential problem, so without knowing that, you are left with assumptions.
But if you know the load torque requirements in detail, you can get the Break-Down torque values of the motor on its torque-speed curve, below which you are beginning to risk stalling. Then you can predict that since running torque will follow voltage drop in a fairly linear fashion, so you can plot out the point at which voltage drop will affect THAT load. This will not be applicable for every load in your facility, so to be of value on predicting the overall effect of starting a large motor, the only safe bet is to do them all and look at the worst case, because anything that tolerates the VD better than that will be fine.
All that said, the generally accepted value is -10% voltage (NEMA standards). In other words motors are supposed to be designed to provide rated torque at +-10% of nominal (design) voltage, then utilities are supposed to maintain +-5%, which leaves some fudge factor for line to load voltage drop.
"You measure the size of the accomplishment by the obstacles you had to overcome to reach your goals" -- Booker T. Washington