It may be somewhat academinc, but I would like to provide a better discussion of why acceleration time is the same for different speed motors (same horsepower), neglecting differences in motor inertia, and relative shape of motor torque speed curves.
Let the two speeds of interest be speed 1 (880rpm) and speed 2 (770rpm).
N = speed
Trated = Torque at rated power
Taverage = average torque over the speed range 0 to N
t = acceleration time
N = speed
J = inertia
Tload – load torque assumed zero during start for simplicity
t1 = J1*N1/(Taverage1)
Trated2 = Trated1*(N1/N2)
Assuming similar torque vs slip curves:
Taverage2 = Taverage1 * (N1/N2)
J2=J1*(N1/N2)^2
t2 = J2*N2/(Taverage2)
t2 = [J1*(N1/N2)^2] * [N2] / [Taverage1*(N1/N2)]
t2 = J1 * N1/Taverage1
t2 = t1
The quantity of interest is not likely starting time, but the ability to start the motor safely. As noted before, the lower speed motors have ability to start larger inertia's, BUT as jomega pointed out the inertia will also look bigger to the lower speed motor. I'm not sure which effect in general is more important.