My understanding of it is that it is caused by a shift in the voltage of the neutral with respect to remote earth.
If a bolted fault occurs on one phase of a distribution feeder, the voltage across the fault is essentially zero, but the voltage at the substation is still pretty close to the normal line to neutral voltage.
The path of fault current is from the source, along the phase conductor, through the fault, and back to the source through the neutral/earth return path.
If the line voltage is 7.2 kv, and the phase conductor has 2 times the impedance of the earth/neutral return path, then the voltage from neutral to remote earth at the fault location is about 2.4 kv. This results in an increase of the line to neutral voltage for the unfaulted phases near the fault location.