Why do some utilities simulation tools ignore the transmission impedance in network models for its analysis? I understand that it will increase the voltage drop and it would not allow proper sizing of conductors. But, it is necessary for sizing of breakers at substations and other devices, also...
I was wondering if it is possible for the voltage of the healthy phase to a double line to ground fault to be higher than the voltage of the healthy phases to a single line to ground fault. The fault is considered to happen on the low voltage side of a transformer feeding a grid.