I'm trying to find a way to calculate the voltage dip when a transformer is connected to weak network.
The idea is that when a transformer is connected, it may take up to ten times the nominal current. This causes a voltage dip in the network because the generator reactances are fairly high...