You can't.
Then, there are several ways out. It depends on what power your transformer is. If it is a small (few hundred VA) unit, you can use a voltage divider hooked up from a couple of resistors. Use a low resistance to avoid capacitive influence and resistive loading by the voltmeter. Make sure that you understand what you are doing. Especially when it comes to power dissipation in the voltage divider and flash-over voltage of the resistors, which accidently (no pun) happens to be around 250 V.
If you are in doubt regarding the resistor's tolerance, you can easily check that by switching resistors. If you get the same reading - then your resistors are close enough.
If your voltmeter is an analogue one with typically 1000 ohms/volt, then you can simply double the range by adding 300 kohms between hot terminal and transformer terminal. If the consumption is something else, use that number.
If the transformer is a power unit, then just don't do it. The risks are too big. Arc-flash and all that.
Don't even measure the output of such a transformer with anything without a fuse (like in a fused measurement lead) or on the downstream side of a low-amp fuse.
Gunnar Englund
--------------------------------------
Half full - Half empty? I don't mind. It's what in it that counts.