Why is it that in some cases, the Neutral Grounding Resistor (NGR)- LRG (Low Resistance Ground), is dimensioned to restrict the fault current to the full load current of the transformer, rather than the typical 400 amperes?
There is no logical reason - it's just an outdated rule of thumb. I've seen a wide variety of limits for LRG over the years. With LRG, tripping is always required and ground faults are nearly always detected by ground overcurrent relays. With today's digital relays, maximum fault current could be limited to much lower values and still allow for selective coordination while also reduced ground fault damage. 100 A to 200 A is a reasonable range to consider.
Limiting the LRG to full load current has an advantage that the High impedance REF protection (87N-HiZ) is possible to be implemented with Neutral CT specified identical to the phase CTs. This used to be the practice in olden days I suppose.
For the last few decades, I have been seeing LRG standardised at 300A or 400A through the plant.
IEEE Red book has some good discussion on the subject.