Dewpoint matters if you care about moisture in your lines or devices. If the ambient temperature in your distribution system can fall to 30 F or below, 30 F dewpoint isn't low enough.
Unless you need the air dry for process reasons (i.e. to feed an ozone generator or some other water-sensitive process), what you're trying to do is to have air that is "superheated", i.e. not saturated, in water vapour at the highest pressure and lowest temperature which will be encountered anywhere in your system- with some additional margin on that so that there is no risk of condensate formation. You can achieve that by two means: by compressing to a higher pressure, drying, and then throttling and reheating to ambient conditions at a lower pressure, or by doing a very scrupulous job of drying the air at your operating pressure. Both are big users of energy.
-40 F dewpoint is very dry air. It is beyond the needs of most instrumentation. And it can only be achieved by methods which waste a lot of energy to achieve this level of dryness.
Why is it harder, and why does it take more energy, to dry air very thoroughly than to dry it a little? I don't think you need to have that question answered for you!