No, I probably do not have to immersion cool this, but the components will be running full time in the upper range of their specs, so the cooler the better. The current setup (which is 200W total less than this) has overheated several times using active air cooling, luckily with no major damage or downtime. The hotspots run about 10C over their recommended max, about 30C under absolute max, and I expect the new components to run hotter with the increase in total output.
The problem is that there are several compact sources of heat, 4 at 20+W/cm^2, roughly 6 cm^2 apiece, ~500W total. We have looked at heat pipes to spread that out a little too. Several other (10) sources produce ~5W/cm^2, and those total roughly 60cm^2. These lower output components could be actively air cooled, but I think it is really asking to much to air cool the whole thing. This is where the liquid cooling comes in. When it's all said and done, I need to remove ~800W total from 2130cm^2 total board area (9000cm^3 volume, 30cm x 20cm x 15cm). There are multiple boards, so each would require it's own cold plate or heatsink, and with the limited space, it will be hard to get them in. The 4 100+W sources are the real concern, but I would like to keep the whole thing as cool as possible, so I thought that it might be easier, not to mention just better overall, to use immersion cooling rather than cold plates. Another app we have right now is using 2 liquid cooled heatsinks on 2 150W sources, which works fine. Some of the ~5W/cm^2 components are a little hotter than they could be, but within specs.
Indirect liquid cooling is not unfamiliar territory and would likely work fine for this too if we can cram it in, but immersion seemed to be ideal. Lack of knowledge about immersion is why I'm here though. Should we hold off on direct liquid cooling except as a last resort? Is it that much more expensive/difficult? I thought it might save some time and agony to try it first, and if it worked, great.