Flare steam rates, to suppress smoke, are a function of the type of gas which you are burning and the design of the flare tip itself.
In general the steam works in three different ways.
Primarily it cools down the core of the flame so that the "cracking" reaction is suppressed and delayed and the carbon doesn't form until the outer edges of the flame so that it can't grow to unburnable proportions.
Then there's the mechanical mixing and turbulence which it introduces into the gas/steam/air mixture which helps to promote more rapid conversion by bring the oxygen into more intimate contact with the "cracking" gas.
Finally, there is some benefit from the addition of the OH- radical into the mix which sweeps up some C+ into -CHO which is a gaseous radical rather then the C-C-C-C which is a solid.
Having said all this the range of requirements starts at roughly 0.2 w/wt for light paraffinic materials, into 0.35 - 0.45 for refinery type mixtures and developing beyond 0.6 wt/wt for aromatic and olefinic mixtures. These latter gases all have heats of formation which are intrinsically positive and therefore do not benefit from the cooling as much as the aliphatics.
Another factor is the flare tip diameter because it influences the flame diameter and the dwell time until the gas meets the air. Bigger flare tips use more steam than smaller tips for the same gas.
Hope this helps you.
If you're really stuck let me know.