Do the math. A typical microwave putting out 1000 W onto about 20 in^2 of area, results in 50 W/in^2.
A 1-W transmitter, even if it was all pointed at you is something like 0.01 W/in^2. Additionally, because the power level is so low, you have plenty of time to dissipate the additional heat load. A 10 hr exposure might raise the temperature of 10 kg of water by 0.8ºC, if you completely neglect the invariable heat losses to the environment. Since the typical wireless antenna is omindirectional, the exposure is even less.
This is all high school math level calculation, so it's unfortunate that most people can't seem to bother to do the calculations themselves and would rather be swayed by rumor mongerers and the like.
TTFN
FAQ731-376