Truly, thank you so much for responding. I love learning from experts.
This is usually combined with a very well insulated cage to prevent that energy from escaping
Faraday cage. Please. I’m a fool, not an imbecile. And to be clear, I’m well-aware of ionizing radiation bands.
However, my concerns lie in extended exposure. I’ll relate this to analog. Regardless of frequency, sound as quiet as 70 decibels can cause hearing loss after extended exposure. In the territory of 24 hours and longer, mind you. This is as quiet as, say, a hearty conversation, or a washing machine.
And this is a dual inverted sliding scale. Hearing loss zones:
| XXdB | Duration before hearing loss. |
| 70dB | 24h | As quiet as a clothes washer can cause hearing loss. Really.
| 75dB | 8h |
| 80dB | 2h |
| 90dB | 1h |
| 95dB | 50m |
| 100dB | 15m |
| 105dB | <5m |
| 110dB | <2m |
| 120+dB | Instantaneous |
I’d like to know where the scales for EM radiation amplitudes are. I’ve read a few studies but most of them focus on bursts or separated exposures. Very few of them observe sustained continuous exposure.
Also, I’m aware sound and radiation are not apples to apples, but my point of relating energy input and exposure duration is the same. If you ask anyone if 70 dB is safe, everyone will tell you, “Yes. Of course.” Which is not correct. Even 60 dB can do you further harm, if your ears have not healed from damage sustained immediately prior.
Some small levels of UV 3 might get through (hello skin cancer).
Now you’re getting into much more familiar territory. UV-A, the lowest band of UV, at UV 1 is entirely capable of causing sun-burns. It just depends on exposure time and pigmentation. Any sunburn has the potential to cause cancer. The more intense the burn and larger the affected area, the higher the chance more cells mutate, the higher your chance one of those cells becomes an unstable cancer cell, the higher the chance one of those cells becomes stable, and the higher chance for metastasis… From non-ionising low flux UV-A. Possible, but unlikely. Though increasingly likely as duration increases.
These transmitters can be legally and safely placed in urban areas provided adequate separation between the antenna and the public, usually 30-40 meters.
The EIRP drops off quickly in the first few meters after the antenna, as the signal expands outwards towards the service area; so even being within 15m is generally safe.
Inverse square, I’m familiar. And now we’re cooking with fire. Let’s elide frequency for a moment, pretending it’s irrelevant in the same way mechanical waves’ frequencies are.
Let’s assume 30-40 meters for 100KW antenna and 15 meters for ~20W macro cell is instantaneous minor damage. With each meter you distance yourself, the concentration of wattage decreases. What do you suppose the limit of energy density is for immediate damage when in direct contact? What do you suppose is the limit of wattage for sustained direct exposure on scale of 24 hours. That is, the equivalent of 70dB for intensity. What about sustained exposure for several years?
I don’t fear the effects it will have. After all, death will come to us all at some point. However, that doesn’t mean I’ll be reckless with the time I have left.
Also, I just started studying to get my Ham and haven’t quite wrapped my brain around a lot of the implications, so your input is very much appreciated. Thanks.
I apologize for not getting to this in a timely fashion. Good luck with your Ham studies. Always nice to see others getting their license.
It’s… Complicated. My understanding of the physics of the problem is limited to say the least, but what I comprehend of it is that frequency and EIRP are parts of a whole. I’m sure there’s more to it, but the simplest way I can explain it is that higher frequencies have higher energy.
In electricity, watts are king. Watts directly represent the amount of power in the mix. You can have all the volts or all the amps, but if you have no watts, you have nothing. Not quite the case with EM. The actual energy in an EM wave is some product of the power output, proximity, wavelength… More than a few things. The watts are in there, they’re just not the only thing.
Since higher frequency has more energy, then it is reasonable to conclude that 100W of 10 MHz EM is less dangerous than 100W of 10 Thz EM. How much more dangerous? IDK.
How much safer is it to be exposed to the same EIRP at the same distance of 1MHz vs 10MHz? I have no idea where to even start to calculate it.
The question goes so far into physics that I don’t think anyone who knows how to figure it out, has bothered to do so. The numbers would just be crazy. And looking at all the problems with generating a “safe” frequency at an unsafe power level, it’s not shocking to see why.
What I know is that as you get up into the high UV B and into the UV C ranges, the exposure from the sun can cause some (relatively minor) issues, and with prolonged exposure, you can develop skin cancers and such. I don’t know how this might relate directly to EM power, but I’ve heard that the sun has a warming capacity on the earth of 1000W per meter squared. I don’t think we’re being blasted with 1000W of UV in total, from the sun; even taking a whole power away from that and saying 100W is probably still far higher than we actually get from the sun; but I have no basis for making any such assumptions. But if we assume, for a moment, that 100W/m3 of UV C can cause cancer, and given that lower frequency has less energy, and that UV C is in the high THz bands, for the sake of argument, we’ll assume 600 THz +, then something over 1000 times lower in frequency, at 600 GHz, does it cause any issues at 100W? 1000W? 50,000W?
Apart from this broadcast FM, which is even lower in frequency, at 95-110 MHz (ish), is legally transmitted at tens of thousands of watts, and has been continuously for decades, with no apparent issues to the public being constantly bombarded with the EM, 24/7 for their entire life.
Bluntly, the power levels required for frequencies lower than light, would need to be significantly above what would be legal for anyone to emit intentionally. I’m not even sure you would legally be allowed to buy anything that would be able to even come close, if such an amplifier existed.
To that end, I would propose that any harm that EM might cause at high enough power levels for the “lower bands” (under the THz range), is basically an impossibility.
Personal safety would be more important at higher levels since generating a broadcast signal at 1000’s of watts would simply be a dangerous amount of power to handle. Anything beyond that would be insane to even attempt, and there would be no purpose to do it. Anything high enough band to be incapable of ionospheric reflection, and thus limited to the horizon, would easily reach that distance well under 1000W, albeit with some caveats about possible obstacles (absorption, reflection, diffraction, etc), and anything low enough band for ionospheric reflection won’t need that much power to bounce through the ionosphere around the globe. With little more than 15-20W, I’ve known people to be able to have global reach… More power is simply not required.
Which leads me to my point. Having that information wouldn’t necessarily be useful. The power levels would, at best be illegal and unsafe to generate, and at worst, it would be impossible to construct a signal amplifier capable of creating that powerful of a signal.
Truly, thank you so much for responding. I love learning from experts.
Faraday cage. Please. I’m a fool, not an imbecile. And to be clear, I’m well-aware of ionizing radiation bands.
However, my concerns lie in extended exposure. I’ll relate this to analog. Regardless of frequency, sound as quiet as 70 decibels can cause hearing loss after extended exposure. In the territory of 24 hours and longer, mind you. This is as quiet as, say, a hearty conversation, or a washing machine.
And this is a dual inverted sliding scale. Hearing loss zones:
I’d like to know where the scales for EM radiation amplitudes are. I’ve read a few studies but most of them focus on bursts or separated exposures. Very few of them observe sustained continuous exposure.
Also, I’m aware sound and radiation are not apples to apples, but my point of relating energy input and exposure duration is the same. If you ask anyone if 70 dB is safe, everyone will tell you, “Yes. Of course.” Which is not correct. Even 60 dB can do you further harm, if your ears have not healed from damage sustained immediately prior.
Now you’re getting into much more familiar territory. UV-A, the lowest band of UV, at UV 1 is entirely capable of causing sun-burns. It just depends on exposure time and pigmentation. Any sunburn has the potential to cause cancer. The more intense the burn and larger the affected area, the higher the chance more cells mutate, the higher your chance one of those cells becomes an unstable cancer cell, the higher the chance one of those cells becomes stable, and the higher chance for metastasis… From non-ionising low flux UV-A. Possible, but unlikely. Though increasingly likely as duration increases.
Inverse square, I’m familiar. And now we’re cooking with fire. Let’s elide frequency for a moment, pretending it’s irrelevant in the same way mechanical waves’ frequencies are.
Let’s assume 30-40 meters for 100KW antenna and 15 meters for ~20W macro cell is instantaneous minor damage. With each meter you distance yourself, the concentration of wattage decreases. What do you suppose the limit of energy density is for immediate damage when in direct contact? What do you suppose is the limit of wattage for sustained direct exposure on scale of 24 hours. That is, the equivalent of 70dB for intensity. What about sustained exposure for several years?
I don’t fear the effects it will have. After all, death will come to us all at some point. However, that doesn’t mean I’ll be reckless with the time I have left.
Also, I just started studying to get my Ham and haven’t quite wrapped my brain around a lot of the implications, so your input is very much appreciated. Thanks.
I apologize for not getting to this in a timely fashion. Good luck with your Ham studies. Always nice to see others getting their license.
It’s… Complicated. My understanding of the physics of the problem is limited to say the least, but what I comprehend of it is that frequency and EIRP are parts of a whole. I’m sure there’s more to it, but the simplest way I can explain it is that higher frequencies have higher energy.
In electricity, watts are king. Watts directly represent the amount of power in the mix. You can have all the volts or all the amps, but if you have no watts, you have nothing. Not quite the case with EM. The actual energy in an EM wave is some product of the power output, proximity, wavelength… More than a few things. The watts are in there, they’re just not the only thing.
Since higher frequency has more energy, then it is reasonable to conclude that 100W of 10 MHz EM is less dangerous than 100W of 10 Thz EM. How much more dangerous? IDK.
How much safer is it to be exposed to the same EIRP at the same distance of 1MHz vs 10MHz? I have no idea where to even start to calculate it.
The question goes so far into physics that I don’t think anyone who knows how to figure it out, has bothered to do so. The numbers would just be crazy. And looking at all the problems with generating a “safe” frequency at an unsafe power level, it’s not shocking to see why.
What I know is that as you get up into the high UV B and into the UV C ranges, the exposure from the sun can cause some (relatively minor) issues, and with prolonged exposure, you can develop skin cancers and such. I don’t know how this might relate directly to EM power, but I’ve heard that the sun has a warming capacity on the earth of 1000W per meter squared. I don’t think we’re being blasted with 1000W of UV in total, from the sun; even taking a whole power away from that and saying 100W is probably still far higher than we actually get from the sun; but I have no basis for making any such assumptions. But if we assume, for a moment, that 100W/m3 of UV C can cause cancer, and given that lower frequency has less energy, and that UV C is in the high THz bands, for the sake of argument, we’ll assume 600 THz +, then something over 1000 times lower in frequency, at 600 GHz, does it cause any issues at 100W? 1000W? 50,000W?
Apart from this broadcast FM, which is even lower in frequency, at 95-110 MHz (ish), is legally transmitted at tens of thousands of watts, and has been continuously for decades, with no apparent issues to the public being constantly bombarded with the EM, 24/7 for their entire life.
Bluntly, the power levels required for frequencies lower than light, would need to be significantly above what would be legal for anyone to emit intentionally. I’m not even sure you would legally be allowed to buy anything that would be able to even come close, if such an amplifier existed.
To that end, I would propose that any harm that EM might cause at high enough power levels for the “lower bands” (under the THz range), is basically an impossibility.
Personal safety would be more important at higher levels since generating a broadcast signal at 1000’s of watts would simply be a dangerous amount of power to handle. Anything beyond that would be insane to even attempt, and there would be no purpose to do it. Anything high enough band to be incapable of ionospheric reflection, and thus limited to the horizon, would easily reach that distance well under 1000W, albeit with some caveats about possible obstacles (absorption, reflection, diffraction, etc), and anything low enough band for ionospheric reflection won’t need that much power to bounce through the ionosphere around the globe. With little more than 15-20W, I’ve known people to be able to have global reach… More power is simply not required.
Which leads me to my point. Having that information wouldn’t necessarily be useful. The power levels would, at best be illegal and unsafe to generate, and at worst, it would be impossible to construct a signal amplifier capable of creating that powerful of a signal.