site map


Privacy policy  ▪  About


Bookmark and Share

BLOG: March 2009

Electromagnetic danger - "No such thing in our view"

EMF&Health - EMF spectrum - Electricity 2 - Official view 2 - Politics 2 - Protection

"Sticks and stones may break my bones..." - we all know that, but what about those weak non-ionizing EMF that are everywhere? Can these tiny, invisible, imperceptible waves hurt our bodies? Neah, say the officials in charge of EMF (electromagnetic field) safety. Only if they are strong enough to raise your tissue temperature more than
1°C. Or to directly stimulate your nerves. And that's way above your average exposure. So - nothing to worry about.

Here's the problem: this public assurance is

not in agreement with great many studies on this subject,

spanning the last few decades. Not a problem, say the officials: those studies are not really good, we don't find them reliable, or sufficient. The minority of studies that do not find likely that non-ionizing EMF can be, or likely are health threatening - sure, you can trust those.

This is not a joke. In its 1998 guidelines for limiting exposure to non-ionizing fields up to 300GHz, ICNIRP (International Commission on Non-Ionizing Radiation Protection) decides that there is no reason to change safety limits based exclusively on radiation levels capable of causing immediate biological effect, either neuromuscular or thermally related. Nearly all governments in the world went along (or was it actually ICNIRP who did the following?).

This despite the fact that of all the studies - nearly 200 - cited in the ICNIRP's own official paper those that did associate much lower EMF exposure levels with either adverse health effects, or various alterations in cellular pathology,

handily outnumber those that did not.

The Commission took a stand that nothing short of complete proof - including the explanation of the mechanism - of the adverse effect won't do. If nearly all studies on particular subject are not in agreement, the Commission justifies inaction by "inconsistency", despite majority of the studies indicating causative link (for instance, low-level non-ionizing EMF exposure link with childhood leukemia).

When nearly all epidemiological studies are consistent in linking low-level EMF exposure with adverse health effects - such as the occupational exposure studies linking it to higher incidence of cancer and other diseases - then they won't consider strengthening safety limits because the type of cancer reported is not identical in all studies, or the laboratory (in vitro) studies are demanded, or replication studies by "independent" laboratories, and so on, and on...

In short, the tone in the ICNIRP Guidelines is:

"We won't drastically lower the official safe EMF exposure limit
just because we think it's the right thing to do."

Their bias is rather obvious: pointing out to design limitations and deficiencies is reserved almost exclusively for studies linking non-ionizing EMF radiation with adverse effects. "More research is needed" label applies only to the studies that do find the association with adverse health effect, never with those that don't. As if there is already existing solid evidence of non-ionizing fields being harmless below levels causing direct neuromuscular stimulation, or excessive tissue heating.

Of course, there is none. To the contrary: available evidence is clearly pointing to the opposite. All that ICNIRP can use as the basis for conforming to the current safety limits amounts to a

mere assumption of weak non-ionizing radiation being
benign or biologically inactive.

Commission's status quo bias cries out loud from the language used when commenting on the consistency in studies linking low level power -frequency field exposure (standard 50/60Hz electricity from power lines) to childhood leukemia. In the Guidelines summary, it states, word for word:

"Data on cancer risk associated with exposure to ELF fields among individuals living close to power lines are apparently consistent in indicating a slightly higher risk of leukemia among children, although more recent studies question the previously observed weak association.".

"Slightly" higher risk refers to 1.5 to 3 times higher risk of childhood leukemia established in eight studies. With its current incidence of around 3,000 a year in the U.S. alone, it translates into

1,000 to 2,000 children possibly falling pray to leukemia

due to their power field exposure each and every year.

"Weak association" refers to children exposed to about 4mG (milligauss, 1mG=0.1μT), or somewhat more, of magnetic field created by power lines. This is exposure level

hundreds of times below the 100μT official "safe limit"
for the power-frequency field.

Even if association would have been weak - and that attribute does not agree with the facts - at the level so much lower from the official "safe" limit, that should be more than enough of a reason to have the official exposure limit drastically reduced, at least for children (the fact that there is no separate official EMF exposure limits for children and pregnant women indicates just how uninformed the official basis is).

The Commission - also aware of and citing a German study (Michaelis et al. 1997) which found 3.2 times higher risk of childhood leukemia for exposures over 2mG - wasn't "convinced" the risk increase was caused by the power-frequency field exposure. In part, as it puts it, because short-term measurements in some other studies "provided no evidence of an association".

In other words, unspecified "short term measurements" somehow have more weight for them than

a series of nine studies with consistent results.

As for the "more recent" studies "questioning" the "weak" association the Commission also points to, it is quite illuminating to take a closer look of what the principal study of the three (Residential exposure to magnetic fields and acute lymphoblastic leukemia in children, Linet et al., National Cancer Institute, 1997) actually has come up with.

In the study, six groups of children exposed to increasing levels of power-frequency field - starting with 0.06-0.09μT (or 0.6-0.9mG), all the way up to the excess of 0.5μT - were compared to similarly sized group of 552 children exposed to less than 0.06μT. The next two lowest exposure groups turned in 10% increase in the rate of leukemia; the 0.2-0.3μT group had 25% higher incidence rate, while 0.3-0.4μT, 0.4-0.5μT and >5μT groups had 40%, 328% and 40% higher rates, respectively.

The corresponding 95% confidence intervals  - giving a range within which the true incidence rate can be expected to be with 95% probability - are 0.63-1.6, 0.6-1.58, 0.5-1.58, 0.7-2.6, 1.42-9.39 and 0.58-4.02 starting with the lowest exposure group, to the highest.

Taking mid-values for each group gives the rate of incidence of about one, or unchanged vs. control group, for the three lower exposure groups. For the three higher exposure groups - 0.3-0.4μT, 0.4-0.5μT and >5μT exposures - the mid values are 1.65, 5.4 and 2.3, respectively.

The figures clearly indicate that higher exposure levels - despite still being hundreds of times below the official safety limit - are associated with significantly higher incidence of leukemia.

Not to the study authors, though. Their conclusion is:

"Our results provide little evidence that living in homes characterized by high measured time-weighted average magnetic-field levels or by the highest wire-code category increases the risk of ALL (acute lymphoblastic leukemia, the most common form of childhood leukemia) in children.".

The reason? Formally, all but one group have their lover leg of the confidence interval below 1, which can be morphed into conclusion that the difference in rates between the groups is small to negligible. Of course, only if we decide to neglect that one group with the lower leg at 1.42, and don't bother with what the numbers actually show. Even a glance at the above graph clearly demonstrates such formalism as unfounded.

But the actual presentation figures of study results don't show any significant effect from the exposure. What happened?

Where did that big spike from the graph disappear?

Well, the final presentation figures were constructed based on the fact that the two highest exposure groups, with significantly higher rates of leukemia, are statistically unreliable due to their small size.

Indeed, one of these two groups had only 19 children, and the other one, with the highest exposure, only 15. So, in order to make it "statistically reliable" they merged two highest exposure groups, numbering 34 participants, with two mid-exposure groups, numbering 119 participants. That nearly smoothed out the unsightly bump at the high-exposure end, allowing the authors to proclaim that the increase in odds ratio (of suffering leukemia) for this whole group - or for exposures over 0.2μT - was insignificant at 1.24.

It was so convenient to have such a small sample size for the high exposure level (relatively speaking, since it is still hundreds of times below the government's "safe limit"), wasn't it? Out of the total of 1258 participants enrolled in the study (that included both, children with leukemia, and controls, nearly split in half), the number of participants in the two highest exposure level groups was totaling 34.

Wasn't it obvious before they even started that this small sample size, as well as grossly imbalanced study group structure

will make study results
- what ever they may happen to be -

Of course it was. But the way results were presented suggests that more important may have been to have seemingly meticulous, large study thrown in the arena, than to really go after the facts. A study designed so that its results - if needed - can be "controlled" through the "appropriate" form of presentation.

Was this 1997 National Cancer Institute study

just another "damage control study"?

We may never know for sure, so - draw your own conclusions.

It is interesting to note that even ICNIRP in its 1998 review of the study results has more regards for the facts than study authors. It states that study's results are "suggestive of a positive association between magnetic fields and leukemia risk". This, in fact, adds this study to the long string of other studies establishing such association - but ICNRP wouldn't go that far as to admit to this, and act accordingly.

In the language of its Guidelines, "suggestive" to the ICNRP equals "insufficient", since it doesn't warrant any action. Just how much more of evidence do they need?

The way this study was further manipulated you may find hard to believe, but it illustrates well the power and determination of the proponents of status quo.

Proclaiming that this "large" study did not turn in the evidence in support of association between power-line EMF exposure and childhood leukemia, they used it to publicize the final, entirely baseless official view, that such association does not exist. In the aftermath of the study, the US Department of Energy quickly

disbanded the EMF Research and Public Information Dissemination (RAPID) Program,

formed under intense pressure of public concerns for children safety, after a string of studies consistently linking power-field exposure with increased incidence of leukemia. It was declared no longer needed. There's no such thing like electromagnetic danger, period.

This agenda of official denial confronted public concerns, as well as steadily increasing incidence of health problems blamed on EMF (generally termed EMF hypersensitivity) and scientific evidence that kept piling up. Not only in the domain of power-frequency fields, but in the entire non-ionizing radiation exposure range.

How did this firm opposition to doing the the right thing, i.e. drastically lowering the official safe EFM exposure limit, at least for children, come about? We don't have to look hard in order to find probable explanation. In human affairs, the truth is not the only thing that matters, and sometimes it matters less than other "factors".

More about the factors beyond the EMF safety status quo politics on the following page.