sam on July 08, 2012, 08:45:35 am
You seem very confident about what data you haven't seen would show. 

For two reasons:

1.  I expect it to show what data I have seen shows.

2.  If they don't show it, it is data that would get them in trouble.

There have been plenty of studies on radon showing a link with lung cancer, both occupational (e.g. in miners) and residential (living in high-radon areas). 

No.  There have been plenty of people who loudly and confidently drew the conclusion that living in high radon areas increases lung cancer, which conclusions were allegedly based on studies that they declined to show us, but no studies showing that living in high radon areas increases lung cancer.

But really, what you should do is look at those studies by Cohen in their historical context, as summarised in this article: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3073196/?tool=pubmed

Which tells us that after torturing the data, they somehow managed to draw the required conclusion, despite the fact that no particular specific identifiable study supported the required conclusion.

They pooled data from different studies operating on different principles, with the unsurprising result that the pooled data showed something different from what any particular study showed.

When you pool data from different studies operating on different principles, the number of ways of "analyzing" the data increase combinatorially, guaranteeing that you can find an analysis that shows whatever you want.

Quote
Quote
On average, we each get about 3 mSv each year, and some people live fine in areas of the world where the background is rather higher.

I guess "higher by a factor of 17"  (3mSv vs 50 mSv) counts as "rather" higher.

Some substantial populations get 250 mSv per year.

paddyfool on July 09, 2012, 03:59:36 am
You seem very confident about what data you haven't seen would show. 

For two reasons:

1.  I expect it to show what data I have seen shows.

What data have you seen from the JSS on leukaemia, then? 

Quote
2.  If they don't show it, it is data that would get them in trouble.

They're reviewing the research done on the leukaemia data over a few different prior studies.  It's not exactly commonplace to reproduce the raw data from said studies when you're doing this.

Meanwhile, what do you make of this, the most thorough review I've yet found of the other data on low-dose (specifically, 100 mSv) effects on leukaemia risk? http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3095477/?tool=pubmed

There have been plenty of studies on radon showing a link with lung cancer, both occupational (e.g. in miners) and residential (living in high-radon areas). 

No.  There have been plenty of people who loudly and confidently drew the conclusion that living in high radon areas increases lung cancer, which conclusions were allegedly based on studies that they declined to show us, but no studies showing that living in high radon areas increases lung cancer.

Meh.  You're leaving aside the evidence linked to in that review.

See here for a meta-analysis and summary of 13 European studies (particularly figure 2): http://www.ncbi.nlm.nih.gov/pmc/articles/PMC546066/?tool=pmcentrez

And here for a meta-analysis and summary of 7 North American studies: http://www.ncbi.nlm.nih.gov/pubmed/15703527/

But really, what you should do is look at those studies by Cohen in their historical context, as summarised in this article: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3073196/?tool=pubmed

Which tells us that after torturing the data, they somehow managed to draw the required conclusion, despite the fact that no particular specific identifiable study supported the required conclusion.

Required by whom for what purpose?  This is a conspiracy theory without any obvious benefit for the conspirators.

They pooled data from different studies operating on different principles, with the unsurprising result that the pooled data showed something different from what any particular study showed.

I've looked into the methodology of the two main meta-analyses behind the conclusions reached in that review on household exposure and no, they really didn't.  The criteria for inclusion were clear, and the mode of analysis consistent.  Also, they included extensive sensitivity analyses to ensure that they weren't biasing the outcome by any particular methodology.

Quote
When you pool data from different studies operating on different principles, the number of ways of "analyzing" the data increase combinatorially, guaranteeing that you can find an analysis that shows whatever you want.

The risk is not so high as when you don't pool the data from all the valid studies, or consider prior reviews of the evidence, but cherrypick the studies you like, based on the conclusions you want to find.

Some substantial populations get 250 mSv per year.

Source, please.

sam on July 10, 2012, 06:24:10 am
2.  If they don't show it, it is data that would get them in trouble.

They're reviewing the research done on the leukaemia data over a few different prior studies.  It's not exactly commonplace to reproduce the raw data from said studies when you're doing this.

You, and they, are not producing evidence, but making claims about what authority claims, authority telling us that authority tells us. That is not science, and on a question as politically dangerous as the effects of low dose radiation, authority is not evidence.  If Authority had evidence for the politically correct view, they would show us that evidence, instead of assuring us it exists somewhere some unspecified place in some form we are insufficiently learned to understand.

Radon gas is extremely patchy, and the patches have no correlation with other population characteristics that might affect cancer.  If radon gas caused cancer, the correct approach is to look at populations exposed to high levels of radon gas because of the rocks that happen to be underneath them, not because of lifestyle or employment characteristics.  That is the only kind of survey that counts, and all such surveys shows no harmful radiation effects, and arguably beneficial radiation effects.  Using any other kind of data is deliberately looking for bad data to grind your axe - just as looking at responders rather than evacuees is deliberately looking for bad data to grind your axe.

And if you are going to use bad data, show us the bad data.  Don't show us what authority says about what authority says about what authority says about the bad data.  Bad data is bad science, but what authority says about bad data is not even science.

Meanwhile, what do you make of this, the most thorough review I've yet found of the other data on low-dose (specifically, 100 mSv) effects on leukaemia risk? http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3095477/?tool=pubmed

This is not a review.  It is a meta-analysis.  Meta-analyses are notoriously worthless even on entirely uncontroversial topics.  On a controversial topic, everything that makes meta-analyses worthless makes them not merely worthless, but of negative value.

Even in a non political meta analysis of non political data, publication bias in favor of significant results is going to produce bogus results:

Here is one problem with even non political meta-analysis:  Suppose a penny coming ten heads in a row is statistically significant, and therefore publishable, but random behavior is unpublishable.  You get one hundred published papers each reporting a penny coming up ten heads in a row, no published papers showing undramatic penny behavior.  You do a meta-analysis, and now you have a published paper showing a penny coming up one thousand heads in a row!  Someone else does a meta-analysis, including that previous meta-analysis as original independent data, when in fact it is re-analyzing the same data, and now you have a published paper showing a penny coming up two thousand heads in a row!   Rinse and repeat! 

A meta-analysis generates bogus statistical significance, fabricating strong statistical significance from a multitude of analyses each with weak statistical significance.  People use meta-analyses when they cannot get analyses that show strong significance - but the usual reason that they can never get strong significance is because there is really no correlation at all.  If low level radiation actually caused leukemia, no one would be reduced to doing meta analyses on each other's analyses, because they would be able to attain statistical significance without being reduced to such extremes.

Thus the very existence of meta analysis papers showing a correlation between x and y is evidence for the complete lack of any correlation between x and y.  When people use statistical methods that are well known to be bad, it is evidence that they are hard up for evidence that shows what they want to show.

As with evidence for flying saucers, the abundance of fishy smelling evidence is itself evidence for the absence of non fishy evidence.

If people provide meta analysis as evidence for x, that is itself evidence against x.  Because of various forms of publication bias, meta analysis is naughty, and well known to be naughty. If people do naughty things to prove x is true, x is probably untrue.

I've yet found of the other data on low-dose (specifically, 100 mSv) effects on leukaemia risk? http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3095477/?tool=pubmed

Another meta analysis.  Funny thing that.   

I repeat:  meta analyses are well known to be bad evidence.  Reliance on bad evidence is itself evidence for the absence of good evidence.  If low doses caused leukemia, there would be good evidence.

Meh.  You're leaving aside the evidence linked to in that review.

See here for a meta-analysis and summary of 13 European studies

Again with the meta analysis.

If anyone had any good evidence that radon caused harmful effects in low doses they would present that survey.  Since they don't present any one specific survey that shows what they want to show, we may conclude that no such survey exists.

If there were non fishy, statistically significant surveys showing what they wanted, they would not be reduced to meta analysis.

(particularly figure 2): http://www.ncbi.nlm.nih.gov/pmc/articles/PMC546066/?tool=pmcentrez

which tells us:

Quote
Collectively, though not separately, these studies show appreciable hazards from residential radon, particularly for smokers and recent ex-smokers, and indicate that it is responsible for about 2% of all deaths from cancer in Europe.

Firstly.  Meta analysis is bad.

Secondly.  Ill ventilated houses will have higher radon levels than well ventilated houses, and if those houses also have smokers ...

The correct approach is not house by house, but area by area.  The question that should be asked is:  Do people have higher cancer rates if they live in houses built over rocks that tend to emit radon, as compared to houses built over rocks that do not tend to emit radon.

And when this question, the right question is asked, we get the result that radon is good for your health.

To get the desired result, radon bad for health, we have to ask questions that do not directly reflect just radiation levels.

To get the desired result, have to ask a different question.

I've looked into the methodology of the two main meta-analyses behind the conclusions reached in that review on household exposure and no, they really didn't.

Yet oddly, when I just looked into the methodology, I found use of notoriously unreliable methods that have a tendency to magnify and rest upon publication bias, (meta analysis) and the deliberate search for radiation data that is likely to reflect factors other than radiation (house by house radon levels rather than area by area radon levels, and the use of radiation responders, rather radiation evacuees).

When you pool data from different studies operating on different principles, the number of ways of "analyzing" the data increase combinatorially, guaranteeing that you can find an analysis that shows whatever you want.

The risk is not so high as when you don't pool the data from all the valid studies

Pooling data is from different studies is well known to be bad practice, for numerous reasons, most notoriously publication bias, even on questions that are not politically sensitive.  It is a well known problem.

Quoting from data pooled from multiple different studies done in multiple different ways as evidence for the danger of low dose radiation is like quoting from mental patients as evidence for visits from flying saucers.  That one is reduced to such evidence is pretty good evidence for lack of visits.

Some substantial populations get 250 mSv per year. [of natural background radiation]

Source, please.

http://www.ncbi.nlm.nih.gov/pubmed/11769138
« Last Edit: July 10, 2012, 08:49:15 am by sam »

paddyfool on July 10, 2012, 07:03:57 am
2.  If they don't show it, it is data that would get them in trouble.

They're reviewing the research done on the leukaemia data over a few different prior studies.  It's not exactly commonplace to reproduce the raw data from said studies when you're doing this.

You, and they, are not producing evidence, but making claims about what authority claims, authority telling us that authority tells us. That is not science, and on a question as politically dangerous as the effects of low dose radiation, authority is not evidence.  If Authority had evidence for the politically correct view, they would show us that evidence, instead of assuring us it exists somewhere some unspecified place in some form we are insufficiently learned to understand.


Grr.  OK, if you insist on taking this argument in that direction:

Your comments show an instinctive, rather than a reasoned, dismissal of every single bit of data I put in front of you simply because it disagrees with your a priori beliefs.  You use little bits of evidence like a drunkard uses a lamp-post - for support rather than illumination - and when I show you how that evidence is incomplete, alongside other information from the very same source you dismiss that too.  And now you're accusing me of arguing from authority, when I have claimed none, while you persist in making a host of unsubstantiated claims and making a series of strawman arguments about my position on this, when you aren't trying to move the goalposts.

What can you possibly learn if you aren't interested in actually putting your beliefs to the test, but only in finding things that prop up what you already believe?

Quote
Radon gas is extremely patchy, and the patches have no correlation with other population characteristics that might affect cancer.  If radon gas caused cancer, the correct approach is to look at populations exposed to high levels of radon gas because of the rocks that happen to be underneath them, not lifestyle characteristics.  That is the only kind of survey that counts, and all such surveys shows no harmful radiation effects, and arguably beneficial radiation effects.  Any other kind of data is deliberately looking for bad data to grind your axe - just as looking at responders rather than evacuees is deliberately looking for bad data to grind your axe.

No, it really isn't.

The issue with radon and lung cancer is that radon is a much smaller risk factor than smoking.  So its effect is easy to miss if you don't control for smoking properly.  And you can't get complete smoking data out of the ecological data that Cohen used, which is why they had to do the case-control studies (I'm much less convinced by the studies on miners, because radon is far from the only additional carcinogen which miners would be exposed to).

sam on July 10, 2012, 08:57:44 am
Your comments show an instinctive, rather than a reasoned, dismissal of every single bit of data I put in front of you simply because it disagrees with your a priori beliefs.  

If you want to determine whether radon causes harmful effects, you compare houses built over high radon rocks with houses built over low radon rocks.  In which case you get the result that high radon levels improve health.

If instead you compare high radon houses with low radon houses, rather than houses built over high radon rocks with houses built over low radon rocks, you are primarily comparing low ventilated houses with highly ventilated houses.  Lo and behold, surprise surprise, low ventilated houses have harmful effects.

In which case, if you knowingly use obviously bad data, it is pretty obvious you want a particular result.

This is not "Instinctive dismissal of data that disagrees."  It is rational dismissal of data cooked up with a blatantly obvious axe to grind.
« Last Edit: July 16, 2012, 06:23:57 pm by sam »

customdesigned on December 26, 2014, 07:21:03 pm
Our team is a unique producer of quality fake documents.


If your list included Europa, I know of a young lady that needs your services (500 years from now).

 

anything