This is not the first time that a protocol that restores energy metabolism, or protects it, has been found effective. And not just in early-onset Parkinson's. See for example the work of Birkmeyer who developed a protocol around NADH and Co-enyme Q10 (both co-enzymes active in glucose metabolism). Or the use of coconut oil (for the lauric acid contained therein) as a dietary addition to provide ketones as an alternative for glucose to energize cells: also found effective for many Alzheimers patients.
The ketogenic approach is easy to try as coconut oil is readily available. The Birkmeyer protocol requires a few supplements, in particular stablized NADH to be taken on an empty stomach.
The lists of Nobel nominees are not publically available. Claiming someone has been nominated a certain number of times is clearly a falsehood.
Nonsense. Those who nominated a peer for a Nobel can and have given publicity to their actions, particularly when they felt him or her to have been unjustly denied the award.
In contrast to the sweeping unsubstantiated WP quote you offer, there is plenty of reliable evidence for the effectiveness of the Budwig protocol: it moved out of the research stage decades ago and has been used with success in practice since as evident from the many testimonials (here and elsewhere) of those who had their lives saved.
So how come this, or the effectiveness of several other alternative treatments, is not acknowledged? A good place to start are the words to two-time Nobel prize winner Linus Pauling: "Everyone should know that the 'war on cancer' is largely a fraud."
Though obviously not ideal, the images do say something about intensity: you can calculate what the surface black body flux in the covered wavelength region should be. Since the images show the surface as relatively dark (since the surface BB emission should be fairly homogeneous and isotropic you can take the darkest pixel regions inside the disk as being an upper bound thereof), it is easy to see (under any kind of reasonable coloring-intensity-increases-with-photon-flux scale choice that may have been made) that in the covered wavelength region of the EUV and X-Ray spectrum, the emissions from the corona are a sizable multiple of the surface black body spectrum. And yes, the flux is not that high anymore in the X-ray region. The point is, tough, that it is extremely variable there, and highly variable over long (solar cycle) time periods in the not-that-narrow 26-34 nm band of the EUV.
In any case, considerable effects of this short wavelength flux on the earth's atmosphere have been observed. Unwelcome observations, so for political correctness some ludicrous CO2 spin had to be put on that too. Realize that about half of the radiative emissions of the thermosphere go towards the earth's surface and as such affect surface temperatures.
Look, solar irradiance averages about 1366 W/m^2 and a has a variation of about 1 W/m^2 (using a one-year moving average). That's 0.073%.
You are referring to the Total Solar Irradiance (TSI). But it is not total: the satellites used to measure it have a spectral window from 2000 nm down to 200nm. That leaves out the EUV and X-Ray region. There, the variation is huge. See for example this factor-of-three variation over the solar cycle in the 26-34 nm band.
In the X-ray region variations can be orders of magnitude. Looking at any EUV of X-ray image, it is obvious that the short wavelength intensity from the corona much exceeds the black body radiation coming off the surface. So the conventional view that the EUV and X-Ray region is just an irrelevant tail of the black-body curve is wrong:the flux there is much more intense.
Then there are serious doubts about whether the TSI time series as published are actually all that constant. There have been per-instrument aging calibrations that have removed slopes in the raw data. The question though is whether this slope was really due to aging or due to a systematic trend in the solar irradiance. Also, the long-term TSI curve spans a number of instruments (satellites) with some gap in between. There is a lot of discussion about whether this gap has been bridged without skewing the data towards less variance than there really is.
There. A tiny bit more research shows that the sun can have a rather greater effect on Earth's temperature than it is given credit for.
And no, climate scientists are not familiar with this. The importance of the EUV and X-Ray region has been overlooked in the past and only recently has started to gain attention.
However, solar variation in radiation is not the cause (this is what is taken into account in climate models) but the magnetic fields and the solar wind appear to play a much larger role.
I would not be so sure about that since there is a bit of a blind spot in the theories, models and observations: EUV and X-Ray radiation. Take, for example, this time graph of the 26-34 nm EUV band. A factor of three or so variation in flux over the course of the solar cycle.
Look at any EUV or X-Ray image of the sun, and it is obvious that we are talking about radiation that much exceeds that what would be expected from the short wavelength tail of the solar black body curve (the surface, which is the source of that tail, appears relatively "dark" at those short wavelengths). Indeed, the spatial distribution of the source of the short wavelength emissions looks determined by magnetic field loops and surface bundles as can be seen in this three-color composite EIT synoptic image in 171 Å (blue), 195 Å (green), and 284 Å (red) . So yes, there is a correlation with magnetic fields and the solar wind, but it is likely still direct EUV and X-Ray radiation (absorbed in the very upper layer of the atmosphere) that affects the climate on earth.
These scare stories seem designed to keep people from consuming healthy foods such as raw milk. Another example: the many stories about mercury in fish. What you won't read about is that the selenium present in the same fish compensates for what little mercury is there.
Well, no, not my interpretation, theirs. Quoting:
"Today, astronomers unveiled the most complete 3-D map of the local universe (out to a distance of 380 million light-years) ever created."
A press release like that implies a spatial interpretation: "map", '3-D", "distance"...
You are trying to make a difference where there is none on two accounts. Firstly, taking redshift and using it as a spatial dimension for a map implies a distance interpretation or, if you wish, a uni-variate spatial interpretation of redshift. An this interpretation is obviously is wrong if, as the observational evidence indicates, there are objects with high redshift co-located with objects that have much lower redshift: the mapping is then projecting two co-located objects on different parts of the map yielding, by any commonsense definition, a wrong map.
Secondly, the redshift is proportional to distance dogma is so ingrained in astronomical teaching and literature that authors assume distance to be implied by a mere mention of a redshift measurement: the reader is supposed to know how to factor out the Hubble constant if units of distance are desired. As such, the use of redshift as a map coordinate can be assumed to have the intent to imply a spatial dimension unless explicitly mentioned otherwise.
To construct the map, the standard assumption that red shift is proportional to distance was made. However, a growing body of observational evidence indicates that there are further sources of red shift not related to distance. This implies that the map must be wrong since it is based on an incomplete interpretation of red shift measurements.
For a good documentary where the mentioned growing body of evidence is being discussed by astronomers and astrophysicists see "The Universe - Cosmology Quest". A torrent can be found here for example.
The 1987 Montreal Protocol banning chlorofluorocarbons (CFCs) is considered a textbook case where science and responsible governance lead to a landmark treaty for the benefit of the Earth and all its inhabitants. How often does that happen?
At about the time that the DuPont patent on Freon(TM), the most widely used CFC refrigerant in the world, was expiring the mainstream media picked up on otherwise arcane scientific observations and hypotheses about ozone concentration in the upper atmosphere near the poles.
There resulted an international mobilization to criminalize CFCs and DuPont developed and patented a replacement refrigerant that was promptly certified for use.
A Nobel Prize in chemistry was awarded in 1995 for a laboratory demonstration that CFCs could deplete ozone in simulated atmospheric conditions. In 2007 it was shown that the latter work may have been seriously flawed by overestimating the depletion rate by an order of magnitude, thereby invalidating the proposed mechanism for CFC-driven ozone depletion . Not to mention that any laboratory experiment is somewhat different from the actual upper atmosphere. Is the Nobel tainted by media and special interest lobbying?
 Nature 449, 382-383 (2007).
It concur with most of what you say, but in some of your statements you persist in assigning primacy to theory. Saying, for example that "theory is central, because without it you have but a bunch of anecdotes" does not reflect some necessary aspects of science. One I already pointed out: when you apply the scientific method to a new field of endeavor of which little is known, no theory is available yet. By necessity you have to start with observations, then develop hypotheses and experiments to test those hypotheses. As these hypotheses grow in number and detail, theories with some predictive power may emerge.
Another way to see that experiments and observation should have precedence is this: nature does not care about theory, it is what it is. Aspects of what nature is can be observed through experiment and observation. Theories are crutches for human understanding: approximate models small enough to fit into our feeble minds. But theories are risky to rely on because even the most well-established theory may in the end be found to be flawed after performing an experiment in a domain where it has not been tested yet.
In the particular field under discussion, the actual situation is more complex because the experiments are interpreted in the context of stack of theories. To measure the degree of double-strand DNA breakage via a comet assay, as Lai and Singh have done, makes implicit use of a lot of physical theories and theories of biochemistry. This stack of theories is lacking a theory that provides a well-verified explanation of the precise biophysical and biochemical pathways via which electromagnetic fields can induce such excess DNA damage. There are multiple candidates for an explanation, and comet assay is exactly the kind of experimentation that can help you narrow down the candidates. As I mentioned, they found a reduced signal when the iron in the blood of the rats was first chelated. Precisely the kind of finding that can help you eliminate candidate hypotheses and move other hypothesis on towards the status of theory.
In short, they have been doing perfectly fine science. And even though a complete theory is not there yet, their experiments unambiguously indicate that electromagnetic fields can pose a mutagenic risk even when those EM fields induce no appreciable thermal heating and the photon energy (proportional to frequency) is in the non-ionizing regime: to establish that does not require a theory of the damage mechanism, instead it merely requires the theories lower in the stack that are sufficient to interpret the comet assay methodology. Hence, their experimental findings are very much worth knowing of and giving publicity to. Sadly, they have received repression and censorship instead.