Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Submission + - Imprisoned journalist/hacktivist MartyG to be sent to prison unit for terrorists (rt.com)

Danngggg writes: Longtime Slashdot readers may remember the case of Martin "MartyG" Gottesfeld, human rights activist and Anonymous hacktvist, and now conservative journalist , published at The Intercept, The Western Journal, Infowars, etc. from behind bars.

A teenager named Justina Pelletier was abused in Boston Children's Hospital's psych ward for over a year before Marty did a DDoS to exert pressure on the hospital to release the girl back to her family. The DDoS knocked all of Harvard off the internet and was successful in that shortly later she was returned, albeit crippled and worse for wear. His jury found that Marty did not harm or even potentially harm any patients during a DDoS. Nonetheless Judge Nathaniel Gorton, whom he shared with Aaron Swartz, sentenced him to 10 years in federal prison and nearly .5 million in restitution.

RT covers the latest in the journalist's case — from being placed in "the hole" at MDC Brooklyn to an upcoming transfer to a "Communications Management Unit," where they send terrorists and according to The Center for Constitutional Rights, those who are "politically inconvenient." Is anyone paying attention?

Submission + - Chelating agent selectively grabs uranium from oceans (acs.org)

webofslime writes: The world’s oceans contain some 4 billion metric tons of dissolved uranium. That’s roughly 1,000 times as much as all known terrestrial sources combined, and enough to fuel the global nuclear power industry for centuries. But the oceans are so vast, and uranium’s concentration in seawater is so low—roughly 3 ppb—that extracting it remains a formidable challenge. That task may have just become easier thanks to a new adsorbent material based on a bioinspired chelating agent (Nat. Commun. 2019 DOI: 10.1038/s41467-019-08758-1).

Researchers have been looking for ways to extract uranium from seawater for more than 50 years. In the 1980s, surveys pointed to amidoxime-type chelating agents, which have a knack for latching onto uranyl ions, the aqueous form of uranium.

Nearly 20 years ago, the Japan Atomic Energy Agency (JAEA) confirmed that amidoxime-functionalized polymers could soak up uranium reliably even under harsh marine conditions. But that type of adsorbent has not been implemented on a large scale because it has a higher affinity for vanadium than uranium. Separating the two ions raises production costs.

Alexander S. Ivanov of Oak Ridge National Laboratory, together with colleagues there and at Lawrence Berkeley National Laboratory and other institutions, may have come up with a solution. Using computational methods, the team identified a highly selective triazine chelator known as H2BHT that resembles iron-sequestering compounds found in bacteria and fungi. Starting with low-cost reagents, the team prepared fibers containing polyethylene and polyacrylic acid, functionalized them with H2BHT (shown), and analyzed their performance as adsorbents.

Submission + - Magnetic north moving 'pretty fast' towards Russia (independent.co.uk)

AmiMoJo writes: Magnetic North is not where it used to be. Instead, our compasses are pointing ever closer to Russia as the magnetic pole drifts away from the Canadian Arctic and towards Siberia. The pole is now travelling about 34 miles (55 kilometres) a year. It crossed the international date line in 2017 in its lurch eastwards.

The constant roaming of the pole is a problem for compasses in smartphones and some consumer electronics. Aeroplanes and boats also rely on magnetic north, usually as backup navigation, said University of Colorado geophysicist Arnaud Chulliat, lead author of the newly issued World Magnetic Model. GPS is not affected because it is satellite-based.

Submission + - Lab Creates Quantum Objects That May Have Birthed Dark Matter in Early Universe (vice.com)

dmoberhaus writes: Physicists in Finland have experimentally created quantum structures that some cosmologists believe were formed seconds after the Big Bang, and may have given birth to dark matter. Although the existence of these structures has been predicted for decades, this is the first time these objects were created in a lab using superfluid He-3.

Submission + - Simulation Showcases the Ferocious Power of a Solar Flare (astroengine.com)

astroengine writes: For the first time, scientists have created a computer model that can simulate the evolution of a solar flare, from thousands of miles below the photosphere to the eruption itself in the lower corona — the sun’s multimillion degree atmosphere. And the results are not only scientifically impressive, the visualization is gorgeous.

Submission + - GOP lawmaker seeks 'virtual Congress' with telecommuting (thehill.com)

Applehu Akbar writes: New Mexico Congressman Steve Pierce has an idea: why not use today's videoconferencing tech to allow representatives to perform most Congressional activity from their home districts? Because Congresspeople serve short terms and campaign largely on constituent service, they have to spend a large percentage of their time shuttling between home and Washington. Virtualizing most of their Washington presence would save fuel and energy while giving them more time with their constituents.

In addition, there could be a long-term societal benefit in making Congress less vulnerable to lobbyist influence by keeping them out of the Beltway.

Submission + - An important message from "Google" about Google+ ! (vortex.com)

Lauren Weinstein writes: So we’re shutting down G+. We’ll be shutting it down this coming August, uh April, uh as soon as we can locate the Google+ SRE in charge. We’ve been trying to page them for months but they’re not answering. We’re pretty sure that there’s a G+ control dashboard in our systems somewhere — as soon as we can find it we’ll pull the switch and you’ll all be history.

Submission + - System Down: A systemd-journald exploit (qualys.com)

walterbyrd writes: Qualys Security Advisory: We discovered three vulnerabilities in systemd-journald.

To the best of our knowledge, all systemd-based Linux distributions are vulnerable, but SUSE Linux Enterprise 15, openSUSE Leap 15.0, and Fedora 28 and 29 are not exploitable because their user space is compiled with GCC's -fstack-clash-protection.

This confirms https://grsecurity.net/an_anci...:
"It should be clear that kernel-only attempts to solve [the Stack Clash] will necessarily always be incomplete, as the real issue lies in the lack of stack probing."

Submission + - SPAM: Was Low-Dose Radiation From the Atomic Bombs Beneficial?

schwit1 writes: Survivors of the Hiroshima and Nagasaki atomic bombings subjected to lower doses of radiation may actually have had elongated lifespans and reduced cancer mortality. Such is the finding of an article recently published to the journal Genes and Environment.

Researcher Shizuyo Sutou of Shujitsu Women's University is the author of the paper. Sutou examined data from the Life Span Study, which has followed 120,000 survivors of the atomic bomb blasts since 1950. His analysis showed that survivors exposed to between 0.005 and 0.5 Grays of radiation had lower relative mortality than control subjects not exposed to atomic bomb radiation.

Sutou's finding is in line with the hormetic theory of radiation (hormesis), which states that very low doses of ionizing radiation might actually be beneficial, producing adaptive responses like stimulating the repair of DNA damage, removing aberrant cells via programmed cell death, and eliminating cancer cells through learned immunity.

Link to Original Source

Submission + - Germany's green transition has hit a brick wall (energycentral.com)

Joe_Dragon writes: Germany's green transition has hit a brick wall

        Like (1) Comment (2)

        December 21, 2018 601 views

More people are finally beginning to realize that supplying the world with sufficient, stable energy solely from sun and wind power will be impossible.

Germany took on that challenge, to show the world how to build a society based entirely on “green, renewable” energy. It has now hit a brick wall. Despite huge investments in wind, solar and biofuel energy production capacity, Germany has not reduced CO2 emissions over the last ten years. However, during the same period, its electricity prices have risen dramatically, significantly impacting factories, employment and poor families.

Germany has installed solar and wind power to such an extent that it should theoretically be able to satisfy the power requirement on any day that provides sufficient sunshine and wind. However, since sun and wind are often lacking – in Germany even more so than in other countries like Italy or Greece – the country only manages to produce around 27% of its annual power needs from these sources.

Equally problematical, when solar and wind production are at their maximum, the wind turbines and solar panels often overproduce – that is, they generate more electricity than Germany needs at that time – creating major problems in equalizing production and consumption. If the electric power system’s frequency is to be kept close to 50Hz (50 cycles per second), it is no longer possible to increase the amount of solar and wind production in Germany without additional, costly measures.

Production is often too high to keep the network frequency stable without disconnecting some solar and wind facilities. This leads to major energy losses and forced power exports to neighboring countries (“load shedding”) at negative electricity prices, below the cost of generating the power.

In 2017 about half of Germany’s wind-based electricity production was exported. Neighboring countries typically do not want this often unexpected power, and the German power companies must therefore pay them to get rid of the excess. German customers have to pick up the bill.

If solar and wind power plants are disconnected from actual need in this manner, wind and solar facility owners are paid as if they had produced 90% of rated output. The bill is also sent to customers.

When wind and solar generation declines, and there is insufficient electricity for everyone who needs it, Germany’s utility companies also have to disconnect large power consumers – who then want to be compensated for having to shut down operations. That bill also goes to customers all over the nation.

Power production from the sun and wind is often quite low and sometimes totally absent. This might take place over periods from one day to ten days, especially during the winter months. Conventional power plants (coal, natural gas and nuclear) must then step in and deliver according to customer needs. Hydroelectric and biofuel power can also help, but they are only able to deliver about 10% of the often very high demand, especially if it is really cold.

Alternatively, Germany may import nuclear power from France, oil-fired power from Austria or coal power from Poland.

In practice, this means Germany can never shut down the conventional power plants, as planned. These power plants must be ready and able to meet the total power requirements at any time; without them, a stable network frequency is unobtainable. The same is true for French, Austrian and Polish power plants.

Furthermore, if the AC frequency is allowed to drift too high or too low, the risk of extensive blackouts becomes significant. That was clearly demonstrated by South Australia, which also relies heavily on solar and wind power, and suffered extensive blackouts that shut down factories and cost the state billions of dollars.

The dream of supplying Germany with mainly green energy from sunshine and wind turns out to be nothing but a fading illusion. Solar and wind power today covers only 27% of electricity consumption and only 5% of Germany's total energy needs, while impairing reliability and raising electricity prices to among the highest in the world.

However, the Germans are not yet planning to end this quest for utopian energy. They want to change the entire energy system and include electricity, heat and transportation sectors in their plans. This will require a dramatic increase in electrical energy and much more renewable energy, primarily wind.

To fulfill the German target of getting 60% of their total energy consumption from renewables by 2050, they must multiply the current power production from solar and wind by a factor of 15. They must also expand their output from conventional power plants by an equal amount, to balance and backup the intermittent renewable energy. Germany might import some of this balancing power, but even then the scale of this endeavor is enormous.

Perhaps more important, the amount of land, concrete, steel, copper, rare earth metals, lithium, cadmium, hydrocarbon-based composites and other raw materials required to do this is astronomical. None of those materials is renewable, and none can be extracted, processed and manufactured into wind, solar or fossil power plants without fossil fuels. This is simply not sustainable or ecological.

Construction of solar and wind “farms” has already caused massive devastation to Germany’s wildlife habitats, farmlands, ancient forests and historic villages. Even today, the northern part of Germany looks like a single enormous wind farm. Multiplying today's wind power capacity by a factor 10 or 15 means a 200 meter high (650 foot tall) turbine must be installed every 1.5 km (every mile) across the entire country, within cities, on land, on mountains and in water.

In reality, it is virtually impossible to increase production by a factor of 15, as promised by the plans.

The cost of Germany’s “Energiewende” (energy transition) is enormous: some 200 billion euros by 2015 – and yet with minimal reduction in CO2 emission. In fact, coal consumption and CO2 emissions have been stable or risen slightly the last seven to ten years. In the absence of a miracle, Germany will not be able to fulfill its self-imposed climate commitments, not by 2020, nor by 2030.

What applies to Germany also applies to other countries that now produce their electricity primarily with fossil or nuclear power plants. To reach development comparable to Germany’s, such countries will be able to replace only about one quarter of their fossil and nuclear power, because these power plants must remain in operation to ensure frequency regulation, balance and back-up power.

Back-up power plants will have to run idle (on “spinning reserve”) during periods of high output of renewable energy, while still consuming fuel almost like during normal operation. They always have to be able to step up to full power, because over the next few hours or days solar or wind power might fail. So they power up and down many times per day and week.

The prospects for reductions in CO2 emissions are thus nearly non-existent! Indeed, the backup coal or gas plants must operate so inefficiently in this up-and-down mode that they often consume more fuel and emit more (plant-fertilizing) carbon dioxide than if they were simply operating at full power all the time, and there were no wind or solar installations.

There is no indication that world consumption of coal will decline in the next decades. Large countries in Asia and Africa continue to build coal-fired power plants, and more than 1,500 coal-fired power plants are in planning or under construction.

This will provide affordable electricity 24/7/365 to 1.3 billion people who still do not have access to electricity today. Electricity is essential for the improved health, living standards and life spans that these people expect and are entitled to. To tell them fears of climate change are a more pressing matter is a violation of their most basic human rights.

Authored by: Oddvar Lundseng, Hans Johnsen and Stein Bergsmark

Oddvar Lundseng is a senior engineer with 43 years of experience in the energy business. Hans Konrad Johnsen, PhD is a former R&D manager with Det Norske Oljeselskap ASA. Stein Storlie Bergsmark has a degree in physics and is a former senior energy researcher and former manager of renewable energy education at the University of Agder.

Submission + - HMV teeters on the brink of collapse

Retron writes: The UK's largest High Street retailer of CDs, DVDs and Blu-rays has called in administrators for the second time in six years. The sector as a whole is struggling thanks to the rise of streaming services from the likes of Netflix, with customers increasingly shunning physical media in favour of lower-quality (but more convenient) subscription streaming services.

HMV made losses last year of £7.5m on sales of £278m. Business rates alone (the tax for being on the High Street) accounted for £15m of costs. This is despite HMV selling roughly a quarter of all physical DVDs and Blu-rays in the UK last year, as well as around a third of all physical music recordings. Indeed, the Telegraph reports HMV overtook Amazon in terms of sales of physical DVDs this year in the UK, as Amazon has focused more on its streaming services.

However, demand is expected to decline by 17% next year, heaping further woes on a company struggling to adapt to modern trends. With a bleak future ahead for physical media sales, it remains to be seen whether HMV — which has been around since 1921 — can survive to see its one hundredth anniversary.

Sources: https://www.telegraph.co.uk/bu... (Paywalled), https://www.bbc.co.uk/news/bus...

Submission + - The Internet Is Mostly Fake Now (nymag.com) 3

AmiMoJo writes: In late November, the Justice Department unsealed indictments against eight people accused of fleecing advertisers of $36 million in two of the largest digital ad-fraud operations ever uncovered. Hucksters infected 1.7 million computers with malware that remotely directed traffic to “spoofed” websites.

How much of the internet is fake? Studies generally suggest that, year after year, less than 60 percent of web traffic is human; some years, according to some researchers, a healthy majority of it is bot. For a period of time in 2013, the Times reported this year, a full half of YouTube traffic was “bots masquerading as people,” a portion so high that employees feared an inflection point after which YouTube’s systems for detecting fraudulent traffic would begin to regard bot traffic as real and human traffic as fake. They called this hypothetical event “the Inversion.”

In the future, when I look back from the high-tech gamer jail in which President PewDiePie will have imprisoned me, I will remember 2018 as the year the internet passed the Inversion, not in some strict numerical sense, since bots already outnumber humans online more years than not, but in the perceptual sense. Everything that once seemed definitively and unquestionably real now seems slightly fake; everything that once seemed slightly fake now has the power and presence of the real. The “fakeness” of the post-Inversion internet is less a calculable falsehood and more a particular quality of experience — the uncanny sense that what you encounter online is not “real” but is also undeniably not “fake,” and indeed may be both at once, or in succession, as you turn it over in your head.

Submission + - icrosoft announces Project Mu, an open-source release of the UEFI core (betanews.com)

Mark Wilson writes: Microsoft has a new open source project — Project Mu. This is the company's open-source release of the Unified Extensible Firmware Interface (UEFI) core which is currently used by Surface devices and Hyper-V.

With the project, Microsoft hopes to make it easier to build scalable and serviceable firmware, and it embraces the idea of Firmware as a Service (FaaS). This allows for fast and efficient updating of firmware after release, with both security patches and performance-enhancing updates.

Submission + - Ask Slashdot: Why Don't HDR TVs Have An sRGB Or AdobeRGB Rating?

dryriver writes: As anyone who buys professional computer monitors knows, the dynamic range of the display device you are looking at can be expressed quite usefully in terms of percentage sRGB coverage and percentage AdobeRGB coverage. The higher the percentage for each, the better and wider the dynamic range of the screen panel you are getting. People who work with professional video and photographs typically aim for a display that has 100 percent sRGB coverage and at least 70 to 80 percent AdobeRGB coverage. Laptop review site Notebookcheck ( https://www.notebookcheck.net/ ) for example uses professional optical testing equipment to check whether the advertised sRGB and AdobeRGB percentages and brightness in nits for any laptop display panel hold up in real life. This being the case, why do quote on quote "High Dynamic Range" capable TVs — which seem to be mostly 10 bits per channel to begin with — not have an sRGB or AdobeRGB rating quoted anywhere in their technical specs? Why don't professional TV reviewers use optical testing equipment — readily available — to measure the real world dynamic range of HDR or non-HDR TVs objectively, in hard numbers? Why do they simply say "the blacks on this TV were deep and pleasing, and the lighter tones were..." when this can be expressed better and more objectively in measured numbers or percentages? Do they think consumers are too unsophisticated to understand a simple number like "this OLED TV achieves a fairly average 66 percent AdobeRGB coverage"?

Slashdot Top Deals

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...