Businesses

Amazon's Shifting Definition of What Is 'Essential' (themarkup.org) 136

Maddy Varner, reporting for The Markup: On March 17, Amazon informed U.S. sellers that it would no longer accept nonessential products at its warehouses. To the casual shopper, it might have sounded similar to the pledges Amazon has made in Italy, France, and India to stop taking orders from customers entirely for nonessential goods. But examining the fine print reveals that it was nothing of the sort. The original pledge -- which was announced as policy for March 17 to April 5 -- allowed Amazon to ship nonessential items that were already stocked in its warehouse, and sellers could also stock nonessential items in their own warehouses and ship directly to customers.

Amazon defined essential loosely, saying that "most of the products" it would accept were in the categories of "Baby Products," "Health & Household," "Beauty & Personal Care," "Grocery," "Industrial & Scientific," and "Pet Supplies." Since that mid-March announcement, Amazon has quietly relaxed even further its definition of what is essential, while also extending indefinitely the date by which "operations will be fully restored." On March 27, archived snapshots of the page indicated that Amazon would broaden the list of new shipments it would accept from sellers, on an unspecified "item-by-item" basis. As of April 6, in the United States, you could still order a bowling ball, a 10-pack of rubber chickens, and a prom dress and have them show up at your door within a week. All of the items are described on the website as either "Fulfilled by Amazon" or "Ships from and sold by Amazon.com," and none of the items are in the categories previously deemed essential.

Open Source

What Linus Torvalds Gets Wrong About ZFS (arstechnica.com) 279

Ars Technica recently ran a rebuttal by author, podcaster, coder, and "mercenary sysadmin" Jim Salter to some comments Linus Torvalds made last week about ZFS.

While it's reasonable for Torvalds to oppose integrating the CDDL-licensed ZFS into the kernel, Salter argues, he believes Torvalds' characterization of the filesystem was "inaccurate and damaging."
Torvalds dips into his own impressions of ZFS itself, both as a project and a filesystem. This is where things go badly off the rails, as Torvalds states, "Don't use ZFS. It's that simple. It was always more of a buzzword than anything else, I feel... [the] benchmarks I've seen do not make ZFS look all that great. And as far as I can tell, it has no real maintenance behind it any more..."

This jaw-dropping statement makes me wonder whether Torvalds has ever actually used or seriously investigated ZFS. Keep in mind, he's not merely making this statement about ZFS now, he's making it about ZFS for the last 15 years -- and is relegating everything from atomic snapshots to rapid replication to on-disk compression to per-block checksumming to automatic data repair and more to the status of "just buzzwords."

[The 2,300-word article goes on to describe ZFS features like per-block checksumming, automatic data repair, rapid replication and atomic snapshots -- as well as "performance wins" including its Adaptive Replacement caching algorithm and its inline compression (which allows datasets to be live-compressed with algorithms.]

The TL;DR here is that it's not really accurate to make blanket statements about ZFS performance, absent a very particular, well-understood workload to measure that performance on. But more importantly, quibbling about the fastest possible benchmark rather loses the main point of ZFS. This filesystem is meant to provide an eminently scalable filesystem that's extremely resistant to data loss; those are points Torvalds notably never so much as touches on....

Meanwhile, OpenZFS is actively consumed, developed, and in some cases commercially supported by organizations ranging from the Lawrence Livermore National Laboratory (where OpenZFS is the underpinning of some of the world's largest supercomputers) through Datto, Delphix, Joyent, ixSystems, Proxmox, Canonical, and more...

It's possible to not have a personal need for ZFS. But to write it off as "more of a buzzword than anything else" seems to expose massive ignorance on the subject... Torvalds' status within the Linux community grants his words an impact that can be entirely out of proportion to Torvalds' own knowledge of a given topic -- and this was clearly one of those topics.

AI

How Google Researchers Used Neural Networks To Make Weather Forecasts (arstechnica.com) 45

A research team at Google has developed a deep neural network that can make fast, detailed rainfall forecasts. Google says that its forecasts are more accurate than conventional weather forecasts, at least for time periods under six hours. Ars Technica reports: The researchers say their results are a dramatic improvement over previous techniques in two key ways. One is speed. Google says that leading weather forecasting models today take one to three hours to run, making them useless if you want a weather forecast an hour in the future. By contrast, Google says its system can produce results in less than 10 minutes -- including the time to collect data from sensors around the United States. A second advantage: higher spatial resolution. Google's system breaks the United States down into squares 1km on a side. Google notes that in conventional systems, by contrast, "computational demands limit the spatial resolution to about 5 kilometers."

Interestingly, Google's model is "physics-free": it isn't based on any a priori knowledge of atmospheric physics. The software doesn't try to simulate atmospheric variables like pressure, temperature, or humidity. Instead, it treats precipitation maps as images and tries to predict the next few images in the series based on previous snapshots. It does this using convolutional neural networks, the same technology that allows computers to correctly label images. Specifically, it uses a popular neural network architecture called a U-Net that was first developed for diagnosing medical images. The U-net has several layers that downsample an image from its initial 256-by-256 shape, producing a lower-resolution image where each "pixel" represents a larger region of the original image. Google doesn't explain the exact parameters, but a typical U-Net might convert a 256-by-256 grid to a 128-by-128 grid, then convert that to a 64-by-64 grid, and finally a 32-by-32 grid. While the number of pixels is declining, the number of "channels" -- variables that capture data about each pixel -- is growing.

The second half of the U-Net then upsamples this compact representation -- converting back to 64, 128, and finally 256-pixel representations. At each step, the network copies over the data from the corresponding downsampling step. The practical effect is that the final layer of the network has both the original full-resolution image and summary data reflecting high-level features inferred by the neural network. To produce a weather forecast, the network takes an hour's worth of previous precipitation maps as inputs. Each map is a "channel" in the input image, just as a conventional image has red, blue, and green channels. The network then tries to output a series of precipitation maps reflecting the precipitation over the next hour. Like any neural network, this one is trained with past real-world examples. After repeating this process millions of times, the network gets pretty good at approximating future precipitation patterns for data it hasn't seen before.

Open Source

FSF-Approved Hyperbola GNU/Linux Forking OpenBSD, Citing 'User Freedom' Concerns (hyperbola.info) 135

Long-time Slashdot reader twocows writes: Hyperbola GNU/Linux, a FSF-approved distribution of GNU/Linux, has declared their intent to fork OpenBSD and become HyperbolaBSD..."
The news came earlier this week in a roadmap announcement promising "a completely new OS derived from several BSD implementations" (though Hyperbola was originally based on Arch snapshots and Debian development).

"This was not an easy decision to make, but we wish to use our time and resources to create a viable alternative to the current operating system trends which are actively seeking to undermine user choice and freedom." In 2017 Hyperbola dropped its support for systemd -- but its concerns go far beyond that: This will not be a "distro", but a hard fork of the OpenBSD kernel and userspace including new code written under GPLv3 and LGPLv3 to replace GPL-incompatible parts and non-free ones.

Reasons for this include:

- Linux kernel forcing adaption of DRM, including HDCP.

- Linux kernel proposed usage of Rust (which contains freedom flaws and a centralized code repository that is more prone to cyber attack and generally requires internet access to use.)

- Linux kernel being written without security and in mind. (KSPP is basically a dead project and Grsec is no longer free software)

- Many GNU userspace and core utils are all forcing adaption of features without build time options to disable them. E.g. (PulseAudio / SystemD / Rust / Java as forced dependencies....)

HyperbolaBSD is intended to be modular and minimalist so other projects will be able to re-use the code under free license.

Ubuntu

Canonical Releases Ubuntu Linux 19.10 Eoan Ermine with GNOME 3.34, Light Theme, and Raspberry Pi 4 Support (betanews.com) 50

Following the beta period, one of the best and most popular Linux-based desktop operating systems reaches a major milestone -- you can now download Ubuntu 19.10! Code-named "Eoan Ermine", the distro is better and faster then ever. From a report: By default, Ubuntu 19.10 comes with one of the greatest desktop environments -- GNOME 3.34. In addition, users will be delighted by an all-new optional Yaru light theme. There is even baked-in support for the Raspberry Pi 4. The kernel is based on Linux 5.3 and comes with support for AMD Navi GPUs. There are plenty of excellent pre-installed programs too, such as LibreOffice 6.3, Firefox 69, and Thunderbird 68. While many users will be quick to install Google Chrome, I would suggest giving Firefox a try -- it has improved immensely lately. "With GNOME 3.34, Ubuntu 19.10 is the fastest release yet with significant performance improvements delivering a more responsive and smooth experience, even on older hardware. App organization is easier with the ability to drag and drop icons into categorized folders, while users can select light or dark Yaru theme variants depending on their preference or for improved viewing accessibility. Native support for ZFS on the root partition is introduced as an experimental desktop installer option. Coupled with the new zsys package, benefits include automated snapshots of file system states, allowing users to boot to a previous update and easily roll forwards and backwards in case of failure," says Canonical.
Earth

Brazil Tries Deploying Its Military To Fight Fires in the Amazon (businesstimes.com.sg) 44

"As an ecological disaster in the Amazon escalated into a global political crisis, Brazil's president, Jair Bolsonaro, took the rare step on Friday of mobilising the armed forces to help contain blazes of a scale not seen in nearly a decade," reports the New York Times: The sudden reversal, after days of dismissing growing concern over hundreds of fires raging across the Amazon, came as international outrage grew over the rising deforestation in the world's largest tropical rainforest. European leaders threatened to cancel a major trade deal, protesters staged demonstrations outside Brazilian embassies and calls for a boycott of Brazilian products snowballed on social media. As a chorus of condemnation intensified, Brazil braced for the prospect of punitive measures that could severely damage an economy that is already sputtering...
"[E]xperts say the clearing of land during the months-long dry season to make way for crops or grazing has accelerated the deforestation," reports AFP. "More than half of the fires are in the Amazon, and some 1,663 new fires were ignited between Thursday and Friday, according to the National Institute for Space Research."

terrancem quotes the non-profit environmental news site Mongabay: High-resolution images from satellite company Planet are revealing glimpses of some of the fires currently devastating the Amazon rainforest... Beyond dramatic snapshots, those images also provide data that can be mined for critical insights into what's happening in the Amazon on a basin-wide scale, according to Greg Asner, the director of the Center for Global Discovery and Conservation Science at Arizona State University, whose team is using Planet's data to assess the impact of the fires on carbon emissions.

"If you took all of the carbon stored in every tropical forest on Earth and burned it up, you would emit about five times the carbon dioxide into the atmosphere that is already there. The Amazon rainforest represents about half of this forest carbon to give you an idea of how serious this current situation is and the kind of impact it will have on climate change."

Cloud

Hundreds of Exposed Amazon Cloud Backups Found Leaking Sensitive Data (techcrunch.com) 16

An anonymous reader quotes a report from TechCrunch: New research just presented at the Def Con security conference reveals how companies, startups and governments are inadvertently leaking their own files from the cloud. You may have heard of exposed S3 buckets -- those Amazon-hosted storage servers packed with customer data but often misconfigured and inadvertently set to "public" for anyone to access. But you may not have heard about exposed EBS snapshots, which poses as much, if not a greater, risk. These elastic block storage (EBS) snapshots are the "keys to the kingdom," said Ben Morris, a senior security analyst at cybersecurity firm Bishop Fox, in a call with TechCrunch ahead of his Def Con talk. EBS snapshots store all the data for cloud applications. "They have the secret keys to your applications and they have database access to your customers' information," he said.

Morris built a tool using Amazon's own internal search feature to query and scrape publicly exposed EBS snapshots, then attach it, make a copy and list the contents of the volume on his system. It took him two months to build up a database of exposed data and just a few hundred dollars spent on Amazon cloud resources. Once he validates each snapshot, he deletes the data. Morris found dozens of snapshots exposed publicly in one region alone, he said, including application keys, critical user or administrative credentials, source code and more. He found several major companies, including healthcare providers and tech companies. He also found VPN configurations, which he said could allow him to tunnel into a corporate network. Morris said he did not use any credentials or sensitive data, as it would be unlawful.

Security

Google Admits Bug Could Let People Spy On Nest Cameras (dailydot.com) 30

Google on Thursday confirmed that a bug in its Nest security cameras could have allowed users to be spied on. The Daily Dot reports: The issue was first raised by a user on Facebook who recently sold his Nest Cam Indoor yet was still able to access its feed. The problem involves Wink, an app that lets people manage multiple smart devices regardless of their developer. The Facebook user noted that despite carrying out a factory reset on his Nest camera before selling it, his Wink account remained connected to the device, allowing him to view snapshots of the buyer's live feed.

Wirecutter tested the vulnerability on its own Nest Cam by linking it to a Wink account and then performing a factory reset. The publication also found it was receiving "a series of still images snapped every several seconds" via its Wink account. "In simpler terms: If you buy and set up a used Nest indoor camera that has been paired with a Wink hub, the previous owner may have unfettered access to images from that camera," Wirecutter says. "And we currently don't know of any cure for this problem."
Google responded to the report and said it has fixed the problem. "We were recently made aware of an issue affecting some Nest cameras connected to third-party partner services via Works with Nest," a spokesperson told Wirecutter. "We've since rolled out a fix for this issue that will update automatically, so if you own a Nest camera, there's no need to take any action."
The Internet

W3C and WHATWG Sign Agreement To Collaborate on a Single Version of HTML and DOM (w3.org) 104

W3C and the WHATWG signed an agreement today to collaborate on the development of a single version of the HTML and DOM specifications. From a blog post: The Memorandum of Understanding jointly published as the WHATWG/W3C Joint Working Mode gives the specifics of this collaboration. This is the culmination of a careful exploration effective partnership mechanisms since December 2017 after the WHATWG adopted many shared features as their work-mode and an IPR policy. The HTML Working Group which we will soon recharter will assist the W3C community in raising issues and proposing solutions for the HTML and DOM specifications, and bring WHATWG Review Drafts to Recommendation.

Motivated by the belief that having two distinct HTML and DOM specifications claiming to be normative is generally harmful for the community, and the mutual desire to bring the work back together, W3C and WHATWG agree to the following terms: W3C and WHATWG work together on HTML and DOM, in the WHATWG repositories, to produce a Living Standard and Recommendation/Review Draft-snapshots. WHATWG maintains the HTML and DOM Living Standards. W3C facilitates community work directly in the WHATWG repositories (bridging communities, developing use cases, filing issues, writing tests, mediating issue resolution). W3C stops independent publishing of a designated list of specifications related to HTML and DOM and instead will work to take WHATWG Review Drafts to W3C Recommendations.

Open Source

Databricks Open-Sources Delta Lake To Make Delta Lakes More Reliable (techcrunch.com) 15

Databricks, the company founded by the original developers of the Apache Spark big data analytics engine, today announced that it has open-sourced Delta Lake, a storage layer that makes it easier to ensure data integrity as new data flows into an enterprise's data lake by bringing ACID transactions to these vast data repositories. TechCrunch reports: Delta Lake, which has long been a proprietary part of Databrick's offering, is already in production use by companies like Viacom, Edmunds, Riot Games and McGraw Hill. The tool provides the ability to enforce specific schemas (which can be changed as necessary), to create snapshots and to ingest streaming data or backfill the lake as a batch job. Delta Lake also uses the Spark engine to handle the metadata of the data lake (which by itself is often a big data problem). Over time, Databricks also plans to add an audit trail, among other things.

What's important to note here is that Delta lake runs on top of existing data lakes and is compatible with the Apache spark APIs. The company is still looking at how the project will be governed in the future. "We are still exploring different models of open source project governance, but the GitHub model is well understood and presents a good trade-off between the ability to accept contributions and governance overhead," said Ali Ghodsi, co-founder and CEO at Databricks. "One thing we know for sure is we want to foster a vibrant community, as we see this as a critical piece of technology for increasing data reliability on data lakes. This is why we chose to go with a permissive open source license model: Apache License v2, same license that Apache Spark uses." To invite this community, Databricks plans to take outside contributions, just like the Spark project.

Programming

The Most Loved and Most Disliked Programming Languages Revealed in Stack Overflow Survey (stackoverflow.com) 268

angel'o'sphere shares a report: The annual Stack Overflow survey is one of the most comprehensive snapshots of how programmers work, with this year's poll being taken by almost 90,000 developers across the globe. This year's survey details which languages developers enjoy using, which are associated with the best paid jobs, which are most commonly used, as well as developers' preferred frameworks, databases, and integrated development environments.

Python's versatility continues to fuel its rise through Stack Overflow's rankings for the "most popular" languages, which lists the languages most widely used by developers. This year's survey finds Python to be the fastest-growing major programming language, with Python edging out Android and enterprise workhorse Java to become the fourth most commonly used language. [...] More importantly for developers, this popularity overlaps with demand for the language, with Julia Silge, data scientist at Stack Overflow, saying that jobs data gathered by Stack Overflow also shows Python to be one of the most in-demand languages sought by employers.

[...] Rust may not have as many users as Python or JavaScript but it has earned a lot of affection from those who use it. For the fourth year running, the language tops Stack Overflow's list of "most-loved" languages, which means the proportion of Rust developers who want to continue working with it is larger than that of any other language.[...] Go stands out as a language that is well paid, while also being sought after and where developers report high levels of job satisfaction.
Full report here.
Space

Astronomers Discover 83 Supermassive Black Holes at the Edge of the Universe (cnet.com) 86

"A team of international astronomers have been hunting for ancient, supermassive black holes -- and they've hit the motherlode, discovering 83 previously unknown quasars," reports CNET: The Japanese team turned the ultra-powerful "Hyper Suprime-Cam", mounted to the Subaru Telescope in Hawaii, toward the cosmos' darkest corners, surveying the sky over a period of five years. By studying the snapshots, they've been able to pick potential quasar candidates out of the dark. Notably, their method of probing populations of supermassive black holes that are similar in size to the ones we see in today's universe, has given us a window into their origins.

After identifying 83 potential candidates, the team used a suite of international telescopes to confirm their findings. The quasars they've plucked out are from the very early universe, about 13 billion light years away. Practically, that means the researchers are looking into the past, at objects form less than a billion years after the Big Bang. "It is remarkable that such massive dense objects were able to form so soon after the Big Bang," said Michael Strauss, who co-authored the paper, in a press release. Scientists aren't sure how black holes formed in the early universe, so being able to detect them this far back in time provides new avenues of exploration.

Earth

Bioacoustic Devices Could Help Save Rainforests (qz.com) 30

"Researchers writing in Science argue that networked audio recording devices mounted in trees could be used to monitor wildlife populations and better evaluate whether conservation projects are working or not," writes Slashdot reader Damien1972. From the report: Compared to ground surveys and camera traps, the technology provides cheap continuous, real-time biodiversity monitoring at the landscape scale. Thousands of hours of recordings can now be collected with long-lasting batteries and stored digitally. In sites with solar power and cellular signal, multi-year recordings have now been transmitted and saved to scientists' databases. That's possible thanks to the steep drop in the price of equipment that enables researchers to collect more than short, isolated sound snapshots.

The key, says co-authors Eddie Game of The Nature Conservancy and Zuzana Burivalova at Princeton University, is to build out enough data to understand how changing soundscapes reflect biodiversity on the ground. Game says he has found plenty of "high-conservation value" tropical forests that are devoid of key species. This is common in reserves set aside by owners of plantation crops such as palm oil. Algorithms can use these recording to learn the sound of healthy forests, and infer the composition of their species. In Papua New Guinea, for example, the researchers found soundscapes in fragmented forests were far quieter during the dawn and evening choruses, the short cacophonous periods during the changing of day and night. Once enough data has been collected [...] the technology can be applied to the zero deforestation commitments set by corporations.

Space

NASA Releases First Clear Images of Distant Kuiper Belt Object (engadget.com) 135

NASA's New Horizons team has released the promised first images from its history-making flyby of (486958) 2014 MU69. "The snapshots, captured from as close at 17,000 miles away, show that the 21-mile-long Kuiper Belt object is a 'contract binary' where two spheres slowly collided and fused with each other," reports Engadget. "The two may have linked up '99 percent of the way' to the start of the Solar System, Johns Hopkins University APL said." From the report: Capturing a true representation of 2014 MU69 is difficult, at least with the initial batch of pictures. There's a visible light camera onboard the New Horizons Probe (shown on the left), but the Long-Range Reconnaissance Imager (center) is much sharper. To create an accurate image (on the right), scientists had to produce a composite. Higher-resolution pictures and additional scientific data will keep flowing over the "next weeks and months," the New Horizons team said.
Social Networks

The Latest Crop of Instagram Influencers? Medical Students. (slate.com) 48

An anonymous reader shares a report: Celebrity physicians often catapult to fame via their mastery of traditional media, like television or radio or books or magazines, and we're used to seeing medical advice and expertise there. What you may have yet to encounter, or haven't fully noticed yet, is the growing group of current medical students who are perhaps on track to achieve even greater fame, through their prodigious and aggressive use of social media, particularly Instagram. Even before receiving their medical degrees, these future doctors are hard at work growing their audiences (many have well into the thousands of followers), arguably in ways even more savvy than the physicians on social media today.

I first learned of the medical student Instagram influencer community a few months ago, when a friend shared links to a few of these accounts with me, asking if this is what medical school was really like. Curated and meticulously organized, these accounts posted long reflections after anatomy lab sessions, video stories of students huddled around a defibrillator during a CPR training session, pictures of neat study spaces featuring board-prep textbooks next to cups of artisan coffee, and 5 a.m. selfies taken in the surgery locker-room before assisting with a C-section. Initially, I cringed. Sure, they looked vaguely familiar -- they were (literally) rose-tinted, glamorized snapshots of relatable moments dispersed over the past few years of my life. But interspersed, and even integrated, into those relatable moments were advertisements and discount codes for study materials and scrub clothing brands. Something about that, in particular, felt impulsively antithetical to my (perhaps wide-eyed) interpretation of medicine's ideals, of service to others over self-promotion.

Sufficiently intrigued, I fell into a digital rabbit hole that surfaced dozens of fellow med students moonlighting as social media influencers, and the partnerships grew ever more questionable. Some accounts featured sponsored posts advertising watches and clothes from Lululemon; another linked back to a personal blog that included a page that allowed followers to "shop my Instagram." A popular fitness-oriented account, hosted by an aspiring M.D., promoted protein powder and pre-workout supplements. A future dermatologist showcased skin care products. Another future M.D.'s account highlights the mattresses, custom maps, furniture rental services, and food brand that, according to the posts, help her seamlessly live the life of a third-year med student.

AI

Google Researchers Created An Amazing Scene-Rendering AI (arstechnica.com) 50

Researchers from Google's DeepMind subsidiary have developed deep neural networks that "have a remarkable capacity to understand a scene, represent it in a compact format, and then 'imagine' what the same scene would look like from a perspective the network hasn't seen before," writes Timothy B. Lee via Ars Technica. From the report: A DeepMind team led by Ali Eslami and Danilo Rezende has developed software based on deep neural networks with these same capabilities -- at least for simplified geometric scenes. Given a handful of "snapshots" of a virtual scene, the software -- known as a generative query network (GQN) -- uses a neural network to build a compact mathematical representation of that scene. It then uses that representation to render images of the room from new perspectives -- perspectives the network hasn't seen before.

Under the hood, the GQN is really two different deep neural networks connected together. On the left, the representation network takes in a collection of images representing a scene (together with data about the camera location for each image) and condenses these images down to a compact mathematical representation (essentially a vector of numbers) of the scene as a whole. Then it's the job of the generation network to reverse this process: starting with the vector representing the scene, accepting a camera location as input, and generating an image representing how the scene would look like from that angle. The team used the standard machine learning technique of stochastic gradient descent to iteratively improve the two networks. The software feeds some training images into the network, generates an output image, and then observes how much this image diverged from the expected result. [...] If the output doesn't match the desired image, then the software back-propagates the errors, updating the numerical weights on the thousands of neurons to improve the network's performance.

AI

DeepMind Self-training Computer Creates 3D Model From 2D Snapshots (ft.com) 27

DeepMind, Google's artificial intelligence subsidiary in London, has developed a self-training vision computer that generates 'a full 3D model of a scene from just a handful of 2D snapshots," according to its chief executive. From a report: The system, called the Generative Query Network, can then imagine and render the scene from any angle [Editor's note: the link maybe paywalled; alternative source], said Demis Hassabis. GQN is a general-purpose system with a vast range of potential applications, from robotic vision to virtual reality simulation. Details were published on Thursday in the journal Science. "Remarkably, [the DeepMind scientists] developed a system that relies only on inputs from its own image sensors -- and that learns autonomously and without human supervision," said Matthias Zwicker, a computer scientist at the University of Maryland who was not involved in the research. This is the latest in a series of high-profile DeepMind projects, which are demonstrating a previously unanticipated ability by AI systems to learn by themselves, once their human programmers have set the basic parameters.
Communications

Germany Unveils World's Most Powerful X-Ray Laser (theguardian.com) 49

An anonymous reader quotes a report from The Guardian: The world's most powerful X-ray laser has begun operating at a facility where scientists will attempt to recreate the conditions deep inside the sun and produce film-like sequences of viruses and cells. The machine, called the European X-ray Free Electron Laser (XFEL), acts as a high-speed camera that can capture images of individual atoms in a few millionths of a billionth of a second. Unlike a conventional camera, though, everything imaged by the X-ray laser is obliterated -- its beam is 100 times more intense than if all the sunlight hitting the Earth's surface were focused onto a single thumbnail. The facility near Hamburg, housed in a series of tunnels up to 38 meters underground, will allow scientists to explore the architecture of viruses and cells, create jittery films of chemical reactions as they unfold and replicate conditions deep within stars and planets.

XFEL is the world's third major X-ray laser facility -- projects in Japan and the U.S. have already spawned major advances in structural biology and materials science. The European beam is more powerful, but most significantly has a far higher pulse rate than either of its predecessors. "They can send 100 pulses out per second, we can send 27,000," said Robert Feidenhan'l, chairman of the European XFEL management board. This matters because to study chemical reactions or biological processes, the X-ray strobe is used to capture flickering snapshots of the same system at different time-points that can be stitched together into a film sequence.

Operating Systems

OpenBSD Will Get Unique Kernels On Each Reboot (bleepingcomputer.com) 162

An anonymous reader quotes a report from Bleeping Computer: A new feature added in test snapshots for the upcoming OpenBSD 6.2 release will create a unique kernel every time an OpenBSD user reboots or upgrades his computer. This feature is named KARL -- Kernel Address Randomized Link -- and works by relinking internal kernel files in a random order so that it generates a unique kernel binary blob every time. Currently, for stable releases, the OpenBSD kernel uses a predefined order to link and load internal files inside the kernel binary, resulting in the same kernel for all users. Developed by Theo de Raadt, KARL will work by generating a new kernel binary at install, upgrade, and boot time. If the user boots up, upgrades, or reboots his machine, the most recently generated kernel will replace the existing kernel binary, and the OS will generate a new kernel binary that will be used on the next boot/upgrade/reboot, constantly rotating kernels on reboots or upgrades. KARL should not be confused with ASLR -- Address Space Layout Randomization -- a technique that randomizes the memory address where application code is executed, so exploits can't target a specific area of memory where an application or the kernel is known to run. A similar technique exists for randomizing the memory location where the kernel loads -- called KASLR. The difference between the two is that KARL loads a different kernel binary in the same place, while KASLR loads the same binary in random locations. Currently Linux and Windows only support KASLR.
Software

Computer Scientists Have Created the Most Accurate Digital Model of a Human Face (sciencemag.org) 28

sciencehabit quotes a report from Science Magazine: If you've used the smartphone application Snapchat, you may have turned a photo of yourself into a disco bear or melded your face with someone else's. Now, a group of researchers has created the most advanced technique yet for building 3D facial models on the computer. The system could improve personalized avatars in video games, facial recognition for security, and -- of course -- Snapchat filters. The team also trained its program to turn casual 2D snapshots into accurate 3D models. The method could be used to view what a criminal suspect caught on camera would look like from a different angle, or 20 years older. One could also flesh out and animate historical figures from portraits. The "large scale facial model," or LSFM, may soon have medical applications, too. If someone has lost a nose, the technology could help plastic surgeons determine how a new one should look, given the rest of the face.

Slashdot Top Deals