Emulation (Games)

Leaked Game Boy Emulators For Switch Were Made By Nintendo, Experts Suggest (arstechnica.com) 9

An anonymous reader quotes a report from Ars Technica: In most cases, the release of yet another classic console emulator for the Switch wouldn't be all that noteworthy. But experts tell Ars that a pair of Game Boy and Game Boy Advance emulators for the Switch that leaked online Monday show signs of being official products of Nintendo's European Research & Development division (NERD). That has some industry watchers hopeful that Nintendo may be planning official support for some emulated classic portable games through the Nintendo Switch Online subscription service in the future. The two leaked emulators -- codenamed Hiroko for Game Boy and Sloop for Game Boy Advance -- first hit the Internet as fully compiled NSP files and encrypted NCA files linked from a 4chan thread posted to the Pokemon board Monday afternoon. Later in that thread, the original poster suggested that these emulators "are official in-house development versions of Game Boy Color/Advance emulators for Nintendo Switch Online, which have not been announced or released."

In short order, dataminers examining the package found a .git folder in the ROM. That folder includes commit logs that reference supposed development work circa August 2020 from a NERD employee and, strangely enough, a developer at Panasonic Vietnam. NERD's history includes work on the software for the NES Classic and SNES Classic, as well as the GameCube emulation technology in last year's Super Mario All-Stars, so the division's supposed involvement wouldn't be out of the ordinary. Footage from the leaked Game Boy Advance emulator also includes a "(c) Nintendo" and "(c) 2019 -- 2020 Nintendo" at various points. While suggestive, none of this is exactly hard evidence of Nintendo's involvement in making these emulators. Some skepticism might be warranted, too, because there is some historical precedent for an emulator developer trying to get more attention by pretending their homebrew product is a "leaked" official Nintendo release.

Some observers also pointed to other reasons to doubt that these leaks were an "official" Nintendo work product. ModernVintageGamer and others noted that the leaked GBA emulator includes an "export state to Flashcart" option designed "to confirm original behavior" on "original hardware," according to the GUI. That option is illustrated with a picture of an EZFlash third-party flash cartridge in the emulator interface, an odd choice given Nintendo's previous litigious attacks on such flashcart makers. A "savedata memory" option in the emulator also references the ability to "inter-operate with flashcarts, other emulators, [and] fan websites..." That's a list that would serve as a decent Johnny Carson "Carnac the Magnificent" setup for "things Nintendo wouldn't want to reference in an official product."
A prominent video game historian that Ars consulted with said they were "99.9% sure [the emulators are] real" and that "personally I'm absolutely convinced of its legitimacy."
Facebook

Gizmodo Publishes Massive New Leaked Trove of Internal Facebook Papers (gizmodo.com) 20

"Big scoop from Gizmodo today: 'Gizmodo has reviewed, redacted, and published more than two dozen leaked Facebook documents, the first of hundreds to come,'" writes Slashdot reader DevNull127. From the report: We have undertaken this project to help better inform the public about Facebook's role in a wide range of controversies, as well as to provide researchers with access to materials that we hope will advance general knowledge of social media's role in modern history's most troubling crises [...]. The documents will reveal to you, for instance, an internal analysis of the many groups that Facebook knew to be prolific sources of both voter suppression efforts and hate speech targeting its most marginalized users. The records show the company was privately aware of the growing fears among users of being exposed to election-related falsehoods. The papers show that Meta's own data pinpointed the account of then-President Trump as being principally responsible for a surge in reports concerning violations of its violence and incitement rules.

Today's release is the first of a series of posts from Gizmodo to be published in tandem with legal and academic partners. Our goal is to minimize any costs to individuals' privacy and any furtherance of other harms while ensuring the responsible disclosure of the greatest amount of information in the public interest possible. Future releases will be added to this page, a directory, that will eventually offer our readers links all of the leaked internal documents we have published.

Sony

Epic Games Lands $2 Billion As Sony Bets On Its Metaverse (crunchbase.com) 50

No one wants to be left behind in the metaverse. From a report: Two more big companies tossed their hats -- or at least dollars -- into the metaverse ring on Monday, as North Carolina-based Epic Games announced a $2 billion investment that values it at $31.5 billion. Epic said the new funding will "advance the company's vision to build the metaverse and support its continued growth." Under terms of the agreement, both Sony and KIRKBI -- the family-owned holding and investment company behind The LEGO Group -- each invest $1 billion. Just last week, Epic and Lego announced a partnership in developing a "family-friendly" metaverse for kids.
AI

AI-Powered Artificial Fingertip Gives Robots a Nearly Humanlike Touch (science.org) 7

Slashdot reader sciencehabit shares this article from Science magazine: Robots can be programmed to lift a car and even help perform some surgeries, but when it comes to picking up an object they have not touched before, such as an egg, they often fail miserably. Now, engineers have come up with an artificial fingertip that overcomes that limitation. The advance enables machines to sense the textures of these surfaces a lot like a human fingertip does....

[W]hen researchers at the University of Bristol began designing an artificial fingertip in 2009, they used human skin as a guide. Their first fingertip — assembled by hand — was about the size of a soda can. By 2018, they had switched to 3D printing. That made it possible to make the tip and all its components about the size of an adult's big toe and more easily create a series of layers approximating the multilayered structure of human skin. More recently, the scientists have incorporated neural networks into the fingertip, which they call TacTip. The neural networks help a robot quickly process what it's sensing and react accordingly — seemingly just like a real finger.

In our fingertips, a layer of nerve endings deforms when skin contacts an object and tells the brain what's happening. These nerves send "fast" signals to help us avoid dropping something and "slow" signals to convey an object's shape. TacTip's equivalent signals come from an array of pinlike projections underneath a rubbery surface layer that move when the surface is touched. The array's pins are like a hairbrush's bristles: stiff but bendable. Beneath that array is, among other things, a camera that detects when and how the pins move. The amount of bending of the pins provides the slow signal and the speed of bending provides the fast signal. The neural network translates those signals into the fingertip's actions, making it grip more tightly for example, or adjust the angle of the fingertip....

In a second project, Lepora's team added more pins and a microphone to TacTip. The microphone mimics another set of nerve endings deep within our skin that sense vibrations felt as we run our fingers across a surface. These nerve endings enhance our ability to feel how rough a surface is. The microphone did likewise when the researchers tested the enhanced fingertip's ability to differentiate among 13 fabrics.

The article points out that in testing, the artificial fingertip's output "closely matched the neuronal signaling patterns of human fingertips undergoing the same tests."
AMD

AMD To Acquire Pensando in a $1.9 Billion Bid for Networking Tech (protocol.com) 12

AMD said early Monday that it plans to acquire networking chip maker Pensando for $1.9 billion in cash, in a bid to arm itself with tech that competes with directly with Nvidia and Intel's data-center chip packages. From a report: Pensando was founded by several former Cisco engineers, and makes edge computing technology that competes with AWS Nitro, Intel's DPU launched last year, and Nvidia's data processing units called BlueField. In a release distributed in advance of the announcement, AMD said that buying the closely held Pensando will give it a networking platform that will bolster its existing server chip lineup. Pensando's chips are an increasingly important part of data center design, as it becomes impossible to simply throw larger numbers of processors at demanding computing tasks. As regular chips scale up, the networking connections become a bottleneck, and the DPU's goal (Intel calls it an IPU) is to free up the central processor to perform other functions.
Science

First Complete Gap-Free Human Genome Sequence Published (theguardian.com) 33

An anonymous reader quotes a report from the Guardian: More than two decades after the draft human genome was celebrated as a scientific milestone, scientists have finally finished the job. The first complete, gap-free sequence of a human genome has been published in an advance expected to pave the way for new insights into health and what makes our species unique. Until now, about 8% of the human genome was missing, including large stretches of highly repetitive sequences, sometimes described as "junk DNA." In reality though, these repeated sections were omitted due to technical difficulties in sequencing them, rather than pure lack of interest.

Sequencing a genome is something like slicing up a book into snippets of text then trying to reconstruct the book by piecing them together again. Stretches of text that contain a lot of common or repeated words and phrases would be harder to put in their correct place than more unique pieces of text. New "long-read" sequencing techniques that decode big chunks of DNA at once -- enough to capture many repeats -- helped overcome this hurdle. Scientists were able to simplify the puzzle further by using an unusual cell type that only contains DNA inherited from the father (most cells in the body contain two genomes -- one from each parent). Together these two advances allowed them to decode the more than 3 billion letters that comprise the human genome.
The science behind the sequencing effort and some initial analysis of the new genome regions are outlined in six papers published in the journal Science.
The Military

The Drone Operators Who Halted Russian Convoy Headed For Kyiv (theguardian.com) 122

"Ukrainian special forces teamed up with IT professionals on ATV four-wheelers to target the infamous Kiev convoy," writes longtime Slashdot reader darkseid. "Every Help Desk Geek's Walter Mitty fantasy!" The Guardian reports: One week into its invasion of Ukraine, Russia massed a 40-mile mechanized column in order to mount an overwhelming attack on Kyiv from the north. But the convoy of armored vehicles and supply trucks ground to a halt within days, and the offensive failed, in significant part because of a series of night ambushes carried out by a team of 30 Ukrainian special forces and drone operators on quad bikes, according to a Ukrainian commander.

The drone operators were drawn from an air reconnaissance unit, Aerorozvidka, which began eight years ago as a group of volunteer IT specialists and hobbyists designing their own machines and has evolved into an essential element in Ukraine's successful David-and-Goliath resistance. [...] The unit's commander, Lt Col Yaroslav Honchar, gave an account of the ambush near the town of Ivankiv that helped stop the vast, lumbering Russian offensive in its tracks. He said the Ukrainian fighters on quad bikes were able to approach the advancing Russian column at night by riding through the forest on either side of the road leading south towards Kyiv from the direction of Chernobyl.

The Ukrainian soldiers were equipped with night vision goggles, sniper rifles, remotely detonated mines, drones equipped with thermal imaging cameras and others capable of dropping small 1.5kg bombs. "This one little unit in the night destroyed two or three vehicles at the head of this convoy, and after that it was stuck. They stayed there two more nights, and [destroyed] many vehicles," Honchar said. The Russians broke the column into smaller units to try to make headway towards the Ukrainian capital, but the same assault team was able to mount an attack on its supply depot, he claimed, crippling the Russians' capacity to advance. "The first echelon of the Russian force was stuck without heat, without oil, without bombs and without gas. And it all happened because of the work of 30 people," Honchar said.
"The Aerorozvidka unit also claims to have helped defeat a Russian airborne attack on Hostomel airport, just north-west of Kyiv, in the first day of the war," adds the Guardian. Similar to the convoy ambush, they "[used] drones to locate, target and shell about 200 Russian paratroopers concealed at one end of the airfield."
Businesses

Workers Are Trading Staggering Amounts of Data for 'Payday Loans' (wired.com) 33

Companies are offering interest-free advances to people with poor credit in exchange for detailed personal data. Wired: Tulloch [Editor's note: the anecdote character in the story] is one of a growing number of US workers turning their personal data over to private companies in exchange for paycheck advances, fueling an industry potentially worth up to $12 billion, by some estimates. In 2020, $9.5 billion in wages were accessed early, according to the research firm Aite-Novarica Group, up from $6.3 billion in 2019. These early payouts can be habit-forming; a 2021 report from the Financial Health Network found that more than 70 percent of pay advance users took out consecutive advances.

What Tulloch didn't know was that when he signed up for the app, a company called Argyle was retrieving the data that would be used to decide how much money to give him. It builds the technology that allows companies like B9 to extract a wealth of data from payroll accounts -- up to 140 data points. These can include shifts worked, time off, earnings and promotions history, health care and retirement contributions, even reputational markers like on-time rate or a gig worker's star rating and deactivation history. For every worker that uses its product, Argyle charges customers like B9 a fee, plus an additional monthly charge for continuous monitoring. This makes for a valuable data trove; it's further upstream than banking data, providing a fuller picture of a worker's earnings, deductions, and behavior. Some estimate that payroll data could be worth $10 billion. Argyle pegs it at 10 times higher.

Argyle is part of an emerging set of payroll data companies founded over the last four years to cash in on workers' personal information. They build secure connections between payroll providers like Paychex and businesses that want to access that data, like B9. Argyle acts like a courier, shuttling data from one account to another, the same way banking data is transmitted to apps like Venmo. Its competitors include Atomic, Pinwheel, Truv, and Plaid (which builds those bank integrations but recently began releasing payroll products). The data that workers provide can be used to underwrite financial products like loans, mortgages, insurance policies, and buy-now-pay-later apps; simplify direct deposit switching; or verify income and employment for apartment and job applications.

News

Dirty Bomb Ingredients Go Missing From Chornobyl Monitoring Lab (science.org) 78

Insecure radioactive materials are the latest worry as Russia continues occupation of infamous nuclear reservation. schwit1 shares a report: When the lights went out at Chornobyl Nuclear Power Plant on 9 March, the Russian soldiers holding Ukrainian workers at gunpoint became the least of Anatolii Nosovskyi's worries. More urgent was the possibility of a radiation accident at the decommissioned plant. If the plant's emergency generators ran out of fuel, the ventilators that keep explosive hydrogen gas from building up inside a spent nuclear fuel repository would quit working, says Nosovskyi, director of the Institute for Safety Problems of Nuclear Power Plants (ISPNPP) in Kyiv. So would sensors and automated systems to suppress radioactive dust inside a concrete "sarcophagus" that holds the unsettled remains of Chornobyl's Unit Four reactor, which melted down in the infamous 1986 accident.

Although power was restored to Chornobyl on 14 March, Nosovskyi's worries have multiplied. In the chaos of the Russian advance, he told Science, looters raided a radiation monitoring lab in Chornobyl village -- apparently making off with radioactive isotopes used to calibrate instruments and pieces of radioactive waste that could be mixed with conventional explosives to form a âoedirty bombâ that would spread contamination over a wide area. ISPNPP has a separate lab in Chornobyl with even more dangerous materials: "powerful sources of gamma and neutron radiation" used to test devices, Nosovskyi says, as well as intensely radioactive samples of material leftover from the Unit Four meltdown. Nosovskyi has lost contact with the lab, he says, so "the fate of these sources is unknown to us."

Movies

Are Movies Dying? (nytimes.com) 249

As viewership drops for Hollywood's annual Academy Awards ceremony, "Everyone has a theory about the decline..." argues an opinion piece in the New York Times.

"My favored theory is that the Oscars are declining because the movies they were made to showcase have been slowly disappearing." When the nominees were announced in February, nine of the 10 had made less than $40 million in domestic box office. The only exception, "Dune," barely exceeded $100 million domestically, making it the 13th-highest-grossing movie of 2021. All told, the 10 nominees together have earned barely one-fourth as much at the domestic box office as "Spider-Man: No Way Home." Even when Hollywood tries to conjure the old magic, in other words, the public isn't there for it anymore.... Sure, non-superhero-movie box office totals will bounce back in 2022, and next year's best picture nominees will probably earn a little more in theaters. Within the larger arc of Hollywood history, though, this is the time to call it: We aren't just watching the decline of the Oscars; we're watching the End of the Movies....

[W]hat looks finished is The Movies — big-screen entertainment as the central American popular art form, the key engine of American celebrity, the main aspirational space of American actors and storytellers, a pop-culture church with its own icons and scriptures and rites of adult initiation.... The internet, the laptop and the iPhone personalized entertainment and delivered it more immediately, in a way that also widened Hollywood's potential audience — but habituated people to small screens, isolated viewing and intermittent watching, the opposite of the cinema's communalism. Special effects opened spectacular (if sometimes antiseptic-seeming) vistas and enabled long-unfilmable stories to reach big screens. But the effects-driven blockbuster, more than its 1980s antecedents, empowered a fandom culture that offered built-in audiences to studios, but at the price of subordinating traditional aspects of cinema to the demands of the Jedi religion or the Marvel cult. And all these shifts encouraged and were encouraged by a more general teenage-ification of Western culture, the extension of adolescent tastes and entertainment habits deeper into whatever adulthood means today....

Under these pressures, much of what the movies did in American culture, even 20 years ago, is essentially unimaginable today. The internet has replaced the multiplex as a zone of adult initiation. There's no way for a few hit movies to supply a cultural lingua franca, given the sheer range of entertainment options and the repetitive and derivative nature of the movies that draw the largest audiences. The possibility of a movie star as a transcendent or iconic figure, too, seems increasingly dated. Superhero franchises can make an actor famous, but often only as a disposable servant of the brand. The genres that used to establish a strong identification between actor and audience — the non-superhero action movie, the historical epic, the broad comedy, the meet-cute romance — have all rapidly declined...

[T]he caliber of instantly available TV entertainment exceeds anything on cable 20 years ago. But these productions are still a different kind of thing from The Movies as they were — because of their reduced cultural influence, the relative smallness of their stars, their lost communal power, but above all because stories told for smaller screens cede certain artistic powers in advance.

The article argues that episodic TV also cedes the Movies' power of an-entire-story-in-one-go condensation. ("This power is why the greatest movies feel more complete than almost any long-form television.") And it ultimately suggests that like opera or ballet, these grand old movies need "encouragement and patronage, to educate people into loves that earlier eras took for granted," and maybe even "an emphasis on making the encounter with great cinema a part of a liberal arts education. "

In 2014 one lone film-maker had even argued that Ben Stiller's spectacular-yet-thoughtful Secret Life of Walter Mitty "might be the last of a dying breed."
Medicine

Scientists Say They Can Read Nearly the Whole Genome of an IVF-Created Embryo (science.org) 44

sciencehabit shares a report from Science.org: A California company says it can decipher almost all the DNA code of a days-old embryo created through in vitro fertilization (IVF) -- a challenging feat because of the tiny volume of genetic material available for analysis. The advance depends on fully sequencing both parents' DNA and "reconstructing" an embryo's genome with the help of those data. And the company suggests it could make it possible to forecast risk for common diseases that develop decades down the line. Currently, such genetic risk prediction is being tested in adults, and sometimes offered clinically. The idea of applying it to IVF embryos has generated intense scientific and ethical controversy. But that hasn't stopped the technology from galloping ahead.

Predicting a person's chance of a specific illness by blending this genetic variability into what's called a "polygenic risk score" remains under study in adults, in part because our understanding of how gene variants come together to drive or protect against disease remains a work in progress. In embryos it's even harder to prove a risk score's accuracy, researchers say. The new work on polygenic risk scores for IVF embryos is "exploratory research," says Premal Shah, CEO of MyOme, the company reporting the results. Today in Nature Medicine, the MyOme team, led by company co-founders and scientists Matthew Rabinowitz and Akash Kumar, along with colleagues elsewhere, describe creating such scores by first sequencing the genomes of 10 pairs of parents who had already undergone IVF and had babies. The researchers then used data collected during the IVF process: The couples' embryos, 110 in all, had undergone limited genetic testing at that time, a sort of spot sequencing of cells, called microarray measurements. Such analysis can test for an abnormal number of chromosomes, certain genetic diseases, and rearrangements of large chunks of DNA, and it has become an increasingly common part of IVF treatment in the United States. By combining these patchy embryo data with the more complete parental genome sequences, and applying statistical and population genomics techniques, the researchers could account for the gene shuffling that occurs during reproduction and calculate which chromosomes each parent had passed down to each embryo. In this way, they could predict much of that embryo's DNA.

The researchers had a handy way to see whether their reconstruction was accurate: Check the couples' babies. They collected cheek swab samples from the babies and sequenced their full genome, just as they'd done with the parents. They then compared that "true sequence" with the reconstructed genome for the embryo from which the child originated. The comparison revealed, essentially, a match: For a 3-day-old embryo, at least 96% of the reconstructed genome aligned with the inherited gene variants in the corresponding baby; for a 5-day-old embryo, it was at least 98%. (Because much of the human genome is the same across all people, the researchers focused on the DNA variability that made the parents, and their babies, unique.) Once they had reconstructed embryo genomes in hand, the researchers turned to published data from large genomic studies of adults with or without common chronic diseases and the polygenic risk score models that were derived from that information. Then, MyOme applied those models to the embryos, crunching polygenic risk scores for 12 diseases, including breast cancer, coronary artery disease, and type 2 diabetes. The team also experimented with combining the reconstructed embryo sequence of single genes, such as BRCA1 and BRCA2, that are known to dramatically raise risk of certain diseases, with an embryo's polygenic risk scores for that condition -- in this case, breast cancer.

Programming

Developers Debate Denying Updates for Open Source Software to Russia (thenewstack.io) 95

Russia's invasion of Ukraine turns up in Mike Melanson's column "This Week in Programming": While the Open Source Initiative's (OSI) definition of open source software is quite clear on the matter — there must be "no discrimination against persons or groups" and "no discrimination against fields of endeavor" — the issue of who should be allowed to use open source software, according to ethical considerations, has long been debated.

Over the last month, this topic has again become a focus of debate as Russia's invasion of Ukraine has led to developers calling for blanket bans by companies like GitHub and GitLab; and to some developers even taking action. Earlier this month, we wrote about how open source gateway Scarf began limiting access to open source packages for the Russian government and military entities, via its gateway.

As we noted at the time, there was a primary distinction made when Scarf took this action: distribution of open source software is separate from the licensing of it. Those points of the OSI definition pertain to the licensing, not to some entity actively providing the software to others.

Since then, discussions around these ideas have continued, and this week an essay by Bradley M. Kuhn, a policy fellow and hacker-in-residence at the Software Freedom Conservancy, argues that copyleft won't solve all problems, just some of them.

The essay specifically takes to task the idea that open source software can effectively affect change by way of licensing limitations. He spent nearly 3,000 words on the topic, before pointedly addressing the issue of Russia — with a similar conclusion to the one reached by Scarf earlier this month. Kuhn argues that "FOSS licenses are not an effective tool to advance social justice causes other than software freedom" and that, instead, developers have a moral obligation to take stances by way of other methods.

"For example, FOSS developers should refuse to work specifically on bug reports from companies who don't pay their workers a living wage," Kuhn offers in an example.

Regarding Russia specifically, Kuhn again points to distribution as an avenue of protest, while still remaining in line with the principles of free and open source software.

"Every FOSS license in existence permits capricious distribution; software freedom guarantees the right to refuse to distribute new versions of the software. (i.e., Copyleft does not require that you publish all your software on the Internet for everyone, or that you give equal access to everyone — rather, it merely requires that those whom you chose to give legitimate access to the software also receive CCS). FOSS projects should thus avoid providing Putin easy access to updates to their FOSS," writes Kuhn.

Science

'Quantum Hair' Could Resolve Hawking's Black Hole Paradox, Say Scientists (theguardian.com) 96

Stephen Hawking's black hole information paradox has bedevilled scientists for half a century and led some to question the fundamental laws of physics. Now scientists say they may have resolved the infamous problem by showing that black holes have a property known as "quantum hair." From a report: If correct, this would mark a momentous advance in theoretical physics. Prof Xavier Calmet, of the University of Sussex, who led the work, said that after working on the mathematics behind the problem for a decade, his team made a rapid advance last year that gave them confidence that they had finally cracked it. "It was generally assumed within the scientific community that resolving this paradox would require a huge paradigm shift in physics, forcing the potential reformulation of either quantum mechanics or general relativity," said Calmet. "What we found -- and I think is particularly exciting -- is that this isn't necessary."

Hawking's paradox boils down to the following: the rules of quantum physics state that information is conserved. Black holes pose a challenge to this law because once an object enters a black hole, it is essentially gone for good -- along with any information encoded in it. Hawking identified this paradox and for decades it has continued to confound scientists. There have been innumerable proposed solutions, including a "firewall theory" in which information was assumed to burn up before entering the black hole, the "fuzzball theory" in which black holes were thought to have indistinct boundaries, and various exotic branches of string theory. But most of these proposals required rewriting of the laws of quantum mechanics or Einstein's theory of gravity, the two pillars of modern physics.

Open Source

The Free Software Foundation Appoints a New Executive Director (fsf.org) 34

The Free Software Foundation announced its new executive director this week.

Back in 2010 John Sullivan had become the Free Software Foundation's previous executive director, but last year after more than 11 years he'd decided to resign.

Taking his place will be the FSF's program manager for the last three years, who writes in a new blog post: The past three years working at the FSF as program manager have been educational and motivational. They have reinforced my belief that what we do is important, and that our goal to give the four freedoms to all computer users continues to be crucial. The work we do reminds people to recognize the power they have to demand change. This change will help free their own digital lives, and their loved ones'.

I am grateful to John Sullivan for his leadership and support. His legacy of nineteen years will be hard to live up to, and I look forward to working with him, the FSF board, and the staff on this transition....

We will continue our unwavering focus on our mission, especially working to increase understanding and adoption of copyleft, and bringing new people into the movement by communicating the necessity of the four freedoms. In the short term, we're focused on making the upcoming LibrePlanet conference [March 19-20] the best online edition yet for you. After that, I plan to reach out and ask for your thoughts and ideas on what else the FSF can do this year and beyond to advance the cause of user freedom.

As a free software activist, like many of you, each day, I am presented with almost innumerable choices between freedom and convenience, and each day I choose freedom wherever I can. I have learned to do this by questioning my tools, by joining this community, and by learning more and more about the ways that I can stand up for myself. If I can do that, I firmly believe we can reach anyone. I hope that you'll join me in rejecting the ways that Big Tech tries to deprive us of our freedoms, and to help set a positive example for computer users around the globe.

In freedom,

Zoë Kooyman
Executive Director

From the FSF's announcement: Kooyman assumes the executive director role following a series of recent steps taken to make the non-profit's governance and board recruitment practices more transparent and participatory, including a new community engagement process that empowers associate members of the FSF to nominate and evaluate candidates for the board of directors for the first time in the organization's 37-year history.

"I want to learn from the community, and will focus on relationship building, and on strengthening the free software movement together," Kooyman said. "Our immediate priority is to convene another successful LibrePlanet conference on March 19 and 20, bringing community activists, domain experts, and other users together to discuss current issues in technology and ethics. With the current and future threats users face, it's critical that we spread the free software message wider than ever before and that we help people understand the steps they can take to defend our user rights and freedom."

Science

Physicists Produce Biggest Time Crystal Yet (science.org) 38

sciencehabit shares a report from Science.org: Physicists in Australia have programmed a quantum computer half a world away to make, or at least simulate, a record-size time crystal -- a system of quantum particles that locks into a perpetual cycle in time, somewhat akin to the repeating spatial pattern of atoms in an actual crystal. The new time crystal comprises 57 quantum particles, more than twice the size of a 20-particle time crystal simulated last year by scientists at Google. That's so big that no conventional computer could simulate it, says Chetan Nayak, a condensed matter physicist at Microsoft, who was not involved in the work. "So that's definitely an important advance." The work shows the power of quantum computers to simulate complex systems that may otherwise exist only in physicists' theories.

[Philipp Frey and Stephan Rachel, theorists at the University of Melbourne] performed the simulation remotely, using quantum computers built and run by IBM in the United States. The qubits, which can be set to 0, 1, or 1 and 0 at once, can be programmed to interact like magnets. For certain settings of their interactions, the researchers found, any initial setting of the 57 qubits, such as 01101101110 ..., remains stable, returning to its original state every two pulses, the researchers report today in Science Advances. [...] Whereas more than 100 researchers worked on the Google simulation, Frey and Rachel worked alone to perform their larger demonstration, submitting it to the IBM computers over the internet. "It was just me, my graduate student, and a laptop," Rachel says, adding that "Philipp is brilliant!" The entire project took about 6 months, he estimates. The demonstration isn't perfect, Rachel says. The flipping pattern ought to last indefinitely, he says, but the qubits in IBM's machines can only hold their states long enough to simulate about 50 cycles. Ultimately, the stabilizing effect of the interactions might be used to store the state of a string of qubits in a kind of memory for a quantum computer, he notes, but realizing such an advance will take -- what else? -- time.

United States

Biden To Congress: Pass The Bill To Fund US Chip Manufacturing (cnet.com) 175

President Joe Biden called on Congress to pass the CHIPS Act, a law that would provide chipmakers with $52 billion in subsidies to advance semiconductor manufacturing in the United States, during his State of the Union speech Tuesday. From a report: Biden lauded Intel Chief Executive Pat Gelsinger, who last month announced a $20 billion investment for two new chip fabrication facilities, or fabs, that the company will build just west of Columbus, Ohio. Intel plans to spend $100 billion to build the Ohio "megafab" over the next decade, with an eventual total of eight fabs, but the speed of that investment will depend on the US subsidy, Gelsinger has said.

"Intel's CEO, Pat Gelsinger, who is here tonight, told me they are ready to increase their investment from $20 billion to $100 billion. That would be one of the biggest investments in manufacturing in American history," Biden said. "And all they're waiting for is for you to pass this bill. ... Send it to my desk. I'll sign it." The Senate passed a bill funding the CHIPS Act in 2021, and the House of Representatives followed suit in February, but the differences in the bills haven't been ironed out in committee and the subsidy hasn't arrived despite some bipartisan support. The funding would help the US compete with government help in Taiwan and South Korea, where leading chipmakers Taiwan Semiconductor Manufacturing Co. (TSMC) and Samsung have the bulk of their operations. The US subsidies would knock about $3 billion off the $10 billion price tag for a new fab, a subsidy level Intel says matches those in Asia.

Transportation

US Clears Way For Automakers To Install Smart Headlights (axios.com) 39

The Department of Transportation's National Highway Traffic Safety Administration (NHTSA) issued a rule Tuesday to allow adaptive driving beam headlights, or smart headlights, in the U.S. Axios reports: The technology, which relies on sensors and LED light, will help prevent crashes by allowing better illumination of pedestrians, animals and objects without impairing the visibility of drivers in other vehicles, NHTSA said. Adaptive driving beam headlight systems, which are commonplace in Europe and Canada, automatically focus beams on darker, unoccupied areas while reducing the intensity of illumination in times of oncoming traffic. Research released in 2019 by the American Automobile Association found that European vehicles with adaptive headlight systems increase roadway lighting by as much as 86% when compared to U.S. low beam headlights. "NHTSA prioritizes the safety of everyone on our nation's roads, whether they are inside or outside a vehicle. New technologies can help advance that mission," said Steven Cliff, NHTSA's deputy administrator, in a statement. "NHTSA is issuing this final rule to help improve safety and protect vulnerable road users."
Science

Scientists Make Breakthrough In Warping Time At Smallest Scale Ever (vice.com) 64

An anonymous reader quotes a report from Motherboard: [S]cientists at JILA, a joint operation between the National Institute of Standards and Technology and the University of Colorado Boulder, have measured time dilation at the smallest scale ever using the most accurate clocks in the world. The team showed that clocks located just a millimeter apart -- about the width of a pencil tip -- showed slightly different times due to the influence of Earth's gravity. The new experiment paves the way toward clocks with 50 times the precision of those available today, which could be used for a host of practical applications, while also shedding light on fundamental mysteries about our universe, including the long-sought "union of general relativity and quantum mechanics," according to a study published on Wednesday in Nature.

In 2010, JILA scientists used these clocks to measure time dilation at two points with a difference in elevation of 33 centimeters (roughly a foot), which was a big advance at that point. After a decade of fine-tuning their clocks, [Jun Ye, a JILA physicist who co-authored the study] and his colleagues have managed to track frequency shifts within a sample of 100,000 extremely cold strontium atoms, enabling them to snag the unprecedented millimeter-scale effects of dilation. What's more, the team managed to keep these atoms dancing in perfect unison for 37 seconds, setting a new record for the duration of "quantum coherence," or the state in which the behavior of these atoms can be predicted.

Privacy

Will ID.Me Destroy the Data of the 7 Million Americans Already Directed to Its Face-Scanning Service? (msn.com) 26

America's Internal Revenue service abandoned plans to make face-scanning mandatory for access to your tax records.

Unfortunately, before this change of heart the IRS had already directed 7 million Americans to facial recognition vendor ID.me, reports the Washington Post. Now the chair of the House Oversight Committee is urging IRS Commissioner Charles Rettig to instruct ID.me to destroy the biometric data and ensure the data isn't used for "unapproved or unauthorized purposes." "Those Americans' highly personal information may continue to be held by a third party outside of the IRS's direct control — increasing the potential for exposure due to bad actors and other cybersecurity incidents," [head of the committee] . Maloney wrote.... ID.me said on Wednesday that it would drop the facial recognition requirement in its software, which is used by 30 states and 10 federal agencies. The company also told The Washington Post that effective March 1, anyone would be able to delete their selfie or photo data....

The letter follows years of controversy over the government's expanding use of facial recognition software, despite warnings from the General Services Administration that the face-scanning technology has too many problems to justify its use.... There is no federal law regulating how facial recognition can be used or how it should be secured....

Maloney also writes that 13 percent of ID.me users since June had struggled to use the software and were referred to customer service, where representatives would attempt to verify their identities over video chat. The letter says this underscores the "widespread issues related to the use of the nascent facial recognition technology."

In fact, the Verge reports that "Internal documents and former ID.me employees say the company was beset by disorganization and staffing shortages throughout 2021, as shortcomings in the automated systems created tensions among the company's workforce, particularly the human verification workers who have to step in when the algorithms fail." Current and former employees who spoke to The Verge paint a picture of a company described as being in "permanent crisis mode," changing policies rapidly to keep up with fluctuating demand for its services and fight a slew of negative press. In particular, they say a lack of human review capacity has been a chokepoint for the company, leading to stress, pressure, and a failure to meet quality standards. It's an unexpected challenge for a biometrics system that's usually seen as automatic, pointing to the often-ignored workers needed to support automated systems at scale.

When the automated systems fail — ID.me says roughly 10 percent of users will need video chat assistance — it's workers and subjects who are left to manage the consequences.... To keep up with demand, the company added 1,300 new employees between January and September 2021, including 500 to be based in a new office in Tampa, Florida, dedicated to customer support. But as adoption increased, so did complaints. A Vice report found dozens of complaints from applicants who said they had been locked out of unemployment benefits when ID.me's verification service had failed to identify them. When the automated system failed, applicants often faced long wait times to reach human reviewers, according to the report — wait times that became even more burdensome and difficult to navigate for people without access to reliable internet connections....

Many staff were unhappy about the end of work-from-home policies, which were being phased out at the company at the same time as first the delta and then omicron variants hit the US. As in-office staffing levels rose, more ID.me employees began to contract COVID at work, sources said, in some cases taking whole teams offline at once.

One Id.me employee complained to the Verge that "In terms of worker treatment, it's like the Amazon of identity protection."

The article also notes that an ID.me video chat agent was terminated after engaging in "inappropriate conduct," and while the company added new procedures to prevent this, "sources said that these quality checks have begun to fall by the wayside under the pressure of clearing through the backlog of video verification requests."
Power

Time-Shifted Computing Could Slash Data Center Energy Costs By Up To 30% (arstechnica.com) 66

An anonymous reader quotes a report from Ars Technica: Recently, two computer scientists had an idea: if computers use energy to perform calculations, could stored data be a form of stored energy? Why not use computing as a way to store energy? What if information could be a battery, man? As it turns out, the idea isn't as far-fetched as it may sound. The "information battery" concept, fleshed out in a recent paper (PDF), would perform certain computations in advance when power is cheap -- like when the sun is shining or the wind is blowing -- and cache the results for later. The process could help data centers replace up to 30 percent of their energy use with surplus renewable power.

The beauty of the system is that it requires no specialized hardware and imposes very little overhead. "Information Batteries are designed to work with existing data centers," write authors Jennifer Switzer, a doctoral student at UC San Diego, and Barath Raghavan, an assistant professor at the University of Southern California. "Some very limited processing power is reserved for the IB [information battery] manager, which manages the scheduling of both real-time computational tasks and precomputation. A cluster of machines or VMs is designated for precomputation. The IB cache, which stores the results of these precomputations, is kept local for quick retrieval. No additional infrastructure is needed."

In the model Switzer and Raghavan created to test the concept, the IB manager queried grid operators every five minutes -- the smallest time interval the operators offered -- to check the price of power to inform its predictions. When prices dipped below a set threshold, the manager green-lit a batch of computations and cached them for later. The system was pretty effective at reducing the need for expensive "grid power," as the authors call it, even when the pre-computation engine did a relatively poor job of predicting which tasks would be needed in the near future. At just 30 percent accuracy, the manager could begin to make the most of the so-called "opportunity power" that is created when there is excess wind or solar power. In a typical large data center, workloads can be predicted around 90 minutes in advance with about 90 percent accuracy, the authors write. With a more conservative prediction window of 60 minutes, "such a data center could store 150 MWh, significantly more than most grid-scale battery-based storage projects," they say. An equivalent grid-scale battery would cost around $50 million, they note.

Slashdot Top Deals