Games

The 2023 Video Game Hall of Fame Inductees (museumofplay.org) 44

Slashdot reader Dave Knott shares the four class of 2023 inductees into the Video Game Hall Of Fame. They were announced today at The Strong National Museum of Play. From the press release: Barbie Fashion Designer : "The 1996 hit Barbie Fashion Designer emerged at a time when many games were marketed to male players. Published by Digital Domain/Mattel Media, it proved that a computer game targeted to girls could succeed, selling more than 500,000 copies in two months. The game helped greatly expanded the market for video games and in the process opened important -- and ongoing -- discussions about gender and stereotypes in gaming. Barbie Fashion Designer was also innovative in bridging the gap between the digital and the physical, allowing players to design clothes for their Barbie dolls and print them on special fabric."

Computer Space : "Nutting Associate's Computer Space appeared in 1971 and was the first commercial video game. Inspired by the early minicomputer and previous World Video Game Hall of Fame inductee -- Spacewar! (1962) -- the coin-operated Computer Space proved that video games could reach an audience outside of computer labs. While not a best-seller, it was a trailblazer in the video game world and inspired its creators to go on to establish Atari Inc., a video game giant in the 1970s and 1980s."

The Last of Us : "Released by Naughty Dog and Sony Interactive Entertainment in 2013, The Last of Us jumped into an oversaturated field of post-apocalyptic zombie games and quickly stood out among the rest with its in-depth storytelling, intimate exploration of humanity, thrilling game jumps and cutscenes, and its memorable characters. More than 200 publications named it the game of the year in 2013. Its story has since made the jump to Hollywood, inspiring an HBO adaptation in 2023 watched weekly by millions."

Wii Sports : "Wii Sports launched with the Nintendo Wii home video game system in 2006 and introduced motion-based technology to living rooms across the world. With a simple swipe of the controller, players could serve a tennis ball, hurl a bowling bowl, throw a left hook, or drive a golf ball. The simple mechanics made the game accessible to almost anyone -- allowing it to be played by young children and seniors alike -- and helped to redefine the idea of who is a "gamer." Ultimately, the game helped Nintendo to sell more than 100 million Wii consoles worldwide."
These titles managed to beat out several other incredibly popular titles, including Angry Birds, Age of Empires, Call of Duty 4: Modern Warfare, GoldenEye 007, NBA 2K, FIFA International Soccer, Quake, and Wizardry.
Security

EarSpy: Spying On Phone Calls Via Ear Speaker Vibrations Captured By Accelerometer (securityweek.com) 27

An anonymous reader quotes a report from SecurityWeek: As smartphone manufacturers are improving the ear speakers in their devices, it can become easier for malicious actors to leverage a particular side-channel for eavesdropping on a targeted user's conversations, according to a team of researchers from several universities in the United States. The attack method, named EarSpy, is described in a paper published just before Christmas by researchers from Texas A&M University, Temple University, New Jersey Institute of Technology, Rutgers University, and the University of Dayton. EarSpy relies on the phone's ear speaker -- the speaker at the top of the device that is used when the phone is held to the ear -- and the device's built-in accelerometer for capturing the tiny vibrations generated by the speaker.

The researchers discovered that attacks such as EarSpy are becoming increasingly feasible due to the improvements made by smartphone manufacturers to ear speakers. They conducted tests on the OnePlus 7T and the OnePlus 9 smartphones -- both running Android -- and found that significantly more data can be captured by the accelerometer from the ear speaker due to the stereo speakers present in these newer models compared to the older model OnePlus phones, which did not have stereo speakers. The experiments conducted by the academic researchers analyzed the reverberation effect of ear speakers on the accelerometer by extracting time-frequency domain features and spectrograms. The analysis focused on gender recognition, speaker recognition, and speech recognition.

In the gender recognition test, whose goal is to determine whether the target is male or female, the EarSpy attack had a 98% accuracy. The accuracy was nearly as high, at 92%, for detecting the speaker's identity. When it comes to actual speech, the accuracy was up to 56% for capturing digits spoken in a phone call. "[This] accuracy still exhibits five times greater accuracy than a random guess, which implies that vibration due to the ear speaker induced a reasonable amount of distinguishable impact on accelerometer data," the researchers said.

Science

Physicists Use Google's Quantum Computer to Create Holographic Wormhole Between Black Holes (quantamagazine.org) 55

"In an experiment that ticks most of the mystery boxes in modern physics, a group of researchers announced Wednesday that they had simulated a pair of black holes in a quantum computer," reports the New York Times [alternate URL here. But in addition, the researchers also sent a message between their two black holes, the Times reports, "through a shortcut in space-time called a wormhole.

"Physicists described the achievement as another small step in the effort to understand the relation between gravity, which shapes the universe, and quantum mechanics, which governs the subatomic realm of particles....

Quanta magazine reports: The wormhole emerged like a hologram out of quantum bits of information, or "qubits," stored in tiny superconducting circuits. By manipulating the qubits, the physicists then sent information through the wormhole, they reported Wednesday in the journal Nature. The team, led by Maria Spiropulu of the California Institute of Technology, implemented the novel "wormhole teleportation protocol" using Google's quantum computer, a device called Sycamore housed at Google Quantum AI in Santa Barbara, California. With this first-of-its-kind "quantum gravity experiment on a chip," as Spiropulu described it, she and her team beat a competing group of physicists who aim to do wormhole teleportation with IBM and Quantinuum's quantum computers.

When Spiropulu saw the key signature indicating that qubits were passing through the wormhole, she said, "I was shaken."

The experiment can be seen as evidence for the holographic principle, a sweeping hypothesis about how the two pillars of fundamental physics, quantum mechanics and general relativity, fit together.... The holographic principle, ascendant since the 1990s, posits a mathematical equivalence or "duality" between the two frameworks. It says the bendy space-time continuum described by general relativity is really a quantum system of particles in disguise. Space-time and gravity emerge from quantum effects much as a 3D hologram projects out of a 2D pattern. Indeed, the new experiment confirms that quantum effects, of the type that we can control in a quantum computer, can give rise to a phenomenon that we expect to see in relativity — a wormhole....

To be clear, unlike an ordinary hologram, the wormhole isn't something we can see. While it can be considered "a filament of real space-time," according to co-author Daniel Jafferis of Harvard University, lead developer of the wormhole teleportation protocol, it's not part of the same reality that we and the Sycamore computer inhabit. The holographic principle says that the two realities — the one with the wormhole and the one with the qubits — are alternate versions of the same physics, but how to conceptualize this kind of duality remains mysterious. Opinions will differ about the fundamental implications of the result. Crucially, the holographic wormhole in the experiment consists of a different kind of space-time than the space-time of our own universe. It's debatable whether the experiment furthers the hypothesis that the space-time we inhabit is also holographic, patterned by quantum bits.

"I think it is true that gravity in our universe is emergent from some quantum [bits] in the same way that this little baby one-dimensional wormhole is emergent" from the Sycamore chip, Jafferis said. "Of course we don't know that for sure. We're trying to understand it."

Here's how principal investigator Spiropulu summarizes their experiment. "We found a quantum system that exhibits key properties of a gravitational wormhole yet is sufficiently small to implement on today's quantum hardware."
Space

Scientists Build 'Baby' Wormhole (reuters.com) 117

An anonymous reader quotes a report from Reuters: Scientists have long pursued a deeper understanding of wormholes and now appear to be making progress. Researchers announced on Wednesday that they forged two miniscule simulated black holes -- those extraordinarily dense celestial objects with gravity so powerful that not even light can escape -- in a quantum computer and transmitted a message between them through what amounted to a tunnel in space-time. It was a "baby wormhole," according to Caltech physicist Maria Spiropulu, a co-author of the research published in the journal Nature. But scientists are a long way from being able to send people or other living beings through such a portal, she said.

"Experimentally, for me, I will tell you that it's very, very far away. People come to me and they ask me, 'Can you put your dog in the wormhole?' So, no," Spiropulu told reporters during a video briefing. "...That's a huge leap." [...] Spiropulu said the researchers found a quantum system that exhibits key properties of a gravitational wormhole but was small enough to implement on existing quantum hardware. The researchers said no rupture of space and time was created in physical space in the experiment, though a traversable wormhole appeared to have emerged based on quantum information teleported using quantum codes on the quantum processor.
"There's a difference between something being possible in principle and possible in reality," added physicist and study co-author Joseph Lykken of Fermilab, America's particle physics and accelerator laboratory. "So don't hold your breath about sending your dog through the wormhole. But you have to start somewhere. And I think to me it's just exciting that we're able to get our hands on this at all."

"It looks like a duck, it walks like a duck, it quacks like a duck. So that's what we can say at this point -- that we have something that in terms of the properties we look at, it looks like a wormhole," Lykken said.
Twitter

Twitter Is Now an Elon Musk Company (theverge.com) 446

Elon Musk has "added [Twitter] to his business empire after months of legal skirmishes," writes The Verge's Elizabeth Lopatto, citing reports from CNBC, The Washington Post and Insider. From the report: Musk's first move on Thursday was to oust Parag Agrawal, who was Twitter's last CEO as a public company. Chief financial officer Ned Segal and Vijaya Gadde, the company's policy chief whom Musk had publicly criticized have also reportedly left the building. Sean Edgett, the general counsel, is also gone, The New York Times reports, adding that at least one of these executives was walked out by security. Chief customer officer Sarah Personette was also fired, Insider reports. The execs received handsome payouts for their trouble, Insider reports: Agrawal got $38.7 million, Segal got $25.4 million, Gadde got $12.5 million, and Personette, who tweeted yesterday about how excited she was for Musk's takeover, got $11.2 million

Questions still remain about what Musk plans to do with Twitter now that he owns it, though he's made a number of public comments. The Washington Post reported that Musk planned to cull 75 percent of Twitter's employees, citing estimates given to prospective Twitter investors. Musk told Twitter staffers that the 75 percent figure was inaccurate, Bloomberg reported. In Musk's text messages, provided during discovery to Twitter's lawyers, he and entrepreneur Jason Calacanis, a friend of his, discussed cutting staff by requiring a return to office. "Day zero," Calacanis texted Musk. "Sharpen your blades boys." Requiring Twitter employees to return to offices would mean 20 percent of the staff would leave voluntarily, Calacanis wrote. Also, Calacanis told Musk, "Twitter CEO is my dream job."

Twitter also faces challenges to its free speech stance in court, as the Supreme Court agreed to take up two cases that will determine its liability for illegal content. Musk, who is also CEO of Tesla and SpaceX, has suggested he'll change the way Twitter's moderation works, potentially relaxing the kinds of policies that saw former President Donald Trump permanently banned from the platform. Although Musk has said that his Twitter acquisition is "not a way to make money," he's reportedly raised ideas for cost cutting and increasing revenue. Governments and corporations could be charged a "slight cost" to use Twitter, and there could be job cuts on the table to improve the company's bottom line. Some of Twitter's current employees have criticized Musk's plans for the platform as "incoherent" and lacking in detail. More broadly, Musk has talked about using Twitter to create "X, the everything app." This is a reference to China's WeChat app, which started life as a messaging platform, but has since grown to encompass multiple businesses, from shopping to payments to gaming. "You basically live on WeChat in China," Musk told Twitter employees in June. "If we can recreate that with Twitter, we'll be a great success."

Apple

Newest Apple Museum Claims To Be 'Biggest and Most Complete' With 1,600 Exhibits (9to5mac.com) 43

An anonymous reader quotes a report from 9to5Mac: Apple Museum of Poland is now open, boasting to be the "biggest and most complete" collection in the world. With over 1,600 exhibits, the museum is the result of years of dedication from Polish collector and architect Jacek Lupina and spans the company's 46-year history. The Apple Museum, located in a former metalworking factory in Warsaw, features a replica of the Apple 1 at its entrance. Released in 1976, the Apple 1 was the first personal computer that Steve Jobs and Steve Wozniak sold. Additionally, the motherboard of the museum's Apple 1 replica includes a signature from Steve Wozniak himself.

Lupina's goal is to showcase how far the company has come and how much things have changed in over four decades. [...] While there's a lot to show, the Apple Museum isn't holding all exhibits at once as it is rotating subjects periodically. The collection exhibits Apple, Macintosh, and NeXT computers as well as iPhones, iPods, and iPads. Also, on the walls, there are vintage advertisements like the well-known "Think Different" campaign from 1997.

Earth

The Ocean Is Starting To Lose Its Memory, Scientists Warn (sciencealert.com) 70

An anonymous reader quotes a report from ScienceAlert: The oceans that surround us are transforming. As our climate changes, the world's waters are shifting too, with abnormalities evident not only in the ocean's temperature, but also its structure, currents, and even its color. As these changes manifest, the usually stable environment of the ocean is becoming more unpredictable and erratic, and in some ways the phenomenon is akin to the ocean losing its memory, scientists suggest. "Ocean memory, the persistence of ocean conditions, is a major source of predictability in the climate system beyond weather time scales," researchers explain in a new paper led by first author and climate researcher Hui Shi from the Farallon Institute in Petaluma, California. "We show that ocean memory, as measured by the year-to-year persistence of sea surface temperature anomalies, is projected to steadily decline in the coming decades over much of the globe."

In the research, the team studied sea surface temperatures (SSTs) in the shallow top layer of the ocean, called the upper-ocean mixed layer (MLD). Despite the MLD's relative shallowness -- extending only to a depth of about 50 meters down from the ocean's surface -- this upper layer of water exhibits a lot of persistence over time in terms of thermal inertia, especially compared to the variations seen in the atmosphere above. In the future, however, modeling suggests that this 'memory' effect of thermal inertia in the upper ocean is set to decline globally over the rest of the century, with dramatically greater variations in temperature predicted over coming decades.

According to the researchers, shoaling effects in the MLD will introduce greater levels of water-mixing in the upper ocean, effectively thinning out the top layer. This is expected to lower the ocean's capacity for thermal inertia, rendering the upper ocean more susceptible to random temperature anomalies. Just what that means for marine wildlife is unclear, but the researchers note that "consequential impacts on populations are likely," although some species are expected to fare better than others in terms of adaptation. On another note, the ocean memory decline is expected to make it significantly harder for scientists to forecast upcoming ocean dynamics, reducing reliable lead times for all sorts of predictions related to SSTs. This will hinder our ability to project monsoons, marine heatwaves (MHWs), and periods of extreme weather, among other things.
The findings have been published in the journal Science Advances.
Hardware

Retro Computing Museum In Ukraine Destroyed By Russian Bomb (pcgamer.com) 131

A privately owned collection of more than 500 pieces of retro computer and technology history has been destroyed by a Russian bomb in the city of Mariupol. PC Gamer reports: The destruction was highlighted by Mark Howlett on Twitter, and confirmed by the Ukrainian Software and Computer Museum account, which operates museums in Kharkiv and Kyiv. The owner of the Mariupul collection, Dmitry Cherepanov, is reportedly safe, though his collection of computers, consoles, and assorted tech from fifty years of computing has been wiped out. "There is neither my museum nor my house," writes Cherepanov on his Facebook page, it8bit.club.

The museum itself may be gone, but Cherepanov has been chronicling his collection of exhibits online for some time now, and though this is all that's left, it is still a resource worth checking out. There are a host of fascinating old machines, including the Commodore C64 [...]. As well as images and information about all the 120 computers and consoles in his collection, Cherepanov also hosts RetroBit Radio on the site, too. Cherepanov has set up a Paypal account for donations, the details of which you can find on his Facebook page.

Programming

'A Quadrillion Mainframes On Your Lap' (ieee.org) 101

"Your laptop is way more powerful than you might realize," writes long-time Slashdot reader fahrbot-bot.

"People often rhapsodize about how much more computer power we have now compared with what was available in the 1960s during the Apollo era. Those comparisons usually grossly underestimate the difference."

Rodney Brooks, emeritus professor of robotics at MIT (and former director of their AI Lab and CSAIL) explains in IEEE Spectrum: By 1961, a few universities around the world had bought IBM 7090 mainframes. The 7090 was the first line of all-transistor computers, and it cost US $20 million in today's money, or about 6,000 times as much as a top-of-the-line laptop today. Its early buyers typically deployed the computers as a shared resource for an entire campus. Very few users were fortunate enough to get as much as an hour of computer time per week.

The 7090 had a clock cycle of 2.18 microseconds, so the operating frequency was just under 500 kilohertz. But in those days, instructions were not pipelined, so most took more than one cycle to execute. Some integer arithmetic took up to 14 cycles, and a floating-point operation could hog up to 15. So the 7090 is generally estimated to have executed about 100,000 instructions per second. Most modern computer cores can operate at a sustained rate of 3 billion instructions per second, with much faster peak speeds. That is 30,000 times as fast, so a modern chip with four or eight cores is easily 100,000 times as fast.

Unlike the lucky person in 1961 who got an hour of computer time, you can run your laptop all the time, racking up more than 1,900 years of 7090 computer time every week....

But, really, this comparison is unfair to today's computers. Your laptop probably has 16 gigabytes of main memory. The 7090 maxed out at 144 kilobytes. To run the same program would require an awful lot of shuffling of data into and out of the 7090 — and it would have to be done using magnetic tapes . The best tape drives in those days had maximum data-transfer rates of 60 KB per second. Although 12 tape units could be attached to a single 7090 computer, that rate needed to be shared among them. But such sharing would require that a group of human operators swap tapes on the drives; to read (or write) 16 GB of data this way would take three days. So data transfer, too, was slower by a factor of about 100,000 compared with today's rate.

So now the 7090 looks to have run at about a quadrillionth (10 ** -15) the speed of your 2021 laptop. A week of computing time on a modern laptop would take longer than the age of the universe on the 7090.

United States

Saving History With Sandbags: Climate Change Threatens the Smithsonian (nytimes.com) 125

President Warren Harding's blue silk pajamas. Muhammad Ali's boxing gloves. The Star Spangled Banner, stitched by Betsy Ross. Scripts from the television show "M*A*S*H." Nearly two million irreplaceable artifacts that tell the American story are housed in the National Museum of American History, part of the Smithsonian Institution, the biggest museum complex in the world. Now, because of climate change, the Smithsonian stands out for another reason: Its cherished buildings are extremely vulnerable to flooding, and some could eventually be underwater. From a report: Eleven palatial Smithsonian museums and galleries form a ring around the National Mall, the grand two-mile park lined with elms that stretches from the Lincoln Memorial to the U.S. Capitol. But that land was once marsh. And as the planet warms, the buildings face two threats. Rising seas will eventually push in water from the tidal Potomac River and submerge parts of the Mall, scientists say. More immediately, increasingly heavy rainstorms threaten the museums and their priceless holdings, particularly since many are stored in basements. At the American History Museum, water is already intruding.

It gurgles up through the floor in the basement. It finds the gaps between ground-level windows, puddling around exhibits. It sneaks into the ductwork, then meanders the building and drips onto display cases. It creeps through the ceiling in locked collection rooms, thief-like, and pools on the floor. Staff have been experimenting with defenses: Candy-red flood barriers lined up outside windows. Sensors that resemble electronic mouse traps, deployed throughout the building, that trigger alarms when wet. Plastic bins on wheels, filled with a version of cat litter, to be rushed back and forth to soak up the water. So far, the museum's holdings have escaped damage. But "We're kind of in trial and error," said Ryan Doyle, a facilities manager at the Smithsonian. "It's about managing water." An assessment of the Smithsonian's vulnerabilities, released last month, reveals the scale of the challenge: Not only are artifacts stored in basements in danger, but floods could knock out electrical and ventilation systems in the basements that keep the humidity at the right level to protect priceless art, textiles, documents and specimens on display. Of all its facilities, the Smithsonian ranks American History as the most vulnerable, followed by its next door neighbor, the National Museum of Natural History.

The Courts

Adobe Uses DMCA To Nuke Project That Keeps Flash Alive, Secure and Adware Free (torrentfreak.com) 69

An anonymous reader quotes a report from TorrentFreak: In January 2021, development and support for Adobe Flash was discontinued. That marked the end of an era but in reality, Flash wasn't quite dead. Flash Player is still available in China, something that was exploited by the Clean Flash project to continue making the software more widely and safely available. The Chinese version of Flash receives one security update per month and can be freely downloaded from Flash.cn but also has significant strings attached. It comes preinstalled with an adware program called Flash Helper which, according to security sources, exhibits malicious behavior. Developed by 'darktohka' and previously located on Github, Clean Flash Installer solves these problems and more. "Clean Flash Installer installs this up-to-date freely available version of Flash, but it comes WITHOUT the adware program," darktohka informs TorrentFreak. "As such Clean Flash Installer can be used by anyone to use a relatively secure version of Flash Player after the support for Flash ended."

The developer says that he was inspired to create his tool to keep Flash content alive, something which he says was a huge part of his childhood. Adobe appears to be less enthusiastic about his work and following a DMCA notice filed with Github, the developer platform has nuked the project. In a DMCA complaint filed with Github on October 4, 2021, a legal representative acting for Adobe explains that the Clean Flash Installer project breaches copyright law. "Adobe Inc. is the copyright owner and I am authorized to act on its behalf. Our Adobe Flash Player software has been infringed. The files in question contain our proprietary Adobe Inc. owned copyrighted materials (software code)," it reads, adding that the project must be removed.
"As this is my passion project, I am deeply disappointed with Adobe's action. The repository in question only hosts the installer code for the project, which was written by myself and does not contain any infringing code," explains darktohka. "Adobe Flash was a huge part of our childhood, and it's gut-wrecking that Adobe would rather have everyone use super out-of-date versions of the software when versions with security updates are freely available. It makes no sense for them to DMCA an installer that was written independently and makes use of the freely available and downloadable version of the project."
Science

Humans Probably Can't Live Longer Than 150 Years, New Research Finds (cnet.com) 109

Science is once again casting doubt on the notion that we could live to be nearly as old as the biblical Methuselah or Mel Brooks' 2,000-year-old man. From a report: New research research [PDF] from Singapore-base biotech company Gero looks at how well the human body bounces back from disease, accidents or just about anything else that puts stress on its systems. This basic resilience declines as people age, with an 80-year-old requiring three times as long to recover from stresses as a 40-year-old on average. This should make sense if you've ever known an elderly person who has taken a nasty fall. Recovery from such a spill can be lif- threatening for a particularly frail person, whereas a similar fall might put a person half as old out of commission for just a short time and teenagers might simply dust themselves off and keep going.

Extrapolate this decline further, and human body resilience is completely gone at some age between 120 and 150, according to new analysis performed by the researchers. In other words, at some point your body loses all ability to recover from pretty much any potential stressor. The researchers arrived at this conclusion by looking at health data for large groups from the US, the UK and Russia. They looked at blood cell counts as well as step counts recorded by wearables. As people experienced different stressors, fluctuations in blood cell and step counts showed that recovery time grew longer as individuals grew older. "Aging in humans exhibits universal features common to complex systems operating on the brink of disintegration," Peter Fedichev, co-founder and CEO of Gero, said in a statement.

Data Storage

400 TB Storage Drives In Our Future: Fujifilm (anandtech.com) 51

One of the two leading manufacturers of tape cartridge storage, FujiFilm, claims that they have a technology roadmap through to 2030 which builds on the current magnetic tape paradigm to enable 400 TB per tape. AnandTech reports: As reported by Chris Mellor of Blocks and Files, Fujifilm points to using Strontium Ferrite grains in order to enable an areal data density on tape of 224 Gbit-per-square-inch, which would enable 400 TB drives. IBM and Sony have already demonstrated 201 Gbit-per-square-inch technology in 2017, with a potential release of the technology for high volume production in 2026. Current drives are over an order of magnitude smaller, at 8 Gbit-per-square-inch, however the delay between research and mass production is quite significant.

Strontium Ferrite would replace Barium Ferrite in current LTO cartridges. Strontium sits on a row above Barium in the periodic table, indicating a much smaller atom. This enables for much smaller particles to be placed into tracks, and thankfully according to Fujifilm, Strontium Ferrite exhibits properties along the same lines as Barium Ferrite, but moreso, enabling higher performance while simultaneously increasing particle density. [...] Fujifilm states that 400 TB is the limit of Strontium Ferrite, indicating that new materials would be needed to go beyond. That said, we are talking about only 224 Gbit-per-square-inch for storage, which compared to mechanical hard disks going beyhind 1000 Gbit-per-square-inch today, there would appear to be plenty of room at the top if the technologies could converge.

AI

'Biologically Plausible' Deep Learning Neurons Predict the Chords of Bach (ibm.com) 24

IBM's research blog shares an article about "polyphonic music prediction using the Johann Sebastian Bach chorales dataset" achieved by using "biologically plausible neurons," a new approach to deep learning "that incorporates biologically-inspired neural dynamics and enables in-memory acceleration, bringing it closer to the way in which the human brain works." At IBM Research Europe we have been investigating both Spiking Neural Networks (SNNs) and Artificial Neural Networks (ANNs) for more than a decade, and one day we were struck with the thought: "Could we combine the characteristics of the neural dynamics of a spiking neuron and an ANN?" The answer is yes, we could. More specifically, we have modelled a spiking neuron using a construct comprising two recurrently-connected artificial neurons — we call it a spiking neural unit (SNU)... It enables a reuse of architectures, frameworks, training algorithms and infrastructure. From a theoretical perspective, the unique biologically-realistic dynamics of SNNs become available for the deep learning community...

Furthermore, a spiking neural unit lends itself to efficient implementation in artificial neural network accelerators and is particularly well-suited for applications using in-memory computing. In-memory computing is a promising new approach for AI hardware that takes inspiration from the architecture of the brain, in which memory and computations are combined in the neurons. In-memory computing avoids the energy cost of shuffling data back and forth between separate memory and processors by performing computations in memory — phase change memory technology is a promising candidate for such implementation, which is well understood and is on its way to commercialization in the coming years. Our work involves experimental demonstration of in-memory spiking neural unit implementation that exhibits a robustness to hardware imperfections that is superior to that of other state-of-the-art artificial neural network units...

The task of polyphonic music prediction on the Johann Sebastian Bach dataset was to predict at each time step the set of notes, i.e. a chord, to be played in the consecutive time step. We used an SNU-based architecture with an output layer of sigmoidal neurons that allows a direct comparison of the obtained loss values to these from ANNs. The SNU-based network achieved an average loss of 8.72 and set the SNN state-of-the-art performance for the Bach chorales dataset. An sSNU-based network further reduced the average loss to 8.39 and surpassed corresponding architectures using state-of-the-art ANN units.

Slashdot reader IBMResearch notes that besides being energy-efficient, the results "point towards the broad adoption of more biologically-realistic deep learning for applications in artificial intelligence."
Google

Playing Around With the Fuchsia OS (quarkslab.com) 102

Security and software development company Quarkslab played around with Google's new Fuchsia operating system, which could one day replace Android on smartphones and Chrome OS on laptops. The researchers "decided to give a quick look at Fuchsia, learn about its inner design, security properties, strengths and weaknesses, and find ways to attack it." Here's what they concluded: Fuchsia's micro kernel is called Zircon. It is written in C++. [...] Contrary to every other major OS, it appears rather difficult to target the Zircon kernel directly. A successful RCE (Remote Code Execution) on the world-facing parts of the system (USB, Bluetooth, network stack, etc) will only give you control over the targeted components, but they run in independent userland processes, not in the kernel. From a component, you then need to escalate privileges to the kernel using the limited number of syscalls you can access with the handles you have. Overall, it seems easier to target other components rather than the kernel, and to focus on components that you can talk to via IPC and that you know have interesting handles.

Overall, Fuchsia exhibits interesting security properties compared to other OSes such as Android. A few days of vulnerability research allowed us to conclude that the common programming bugs found in other OSes can also be found in Fuchsia. However, while these bugs can often be considered as vulnerabilities in other OSes, they turn out to be uninteresting on Fuchsia, because their impact is, for the most part, mitigated by Fuchsia's security properties. We note however that these security properties do not -- and in fact, cannot -- hold in the lowest layers of the kernel related to virtualization, exception handling and scheduling, and that any bug here remains exploitable just like on any other OS. All the bugs we found were reported to Google, and are now fixed.

Again, it is not clear where Fuchsia is heading, and whether it is just a research OS as Google claims or a real OS that is vowed to be used on future products. What's clear, though, is that it has the potential to significantly increase the difficulty for attackers to compromise devices.

Hardware

Vulcan Is Closing 'The Living Computers: Museum + Labs' In Seattle (seattletimes.com) 23

Flexagon writes: Buried in the news of several closures by Vulcan, a venture by the late Paul Allen, is that Seattle's Living Computers museum is among the closures, along with Seattle's Cinerama movie theater.

"Two museums under the Vulcan wing, closed because of the pandemic, will also remain shuttered: the Living Computers: Museum + Labs and the Flying Heritage & Combat Armor Museum," reports The Seattle Times. "For both, the Vulcan statement said, the coming months will be a time to evaluate 'if, how and when to reopen.' The Living Computers: Museum + Labs, described on Vulcan's website as 'the world's largest collection of fully restored supercomputers, mainframes, minicomputers and more,' opened in Sodo in 2012 and was expanded in 2016. Its offerings included not only selections from Allen's vast personal collection, but hands-on exhibits on virtual reality, self-driving cars, robotics, and computer-generated art and music."

Medicine

Scientists Find Brain Center That 'Profoundly' Shuts Down Pain (sciencedaily.com) 66

A research team from Duke University has found a small area of the brain in mice that can profoundly shut down pain. "It's located in an area where few people would have thought to look for an anti-pain center, the amygdala, which is often considered the home of negative emotions and responses, like the fight or flight response and general anxiety," reports ScienceDaily. From the report: The researchers found that general anesthesia also activates a specific subset of inhibitory neurons in the central amygdala, which they have called the CeAga neurons (CeA stands for central amygdala; ga indicates activation by general anesthesia). Mice have a relatively larger central amygdala than humans, but [senior author Fan Wang, the Morris N. Broad Distinguished Professor of neurobiology in the School of Medicine] said she had no reason to think we have a different system for controlling pain. Using technologies that Wang's lab has pioneered to track the paths of activated neurons in mice, the team found the CeAga was connected to many different areas of the brain, "which was a surprise," Wang said.

By giving mice a mild pain stimulus, the researchers could map all of the pain-activated brain regions. They discovered that at least 16 brain centers known to process the sensory or emotional aspects of pain were receiving inhibitory input from the CeAga. Using a technology called optogenetics, which uses light to activate a small population of cells in the brain, the researchers found they could turn off the self-caring behaviors a mouse exhibits when it feels uncomfortable by activating the CeAga neurons. Paw-licking or face-wiping behaviors were "completely abolished" the moment the light was switched on to activate the anti-pain center.

When the scientists dampened the activity of these CeAga neurons, the mice responded as if a temporary insult had become intense or painful again. They also found that low-dose ketamine, an anesthetic drug that allows sensation but blocks pain, activated the CeAga center and wouldn't work without it. Now the researchers are going to look for drugs that can activate only these cells to suppress pain as potential future pain killers, Wang said.
The study has been published in the journal Nature Neuroscience.
Electronic Frontier Foundation

Court Upholds Public Right of Access To Court Documents (eff.org) 19

An anonymous reader quotes a report from the Electronic Frontier Foundation: A core part of EFF's mission is transparency and access to information, because we know that in a nation bound by the rule of law, the public must have the ability to know the law and how it is being applied. That's why the default rule is that the public must have full access to court records -- even if those records contain unsavory details. Any departure from that rule must be narrow and well-justified. But litigants and judges aren't always rigorous in upholding that principle. For example, when Brian Fargo sued Jennifer Tejas for allegedly defamatory Instagram posts, he asked that the court seal portions of his filings that contained those posts, references to other people and private medical information. The court granted Fargo's request, with little explanation or apparent care.

That approach set a dangerous precedent for others. The public has a right to know what courts consider defamatory. So, with help from the First Amendment Clinic at UCLA School of Law, EFF and the First Amendment Coalition moved to unseal the records containing the Instagram posts and references to other people. The judge denied that request. Undeterred, we appealed -- and won (PDF download). The appeals court chided the trial court for its failure to adequately justify its sealing order, and its equal failure to make sure the order was narrowly tailored so that as little as possible would be hidden from the public. While it did allow some information to remain sealed -- information related to private medical records can be kept from the public, and pseudonyms should be used in some exhibits to protect the privacy of third parties -- it ordered the rest released.

The Almighty Buck

What the Hell Happened To Mint? (fastcompany.com) 89

An anonymous reader quotes a report from Fast Company: Intuit's Mint personal-finance service wants me to know it's sorry. Again. "We're sorry!" its investments page bleats when I try to view my mutual funds' performance. "Our graphs require the latest version of Adobe Flash player." That site has spent years apologizing to me for needing Adobe's vulnerability-riddled plug-in: since I long ago booted Flash from my browser, since Adobe said in 2017 that it would drop Flash by the end of 2020, since Intuit told me in 2018 that Mint would wean itself from Flash "in the coming months."

But that's in keeping with this fossilized financial tool. Mint still provides a valuable service for free in aggregating transaction data from multiple financial institutions to clarify where your money comes and goes -- and in the bargain suggests hopefully-better financial products from advertisers -- but this app exhibits severe symptoms of neglect. It's as if Mint, with 13 million-plus registered users, were a resource-constrained startup instead of a property of Intuit, the Microsoft of personal finance. But more than a decade after the firm behind TurboTax and QuickBooks (and, until 2016, Quicken) bought Mint for $170 million, neatly taking a competitor off the map, this once-groundbreaking app might as well be streaked with cobwebs.
The report goes on to note the "updates" category of Mint's blog "reveals no new features since April 2019's revised financial-advice interfaces in the mobile apps it introduced soon after the acquisition."

"It could be doing much more," says Aaron Patzer, founder of Mint. He points in particular to the lack of integration between Mint and TurboTax, saying, "I had a dream that TurboTax would take you about five minutes."

Another explanation for why the personal-finance service has gone neglected is the success of TurboTax, which generates roughly 10 to 20 times the revenue of Mint. Fast Company also notes that Mint "benefits from a lack of serious competition," as Quicken requires an annual subscription and remains desktop-bound, and the free Personal Capital web app is more geared toward investment management.
Earth

Do Elephants Belong In Zoos? Extinction Policy Under Scrutiny (conservationaction.co.za) 88

Long-time Slashdot reader retroworks writes: In "Zoos Called It a 'Rescue.' But Are the Elephants Really Better Off?" New York Times reporter Charles Siebert does much to dispel the idea that zoos are a solution to extinction. In the first half of the article, the cruelty of zoos is in focus. "Neuroimaging has shown that elephants possess in their cerebral cortex the same elements of neural wiring we long thought exclusive to us, including spindle and pyramidal neurons, associated with higher cognitive functions like self-recognition, social awareness and language. "

The second half of the article questions whether any current (expensive) efforts to "save" the elephants offers anything more than window dressing. Ted Reilly [founder and executive director of a game preserve] is quoted that, "The greatest threat to wildlife in Africa today is the uncontrolled spread of human sprawl. As far as it sprawls, nature dies. And that's the reality on the ground. It's not the nice idea that people cook up and suggest, but that's the reality. And in my view, an equally important threat, serious threat, is dependence on donor money. If you become dependent on donor money, you will inevitably become dictated to in terms of your policies. And your management integrity will be interfered with. And it's not possible to be totally free of corruptive influences if you're not financially independent."

Does this type of reporting improve the situation, or cause despondence and abandonment of the extinction cause?

The 7,000-word article points out that 22 American zoos had already closed their elephant exhibits (or were phasing them out) by 2012 (according to a depressing study by the Seattle Times).

The New York Times adds that "an increasing awareness of nonhuman animal sentience is now compelling many to question the very existence of zoos."

Slashdot Top Deals