ISS

ISS Astronauts are Safe. But NASA and Russia Disagree on How to Fix Leak (space.com) 39

"NASA has emphasized the ISS crew is in no immediate danger," reports Space.com. "The leaking area in the Russian segment of the orbital complex has been ongoing for five years," and "there was a temporary increase in the leak rate that was patched earlier this year..."

Former astronaut Bob Cabana emphasized that troubleshooting is ongoing during a brief livestreamed meeting on Wednesday. But NASA and Roscosmos "don't have a common understanding of what the likely root causes or the severity of the consequences of these leaks." "The Russian position is that the most probable cause of the cracks is high cycling caused by micro-vibrations," Cabana said, referring to flexing of metal and similar components that heat and cool as the ISS orbits in and out of sunlight. "NASA believes the PrK cracks are likely multi-causal — including pressure and mechanical stress, residual stress, material properties and environmental exposures," Cabana continued.

NASA and Russia disagree about whether "continued operations are safe", he added, but the remedy for now is to keep the hatch closed between the U.S. and Russian side as investigations continue.

The two agencies will continue meeting to seek "common understanding of the structural integrity", Cabana pledged, but he did not provide a timeline. Academic and industry experts will also be consulted.

IT

Second Life for Server Components (ieee.org) 31

Scientists have developed a method to reuse components from decommissioned data center servers, potentially reducing the carbon footprint of cloud computing infrastructure.

The research team from Microsoft, Carnegie Mellon University and the University of Washington demonstrated that older RAM modules and solid-state drives can be safely repurposed in new server builds without compromising performance, according to papers presented at recent computer architecture conferences.

When combined with energy-efficient processors, the prototype servers achieved an 8% reduction in total carbon emissions during Azure cloud service testing. Researchers estimate the approach could cut global carbon emissions by up to 0.2% if widely adopted. The cloud computing industry currently accounts for 3% of global energy consumption and could represent 20% of emissions by 2030, according to computing experts. Most data centers, including Microsoft's Azure, typically replace servers every 3-5 years.
Programming

Will We Care About Frameworks in the Future? (kinlan.me) 67

Paul Kinlan, who leads the Chrome and the Open Web Developer Relations team at Google, asks and answers the question (with a no.): Frameworks are abstractions over a platform designed for people and teams to accelerate their teams new work and maintenance while improving the consistency and quality of the projects. They also frequently force a certain type of structure and architecture to your code base. This isn't a bad thing, team productivity is an important aspect of any software.

I'm of the belief that software development is entering a radical shift that is currently driven by agents like Replit's and there is a world where a person never actually has to manipulate code directly anymore. As I was making broad and sweeping changes to the functionality of the applications by throwing the Agent a couple of prompts here and there, the software didn't seem to care that there was repetition in the code across multiple views, it didn't care about shared logic, extensibility or inheritability of components... it just implemented what it needed to do and it did it as vanilla as it could.

I was just left wondering if there will be a need for frameworks in the future? Do the architecture patterns we've learnt over the years matter? Will new patterns for software architecture appear that favour LLM management?

Canada

Canada Passes New Right To Repair Rules With the Same Old Problem (theregister.com) 16

An anonymous reader quotes a report from The Register: Royal assent was granted to two right to repair bills last week that amend Canada's Copyright Act to allow the circumvention of technological protection measures (TPMs) if this is done for the purposes of "maintaining or repairing a product, including any related diagnosing," and "to make the program or a device in which it is embedded interoperable with any other computer program, device or component." The pair of bills allow device owners to not only repair their own stuff regardless of how a program is written to prevent such non-OEM measures, but said owners can also make their devices work with third-party components without needing to go through the manufacturer to do so.

Bills C-244 (repairability) and C-294 (interoperability) go a long way toward advancing the right to repair in Canada and, as iFixit pointed out, are the first federal laws anywhere that address how TPMs restrict the right to repair -- but they're hardly final. TPMs can take a number of forms, from simple administrative passwords to encryption, registration keys, or even the need for a physical object like a USB dongle to unlock access to copyrighted components of a device's software. Most commercially manufactured devices with proprietary embedded software include some form of TPM, and neither C-244 nor C-294 place any restrictions on the use of such measures by manufacturers. As iFixit points out, neither Copyright Act amendments do anything to expand access to the tools needed to circumvent TPMs. That puts Canadians in a similar position to US repair advocates, who in 2021 saw the US Copyright Office loosen DMCA restrictions to allow limited repairs of some devices despite TPMs, but without allowing access to the tools needed to do so. [...]

Canadian Repair Coalition co-founder Anthony Rosborough said last week that the new repairability and interoperability rules represent considerable progress, but like similar changes in the US, don't actually amount to much without the right to distribute tools. "New regulations are needed that require manufacturers and vendors to ensure that products and devices are designed with accessibility of repairs in mind," Rosborough wrote in an op-ed last week. "Businesses need to be able to carry out their work without the fear of infringing various intellectual property rights."

Earth

SpaceX Alums Find Traction On Earth With Their Mars-Inspired CO2-To-Fuel Tech (techcrunch.com) 49

An anonymous reader quotes a report from TechCrunch: A trend has emerged among a small group of climate tech founders who start with their eyes fixed on space and soon realize their technology would do a lot more good here on Earth. Halen Mattison and Luke Neise fit the bill. Mattison spent time at SpaceX, while Neise worked at Vanderbilt Aerospace Design Laboratory and Varda Space Industries. The pair originally wanted to sell reactors to SpaceX that could turn carbon dioxide into methane for use on Mars. Today, they're building them to replace natural gas that's pumped from underground. Their company, General Galactic, which emerged from stealth in April, has built a pilot system that can produce 2,000 liters of methane per day. Neise, General Galactic's CTO, told TechCrunch that he expects that figure to rise as the company replaces off-the-shelf components with versions designed in-house.

"We think that's a big missing piece in the energy mix right now," said Mattison, the startup's CEO. "Being able to own our supply chains, to be able to fully control all of the parameters, to challenge the requirements between components, all of that unlocks some real elegance in the engineering solution." At commercial scale, the company's reactors will be assembled using mass production techniques. It's a contrast to how most petrochemical and energy facilities are built today. General Galactic is focused on producing methane. However, Mattison said the company isn't necessarily looking to displace the fuel from heating and energy. "Those are generally going toward electrification," he said. Instead, it intends to sell its methane to companies that use it as an ingredient or to power a process, like in chemical or plastic manufacturing. The company isn't ruling out transportation entirely either. Mattison hinted that General Galactic is working on other hydrocarbons that could be used for transportation, like jet fuel. "Stay tuned," he said.
General Galactic plans to deploy its first modules next year. The startup "hopes its modules will be able to plug into existing infrastructure, speeding its adoption relative to other fuels like hydrogen," notes TechCrunch.
AMD

AMD's Desktop PC Market Share Skyrockets Amid Intel's Raptor Lake CPU Crashing Scandal (tomshardware.com) 33

An anonymous reader shares a report: AMD has gained a substantial 5.7 percentage points of share of the desktop x86 CPU market in the third quarter compared to Q2, the largest quarterly share gain since we began tracking the market share reports in 2016. It also represents an incredible ten percentage point improvement over the prior year. AMD also raked in a strong increase in revenue share, jumping 8.5 percentage points over the prior quarter, indicating that it is selling a strong mix of higher-end CPU models.

During the quarter, AMD launched its new Ryzen 9000-series family of processors amid a scandal related to stability issues with Intel's Raptor Lake chips, which generated a flood of negative press for the company over the course of several months, and inventory adjustments for one of Intel's customers. AMD now commands 28.7% of the desktop processor market. AMD also continued to gain share in the laptop and server markets, though its gains on the desktop side of the business were the most impressive, according to Mercury Research.

Space

World's First Wood-Paneled Satellite Launched Into Space (bbc.com) 47

SpaceX has launched the world's first wood-paneled satellite into space "to test the suitability of timber as a renewable building material in future exploration of destinations like the Moon and Mars," reports the BBC. From the report: Made by researchers in Japan, the tiny satellite weighing just 900g is heading for the International Space Station on a SpaceX mission. It will then be released into orbit above the Earth. Named LignoSat, after the Latin word for wood, its panels have been built from a type of magnolia tree, using a traditional technique without screws or glue. Researchers at Kyoto University who developed it hope it may be possible in the future to replace some metals used in space exploration with wood.

"Wood is more durable in space than on Earth because there's no water or oxygen that would rot or inflame it," Kyoto University forest science professor Koji Murata told Reuters news agency. "Early 1900s airplanes were made of wood," Prof Murata said. "A wooden satellite should be feasible, too." If trees could one day be planted on the Moon or Mars, wood might also provide material for colonies in space in the future, the researchers hope. Along with its wood panels, LignoSat also incorporates traditional aluminium structures and electronic components. It has sensors on board to monitor how its wood reacts to the extreme environment of space during the six months it will orbit the Earth.
You can watch the launch on YouTube.
AMD

AMD Overtakes Intel in Datacenter Sales For First Time (tomshardware.com) 48

AMD has surpassed Intel in datacenter processor sales for the first time in history, marking a dramatic shift in the server chip market. AMD's datacenter revenue hit $3.549 billion in Q3, edging out Intel's $3.3 billion, according to SemiAnalysis.

The milestone ends Intel's decades-long dominance in server processors, where it held over 90% market share until recent years. AMD's EPYC processors now power many high-end servers, commanding premium prices despite selling at lower costs than comparable Intel chips.
Media

FFmpeg Devs Boast of Up To 94x Performance Boost After Implementing Handwritten AVX-512 Assembly Code (tomshardware.com) 135

Anton Shilov reports via Tom's Hardware: FFmpeg is an open-source video decoding project developed by volunteers who contribute to its codebase, fix bugs, and add new features. The project is led by a small group of core developers and maintainers who oversee its direction and ensure that contributions meet certain standards. They coordinate the project's development and release cycles, merging contributions from other developers. This group of developers tried to implement a handwritten AVX512 assembly code path, something that has rarely been done before, at least not in the video industry.

The developers have created an optimized code path using the AVX-512 instruction set to accelerate specific functions within the FFmpeg multimedia processing library. By leveraging AVX-512, they were able to achieve significant performance improvements -- from three to 94 times faster -- compared to standard implementations. AVX-512 enables processing large chunks of data in parallel using 512-bit registers, which can handle up to 16 single-precision FLOPS or 8 double-precision FLOPS in one operation. This optimization is ideal for compute-heavy tasks in general, but in the case of video and image processing in particular.

The benchmarking results show that the new handwritten AVX-512 code path performs considerably faster than other implementations, including baseline C code and lower SIMD instruction sets like AVX2 and SSSE3. In some cases, the revamped AVX-512 codepath achieves a speedup of nearly 94 times over the baseline, highlighting the efficiency of hand-optimized assembly code for AVX-512.

Apple

Apple Delays Cut-price Vision Headset Until 2027, Analyst Ming-Chi Kuo Says 24

Apple has scrapped plans for a budget mixed-reality headset initially slated for 2025, pushing the launch to 2027, according to supply chain analyst Ming-Chi Kuo. The company will instead focus on releasing an upgraded Vision Pro next year featuring its M5 chip and enhanced AI capabilities, he said. The canceled lower-cost model would have stripped features like EyeSight and used cheaper components to target mainstream consumers.
Open Source

New 'Open Source AI Definition' Criticized for Not Opening Training Data (slashdot.org) 38

Long-time Slashdot reader samj — also a long-time Debian developertells us there's some opposition to the newly-released Open Source AI definition. He calls it a "fork" that undermines the original Open Source definition (which was originally derived from Debian's Free Software Guidelines, written primarily by Bruce Perens), and points us to a new domain with a petition declaring that instead Open Source shall be defined "solely by the Open Source Definition version 1.9. Any amendments or new definitions shall only be recognized with clear community consensus via an open and transparent process."

This move follows some discussion on the Debian mailing list: Allowing "Open Source AI" to hide their training data is nothing but setting up a "data barrier" protecting the monopoly, disabling anybody other than the first party to reproduce or replicate an AI. Once passed, OSI is making a historical mistake towards the FOSS ecosystem.
They're not the only ones worried about data. This week TechCrunch noted an August study which "found that many 'open source' models are basically open source in name only. The data required to train the models is kept secret, the compute power needed to run them is beyond the reach of many developers, and the techniques to fine-tune them are intimidatingly complex. Instead of democratizing AI, these 'open source' projects tend to entrench and expand centralized power, the study's authors concluded."

samj shares the concern about training data, arguing that training data is the source code and that this new definition has real-world consequences. (On a personal note, he says it "poses an existential threat to our pAI-OS project at the non-profit Kwaai Open Source Lab I volunteer at, so we've been very active in pushing back past few weeks.")

And he also came up with a detailed response by asking ChatGPT. What would be the implications of a Debian disavowing the OSI's Open Source AI definition? ChatGPT composed a 7-point, 14-paragraph response, concluding that this level of opposition would "create challenges for AI developers regarding licensing. It might also lead to a fragmentation of the open-source community into factions with differing views on how AI should be governed under open-source rules." But "Ultimately, it could spur the creation of alternative definitions or movements aimed at maintaining stricter adherence to the traditional tenets of software freedom in the AI age."

However the official FAQ for the new Open Source AI definition argues that training data "does not equate to a software source code." Training data is important to study modern machine learning systems. But it is not what AI researchers and practitioners necessarily use as part of the preferred form for making modifications to a trained model.... [F]orks could include removing non-public or non-open data from the training dataset, in order to train a new Open Source AI system on fully public or open data...

[W]e want Open Source AI to exist also in fields where data cannot be legally shared, for example medical AI. Laws that permit training on data often limit the resharing of that same data to protect copyright or other interests. Privacy rules also give a person the rightful ability to control their most sensitive information — like decisions about their health. Similarly, much of the world's Indigenous knowledge is protected through mechanisms that are not compatible with later-developed frameworks for rights exclusivity and sharing.

Read on for the rest of their response...
Movies

ASWF: the Open Source Foundation Run By the Folks Who Give Out Oscars (theregister.com) 18

This week's Ubuntu Summit 2024 was attended by Lproven (Slashdot reader #6,030). He's also a FOSS correspondent for the Register, where he's filed this report: One of the first full-length sessions was presented by David Morin, executive director of the Academy Software Foundation, introducing his organization in a talk about Open Source Software for Motion Pictures. Morin linked to the Visual Effects Society's VFX/Animation Studio Workstation Linux Report, highlighting the market share pie-chart, showing Rocky Linux 9 with at some 58 percent and the RHELatives in general at 90 percent of the market. Ubuntu 22 and 24 — the report's nomenclature, not this vulture's — got just 10.5 percent. We certainly didn't expect to see that at an Ubuntu event, with the latest two versions of Rocky Linux taking 80 percent of the studio workstation market...

What also struck us over the next three quarters of an hour is that Linux and open source in general seem to be huge components of the movie special effects industry — to an extent that we had not previously realized.

There's a "sizzle reel" showing examples of how major motion pictures used OpenColorIO, an open-source production tool for syncing color representations originally developed by Sony Pictures Imageworks. That tool is hosted by a collaboration between the Linux Foundation with the Science and Technology Council of the Academy of Motion Picture Arts and Sciences (the "Academy" of the Academy Awards). The collaboration — which goes by the name of the Academy Software Foundation — hosts 14 different projects The ASWF hasn't been around all that long — it was only founded in 2018. Despite the impact of the COVID pandemic, by 2022 it had achieved enough to fill a 45-page history called Open Source in Entertainment [PDF]. Morin told the crowd that it runs events, provides project marketing and infrastructure, as well as funding, training and education, and legal assistance. It tries to facilitate industry standards and does open source evangelism in the industry. An impressive list of members — with 17 Premier companies, 16 General ones, and another half a dozen Associate members — shows where some of the money comes from. It's a big list of big names. [Adobe, AMD, AWS, Autodesk...]
The presentation started with OpenVBD, a C++ library developed and donated by Dreamworks for working with three-dimensional voxel-based shapes. (In 2020 they created this sizzle reel, but this year they've unveiled a theme song.) Also featured was OpenEXR, originally developed at Industrial Light and Magic and sourced in 1999. (The article calls it "a specification and reference implementation of the EXR file format — a losslessly compressed image storage format for moving images at the highest possible dynamic range.")

"For an organization that is not one of the better-known ones in the FOSS space, we came away with the impression that the ASWF is busy," the article concludes. (Besides running Open Source Days and ASWF Dev Days, it also hosts several working groups like the Language Interop Project works on Rust bindings and the Continuous Integration Working Group on CI tools, There's generally very little of the old razzle-dazzle in the Linux world, but with the demise of SGI as the primary maker of graphics workstations — its brand now absorbed by Hewlett Packard Enterprise — the visual effects industry moved to Linux and it's doing amazing things with it. And Kubernetes wasn't even mentioned once.
Power

The 'Passive Housing' Trend is Booming (yahoo.com) 145

The Washington Post reports that a former Etsy CEO remodeled their home into what's known as a passive house. It's "designed to be as energy efficient as possible, typically with top-notch insulation and a perfect seal that prevents outside air from penetrating the home; air flows in and out through filtration and exhaust systems only."

Their benefits include protection from pollution and pollen, noise insulation and a stable indoor temperature that minimizes energy needs. That translates to long-term savings on heating and cooling.

While the concept has been around for about 50 years, experts say that the United States is on the cusp of a passive house boom, driven by lowered costs, state-level energy code changes and a general greater awareness of — and desire for — more sustainable housing... Massachusetts — which alongside New York and Pennsylvania is one of the leading states in passive house adoption — has 272 passive house projects underway thanks to an incentive program, says Zack Semke [the director of the Passive House Accelerator, a group of industry professionals who aim to spread lessons in passive house building]. Consumer demand for passive houses is also increasing, says Michael Ingui, an architect in New York City and the founder of the Passive House Accelerator... The need to lower our energy footprint is so much more top-of-mind today than it was 10 years ago, Ingui says, and covid taught us about the importance of good ventilation and filtered fresh air. "People are searching for the healthiest house," he says, "and that's a passive house...."

These days, new passive houses are usually large, multifamily apartment buildings or high-end single-family homes. But that leaves out a large swath of homeowners in the middle. To widen passive house accessibility to include all types of people and their housing needs, we need better energy codes and even more policies and incentives, says In Cho, a sustainability architect, educator and a co-founder of the nonprofit Passive House for Everyone! Passive houses "can and should serve folks from all socioeconomic backgrounds," she says. Using a one-two punch of mandates for energy efficient buildings and greater awareness to the public, that increased demand for passive houses will lead to more supply, Cho says. And we're already seeing those changes in the market.

Take triple-pane windows, for example, which are higher performing and more insulating than their double-pane counterparts. Even just 10 to 20 years ago, the difference in price between the two was high enough to make triple-pane windows cost-prohibitive for a lot of people, Cho says. Over the years, as the benefits of higher performing windows became more well-known, and as cities and states changed their energy codes, more companies began producing better windows. Now they're basically at price parity, she says. If we keep pushing for greater awareness and further policy changes, it's possible that all of the components of passive house buildings could follow that trend.

"For large multifamily projects, we're already seeing price parity in some cases, Semke says...

"But as it stands, single-family passive houses are still likely to cost a margin more than non-passive houses, he says. This is because price parity is easier to achieve when working at larger scales, but also because many of the housing policies and incentives encouraging passive house buildings are geared toward these larger projects."
China

How America's Export Controls Failed to Keep Cutting-Edge AI Chips from China's Huawei (stripes.com) 40

An anonymous reader shared this report from the Washington Post: A few weeks ago, analysts at a specialized technological lab put a microchip from China under a powerful microscope. Something didn't look right... The microscopic proof was there that a chunk of the electronic components from Chinese high-tech champion Huawei Technologies had been produced by the world's most advanced chipmaker, Taiwan Semiconductor Manufacturing Company.

That was a problem because two U.S. administrations in succession had taken actions to assure that didn't happen. The news of the breach of U.S. export controls, first reported in October by the tech news site the Information, has sent a wave of concern through Washington... The chips were routed to Huawei through Sophgo Technologies, the AI venture of a Chinese cryptocurrency billionaire, according to two people familiar with the matter, speaking on the condition of anonymity to discuss a sensitive topic... "It raises some fundamental questions about how well we can actually enforce these rules," said Emily Kilcrease, a senior fellow at the Center for a New American Security in Washington... Taiwan's Ministry of Economic Affairs confirmed that TSMC recently halted shipments to a "certain customer" and notified the United States after suspecting that customer might have directed its products to Huawei...

There's been much intrigue in recent days in the industry over how the crypto billionaire's TSMC-made chips reportedly ended up at Huawei. Critics accuse Sophgo of working to help Huawei evade the export controls, but it is also possible that they were sold through an intermediary, which would align with Sophgo's denial of having any business relationship with Huawei... While export controls are often hard to enforce, semiconductors are especially hard to manage due to the large and open nature of the global chip trade. Since the Biden administration implemented sweeping controls in 2022, there have been reports of widespread chip smuggling and semiconductor black markets allowing Chinese companies to access necessary chips...

Paul Triolo, technology policy lead at Albright Stonebridge Group, said companies were trying to figure out what lengths they had to go to for due diligence: "The guidelines are murky."

AMD

Chip Designers Recall the Big AMD-Intel Battle Over x86-64 Support (tomshardware.com) 47

Tom's Hardware reports on some interesting hardware history being shared on X.com: AMD engineer Phil Park identified a curious nugget of PC architectural history from, of all places, a year-old Quora answer posted by former Intel engineer [and Pentium Pro architect] Robert Colwell. The nugget indicates that Intel could have beaten AMD to the x86-64 punch if the former wasn't dead-set on the x64-only Itanium line of CPUs.
Colwell had responded on Quora to the question "Shouldn't Intel with its vast resources have been able to develop both architectures?" This was a marketing decision by Intel — they believed, probably rightly, that bringing out a new 64-bit feature in the x86 would be perceived as betting against their own native-64-bit Itanium, and might well severely damage Itanium's chances. I was told, not once, but twice, that if I "didn't stop yammering about the need to go 64-bits in x86 I'd be fired on the spot" and was directly ordered to take out that 64-bit stuff. I decided to split the difference, by leaving in the gates but fusing off the functionality. That way, if I was right about Itanium and what AMD would do, Intel could very quickly get back in the game with x86. As far as I'm concerned, that's exactly what did happen.
Phil Park continued the discussion on X.com. "He didn't quite get what he wanted, but he got close since they had x86-64 support in subsequent products when Intel made their comeback." (So, Park posted later in the thread, "I think he won the long game.")

Park also shared a post from Nicholas Wilt (NVIDIA CUDA designer who earlier did GPU computing work at Microsoft and built the prototype for Windows Desktop Manager): I have an x86-64 story of my own. I pressed a friend at AMD to develop an alternative to Itanium. "For all the talk about Wintel," I told him, "these companies bear no love for one another. If you guys developed a 64-bit extension of x86, Microsoft would support it...."

Interesting coda: When it became clear that x86-64 was beating Itanium in the market, Intel reportedly petitioned Microsoft to change the architecture and Microsoft told Intel to pound sand.

Power

Amazon Joins Push For Nuclear Power To Meet Data Center Demand (reuters.com) 83

Amazon said on Wednesday it has signed three agreements on developing the nuclear power technology called small modular reactors, becoming the latest big tech company to push for new sources to meet surging electricity demand from data centers. From a report: Amazon said it will fund a feasibility study for an SMR project near a Northwest Energy site in Washington state. The SMR is planned to be developed by X-Energy. Financial details were not disclosed. Under the agreement, Amazon will have the right to purchase electricity from four modules. Energy Northwest, a consortium of state public utilities, will have the option to add up to eight 80 MW modules, resulting in a total capacity up to 960 MWs, or enough to power the equivalent of more than 770,000 U.S. homes. The additional power would be available to Amazon and utilities to power homes and businesses. "Our agreements will encourage the construction of new nuclear technologies that will generate energy for decades to come," said Matt Garman, CEO of Amazon Web Services. SMRs will have their components built in a factory to reduce construction costs. [...]

Amazon said it is also leading a funding round for $500 million to support X-Energy's development of SMRs. Amazon and X-Energy aim to bring more than 5 gigawatts online in the United States by 2039, which the companies call the largest commercial deployment target of SMRs yet. Amazon also signed an agreement with Dominion Energy, opens new tab to explore the development of an SMR project near the utility's existing power station in Virginia. The about 300 megawatt project would help meet power needs in a region where demand is expected to jump 85% in 15 years, Dominion said.

Data Storage

SSD Prices Set To Fall 10% in Q4 as AI PC Demand Lags - TrendForce (tomshardware.com) 30

SSD prices are set to drop up to 10% in Q4 2024, market research firm TrendForce has reported. The decline stems from increased production and weakening demand, particularly in the consumer sector. Enterprise SSD prices, however, may see a slight increase. TrendForce analysts attribute the softer demand partly to slower-than-expected adoption of AI PCs. The mobile storage market could experience even steeper price cuts, with eMMC and UFS components potentially falling 13% as smartphone makers deplete inventories. The forecast follows modest price reductions observed in Q3 2024.
Medicine

Human Sense of Smell Is Faster Than Previously Thought, New Study Suggests 26

A new study reveals that the human sense of smell is far more sensitive than previously thought, capable of distinguishing odors and their sequences within just 60 milliseconds. CNN reports: In a single sniff, the human sense of smell can distinguish odors within a fraction of a second, working at a level of sensitivity that is "on par" with how our brains perceive color, "refuting the widely held belief that olfaction is our slow sense," a new study finds. Humans also can discern between various sequences of odors -- distinguishing a sequence of "A" before "B" from sequence "B" before "A" -- when the interval between odorant A and odorant B is merely 60 milliseconds, according to the study, published Monday in the journal Nature Human Behavior. [...]

The new findings challenge previous research in which the timing it took to discriminate between odor sequences was around 1,200 milliseconds, Dr. Dmitry Rinberg, a professor in the Department of Neuroscience and Physiology at NYU Langone Health in New York, wrote in an editorial accompanying the study in Nature Human Behavior. "The timing of individual notes in music is essential for conveying meaning and beauty in a melody, and the human ear is very sensitive to this. However, temporal sensitivity is not limited to hearing: our sense of smell can also perceive small temporal changes in odor presentations," he wrote. "Similar to how timing affects the perception of notes in a melody, the timing of individual components in a complex odor mixture that reaches the nose may be crucial for our perception of the olfactory world."

The ability to tell apart odors within a single sniff might be an important way in which animals detect both what a smell is and where it might be in space, said Dr. Sandeep Robert Datta, a professor in the Department of Neurobiology at Harvard Medical School, who was not involved in the new study. "The demonstration that humans can tell apart smells as they change within a sniff is a powerful demonstration that timing is important for smell across species, and therefore is a general principle underlying olfactory function. In addition, this study sheds important light on the mysterious mechanisms that support human odor perception," Datta wrote in an email. "The study of human olfaction has historically lagged that of vision and hearing, because as humans we think of ourselves as visual creatures that largely use speech to communicate," he said, adding that the new study helps "fill a critical gap in our understanding of how we as humans smell."
Intel

Intel Unveils Arrow Lake Desktop Processors, Promising Power Efficiency Gains (pcworld.com) 46

Intel has announced its new Arrow Lake desktop processors, marking a significant shift in the company's approach to chip design and power efficiency. The Core Ultra 200S series, set to launch on October 24, 2024, introduces a disaggregated architecture manufactured using TSMC's advanced nodes.

The flagship Core Ultra 9 285K boasts 24 cores (8 performance, 16 efficiency) and can boost up to 5.7 GHz, priced at $589. Intel claims the new chips offer comparable performance to their predecessors while consuming significantly less power, with reductions of up to 136 watts in some gaming scenarios.

Arrow Lake utilizes a tiled design, combining compute, GPU, SoC, and I/O components manufactured by TSMC and packaged using Intel's Foveros technology. The compute tile is built on TSMC's N3B process, while the GPU tile uses TSMC's N5P, and the I/O and SoC tiles are on TSMC's N6. Intel's Roger Chandler stated, "Arrow Lake picks up the mantle of Raptor Lake's top-end gaming performance and delivers parity performance at about half the power."

Intel acknowledges that gaming performance may lag slightly behind the previous generation, with a 5% deficit in some benchmarks compared to the Core i9-14900K. The company is positioning Arrow Lake as a balanced solution, emphasizing power efficiency and content creation capabilities. The new processors require a new LGA 1851 socket and Z890 chipset, necessitating motherboard upgrades. Memory support extends to DDR5-6400, with XMP profiles potentially reaching DDR5-8000.
Earth

Plastic-Eating Bacteria Could Combat Pollution Problems, Scientists Hope (msn.com) 68

The Washington Post on scientists who "discovered that bacteria commonly found in wastewater can break down plastic to turn it into a food source, a finding that researchers hope could be a promising answer to combat one of Earth's major pollution problems." In a study published Thursday in Environmental Science and Technology, scientists laid out their examination of Comamonas testosteroni, a bacteria that grows on polyethylene terephthalate, or PET, a plastic commonly found in single-use food packaging and water bottles. PET makes up about 12 percent of global solid waste and 90 million tons of the plastic produced each year... Unlike most other bacteria, which thrive on sugar, C. testosteroni has a more refined palate, including chemically complex materials from plants and plastics that take longer to decompose.

The researchers are the first to demonstrate not only that this bacteria can break down plastic, but they also illuminate exactly how they do it. Through six meticulous steps, involving complex imaging and gene editing techniques, the authors found that the bacteria first physically break down plastic by chewing it into smaller pieces. Then, they release enzymes — components of a cell that speed up chemical reactions — to chemically break down the plastic into a carbon-rich food source known as terephthalate...

The bacteria take a few months to break down chunks of plastic, according to Rebecca Wilkes [a lead author on the study and postdoctoral researcher at the National Renewable Energy Laboratory]. As a result, if the bacteria are going to be efficient tools, a lot of optimization needs to take place to speed up the rate at which they decompose pollutants. One approach is to promote bacterial growth by providing them with an additional food source, such as a chemical known as acetate.

A senior author on the study (and associate professor of civil and environmental engineering at Northwestern University) tells the Washington Post that "The machinery in environmental microbes is still a largely untapped potential for uncovering sustainable solutions we can exploit."

Slashdot Top Deals