×
Wireless Networking

SpaceX Gets E-Band Radio Waves To Boost Starlink Broadband (spacenews.com) 26

Jason Rainbow reports via SpaceNews: SpaceX has secured conditional approval to use extremely high-frequency E-band radio waves to improve the capacity of its low Earth orbit Starlink broadband constellation. The Federal Communications Commission said March 8 it is allowing SpaceX to use E-band frequencies between second-generation Starlink satellites and gateways on the ground, alongside already approved spectrum in the Ka and Ku bands. Specifically, SpaceX is now also permitted to communicate between 71 and 76 gigahertz from space to Earth, and 81-86 GHz Earth-to-space, using the up to 7,500 Gen2 satellites SpaceX is allowed to deploy.

SpaceX has plans for 30,000 Gen2 satellites, on top of the 4,400 Gen1 satellites already authorized by the FCC. However, the FCC deferred action in December 2022 on whether to allow SpaceX to deploy the other three-quarters of its Gen2 constellation, which includes spacecraft closer to Earth to improve broadband speeds. The regulator also deferred action at the time on SpaceX's plans to use E-band frequencies, citing a need to first establish ground rules for using them in space. In a March 8 regulatory filing, the FCC said it found "SpaceX's proposed operations in the E-band present no new or increased frequency conflicts with other satellite operations." But the order comes with multiple conditions, including potentially forcing SpaceX to modify operations if another satellite operator also seeks to use the radio waves.

Power

New 'Water Batteries' Are Cheaper, Recyclable, And Won't Explode (sciencealert.com) 73

Clare Watson reports via ScienceAlert: By replacing the hazardous chemical electrolytes used in commercial batteries with water, scientists have developed a recyclable 'water battery' -- and solved key issues with the emerging technology, which could be a safer and greener alternative. 'Water batteries' are formally known as aqueous metal-ion batteries. These devices use metals such as magnesium or zinc, which are cheaper to assemble and less toxic than the materials currently used in other kinds of batteries.

Batteries store energy by creating a flow of electrons that move from the positive end of the battery (the cathode) to the negative end (the anode). They expend energy when electrons flow the opposite way. The fluid in the battery is there to shuttle electrons back and forth between both ends. In a water battery, the electrolytic fluid is water with a few added salts, instead of something like sulfuric acid or lithium salt. Crucially, the team behind this latest advancement came up with a way to prevent these water batteries from short-circuiting. This happens when tiny spiky metallic growths called dendrites form on the metal anode inside a battery, busting through battery compartments. [...]

To inhibit this, the researchers coated the zinc anode of the battery with bismuth metal, which oxidizes to form rust. This creates a protective layer that stops dendrites from forming. The feature also helps the prototype water batteries last longer, retaining more than 85 percent of their capacity after 500 cycles, the researchers' experiments showed. According to Royce Kurmelovs at The Guardian, the team has so far developed water-based prototypes of coin-sized batteries used in clocks, as well as cylindrical batteries similar to AA or AAA batteries. The team is working to improve the energy density of their water batteries, to make them comparable to the compact lithium-ion batteries found inside pocket-sized devices. Magnesium is their preferred material, lighter than zinc with a greater potential energy density. [I]f magnesium-ion batteries can be commercialized, the technology could replace bulky lead-acid batteries within a few years.
The study has been published in the journal Advanced Materials.
Microsoft

Microsoft Engineer Warns Company's AI Tool Creates Violent, Sexual Images, Ignores Copyrights (cnbc.com) 75

An anonymous reader shares a report: On a late night in December, Shane Jones, an AI engineer at Microsoft, felt sickened by the images popping up on his computer. Jones was noodling with Copilot Designer, the AI image generator that Microsoft debuted in March 2023, powered by OpenAI's technology. Like with OpenAI's DALL-E, users enter text prompts to create pictures. Creativity is encouraged to run wild. Since the month prior, Jones had been actively testing the product for vulnerabilities, a practice known as red-teaming. In that time, he saw the tool generate images that ran far afoul of Microsoft's oft-cited responsible AI principles.

The AI service has depicted demons and monsters alongside terminology related to abortion rights, teenagers with assault rifles, sexualized images of women in violent tableaus, and underage drinking and drug use. All of those scenes, generated in the past three months, have been recreated by CNBC this week using the Copilot tool, which was originally called Bing Image Creator. "It was an eye-opening moment," Jones, who continues to test the image generator, told CNBC in an interview. "It's when I first realized, wow this is really not a safe model."

Jones has worked at Microsoft for six years and is currently a principal software engineering manager at corporate headquarters in Redmond, Washington. He said he doesn't work on Copilot in a professional capacity. Rather, as a red teamer, Jones is among an army of employees and outsiders who, in their free time, choose to test the company's AI technology and see where problems may be surfacing. Jones was so alarmed by his experience that he started internally reporting his findings in December. While the company acknowledged his concerns, it was unwilling to take the product off the market. Jones said Microsoft referred him to OpenAI and, when he didn't hear back from the company, he posted an open letter on LinkedIn asking the startup's board to take down DALL-E 3 (the latest version of the AI model) for an investigation.

China

China To Debut Large Reusable Rockets In 2025 and 2026 (spacenews.com) 43

Andrew Jones reports via SpaceNews: The China Aerospace Science and Technology Corporation (CASC) plans to launch four-meter and five-meter-diameter reusable rockets for the first time in 2025 and 2026 respectively, Wang Wei, a deputy to the National People's Congress, told China News Service March 4. The reports do not clearly identify the two rockets. CASC is known to be developing a new, 5.0m-diameter crew launch vehicle, known as the Long March 10. A single stick version would be used to launch a new-generation crew spacecraft to low Earth orbit and could potentially fly in 2025. A three-core variant will launch the "Mengzhou" crew spacecraft into trans-lunar orbit.

The rocket is key to China's plans to put astronauts on the moon before 2030. The Long March 10 lunar variant will be 92 meters long and be able to launch 27 tons into trans-lunar orbit. The 4.0-meter-diameter launcher could be a rocket earlier proposed by CASC's Shanghai Academy of Spaceflight Technology (SAST). That rocket would be able to launch up to 6,500 kg of payload to 700-kilometer sun-synchronous orbit (SSO). It would notably use engines developed by the commercial engine maker Jiuzhou Yunjian.

CASC's first move to develop a reusable rocket centered on making a recoverable version of the Long March 8. That plan appears to have been abandoned. SAST also plans to debut the 3.8m-diameter Long March 12 later this year from a new commercial launch site. While the Long March 10 has specific, defined uses for lunar and human spaceflight, the second reusable rocket would appear to be in competition with China's commercial rocket companies. While this suggests duplication of effort, it also fits into a national strategy to develop reusable rockets and support commercial ecosystems. The moves would greatly boost China's options for launch and access to space. It would also provide new capacity needed to help construction planned low Earth orbit megaconstellations.

Microsoft

Microsoft Accuses the New York Times of Doom-Mongering in OpenAI Lawsuit (engadget.com) 55

Microsoft has filed a motion seeking to dismiss key parts of a lawsuit The New York Times filed against the company and Open AI, accusing them of copyright infringement. From a report: If you'll recall, The Times sued both companies for using its published articles to train their GPT large language models (LLMs) without permission and compensation. In its filing, the company has accused The Times of pushing "doomsday futurology" by claiming that AI technologies pose a threat to independent journalism. It follows OpenAI's court filing from late February that's also seeking to dismiss some important elements on the case.

Like OpenAI before it, Microsoft accused The Times of crafting "unrealistic prompts" in an effort to "coax the GPT-based tools" to spit out responses matching its content. It also compared the media organization's lawsuit to Hollywood studios' efforts to " stop a groundbreaking new technology:" The VCR. Instead of destroying Hollywood, Microsoft explained, the VCR helped the entertainment industry flourish by opening up revenue streams. LLMs are a breakthrough in artificial intelligence, it continued, and Microsoft collaborated with OpenAI to "help bring their extraordinary power to the public" because it "firmly believes in LLMs' capacity to improve the way people live and work."

NASA

Blue Origin Targets 2025 For Cargo Lander's Inaugural Moon Trip, With Humans To Follow (geekwire.com) 19

In an update on CBS' "60 Minutes" on Sunday, Blue Origin said it was aiming to send an uncrewed lander to the surface of the moon in the next 12 to 16 months. A crewed version is expected to follow. GeekWire reports: "We're expecting to land on the moon between 12 and 16 months from today," [said John Couluris, senior vice president for lunar permanence at Blue Origin]. "I understand I'm saying that publicly, but that's what our team is aiming towards." Couluris was referring to a pathfinder version of Blue Origin's nearly three-story-tall Blue Moon Mark 1 cargo lander, which is taking shape at Blue Origin's production facility in Huntsville, Ala. The Pathfinder Mission would demonstrate the MK1's capabilities -- including its hydrogen-fueled BE-7 engine, its precision landing system and its ability to deliver up to 3 tons of payload anywhere on the moon.

Blue Origin envisions building multiple cargo landers, as well as a crewed version of the Blue Moon lander that could transport NASA astronauts to and from the lunar surface. The MK1 cargo lander is designed for a single launch and delivery, but the crewed lander would be reusable. "We'll launch them to lunar orbit, and we'll leave them there," Couluris explained. "And we'll refuel them in orbit, so that multiple astronauts can use the same vehicle back and forth."

The Pathfinder Mission would be funded by Blue Origin, but NASA is providing support for other Blue Moon missions. Blue Origin's $3.4 billion contract with NASA calls for the crewed lander to be available for the Artemis 5 moon mission by 2029, with an uncrewed test flight as part of the buildup. The in-space refueling operation would make use of a cislunar transporter, built by Lockheed Martin, that could travel between low Earth orbit and lunar orbit with supplies. "We are now building with NASA the infrastructure to ensure lunar permanency," Couluris said. NASA is providing funding for the Blue Moon landing system as an alternative to SpaceX's Starship system, which is under development at SpaceX's Starbase in South Texas. The crewed Starship lunar lander is scheduled to come into play for Artemis 3, a milestone landing mission that's currently scheduled for 2026. [...]

Blue Origin plans to send the MK1 lander to the moon on its reusable New Glenn rocket, which is also under development. A couple of weeks ago, a pathfinder version of that rocket was raised on a Florida launch pad for the first time, and it's currently going through a series of cryogenic tanking tests. Blue Origin CEO Dave Limp, who was brought over to the company from Amazon last year to accelerate work on New Glenn, said in a LinkedIn post that he's "looking forward to bringing this heavy-lift capacity to our customers later this year." One of the early launches is tasked with sending a pair of NASA probes to Mars.

AI

How AI is Taking Water From the Desert (msn.com) 108

Microsoft built two datacenters west of Phoenix, with plans for seven more (serving, among other companies, OpenAI). "Microsoft has been adding data centers at a stupendous rate, spending more than $10 billion on cloud-computing capacity in every quarter of late," writes the Atlantic. "One semiconductor analyst called this "the largest infrastructure buildout that humanity has ever seen."

But is this part of a concerning trend? Microsoft plans to absorb its excess heat with a steady flow of air and, as needed, evaporated drinking water. Use of the latter is projected to reach more than 50 million gallons every year. That might be a burden in the best of times. As of 2023, it seemed absurd. Phoenix had just endured its hottest summer ever, with 55 days of temperatures above 110 degrees. The weather strained electrical grids and compounded the effects of the worst drought the region has faced in more than a millennium. The Colorado River, which provides drinking water and hydropower throughout the region, has been dwindling. Farmers have already had to fallow fields, and a community on the eastern outskirts of Phoenix went without tap water for most of the year... [T]here were dozens of other facilities I could visit in the area, including those run by Apple, Amazon, Meta, and, soon, Google. Not too far from California, and with plenty of cheap land, Greater Phoenix is among the fastest-growing hubs in the U.S. for data centers....

Microsoft, the biggest tech firm on the planet, has made ambitious plans to tackle climate change. In 2020, it pledged to be carbon-negative (removing more carbon than it emits each year) and water-positive (replenishing more clean water than it consumes) by the end of the decade. But the company also made an all-encompassing commitment to OpenAI, the most important maker of large-scale AI models. In so doing, it helped kick off a global race to build and deploy one of the world's most resource-intensive digital technologies. Microsoft operates more than 300 data centers around the world, and in 2021 declared itself "on pace to build between 50 and 100 new datacenters each year for the foreseeable future...."

Researchers at UC Riverside estimated last year... that global AI demand could cause data centers to suck up 1.1 trillion to 1.7 trillion gallons of freshwater by 2027. A separate study from a university in the Netherlands, this one peer-reviewed, found that AI servers' electricity demand could grow, over the same period, to be on the order of 100 terawatt hours per year, about as much as the entire annual consumption of Argentina or Sweden... [T]ensions over data centers' water use are cropping up not just in Arizona but also in Oregon, Uruguay, and England, among other places in the world.

The article points out that Microsoft "is transitioning some data centers, including those in Arizona, to designs that use less or no water, cooling themselves instead with giant fans." And an analysis (commissioned by Microsoft) on the impact of one building said it would use about 56 million gallons of drinking water each year, equivalent to the amount used by 670 families, according to the article. "In other words, a campus of servers pumping out ChatGPT replies from the Arizona desert is not about to make anyone go thirsty."
Intel

Intel Puts 1nm Process (10A) on the Roadmap For 2027 (tomshardware.com) 35

Intel's previously-unannounced Intel 10A (analogous to 1nm) will enter production/development in late 2027, marking the arrival of the company's first 1nm node, and its 14A (1.4nm) node will enter production in 2026. The company is also working to create fully autonomous AI-powered fabs in the future. Tom's Hardware: Intel's Keyvan Esfarjani, the company's EVP and GM and Foundry Manufacturing and Supply, held a very insightful session that covered the company's latest developments and showed how the roadmap unfolds over the coming years. Here, we can see two charts, with the first outlining the company's K-WSPW (thousands of wafer starts per week) capacity for Intel's various process nodes. Notably, capacity typically indicates how many wafers can be started, but not the total output -- output varies based on yields. You'll notice there isn't a label for the Y-axis, which would give us a direct read on Intel's production volumes. However, this does give us a solid idea of the proportionality of Intel's planned node production over the next several years.

Intel did not specify the arrival date of its coming 14A node in its previous announcements, but here, the company indicates it will begin production of the Intel 14A node in 2026. Even more importantly, Intel will begin production/development of its as-yet-unannounced 10A node in late 2027, filling out its roster of nodes produced with EUV technology. Intel's 'A' suffix in its node naming convention represents Angstroms, and 10 Angstroms converts to 1nm, meaning this is the company's first 1nm-class node. Intel hasn't shared any details about the 10A/1nm node but has told us that it classifies a new node as at least having a double-digit power/performance improvement. Intel CEO Pat Gelsinger has told us the cutoff for a new node is around a 14% to 15% improvement, so we can expect that 10A will have at least that level of improvement over the 14A node. (For example, the difference between Intel 7 and Intel 4 was a 15% improvement.)

China

China Breakthrough Promises Optical Discs That Store Hundreds of Terabytes (theregister.com) 38

Optical discs that can store up to 200 TB of data could be possible with a new technology developed in China. If commercialized, it could revive optical media as an alternative to hard disk or tape for cost-effective long-term storage. The Register: Researchers at the University of Shanghai for Science and Technology (USST) and Shanghai Institute of Optics and Fine Mechanics (SIOM) say they have demonstrated that optical storage is possible up to the petabit level by using hundreds of layers, while also claiming to have broken the optical diffraction barrier limiting how close together recorded features can be.

In an article published in Nature titled "A 3D nanoscale optical disk memory with petabit capacity," the researchers detail how they developed a novel optical storage medium they call dye-doped photoresist (DDPR) with aggregation-induced emission luminogens (AIE-DDPR). When applied as a recording layer, this is claimed to outperform other optical systems and hard drives in terms of areal density -- the amount of storage per unit of area. To be specific, the researchers claim it to be 125 times that of a multi-layer optical disk based on gold nanorods, and 24 times that of the most advanced hard drives (based on data from 2022). The proposed recording and retrieval processes for this medium calls for two laser beams each. For optical writing, a 515 nm femtosecond Gaussian laser beam and a doughnut-shaped 639 nm continuous wave laser beam are focused on the recording area.

Microsoft

Microsoft Strikes Deal With Mistral in Push Beyond OpenAI (ft.com) 13

Microsoft has struck a deal with French AI startup Mistral as it seeks to broaden its involvement in the fast-growing industry beyond OpenAI. From a report: The US tech giant will provide the 10-month-old Paris-based company with help in bringing its AI models to market. Microsoft will also take a minor stake in Mistral, although the financial details have not been disclosed. The partnership makes Mistral the second company to provide commercial language models available on Microsoft's Azure cloud computing platform. Microsoft has already invested about $13 billion in San Francisco-based OpenAI, an alliance that is being reviewed by competition watchdogs in the US, EU and UK. Other Big Tech rivals, such as Google and Amazon, are also investing heavily in building generative AI -- software that can produce text, images and code in seconds -- which analysts believe has the capacity to shake up industries across the world. WSJ adds: On Monday, Mistral plans to announce a new AI model, called Mistral Large, that Mensch said can perform some reasoning tasks comparably with GPT-4, OpenAI's most advanced language model to date, and Gemini Ultra, Google's new model. Mensch said his new model cost less than 20 million euros, the equivalent of roughly $22 million, to train. By contrast OpenAI Chief Executive Sam Altman said last year after the release of GPT-4 that training his company's biggest models cost "much more than" $50 million to $100 million.
Power

Are Corporate Interests Holding Back US Electrical Grid Expansion? (ieee.org) 133

Long-time Slashdot reader BishopBerkeley writes: Though it does not come as much of a surprise, a new study highlighted in IEEE Spectrum delves into how corporate profit motives are preventing the upgrading and the expansion of the U.S. electrical grid. The full report can be downloaded here from the source [the nonprofit economic research group NBER].

Besides opening up the market to competition, utilities don't want to lose control over regional infrastructure, writes IEEE Spectrum. "[I]nterregional lines threaten utility companies' dominance over the nation's power supply. In the power industry, asset ownership provides control over rules that govern energy markets and transmission service and expansion. When upstart entities build power plants and transmission lines, they may be able to dilute utility companies' control over power-industry rules and prevent utilities from dictating decisions about transmission expansion."

The article begins by noting that "The United States is not building enough transmission lines to connect regional power networks. The deficit is driving up electricity prices, reducing grid reliability, and hobbling renewable-energy deployment. " Utilities can stall transmission expansion because out-of-date laws sanction these companies' sweeping control over transmission development... One of the main values of connecting regional networks is that it enablesâ"and is in fact critical forâ"incorporating renewable energy... Plus, adding interregional transmission for renewables can significantly reduce costs for consumers. Such connections allow excess wind and solar power to flow to neighboring regions when weather conditions are favorable and allow the import of energy from elsewhere when renewables are less productive.

Even without renewables, better integrated networks generally lower costs for consumers because they reduce the amount of generation capacity needed overall and decrease energy market prices. Interregional transmission also enhances reliability,particularly during extreme weather...

Addressing the transmission shortage is on the agenda in Washington, but utility companies are lobbying against reforms.

The article points out that now investors and entrepreneurs "are developing long-distance direct-current lines, which are more efficient at moving large amounts of energy over long distances, compared with AC," and also "sidestep the utility-dominated transmission-expansion planning processes."

They're already in use in China, and are also becoming Europe's preferred choice...
Data Storage

Scientists Create DVD-Sized Disk Storing 1 Petabit (125,000 Gigabytes) of Data (popsci.com) 113

Popular Science points out that for encoding data, "optical disks almost always offer just a single, 2D layer — that reflective, silver underside."

"If you could boost a disk's number of available, encodable layers, however, you could hypothetically gain a massive amount of extra space..." Researchers at the University of Shanghai for Science and Technology recently set out to do just that, and published the results earlier this week in the journal, Nature. Using a 54-nanometer laser, the team managed to record a 100 layers of data onto an optical disk, with each tier separated by just 1 micrometer. The final result is an optical disk with a three-dimensional stack of data layers capable of holding a whopping 1 petabit (Pb) of information — that's equivalent to 125,000 gigabytes of data...

As Gizmodo offers for reference, that same petabit of information would require roughly a six-and-a-half foot tall stack of HHD drives — if you tried to encode the same amount of data onto Blu-rays, you'd need around 10,000 blank ones to complete your (extremely inefficient) challenge.

To pull off their accomplishment, engineers needed to create an entirely new material for their optical disk's film... AIE-DDPR film utilizes a combination of specialized, photosensitive molecules capable of absorbing photonic data at a nanoscale level, which is then encoded using a high-tech dual-laser array. Because AIE-DDPR is so incredibly transparent, designers could apply layer-upon-layer to an optical disk without worrying about degrading the overall data. This basically generated a 3D "box" for digitized information, thus exponentially raising the normal-sized disk's capacity.

Thanks to long-time Slashdot reader hackingbear for sharing the news.
Iphone

Apple Says the iPhone 15's Battery Has Double the Promised Lifespan (engadget.com) 51

Apple has updated the iPhone 15's battery lifespan, noting the new handsets can retain 80 percent of their original charging capacity after 1,000 cycles -- double the company's previous estimate -- without any new hardware or software updates. From a report: Not so coincidentally, the change will arrive in time for upcoming EU regulations that will assign an energy grade for phones' battery longevity. Before today, Apple's online support documents quoted iPhone batteries as maintaining 80 percent of their original full charge after 500 cycles. But after the company retested long-term battery health in its 2023 smartphones -- iPhone 15, iPhone 15 Plus, iPhone 15 Pro and iPhone 15 Pro Max -- it found they can retain 80 percent capacity after at least 1,000 cycles. The company said its support documents will be updated on Tuesday to reflect the new estimate.
Biotech

What Happens After Throughput to DNA Storage Drives Surpasses 2 Gbps? (ieee.org) 35

High-capacity DNA data storage "is closer than you think," Slashdot wrote in 2019.

Now IEEE Spectrum brings an update on where we're at — and where we're headed — by a participant in the DNA storage collaboration between Microsoft and the Molecular Information Systems Lab of the Paul G. Allen School of Computer Science and Engineering at the University of Washington. "Organizations around the world are already taking the first steps toward building a DNA drive that can both write and read DNA data," while "funding agencies in the United States, Europe, and Asia are investing in the technology stack required to field commercially relevant devices." The challenging part is learning how to get the information into, and back out of, the molecule in an economically viable way... For a DNA drive to compete with today's archival tape drives, it must be able to write about 2 gigabits per second, which at demonstrated DNA data storage densities is about 2 billion bases per second. To put that in context, I estimate that the total global market for synthetic DNA today is no more than about 10 terabases per year, which is the equivalent of about 300,000 bases per second over a year. The entire DNA synthesis industry would need to grow by approximately 4 orders of magnitude just to compete with a single tape drive. Keeping up with the total global demand for storage would require another 8 orders of magnitude of improvement by 2030. But humans have done this kind of scaling up before. Exponential growth in silicon-based technology is how we wound up producing so much data. Similar exponential growth will be fundamental in the transition to DNA storage...

Companies like DNA Script and Molecular Assemblies are commercializing automated systems that use enzymes to synthesize DNA. These techniques are replacing traditional chemical DNA synthesis for some applications in the biotechnology industry... [I]t won't be long before we can combine the two technologies into one functional device: a semiconductor chip that converts digital signals into chemical states (for example, changes in pH), and an enzymatic system that responds to those chemical states by adding specific, individual bases to build a strand of synthetic DNA. The University of Washington and Microsoft team, collaborating with the enzymatic synthesis company Ansa Biotechnologies, recently took the first step toward this device... The path is relatively clear; building a commercially relevant DNA drive is simply a matter of time and money...

At the same time, advances in DNA synthesis for DNA storage will increase access to DNA for other uses, notably in the biotechnology industry, and will thereby expand capabilities to reprogram life. Somewhere down the road, when a DNA drive achieves a throughput of 2 gigabases per second (or 120 gigabases per minute), this box could synthesize the equivalent of about 20 complete human genomes per minute. And when humans combine our improving knowledge of how to construct a genome with access to effectively free synthetic DNA, we will enter a very different world... We'll be able to design microbes to produce chemicals and drugs, as well as plants that can fend off pests or sequester minerals from the environment, such as arsenic, carbon, or gold. At 2 gigabases per second, constructing biological countermeasures against novel pathogens will take a matter of minutes. But so too will constructing the genomes of novel pathogens. Indeed, this flow of information back and forth between the digital and the biological will mean that every security concern from the world of IT will also be introduced into the world of biology...

The future will be built not from DNA as we find it, but from DNA as we will write it.

The article makes an interesting point — that biology labs around the world already order chemically-synthesized ssDNA, "delivered in lengths of up to several hundred bases," and sequence DNA molecules up to thousands of bases in length.

"In other words, we already convert digital information to and from DNA, but generally using only sequences that make sense in terms of biology."
Education

NYC Fails Controversial Remote Learning Snow Day 'Test,' Public Schools Chancellor Says (nbcnews.com) 60

New York City's public schools chancellor said the city did not pass Tuesday's remote learning "test" due to technical issues. From a report: "As I said, this was a test. I don't think that we passed this test," David Banks said during a news briefing, adding that he felt "disappointed, frustrated and angry" as a result of the technical issues. NYC Public Schools did a lot of work to prepare for the remote learning day, Banks said, but shortly before 8 a.m. they were notified that parents and students were having difficulty signing onto remote learning.

This is the first time NYC Public Schools has implemented remote learning on a snow day since introducing the no snow day policy in 2022. The district serves 1.1 million students in more than 1,800 schools. Banks blamed the technical issues on IBM, which helps facilitate the city's remote learning program. "IBM was not ready for primetime," Banks said, adding that the company was overwhelmed with the surge of people signing on for school. IBM has since expanded their capacity and a total of 850,000 students and teachers are currently online, Banks said. "We'll work harder to do better next time," he said, adding that there will be a deeper analysis into what went wrong.

Power

28-Ton, 1.2-Megawatt Tidal Kite Is Now Exporting Power To the Grid (newatlas.com) 65

Minesto, a marine energy tech developer based in Sweden, has deployed their new Dragon 12 tidal energy harvester to the Faroe Islands. Operating like an underwater kite, the Dragon 12 "uses lift generated by tidal flows to fly patterns faster than the currents, harvesting renewable energy," reports New Atlas. From the report: Where devices like Orbital's O2 tidal turbine more or less just sit there in the water harvesting energy from tidal currents, Minesto's Dragon series are anchored to the sea bed, and fly around like kites, treating the currents like wind. Just as land-based wind energy kites fly in figure 8 patterns to accelerate themselves faster than the wind, so does the Dragon underwater. This, says Minesto, lets the Dragon pull more energy from a given tidal current than other designs -- and it also changes the economic equations for relevant sites, making slower tidal flows worth exploiting.

These are by no means small kites -- the Dragon 12 needs to be disassembled to fit in a shipping container. It rocks a monster 12-meter (39-ft) wingspan, and weighs no less than 28 tons. But compared to other offshore power options like wind turbines, it's an absolute minnow, and extremely easy to install using a single smallish boat and a sea bed tether. As with any renewable energy project, the key figure here is LCoE (levelized cost of energy) -- so what's it gonna cost? Well, back in 2017, Minesto projected about US$108/MWh once its first hundred megawatts of capacity are installed -- with costs falling thereafter as low as $54/MWh.

The Dragon 12, like other tidal devices, will be more effective in some places than others -- and Denmark's Faroe Islands, an archipelago in the chilly North Atlantic between Scotland and Iceland, offer ideal conditions. Home to about 55,000 people and more than a million puffins, the Faroe Islands funnel tidal currents through a number of slim channels. This accelerates the water significantly, and thus increases the energy that devices like the Dragon 12 can harvest. That's where the first Dragon has been deployed, and on Friday, it was connected to the local power grid to begin delivering energy.
You can watch a video of the Dragon 12 on YouTube.
Businesses

Sam Altman Seeks Trillions of Dollars To Reshape Business of Chips and AI (wsj.com) 54

Sam Altman was already trying to lead the development of human-level artificial intelligence. Now he has another great ambition: raising trillions of dollars to reshape the global semiconductor industry. From a report: The OpenAI chief executive officer is in talks with investors including the United Arab Emirates government to raise funds for a wildly ambitious tech initiative that would boost the world's chip-building capacity, expand its ability to power AI, among other things, and cost several trillion dollars, according to people familiar with the matter. The project could require raising as much as $5 trillion to $7 trillion, one of the people said.

The fundraising plans, which face significant obstacles, are aimed at solving constraints to OpenAI's growth, including the scarcity of the pricey AI chips required to train large language models behind AI systems such as ChatGPT. Altman has often complained that there aren't enough of these kinds of chips -- known as graphics processing units, or GPUs -- to power OpenAI's quest for artificial general intelligence, which it defines as systems that are broadly smarter than humans. Such a sum of investment would dwarf the current size of the global semiconductor industry. Global sales of chips were $527 billion last year and are expected to rise to $1 trillion annually by 2030. Global sales of semiconductor manufacturing equipment -- the costly machinery needed to run chip factories -- last year were $100 billion, according to an estimate by the industry group SEMI.

Japan

TSMC To Build Second Japan Chip Factory, Raising Investment To $20 Billion (reuters.com) 44

Taiwanese chipmaker TSMC announced plans to build a second chip factory in Japan by the end of 2027, bringing total investment in its Japan venture to more than $20 billion. "Taiwan Semiconductor Manufacturing Co announced plans in 2021 to build a $7 billion chip plant in Kumamoto in southern Japan's Kyushu," notes Reuters. From the report: In a statement, TSMC, the world's largest contract chipmaker, said its majority-owned unit Japan Advanced Semiconductor Manufacturing in Kumamoto would build a second fabrication plant, or fab, in response to rising customer demand. The second fab will begin construction by the end of this year and with both factories the site is expected to have total monthly capacity of more than 100,000 12-inch wafers to be used for automotive, industrial, consumer and high performance computing-related applications, TSMC said. The capacity plan may be further adjusted based upon customer demand, it added.

TSMC's expansion in Kyushu is central to the Japanese government's efforts to rebuild the country's position as a leading chip manufacturing centre and ensure the stable supply of chips amid trade tensions between the United States and China. The decision to build a second fab is a vote of confidence by TSMC in Japan where construction of the first fab has run smoothly and which, Reuters has reported, it sees as a source of diligent workers with a government that is easy to deal with.

Data Storage

Report Reveals Decline In Quality of USB Sticks, MicroSD Cards (techspot.com) 71

A new report from German data recovery company CBL found that devices using NAND chips from reputable brands are declining in quality, with reduced capacity and their manufacturers' logo removed. Furthermore, some USB sticks use the old trick of soldiering a microSD card onto the board. TechSpot reports: Most of the janky USB sticks CBL examined were promotional gifts, the kind given away free with products or by companies at conferences. However, there were some "branded" products that fell into the same inferior-quality category, though CBL didn't say if these were well-known mainstream brands or the kind of brands you've probably never heard of.

Technological advancements have also affected these NAND chips, but not in a good way. The chips originally used single-level cell (SLC) memory cells that only stored one bit each, offering less data density but better performance and reliability. In order to increase the amount of storage the chips offered, manufacturers started moving to four bits per cell (QLC), decreasing the endurance and retention. Combined with the questionable components, it's why CBL warns that "You shouldn't rely too much on the reliability of flash memory."

The report illustrates how some of the components found in the devices had their manufactures' names removed or obscured. One simply printed text over the top of the company name, while another had been scrubbed off completely. There's also a photo of a microSD card found inside a USB stick that had all of its identifying markings removed. It's always wise to be careful when choosing your storage device and beware of offers that seem too good to be true.

Bitcoin

Over 2 Percent of the US's Electricity Generation Now Goes To Bitcoin (arstechnica.com) 106

"In the last few years, the U.S. has seen a boom in cryptocurrency mining," writes Ars Technica. But they add that the U.S. government "is now trying to track exactly what that means for the consumption of electricity. Specifically, a crucial branch of the U.S. Department of Energy.

"While its analysis is preliminary, the Energy Information Agency (EIA) estimates that large-scale cryptocurrency operations are now consuming over 2 percent of the U.S.'s electricity." That's roughly the equivalent of having added an additional state to the grid over just the last three years."

While there is some small-scale mining that goes on with personal computers and small rigs, most cryptocurrency mining has moved to large collections of specialized hardware. While this hardware can be pricy compared to personal computers, the main cost for these operations is electricity use, so the miners will tend to move to places with low electricity rates. The EIA report notes that, in the wake of a crackdown on cryptocurrency in China, a lot of that movement has involved relocation to the U.S., where keeping electricity prices low has generally been a policy priority.

One independent estimate made by the Cambridge Centre for Alternative Finance had the US as the home of just over 3 percent of the global bitcoin mining at the start of 2020. By the start of 2022, that figure was nearly 38 percent... The EIA decided it needed a better grip on what was going on... To better understand the implications of this major new drain on the U.S. electric grid, the EIA will be performing monthly analyses of bitcoin operations during the first half of 2024.

The Energy Information Agency identified 137 bitcoin mining operators, of which 101 responded to inquiries about their full-capacity power supply. "If running all-out, those 101 facilities would consume 2.3 percent of the US's average power demand," the article points out. And they add that in at least five instances, the Agency found bitcoin operators had "moved in near underutilized power plants and sent generation soaring again...

"These are almost certainly fossil fuel plants that might be reasonable candidates for retirement if it weren't for their use to supply bitcoin miners."

Slashdot Top Deals