×
Microsoft

How Microsoft Employees Pressured the Company Over Its Oil Industry Ties (grist.org) 1

The non-profit environmental site Grist reports on "an internal, employee-led effort to raise ethical concerns about Microsoft's work helping oil and gas producers boost their profits by providing them with cloud computing resources and AI software tools." There's been some disappointments — but also some successes, starting with the founding of an internal sustainability group within Microsoft that grew to nearly 10,000 employees: Former Microsoft employees and sources familiar with tech industry advocacy say that, broadly speaking, employee pressure has had an enormous impact on sustainability at Microsoft, encouraging it to announce industry-leading climate goals in 2020 and support key federal climate policies.

But convincing the world's most valuable company to forgo lucrative oil industry contracts proved far more difficult... Over the past seven years, Microsoft has announced dozens of new deals with oil and gas producers and oil field services companies, many explicitly aimed at unlocking new reserves, increasing production, and driving up oil industry profits...

As concerns over the company's fossil fuel work mounted, Microsoft was gearing up to make a big sustainability announcement. In January 2020, the company pledged to become "carbon negative" by 2030, meaning that in 10 years, the tech giant would pull more carbon out of the air than it emitted on an annual basis... For nearly two years, employees watched and waited. Following its carbon negative announcement, Microsoft quickly expanded its internal carbon tax, which charges the company's business groups a fee for the carbon they emit via electricity use, employee travel, and more. It also invested in new technologies like direct air capture and purchased carbon removal contracts from dozens of projects worldwide.

But Microsoft's work with the oil industry continued unabated, with the company announcing a slew of new partnerships in 2020 and 2021 aimed at cutting fossil fuel producers' costs and boosting production.

The last straw for one technical account manager was a 2023 LinkedIn post by a Microsoft technical architect about the company's work on oil and gas industry automation. The post said Microsoft's cloud service was "unlocking previously inaccessible reserves" for the fossil fuel industry, promising that with Microsoft's Azure service, "the future of oil and gas exploration and production is brighter than ever."

The technical account manager resigned from the position they'd held for nearly a decade, citing the blog post in a resignation letter which accused Microsoft of "extending the age of fossil fuels, and enabling untold emissions."

Thanks to Slashdot reader joshuark for sharing the news.
Businesses

Squarespace To Go Private in $6.9 Billion Deal With Permira 1

Squarespace said on Monday it has agreed to be taken private by private equity firm Permira in an all-cash deal valued at approximately $6.9 billion. Under the terms of the agreement, Squarespace stockholders will receive $44.00 per share in cash, representing a premium of about 29% over the company's 90-day volume weighted average trading price.

Upon completion of the transaction, Squarespace will become a privately held company. Founder and CEO Anthony Casalena will continue to lead the business and be one of the largest shareholders following the deal. "Squarespace has been at the forefront of providing services to businesses looking to establish themselves online for more than two decades. We are excited to continue building on that foundation, and expanding our offerings, for years to come," said Casalena in a statement.

"We are thrilled to be partnering with Permira on this new leg of our journey, alongside our existing long-term investors General Atlantic and Accel, who strongly believe in the future of Squarespace," he added.
AI

OpenAI's Sam Altman Wants AI in the Hands of the People - and Universal Basic Compute? (youtube.com) 26

OpenAI CEO Sam Altman gave an hour-long interview to the "All-In" podcast (hosted by Chamath Palihapitiya, Jason Calacanis, David Sacks and David Friedberg).

And when asked about this summer's launch of the next version of ChatGPT, Altman said they hoped to "be thoughtful about how we do it, like we may release it in a different way than we've released previous models...

Altman: One of the things that we really want to do is figure out how to make more advanced technology available to free users too. I think that's a super-important part of our mission, and this idea that we build AI tools and make them super-widely available — free or, you know, not-that-expensive, whatever that is — so that people can use them to go kind of invent the future, rather than the magic AGI in the sky inventing the future, and showering it down upon us. That seems like a much better path. It seems like a more inspiring path.

I also think it's where things are actually heading. So it makes me sad that we have not figured out how to make GPT4-level technology available to free users. It's something we >really want to do...

Q: It's just very expensive, I take it?

Altman: It's very expensive.

But Altman said later he's confident they'll be able to reduce cost. Altman: I don't know, like, when we get to intelligence too cheap to meter, and so fast that it feels instantaneous to us, and everything else, but I do believe we can get there for, you know, a pretty high level of intelligence. It's important to us, it's clearly important to users, and it'll unlock a lot of stuff.
Altman also thinks there's "great roles for both" open-source and closed-source models, saying "We've open-sourced some stuff, we'll open-source more stuff in the future.

"But really, our mission is to build toward AGI, and to figure out how to broadly distribute its benefits... " Altman even said later that "A huge part of what we try to do is put the technology in the hands of people..." Altman: The fact that we have so many people using a free version of ChatGPT that we don't — you know, we don't run ads on, we don't try to make money on it, we just put it out there because we want people to have these tools — I think has done a lot to provide a lot of value... But also to get the world really thoughtful about what's happening here. It feels to me like we just stumbled on a new fact of nature or science or whatever you want to call it... I am sure, like any other industry, I would expect there to be multiple approaches and different peoiple like different ones.
Later Altman said he was "super-excited" about the possibility of an AI tutor that could reinvent how people learn, and "doing faster and better scientific discovery... that will be a triumph."

But at some point the discussion led him to where the power of AI intersects with the concept of a universal basic income: Altman: Giving people money is not going to go solve all the problems. It is certainly not going to make people happy. But it might solve some problems, and it might give people a better horizon with which to help themselves.

Now that we see some of the ways that AI is developing, I wonder if there's better things to do than the traditional conceptualization of UBI. Like, I wonder — I wonder if the future looks something more like Universal Basic Compute than Universal Basic Income, and everybody gets like a slice of GPT-7's compute, and they can use it, they can re-sell it, they can donate it to somebody to use for cancer research. But what you get is not dollars but this like slice — you own part of the the productivity.

Altman was also asked about the "ouster" period where he was briefly fired from OpenAI — to which he gave a careful response: Altman: I think there's always been culture clashes at — look, obviously not all of those board members are my favorite people in the world. But I have serious respect for the gravity with which they treat AGI and the importance of getting AI safety right. And even if I stringently disagree with their decision-making and actions, which I do, I have never once doubted their integrity or commitment to the sort of shared mission of safe and beneficial AGI...

I think a lot of the world is, understandably, very afraid of AGI, or very afraid of even current AI, and very excited about it — and even more afraid, and even more excited about where it's going. And we wrestle with that, but I think it is unavoidable that this is going to happen. I also think it's going to be tremendously beneficial. But we do have to navigate how to get there in a reasonable way. And, like a lot of stuff is going to change. And change is pretty uncomfortable for people. So there's a lot of pieces that we've got to get right...

I really care about AGI and think this is like the most interesting work in the world.

Transportation

Will Smarter Cars Bring 'Optimized' Traffic Lights? (apnews.com) 47

"Researchers are exploring ways to use features in modern cars, such as GPS, to make traffic safer and more efficient," reports the Associated Press.

"Eventually, the upgrades could do away entirely with the red, yellow and green lights of today, ceding control to driverless cars." Among those reimagining traffic flows is a team at North Carolina State University led by Ali Hajbabaie, an associate engineering professor. Rather than doing away with today's traffic signals, Hajbabaie suggests adding a fourth light, perhaps a white one, to indicate when there are enough autonomous vehicles on the road to take charge and lead the way. "When we get to the intersection, we stop if it's red and we go if it's green," said Hajbabaie, whose team used model cars small enough to hold. "But if the white light is active, you just follow the vehicle in front of you."
He points out that this approach could be years aways, since it requires self-driving capability in 40% to 50% of the cars on the road.

But the article notes another approach which could happen sooner, talking to Henry Liu, a civil engineering professor who is leading ">a study through the University of Michigan: They conducted a pilot program in the Detroit suburb of Birmingham using insights from the speed and location data found in General Motors vehicles to alter the timing of that city's traffic lights. The researchers recently landed a U.S. Department of Transportation grant under the bipartisan infrastructure law to test how to make the changes in real time... Liu, who has been leading the Michigan research, said even with as little as 6% of the vehicles on Birmingham's streets connected to the GM system, they provide enough data to adjust the timing of the traffic lights to smooth the flow... "The beauty of this is you don't have to do anything to the infrastructure," Liu said. "The data is not coming from the infrastructure. It's coming from the car companies."

Danielle Deneau, director of traffic safety at the Road Commission in Oakland County, Michigan, said the initial data in Birmingham only adjusted the timing of green lights by a few seconds, but it was still enough to reduce congestion.

"Even bigger changes could be in store under the new grant-funded research, which would automate the traffic lights in a yet-to-be announced location in the county."
Australia

Australia Criticized For Ramping Up Gas Extraction Through '2050 and Beyond' (bbc.com) 91

Slashdot reader sonlas shared this report from the BBC: Australia has announced it will ramp up its extraction and use of gas until "2050 and beyond", despite global calls to phase out fossil fuels. Prime Minister Anthony Albanese's government says the move is needed to shore up domestic energy supply while supporting a transition to net zero... Australia — one of the world's largest exporters of liquefied natural gas — has also said the policy is based on "its commitment to being a reliable trading partner". Released on Thursday, the strategy outlines the government's plans to work with industry and state leaders to increase both the production and exploration of the fossil fuel. The government will also continue to support the expansion of the country's existing gas projects, the largest of which are run by Chevron and Woodside Energy Group in Western Australia...

The policy has sparked fierce backlash from environmental groups and critics — who say it puts the interest of powerful fossil fuel companies before people. "Fossil gas is not a transition fuel. It's one of the main contributors to global warming and has been the largest source of increases of CO2 [emissions] over the last decade," Prof Bill Hare, chief executive of Climate Analytics and author of numerous UN climate change reports told the BBC... Successive Australian governments have touted gas as a key "bridging fuel", arguing that turning it off too soon could have "significant adverse impacts" on Australia's economy and energy needs. But Prof Hare and other scientists have warned that building a net zero policy around gas will "contribute to locking in 2.7-3C global warming, which will have catastrophic consequences".

Linux

Linux Kernel 6.9 Officially Released (9to5linux.com) 22

"6.9 is now out," Linus Torvalds posted on the Linux kernel mailing list, "and last week has looked quite stable (and the whole release has felt pretty normal)."

Phoronix writes that Linux 6.9 "has a number of exciting features and improvements for those habitually updating to the newest version." And Slashdot reader prisoninmate shared this report from 9to5Linux: Highlights of Linux kernel 6.9 include Rust support on AArch64 (ARM64) architectures, support for the Intel FRED (Flexible Return and Event Delivery) mechanism for improved low-level event delivery, support for AMD SNP (Secure Nested Paging) guests, and a new dm-vdo (virtual data optimizer) target in device mapper for inline deduplication, compression, zero-block elimination, and thin provisioning.

Linux kernel 6.9 also supports the Named Address Spaces feature in GCC (GNU Compiler Collection) that allows the compiler to better optimize per-CPU data access, adds initial support for FUSE passthrough to allow the kernel to serve files from a user-space FUSE server directly, adds support for the Energy Model to be updated dynamically at run time, and introduces a new LPA2 mode for ARM 64-bit processors...

Linux kernel 6.9 will be a short-lived branch supported for only a couple of months. It will be succeeded by Linux kernel 6.10, whose merge window has now been officially opened by Linus Torvalds. Linux kernel 6.10 is expected to be released in mid or late September 2024.

"Rust language has been updated to version 1.76.0 in Linux 6.9," according to the article. And Linus Torvalds shared one more details on the Linux kernel mailing list.

"I now have a more powerful arm64 machine (thanks to Ampere), so the last week I've been doing almost as many arm64 builds as I have x86-64, and that should obviously continue during the upcoming merge window too."
Social Networks

Reddit Grows, Seeks More AI Deals, Plans 'Award' Shops, and Gets Sued (yahoo.com) 34

Reddit reported its first results since going public in late March. Yahoo Finance reports: Daily active users increased 37% year over year to 82.7 million. Weekly active unique users rose 40% from the prior year. Total revenue improved 48% to $243 million, nearly doubling the growth rate from the prior quarter, due to strength in advertising. The company delivered adjusted operating profits of $10 million, versus a $50.2 million loss a year ago. [Reddit CEO Steve] Huffman declined to say when the company would be profitable on a net income basis, noting it's a focus for the management team. Other areas of focus include rolling out a new user interface this year, introducing shopping capabilities, and searching for another artificial intelligence content licensing deal like the one with Google.
Bloomberg notes that already Reddit "has signed licensing agreements worth $203 million in total, with terms ranging from two to three years. The company generated about $20 million from AI content deals last quarter, and expects to bring in more than $60 million by the end of the year."

And elsewhere Bloomberg writes that Reddit "plans to expand its revenue streams outside of advertising into what Huffman calls the 'user economy' — users making money from others on the platform... " In the coming months Reddit plans to launch new versions of awards, which are digital gifts users can give to each other, along with other products... Reddit also plans to continue striking data licensing deals with artificial intelligence companies, expanding into international markets and evaluating potential acquisition targets in areas such as search, he said.
Meanwhile, ZDNet notes that this week a Reddit announcement "introduced a new public content policy that lays out a framework for how partners and third parties can access user-posted content on its site." The post explains that more and more companies are using unsavory means to access user data in bulk, including Reddit posts. Once a company gets this data, there's no limit to what it can do with it. Reddit will continue to block "bad actors" that use unauthorized methods to get data, the company says, but it's taking additional steps to keep users safe from the site's partners.... Reddit still supports using its data for research: It's creating a new subreddit — r/reddit4researchers — to support these initiatives, and partnering with OpenMined to help improve research. Private data is, however, going to stay private.

If a company wants to use Reddit data for commercial purposes, including advertising or training AI, it will have to pay. Reddit made this clear by saying, "If you're interested in using Reddit data to power, augment, or enhance your product or service for any commercial purposes, we require a contract." To be clear, Reddit is still selling users' data — it's just making sure that unscrupulous actors have a tougher time accessing that data for free and researchers have an easier time finding what they need.

And finally, there's some court action, according to the Register. Reddit "was sued by an unhappy advertiser who claims that internet giga-forum sold ads but provided no way to verify that real people were responsible for clicking on them." The complaint [PDF] was filed this week in a U.S. federal court in northern California on behalf of LevelFields, a Virginia-based investment research platform that relies on AI. It says the biz booked pay-per-click ads on the discussion site starting September 2022... That arrangement called for Reddit to use reasonable means to ensure that LevelField's ads were delivered to and clicked on by actual people rather than bots and the like. But according to the complaint, Reddit broke that contract...

LevelFields argues that Reddit is in a particularly good position to track click fraud because it's serving ads on its own site, as opposed to third-party properties where it may have less visibility into network traffic... Nonetheless, LevelFields's effort to obtain IP address data to verify the ads it was billed for went unfulfilled. The social media site "provided click logs without IP addresses," the complaint says. "Reddit represented that it was not able to provide IP addresses."

"The plaintiffs aspire to have their claim certified as a class action," the article adds — along with an interesting statistic.

"According to Juniper Research, 22 percent of ad spending last year was lost to click fraud, amounting to $84 billion."
AI

OpenAI's Sam Altman on iPhones, Music, Training Data, and Apple's Controversial iPad Ad (youtube.com) 27

OpenAI CEO Sam Altman gave an hour-long interview to the "All-In" podcast (hosted by Chamath Palihapitiya, Jason Calacanis, David Sacks and David Friedberg). And speaking on technology's advance, Altman said "Phones are unbelievably good.... I personally think the iPhone is like the greatest piece of technology humanity has ever made. It's really a wonderful product."


Q: What comes after it?

Altman: I don't know. I mean, that was what I was saying. It's so good, that to get beyond it, I think the bar is quite high.

Q: You've been working with Jony Ive on something, right?

Altman: We've been discussing ideas, but I don't — like, if I knew...


Altman said later he thought voice interaction "feels like a different way to use a computer."

But the conversation turned to Apple in another way. It happened in a larger conversation where Altman said OpenAI has "currently made the decision not to do music, and partly because exactly these questions of where you draw the lines..."

Altman: Even the world in which — if we went and, let's say we paid 10,000 musicians to create a bunch of music, just to make a great training set, where the music model could learn everything about song structure and what makes a good, catchy beat and everything else, and only trained on that — let's say we could still make a great music model, which maybe we could. I was posing that as a thought experiment to musicians, and they were like, "Well, I can't object to that on any principle basis at that point — and yet there's still something I don't like about it." Now, that's not a reason not to do it, um, necessarily, but it is — did you see that ad that Apple put out... of like squishing all of human creativity down into one really iPad...?

There's something about — I'm obviously hugely positive on AI — but there is something that I think is beautiful about human creativity and human artistic expression. And, you know, for an AI that just does better science, like, "Great. Bring that on." But an AI that is going to do this deeply beautiful human creative expression? I think we should figure out — it's going to happen. It's going to be a tool that will lead us to greater creative heights. But I think we should figure out how to do it in a way that preserves the spirit of what we all care about here.

What about creators whose copyrighted materials are used for training data? Altman had a ready answer — but also some predictions for the future. "On fair use, I think we have a very reasonable position under the current law. But I think AI is so different that for things like art, we'll need to think about them in different ways..." Altman:I think the conversation has been historically very caught up on training data, but it will increasingly become more about what happens at inference time, as training data becomes less valuable and what the system does accessing information in context, in real-time... what happens at inference time will become more debated, and what the new economic model is there.
Altman gave the example of an AI which was never trained on any Taylor Swift songs — but could still respond to a prompt requesting a song in her style. Altman: And then the question is, should that model, even if it were never trained on any Taylor Swift song whatsoever, be allowed to do that? And if so, how should Taylor get paid? So I think there's an opt-in, opt-out in that case, first of all — and then there's an economic model.
Altman also wondered if there's lessons in the history and economics of music sampling...
Space

Webb Telescope Finds a (Hot) Earth-Sized Planet With an Atmosphere (apnews.com) 16

An anonymous reader shared this report from the Associated Press: A thick atmosphere has been detected around a planet that's twice as big as Earth in a nearby solar system, researchers reported Wednesday.

The so-called super Earth — known as 55 Cancri e — is among the few rocky planets outside our solar system with a significant atmosphere, wrapped in a blanket of carbon dioxide and carbon monoxide. The exact amounts are unclear. Earth's atmosphere is a blend of nitrogen, oxygen, argon and other gases. "It's probably the firmest evidence yet that this planet has an atmosphere," said Ian Crossfield, an astronomer at the University of Kansas who studies exoplanets and was not involved with the research.

The research was published in the journal Nature.

"The boiling temperatures on this planet — which can reach as hot as 4,200 degrees Fahrenheit (2,300 degrees Celsius) — mean that it is unlikely to host life," the article points out.

"Instead, scientists say the discovery is a promising sign that other such rocky planets with thick atmospheres could exist that may be more hospitable."
Power

Could Atomically Thin Layers Bring A 19x Energy Jump In Battery Capacitors? (popularmechanics.com) 20

Researchers believe they've discovered a new material structure that can improve the energy storage of capacitors. The structure allows for storage while improving the efficiency of ultrafast charging and discharging. The new find needs optimization but has the potential to help power electric vehicles. * An anonymous reader shared this report from Popular Mechanics: In a study published in Science, lead author Sang-Hoon Bae, an assistant professor of mechanical engineering and materials science, demonstrates a novel heterostructure that curbs energy loss, enabling capacitors to store more energy and charge rapidly without sacrificing durability... Within capacitors, ferroelectric materials offer high maximum polarization. That's useful for ultra-fast charging and discharging, but it can limit the effectiveness of energy storage or the "relaxation time" of a conductor. "This precise control over relaxation time holds promise for a wide array of applications and has the potential to accelerate the development of highly efficient energy storage systems," the study authors write.

Bae makes the change — one he unearthed while working on something completely different — by sandwiching 2D and 3D materials in atomically thin layers, using chemical and nonchemical bonds between each layer. He says a thin 3D core inserts between two outer 2D layers to produce a stack that's only 30 nanometers thick, about 1/10th that of an average virus particle... The sandwich structure isn't quite fully conductive or nonconductive.

This semiconducting material, then, allows the energy storage, with a density up to 19 times higher than commercially available ferroelectric capacitors, while still achieving 90 percent efficiency — also better than what's currently available.

Thanks to long-time Slashdot reader schwit1 for sharing the article.
Transportation

Photographer Sets World Record for Fastest Drone Flight at 298 MPH (petapixel.com) 25

An anonymous reader shared this report from PetaPixel: A photographer and content creator has set the world record for the fastest drone flight after his custom-made aircraft achieved a staggering 298.47 miles per hour (480.2 kilometers per hour). Guinness confirmed the record noting that Luke Maximo Bell and his father Mike achieved the "fastest ground speed by a battery-powered remote-controlled (RC) quadcopter."

Luke, who has previously turned his GoPro into a tennis ball, describes it as the most "frustrating and difficult project" he has ever worked on after months of working on prototypes that frequently caught fire. From the very first battery tests for the drone that Luke calls Peregrine 2, there were small fires as it struggled to cope with the massive amount of current which caused it to heat up to over 266 degrees Fahrenheit (130 degrees Celsius). The motor wires also burst into flames during full load testing causing Luke and Mike to use thicker ones so they didn't fail...

After 3D-printing the final model and assembling all the parts, Luke took it for a maiden flight which immediately resulted in yet another fire. This setback made Bell almost quit the project but he decided to remake all the parts and try again — which also ended in fire. This second catastrophe prompted Luke and his Dad to "completely redesign the whole drone body." It meant weeks of work as the new prototype was once again tested, 3D-printed, and bolted together.

Space

Is Dark Matter's Main Rival Theory Dead? (theconversation.com) 56

"One of the biggest mysteries in astrophysics today is that the forces in galaxies do not seem to add up," write two U.K. researchers in the Conversation: Galaxies rotate much faster than predicted by applying Newton's law of gravity to their visible matter, despite those laws working well everywhere in the Solar System. To prevent galaxies from flying apart, some additional gravity is needed. This is why the idea of an invisible substance called dark matter was first proposed. But nobody has ever seen the stuff. And there are no particles in the hugely successful Standard Model of particle physics that could be the dark matter — it must be something quite exotic.

This has led to the rival idea that the galactic discrepancies are caused instead by a breakdown of Newton's laws. The most successful such idea is known as Milgromian dynamics or Mond [also known as modified Newtonian dynamics], proposed by Israeli physicist Mordehai Milgrom in 1982. But our recent research shows this theory is in trouble...

Due to a quirk of Mond, the gravity from the rest of our galaxy should cause Saturn's orbit to deviate from the Newtonian expectation in a subtle way. This can be tested by timing radio pulses between Earth and Cassini. Since Cassini was orbiting Saturn, this helped to measure the Earth-Saturn distance and allowed us to precisely track Saturn's orbit. But Cassini did not find any anomaly of the kind expected in Mond. Newton still works well for Saturn... Another test is provided by wide binary stars — two stars that orbit a shared centre several thousand AU apart. Mond predicted that such stars should orbit around each other 20% faster than expected with Newton's laws. But one of us, Indranil Banik, recently led a very detailed study that rules out this prediction. The chance of Mond being right given these results is the same as a fair coin landing heads up 190 times in a row. Results from yet another team show that Mond also fails to explain small bodies in the distant outer Solar System...

The standard dark matter model of cosmology isn't perfect, however. There are things it struggles to explain, from the universe's expansion rate to giant cosmic structures. So we may not yet have the perfect model. It seems dark matter is here to stay, but its nature may be different to what the Standard Model suggests. Or gravity may indeed be stronger than we think — but on very large scales only.

"Ultimately though, Mond, as presently formulated, cannot be considered a viable alternative to dark matter any more," the researchers conclude. "We may not like it, but the dark side still holds sway."
Data Storage

Father of SQL Says Yes to NoSQL (theregister.com) 50

An anonymous reader shared this report from the Register: The co-author of SQL, the standardized query language for relational databases, has come out in support of the NoSQL database movement that seeks to escape the tabular confines of the RDBMS. Speaking to The Register as SQL marks its 50th birthday, Donald Chamberlin, who first proposed the language with IBM colleague Raymond Boyce in a 1974 paper [PDF], explains that NoSQL databases and their query languages could help perform the tasks relational systems were never designed for. "The world doesn't stay the same thing, especially in computer science," he says. "It's a very fast, evolving, industry. New requirements are coming along and technology has to change to meet them, I think that's what's happening. The NoSQL movement is motivated by new kinds of applications, particularly web applications, that need massive scalability and high performance. Relational databases were developed in an earlier generation when scalability and performance weren't quite as important. To get the scalability and performance that you need for modern apps, many systems are relaxing some of the constraints of the relational data model."

[...] A long-time IBMer, Chamberlin is now semi-retired, but finds time to fulfill a role as a technical advisor for NoSQL company Couchbase. In the role, he has become an advocate for a new query language designed to overcome the "impedance mismatch" between data structures in the application language and a database, he says. UC San Diego professor Yannis Papakonstantinou has proposed SQL++ to solve this problem, with a view to addressing impedance mismatch between heavily object-based JavaScript, the core language for web development and the assumed relational approach embedded in SQL. Like C++, SQL++ is designed as a compatible extension of an earlier language, SQL, but is touted as better able to handle the JSON file format inherent in JavaScript. Couchbase and AWS have adopted the language, although the cloud giant calls it PartiQL.

At the end of the interview, Chamblin adds that "I don't think SQL is going to go away. A large part of the world's business data is encoded in SQL, and data is very sticky. Once you've got your database, you're going to leave it there. Also, relational systems do a very good job of what they were designed to do...

"[I]f you're a startup company that wants to sell shoes on the web or something, you're going to need a database, and one of those SQL implementations will do the job for free. I think relational databases and the SQL language will be with us for a long time."
AMD

AMD Core Performance Boost For Linux Getting Per-CPU Core Controls (phoronix.com) 4

An anonymous reader shared this report from Phoronix: For the past several months AMD Linux engineers have been working on AMD Core Performance Boost support for their P-State CPU frequency scaling driver. The ninth iteration of these patches were posted on Monday and besides the global enabling/disabling support for Core Performance Boost, it's now possible to selectively toggle the feature on a per-CPU core basis...

The new interface is under /sys/devices/system/cpu/cpuX/cpufreq/amd_pstate_boost_cpb for each CPU core. Thus users can tune whether particular CPU cores are boosted above the base frequency.

Power

Are Small Modular Nuclear Reactors Costly and Unviable? (cosmosmagazine.com) 184

The Royal Institution of Australia is a national non-profit hub for science communication, publishing the science magazine Cosmos four times a year.

This month they argued that small modular nuclear reactors "don't add up as a viable energy source." Proponents assert that SMRs would cost less to build and thus be more affordable. However, when evaluated on the basis of cost per unit of power capacity, SMRs will actually be more expensive than large reactors. This 'diseconomy of scale' was demonstrated by the now-terminated proposal to build six NuScale Power SMRs (77 megawatts each) in Idaho in the United States. The final cost estimate of the project per megawatt was around 250 percent more than the initial per megawatt cost for the 2,200 megawatts Vogtle nuclear power plant being built in Georgia, US. Previous small reactors built in various parts of America also shut down because they were uneconomical.
The cost was four to six times the cost of the same electricity from wind and solar photovoltaic plants, according to estimates from the Australian Commonwealth Scientific and Industrial Research Organisation and the Australian Energy Market Operator. "The money invested in nuclear energy would save far more carbon dioxide if it were instead invested in renewables," the article agues: Small reactors also raise all of the usual concerns associated with nuclear power, including the risk of severe accidents, the linkage to nuclear weapons proliferation, and the production of radioactive waste that has no demonstrated solution because of technical and social challenges. One 2022 study calculated that various radioactive waste streams from SMRs would be larger than the corresponding waste streams from existing light water reactors...

Nuclear energy itself has been declining in importance as a source of power: the fraction of the world's electricity supplied by nuclear reactors has declined from a maximum of 17.5 percent in 1996 down to 9.2 percent in 2022. All indications suggest that the trend will continue if not accelerate. The decline in the global share of nuclear power is driven by poor economics: generating power with nuclear reactors is costly compared to other low-carbon, renewable sources of energy and the difference between these costs is widening.

Thanks to Slashdot reader ZipNada for sharing the article.

Slashdot Top Deals