Unix

What Made Bell Labs So Successful? (msn.com) 70

Bell Labs "created many of the foundational innovations of the modern age," writes Jon Gertner, author of The Idea Factory: Bell Labs and the Great Age of American Innovation — from transistors and telecommunications satellites to Unix and the C programming language.

But what was the secret to its success? he asks in a new article for the Wall Street Journal. Start with its lucky arrival in a "problem-rich" environment, suggests Arno Penzias, winner of one of Bell Labs' 11 Nobel Prizes: It was Bell Labs' responsibility, in other words, to create technologies for designing, expanding and improving an unruly communications network of cables and microwave links and glass fibers. The Labs also had to figure out ways to create underwater conduits, as well as switching centers that could manage the growing number of customers and escalating amounts of data.... Money mattered, too. Being connected to AT&T, the largest company in the world, was an advantage. The Labs' budget was enormous, and accounting conventions allowed its parent company to make huge and continuing investments in R & D. The generous funding, moreover, allowed scientists and engineers to buy and build expensive equipment — for instance, anechoic chambers to create the world's quietest rooms...

The most fortunate part of Bell Labs' situation, however, was that in being attached to a monopoly it could partake in long-term thinking... Without competition nipping at its heels, Bell Labs engineers had the luxury of working out difficult ideas over decades. The first conceptualization of a cellular phone network, for instance, came out of the Labs in the late 1940s; it wasn't until the late 1970s that technicians began testing one in Chicago to gauge its potential. The challenge of deploying these technologies was immense. (The regulatory hurdles were formidable, too....)

The article also credits the visionary management of Mervin Kelly — who fortunately also "had access to funding in a decade when most executives and universities didn't" to hire the brightest people. (By the early 1980s Bell Labs employed about 25,000 researchers, technicians and support staff, with an annual budget of $2 billion — roughly $7 billion in today's dollars.) "The Labs' involvement in World War II suggested to Kelly that an exciting postwar era of electronics was approaching, but that the technical problems would be so complex that they required a mix of expertise — not just physicists, but material scientists, chemists, electrical engineers, circuitry experts and the like." At Bell Labs, Kelly would sometimes handpick teams and create such a mix, as was the case for the transistor invention in the late 1940s. He came to see innovation arising not from like-minded or similarly trained people conversing with each other, but from a friction of ideas and approaches. It meant hiring researchers who had different personalities and favored a range of experimental angles. It also meant personally designing a campus in Murray Hill where departments were spread apart, so that scientists and engineers would be forced to walk, mingle and engage in serendipitous conversations and debate ideas. Meanwhile, under Kelly, the Labs focused on hiring people who were deeply curious, not just smart. Kelly saw it as his professional duty to do far more than what was expected, with his laboratory and vast resources, to create new technologies...

The breakup of AT&T's monopoly, which led to a steady shrinking of Bell Labs' staff, budget and remit, shows us that no matter how forward looking your employees and managers may be, they will not necessarily see the future coming. It likewise suggests that technological progress is too unpredictable for one organization, no matter how powerful or smart, to control. Famously, Bell Labs managers didn't see value in the Arpanet, which eventually led to today's internet.

And yet, for at least five decades, Bell Labs created a blueprint for the global development of communications and electronics. In understanding why it did so, I tend to think its ultimate secret may be hiding in plain sight. The secret has to do with Bell Labs' structure — not only being connected to a fabulously profitable monopoly, but being connected to a company that could move theoretical and applied research into a huge manufacturing division that made telecom equipment (at Western Electric) and ultimately into a dynamic operating system (the AT&T network)... Scientists and engineers at the Labs understood their ideas would be implemented, if they passed muster, into the huge system its parent company was running.

Bell Labs racked up about 30,000 patents, according to the article, and celebrated its 100th anniversary last April.

It is now part of Finland-based Nokia.
AI

Scientists Launch AI DinoTracker App That Identifies Dinosaur Footprints (theguardian.com) 7

Scientists have released DinoTracker, a free AI-powered app that identifies dinosaur footprints by analyzing shape patterns rather than relying on potentially flawed historical labels. "When we find a dinosaur footprint, we try to do the Cinderella thing and find the foot that matches the slipper," said Prof Steve Brusatte, a co-author of the work. "But it's not so simple, because the shape of a dinosaur footprint depends not only on the shape of the dinosaur's foot but also the type of sand or mud it was walking through, and the motion of its foot." The Guardian reports: [...] Brusatte, [Dr Gregor Hartmann, the first author of the new research from Helmholtz-Zentrum in Germany] and colleagues fed their AI system with 2,000 unlabelled footprint silhouettes. The system then determined how similar or different the imprints were from each other by analysing a range of features it identified as meaningful. The researchers discovered these eight features reflected variations in the imprints' shapes, such as the spread of the toes, amount of ground contact and heel position. The team have turned the system into a free app called DinoTracker that allows users to upload the silhouette of a footprint, explore the seven other footprints most similar to it and manipulate the footprint to see how varying the eight features can affect which other footprints are deemed most similar. Hartmann said that at present experts had to double check if factors such as the material the footprints were made in, and their age, matched the scientific hypothesis, but the system clustered prints with those expected from classifications made by human experts about 90% of the time. The findings have been published in the journal PNAS.
Earth

Iceland Deems Possible Atlantic Current Collapse A Security Risk 62

Iceland has formally classified the potential collapse of a major Atlantic Ocean current system a national security threat, warning that a disruption could trigger a modern-day ice age in Northern Europe and destabilize global weather systems. The move elevates the risk across government and enables it to strategize for worst-case scenarios. Reuters reports: The Atlantic Meridional Overturning Circulation, or AMOC, current brings warm water from the tropics northward toward the Arctic, and the flow of warm water helps keep Europe's winters mild. But as warming temperatures speed the thaw of Arctic ice and cause meltwater from Greenland's ice sheet to pour into the ocean, scientists warn the cold freshwater could disrupt the current's flow.

A potential collapse of AMOC could trigger a modern-day ice age, with winter temperatures across Northern Europe plummeting to new cold extremes, bringing far more snow and ice. The AMOC has collapsed in the past - notably before the last Ice Age that ended about 12,000 years ago. "It is a direct threat to our national resilience and security," Iceland Climate Minister Johann Pall Johannsson said by email. "(This) is the first time a specific climate-related phenomenon has been formally brought before the National Security Council as a potential existential threat."

Elevation of the issue means Iceland's ministries will be on alert and coordinating a response, Johannsson said. The government is assessing what further research and policies are needed, with work underway on a disaster preparedness policy. Risks being evaluated span a range of areas, from energy and food security to infrastructure and international transportation.
"Sea ice could affect marine transport; extreme weather could severely affect our capabilities to maintain any agriculture and fisheries, which are central to our economy and food systems," Johannsson said. "We cannot afford to wait for definitive, long-term research before acting."
Medicine

What Researchers Suspect May Be Fueling Cancer Among Millennials (msn.com) 171

Cancer rates among people aged 15 to 49 have increased 10% since 2000 even as rates have fallen among older populations. Young women face an 83% higher cancer rate than men in the same age range. A 150,000-person study presented at the American Association for Cancer Research meeting found millennials appear to be aging biologically faster than previous generations based on blood biomarkers. That acceleration was associated with up to 42% increased risk for certain cancers including lung, gastrointestinal and uterine malignancies.

Researchers are examining the "exposome" -- the full range of environmental exposures across a person's life. Studies have linked early-onset cancers to medications taken during pregnancy, ultra-processed foods that now account for more than half of daily calorie intake in the United States, circadian rhythm disruption from artificial light and shift work, and chemical exposures. Gary Patti at Washington University is using zebrafish exposed to known and suspected carcinogens to track tumor development. His lab has developed systems to scan blood samples for tens of thousands of chemicals simultaneously to identify signatures appearing more frequently in early-onset cancer patients.
EU

Switzerland Approves Digital ID In Narrow Vote, UK Proposes One Too (theguardian.com) 63

"Swiss voters have backed plans for electronic identity cards by a wafer-thin margin," reports the Guardian, "in the second nationwide vote on the issue." In a referendum on Sunday, 50.4% of voters supported an electronic ID card, while 49.6% were against, confounding pollsters who had forecast stronger support for the "yes" vote. Turnout was 49.55%, higher than expected... [V]oters rejected an earlier version of the e-ID in 2021, largely over objections to the role of private companies in the system. In response to these concerns, the Swiss state will now provide the e-ID, which will be optional and free of charge... To ensure security the e-ID is linked to a single smartphone, users will have to get a new e-ID if they change their device... An ID card containing biometric data — fingerprints — will be available from the end of next year.

Critics of the e-ID scheme raised data protection concerns and said it opened the door to mass surveillance. They also fear the voluntary scheme will become mandatory and disadvantage people without smartphones. The referendum was called after a coalition of rightwing and data-privacy parties collected more than 50,000 signatures against e-ID cards, triggering the vote.

"To further ease privacy concerns, a particular authority seeking information on a person — such as proof of age or nationality, for example — will only be able to check for those specific details," notes the BBC: Supporters of the Swiss system say it will make life much easier for everyone, allowing a range of bureaucratic procedures — from getting a telephone contract to proving you are old enough to buy a bottle of wine — to happen quickly online. Opponents of digital ID cards, who gathered enough signatures to force another referendum on the issue, argue that the measure could still undermine individual privacy. They also fear that, despite the new restrictions on how data is collected and stored, it could still be used to track people and for marketing purposes.
The BBC adds that the UK government also announced plans earlier this week to introduce its own digital ID, "which would be mandatory for employment. The proposed British digital ID would have fewer intended uses than the Swiss version, but has still raised concerns about privacy and data security."

The Guardian reports: The referendum came soon after the UK government announced plans for a digital ID card, which would sit in the digital wallets of smartphones, using state-of-the-art encryption. More than 1.6 million people have signed a petition opposing e-ID cards, which would be mandatory for people working in the UK by 2029.
Thanks to long-time Slashdot reader schwit1 for sharing the news.
The Almighty Buck

Gen Z Leads Biggest Drop In FICO Scores Since Financial Crisis 111

An anonymous reader quotes a report from Bloomberg: Gen Z borrowers took the biggest hit of any age group this year, helping pull overall credit scores lower in the worst year for US consumer credit quality since the global financial crisis roiled the world's economy. The average FICO score slipped to 715 in April from 717 a year earlier, marking the second consecutive year-over-year drop, according to a report released Tuesday by Fair Isaac Corp. The average score dropped three points to 687 in 2009.

Gen Z borrowers saw the largest drop, not only this year, but of any age group since 2020, with their average score falling three points to 676, the Montana-based creator of the FICO credit score said. FICO scores are a measure of consumer credit risk and are frequently used by US banks to assess whether to provide loans. The scores typically range from 300 to 850. The credit scoring agency attributed the recent overall drop to higher rates of utilization and delinquency, including the resumption of reporting student loan delinquencies -- a category that hit a record high of 3.1% of the entire scorable population. [...] While the overall average score dropped, the median FICO score continued to rise to 745 from 744 a year ago, indicating that a large drop in scores at the low end dragged down the average.
Medicine

5% of Americans are Cancer Survivors - and They're Living Longer (msn.com) 109

"The U.S. is currently home to more than 18 million cancer survivors," reports the Wall Street Journal, "over 5% of the total population" (including those who are living with the disease).

Their article tells the story of Gwen Orilio, who was diagnosed with stage-four lung cancer at age 31. Ten years later she's still alive — and she still has metastatic cancer... Keeping her going is a string of new treatments that don't cure the disease but can buy months — even years — of time, with the hope that once one drug stops working a new one will come along. Orilio started on chemotherapy, and then switched to a new treatment, and then another, and another, and another... A small but growing population is living longer with incurable or advanced cancer, navigating the rest of their lives with a disease increasingly akin to a chronic illness. The trend, which started in breast cancer, has expanded to patients with melanoma, kidney cancer, lung cancer and others. The new drugs can add years to a life, even for some diagnoses like Orilio's that were once swift death sentences. They also put people in a state of limbo, living on a knife's edge waiting for the next scan to say a drug has stopped working and doctors need to find a new one. The wide range of survival times has made it more difficult for cancer doctors to predict how much time a patient might have left. For most, the options eventually run out....

More than 690,000 people were projected to be living with stage-four or metastatic disease of the six most common cancers — melanoma, breast, bladder, colorectal, prostate or lung cancer — in 2025, according to a 2022 report from the National Cancer Institute. That's an increase from 623,000 in 2018 and a significant rise since 1990, the report found... Nearly 30% of survivors diagnosed with metastatic melanoma and 20% of those diagnosed with metastatic colorectal or breast cancer had been living with their disease for a decade or more, the NCI paper estimated... Even for lung cancer, the biggest U.S. cancer killer, the five-year relative survival rate for advanced disease has inched up, from 3.7% for patients diagnosed in 2004 to 9.2% for patients diagnosed in 2017, federal data show. The overall lung cancer survival rate has risen by 26% in the past five years, according to the American Lung Association, as declining cigarette use, screening and new drugs have driven down deaths.

The expanding number of therapies that target a cancer's mutations or boost the immune system are improving the outlook for several cancers. In breast cancer, treatment for metastatic disease accounted for 29% of the drop in deaths between 1975 and 2019, according to one 2024 estimate, with screening and treatment for early-stage disease accounting for the rest.

The number of American cancer survivors (or those living with cancer) is expected to grow to 26 million by 2040," the article points out.
Programming

How Do You Teach Computer Science in the Age of AI? (thestar.com.my) 177

"A computer science degree used to be a golden ticket to the promised land of jobs," a college senior tells the New York Times. But "That's no longer the case."

The article notes that in the last three years there's been a 65% drop from companies seeking workers with two years of experience or less (according to an analysis by technology research/education organization CompTIA), with tech companies "relying more on AI for some aspects of coding, eliminating some entry-level work."

So what do college professors teach when AI "is coming fastest and most forcefully to computer science"? Computer science programs at universities across the country are now scrambling to understand the implications of the technological transformation, grappling with what to keep teaching in the AI era. Ideas range from less emphasis on mastering programming languages to focusing on hybrid courses designed to inject computing into every profession, as educators ponder what the tech jobs of the future will look like in an AI economy... Some educators now believe the discipline could broaden to become more like a liberal arts degree, with a greater emphasis on critical thinking and communication skills.

The National Science Foundation is funding a program, Level Up AI, to bring together university and community college educators and researchers to move toward a shared vision of the essentials of AI education. The 18-month project, run by the Computing Research Association, a research and education nonprofit, in partnership with New Mexico State University, is organising conferences and roundtables and producing white papers to share resources and best practices. The NSF-backed initiative was created because of "a sense of urgency that we need a lot more computing students — and more people — who know about AI in the workforce," said Mary Lou Maher, a computer scientist and a director of the Computing Research Association.

The future of computer science education, Maher said, is likely to focus less on coding and more on computational thinking and AI literacy. Computational thinking involves breaking down problems into smaller tasks, developing step-by-step solutions and using data to reach evidence-based conclusions. AI literacy is an understanding — at varying depths for students at different levels — of how AI works, how to use it responsibly and how it is affecting society. Nurturing informed skepticism, she said, should be a goal.

The article raises other possibilities. Experts also suggest the possibility of "a burst of technology democratization as chatbot-style tools are used by people in fields from medicine to marketing to create their own programs, tailored for their industry, fed by industry-specific data sets." Stanford CS professor Alex Aiken even argues that "The growth in software engineering jobs may decline, but the total number of people involved in programming will increase."

Last year, Carnegie Mellon actually endorsed using AI for its introductory CS courses. The dean of the school's undergraduate programs believes that coursework "should include instruction in the traditional basics of computing and AI principles, followed by plenty of hands-on experience designing software using the new tools."
United States

Young Americans Are Spending a Whole Lot Less On Video Games This Year (gamespot.com) 68

An anonymous reader quotes a report from GameSpot: Perhaps responding to economic uncertainty and narrowing job prospects, young people in the United States are significantly cutting back on spending on video games compared to this time last year. While 18- to 24-year-olds aren't buying as much across a range of different categories, losses are concentrated in games. New data published by market research firm Circana and reported by The Wall Street Journal suggests that young adults spent nearly 25% less on video game products in a four-week span in April than in the same timeframe last year. Other categories also dramatic drops: Accessories (down 18%), technology (down 14%), and furniture (down 12%).

All categories combined, the 18-24 age group spent around 13% less than last year. This decrease is not reflected among older cohorts, whose spending has been mostly stable year-over-year. The WSJ report suggests that the economic context could be driving young adults to pull back; a tighter labor market, increased economic uncertainty, and student-loan payments restarting all may be contributing to an environment hostile to the spending habits of 18- to 24-year-olds in particular.

AI

AI Use Damages Professional Reputation, Study Suggests (arstechnica.com) 90

An anonymous reader quotes a report from Ars Technica: Using AI can be a double-edged sword, according to new research from Duke University. While generative AI tools may boost productivity for some, they might also secretly damage your professional reputation. On Thursday, the Proceedings of the National Academy of Sciences (PNAS) published a study showing that employees who use AI tools like ChatGPT, Claude, and Gemini at work face negative judgments about their competence and motivation from colleagues and managers. "Our findings reveal a dilemma for people considering adopting AI tools: Although AI can enhance productivity, its use carries social costs," write researchers Jessica A. Reif, Richard P. Larrick, and Jack B. Soll of Duke's Fuqua School of Business.

The Duke team conducted four experiments with over 4,400 participants to examine both anticipated and actual evaluations of AI tool users. Their findings, presented in a paper titled "Evidence of a social evaluation penalty for using AI," reveal a consistent pattern of bias against those who receive help from AI. What made this penalty particularly concerning for the researchers was its consistency across demographics. They found that the social stigma against AI use wasn't limited to specific groups.
"Testing a broad range of stimuli enabled us to examine whether the target's age, gender, or occupation qualifies the effect of receiving help from Al on these evaluations," the authors wrote in the paper. "We found that none of these target demographic attributes influences the effect of receiving Al help on perceptions of laziness, diligence, competence, independence, or self-assuredness. This suggests that the social stigmatization of AI use is not limited to its use among particular demographic groups. The result appears to be a general one."
News

Pope Francis Has Died (sky.com) 181

Pope Francis has died at the age of 88, the Vatican said Monday. The pontiff, who was Bishop of Rome and head of the Catholic Church, became pope in 2013 after his predecessor Benedict XVI resigned. On February 14, the Pope was admitted to hospital for bronchitis treatment. From a report: Born in 1936, Francis was the first pope from South America. His papacy was marked by his championing of those escaping war and hunger, as well as those in poverty, earning him the moniker the "People's Pope." In 2016, he washed the feet of refugees from different religions at an asylum centre outside Rome in a "gesture of humility and service."

He also made his views known on a wide range of issues, from climate change to wealth inequality and the role of women in the Catholic Church.

Biotech

Technology For Lab-Grown Eggs Or Sperm On Brink of Viability, UK Watchdog Finds (theguardian.com) 99

An anonymous reader quotes a report from The Guardian: Bolstered by Silicon Valley investment, scientists are making such rapid progress that lab-grown human eggs and sperm could be a reality within a decade, a meeting of the Human Fertilization and Embryology Authority board heard last week (PDF). In-vitro gametes (IVGs), eggs or sperm that are created in the lab from genetically reprogrammed skin or stem cells, are viewed as the holy grail of fertility research. The technology promises to remove age barriers to conception and could pave the way for same-sex couples to have biological children together. It also poses unprecedented medical and ethical risks, which the HFEA now believes need to be considered in a proposed overhaul of fertility laws.

Peter Thompson, chief executive of the HFEA, said: "In-vitro gametes have the potential to vastly increase the availability of human sperm and eggs for research and, if proved safe, effective, and publicly acceptable, to provide new fertility treatment options for men with low sperm counts and women with low ovarian reserve." The technology also heralds more radical possibilities including "solo parenting" and "multiplex parenting." Julia Chain, chair of HFEA, said: "It feels like we ought to have Steven Spielberg on this committee," in a brief moment of levity in the discussion of how technology should be regulated. Lab-grown eggs have already been used produce healthy babies in mice -- including ones with two biological fathers. The equivalent feat is yet to be achieved using human cells, but US startups such as Conception and Gameto claim to be closing in on this prize.

The HFEA meeting noted that estimated timeframes ranged from two to three years -- deemed to be optimistic -- to a decade, with several clinicians at the meeting sharing the view that IVGs appeared destined to become "a routine part of clinical practice." The clinical use of IVGs would be prohibited under current law and there would be significant hurdles to proving that IVGs are safe, given that any unintended genetic changes to the cells would be passed down to all future generations. The technology also opens up myriad ethical issues.
Thompson said: "Research on IVGs is progressing quickly but it is not yet clear when they might be a viable option in treatment. IVGs raise important questions and that is why the HFEA has recommended that they should be subject to statutory regulation in time, and that biologically dangerous use of IVGs in treatment should never be permitted."

"This is the latest of a range of detailed recommendations on scientific developments that we are looking at to future-proof the HFE Act, but any decisions around UK modernizing fertility law are a matter for parliament."
AI

OpenAI CEO Sam Altman Anticipates Superintelligence In 'a Few Thousand Days' 174

In a rare blog post today, OpenAI CEO Sam Altman laid out his vision of the AI-powered future, which he refers to as "The Intelligence Age." Among the most notable claims, Altman said superintelligence might be achieved in "a few thousand days." VentureBeat reports: Specifically, Altman argues that "deep learning works," and can generalize across a range of domains and difficult problem sets based on its training data, allowing people to "solve hard problems," including "fixing the climate, establishing a space colony, and the discovery of all physics." As he puts it: "That's really it; humanity discovered an algorithm that could really, truly learn any distribution of data (or really, the underlying "rules" that produce any distribution of data). To a shocking degree of precision, the more compute and data available, the better it gets at helping people solve hard problems. I find that no matter how much time I spend thinking about this, I can never really internalize how consequential it is."

In a provocative statement that many AI industry participants and close observers have already seized upon in discussions on X, Altman also said that superintelligence -- AI that is "vastly smarter than humans," according to previous OpenAI statements -- may be achieved in "a few thousand days." "This may turn out to be the most consequential fact about all of history so far. It is possible that we will have superintelligence in a few thousand days (!); it may take longer, but I'm confident we'll get there." A thousand days is roughly 2.7 years, a time that is much sooner than the five years most experts give out.
Python

Python Developer Survey: 55% Use Linux, 6% Use Python 2 (jetbrains.com) 68

More than 25,000 Python developers from nearly 200 countries took the 7th annual Python Developers Survey between November 2023 and February 2024, with 85% saying Python was their main language.

Some interesting findings:
  • Though Python 2 reached "end-of-life" status in April of 2020, last year's survey found 7% of respondents were still using Python 2. This year's survey found that number has finally dropped... to 6%.

    "Almost half of Python 2 holdouts are under 21 years old," the survey results point out, "and a third are students. Perhaps courses are still using Python 2?"
  • Meanwhile, 73% are using one of the last three versions of Python (3.10, 3.11, or 3.12)
  • "The share of developers using Linux as their development environment has decreased through the years: compared with 2021, it's dropped by 8 percentage points." [The graphic is a little confusing, showing 55% using Linux, 55% using Windows, 29% on MacOS, 2% on BSD, and 1% on "Other."]
  • Visual Studio Code is the most popular IDE (22%), followed by Jupyter Notebook (20%) and Vim (17%). The next-most popular IDEs were PyCharm Community Edition (13%), JupyterLab (12%), NotePad++ (11%) and Sublime Text (9%). Interestingly, just 23% of the 25,000 respondents said they only used one IDE, with 38% saying they used two, 21% using three, and 19% using four or more. [The annual survey is a collaboration between the Python Software Foundation and JetBrains.]
  • 37% said they'd contributed to open-source projects within the last year. (77% of those contributed code, while 38% contributed documentation, 35% contributed governance/leadership/maintainer duties, and 33% contributed tests...)
  • For "age range," nearly one-third (32%) said 21-29 (with another 8% choosing 18-20). Another 33% said 30-39, while 16% said 40-49, 7% said 50-59, and 3% chose "60 or older."

    49% of respondents said they had less than two years of programming experience, with 33% saying "less than 1 year" and 16% saying "1-2 years." (34% of developers also said they practiced collaborative development.)

And here's how the 25,000 developers answered the question: how long have you been programming in Python?

  • Less than 1 year: 25%
  • 1-2 years: 16%
  • 3-5 years: 26%
  • 6-10 years: 19%
  • 11+ years: 13%

So what are they doing with Python? Among those who'd said Python was their main language:

  • Data analysis: 44%
  • Web development: 44%
  • Machine learning: 34%
  • Data engineering: 28%
  • Academic research: 26%
  • DevOps / Systems administration / Writing automation scripts 26%
  • Programming of web parsers / scrapers / crawlers: 25%

62% were "fully employed by a company," while the next-largest category was "student" (12%) with another 5% in "working student". There were also categories for "self-employed" (6%), "freelancer" (another 6%), and "partially employed by a company" (4%). Another 4% said they were unemployed.

In other news, the Python Software Foundation board has also "decided to invest more in connecting and serving the global Python community" by hosting monthly "office hours" on their Discord channel.


IBM

IBM, Kyndryl Sued For Age Discrimination By Its Own VPs (theregister.com) 64

Thomas Claburn reports via The Register: Once again, IBM has been sued for age discrimination, this time alongside spin-off Kyndryl, for allegedly cutting the jobs of older workers while creating similar positions for younger ones. The complaint [PDF] was filed on Tuesday in New York City, on behalf of five veteran executives and employees who collectively served the two corporations for more than 150 years. The IBM plaintiffs include: Michael Nolan, former Director of Strategy and Planning for IBM's Software Unit; Karla Bousquet, former VP, CEO of Events at IBM, Karla; Jay Zeltzer, former Business Automation Leader; and Teresa Cook, former VP of Client Experience. Randall Blanchard, former Services Account manager, is suing Kyndryl, having previously been with Big Blue.

Despite IBM chief global HR officer Nickel LaMoreaux's 2022 rejection of what she characterized as "false claims of systemic age discrimination," the lawsuit argues the mainframe titan is still targeting older workers. The legal filing cites a 2021 case, Townsley v. Int'l Bus. Machines Corp, in which executive Sam Ladah, who is accused of attempting "to keep ageist IBM executive level planning documents confidential," said those documents from five to six years earlier were still being used for hiring decisions. To further support the claim that the targeting of older workers continues to this day, the complaint says, "A recently leaked video of [CEO Arvind] Krishna confirms that IBM has continued its practice of using secretive top-down pressure to gerrymander its workforce to reflect the demographic preferences of its executives."

The 2023 video, published by conservative political activist James O'Keefe, appears to show Krishna tying manager bonuses to diversity targets in a context where such targets are alleged to be discriminatory. Basically, IBM has been accused of threatening to withhold bonuses from bosses if they don't hire a diverse enough range of techies -- more Hispanic and Black people -- leading to qualified candidates -- Asian people and others -- being ignored on the basis of their race. The latest lawsuit also points to Wimbish v. IBM, an age discrimination complaint filed in September by two human resources managers. "In their complaint, these fired HR managers alleged that IBM's HR still constantly consider an employee's 'runway' when determining if that worker would be terminated," the complaint says. "'Runway' is coded language for how long IBM HR expects an employee to remain at IBM before they retire, a direct proxy for age."

Technology

Vernor Vinge, Father of the Tech Singularity, Has Died At Age 79 (arstechnica.com) 67

"Vernor Vinge, who three times won the Hugo for best novel, has died," writes Slashdot reader Felix Baum. Ars Technica reports: On Wednesday, author David Brin announced that Vernor Vinge, sci-fi author, former professor, and father of the technological singularity concept, died from Parkinson's disease at age 79 on March 20, 2024, in La Jolla, California. The announcement came in a Facebook tribute where Brin wrote about Vinge's deep love for science and writing. "A titan in the literary genre that explores a limitless range of potential destinies, Vernor enthralled millions with tales of plausible tomorrows, made all the more vivid by his polymath masteries of language, drama, characters, and the implications of science," wrote Brin in his post.

As a sci-fi author, Vinge won Hugo Awards for his novels A Fire Upon the Deep (1993), A Deepness in the Sky (2000), and Rainbows End (2007). He also won Hugos for novellas Fast Times at Fairmont High (2002) and The Cookie Monster (2004). As Mike Glyer's File 770 blog notes, Vinge's novella True Names (1981) is frequency cited as the first presentation of an in-depth look at the concept of "cyberspace." Vinge first coined the term "singularity" as related to technology in 1983, borrowed from the concept of a singularity in spacetime in physics.

When discussing the creation of intelligences far greater than our own in an 1983 op-ed in OMNI magazine, Vinge wrote, "When this happens, human history will have reached a kind of singularity, an intellectual transition as impenetrable as the knotted space-time at the center of a black hole, and the world will pass far beyond our understanding." In 1993, he expanded on the idea in an essay titled The Coming Technological Singularity: How to Survive in the Post-Human Era.

Social Networks

TikTok is Banned in China, Notes X User Community - Along With Most US Social Media (newsweek.com) 148

Newsweek points out that a Chinese government post arguing the bill is "on the wrong side of fair competition" was flagged by users on X. "TikTok is banned in the People's Republic of China," the X community note read. (The BBC reports that "Instead, Chinese users use a similar app, Douyin, which is only available in China and subject to monitoring and censorship by the government.")

Newsweek adds that China "has also blocked access to YouTube, Facebook, Instagram, and Google services. X itself is also banned — though Chinese diplomats use the microblogging app to deliver Beijing's messaging to the wider world."

From the Wall Street Journal: Among the top concerns for [U.S.] intelligence leaders is that they wouldn't even necessarily be able to detect a Chinese influence operation if one were taking place [on TikTok] due to the opacity of the platform and how its algorithm surfaces content to users. Such operations, FBI director Christopher Wray said this week in congressional testimony, "are extraordinarily difficult to detect, which is part of what makes the national-security concerns represented by TikTok so significant...."

Critics of the bill include libertarian-leaning lawmakers, such as Sen. Rand Paul (R., Ky.), who have decried it as a form of government censorship. "The Constitution says that you have a First Amendment right to express yourself," Paul told reporters Thursday. TikTok's users "express themselves through dancing or whatever else they do on TikTok. You can't just tell them they can't do that." In the House, a bloc of 50 Democrats voted against the bill, citing concerns about curtailing free speech and the impact on people who earn income on the app. Some Senate Democrats have raised similar worries, as well as an interest in looking at a range of social-media issues at rival companies such as Meta Platforms.

"The basic idea should be to put curbs on all social media, not just one," Sen. Elizabeth Warren (D., Mass.) said Thursday. "If there's a problem with privacy, with how our children are treated, then we need to curb that behavior wherever it occurs."

Some context from the Columbia Journalism Review: Roughly one-third of Americans aged 18-29 regularly get their news from TikTok, the Pew Research Center found in a late 2023 survey. Nearly half of all TikTok users say they regularly get news from the app, a higher percentage than for any other social media platform aside from Twitter.

Almost 40 percent of young adults were using TikTok and Instagram for their primary Web search instead of the traditional search engines, a Google senior vice president said in mid-2022 — a number that's almost certainly grown since then. Overall, TikTok claims 150 million American users, almost half the US population; two-thirds of Americans aged 18-29 use the app.

Some U.S. politicians believe TikTok "radicalized" some of their supporters "with disinformation or biased reporting," according to the article.

Meanwhile in the Guardian, a Duke University law professor argues "this saga demands a broader conversation about safeguarding democracy in the digital age." The European Union's newly enacted AI act provides a blueprint for a more holistic approach, using an evidence- and risk-based system that could be used to classify platforms like TikTok as high-risk AI systems subject to more stringent regulatory oversight, with measures that demand transparency, accountability and defensive measures against misuse.
Open source advocate Evan Prodromou argues that the TikTok controversy raises a larger issue: If algorithmic curation is so powerful, "who's making the decisions on how they're used?" And he also proposes a solution.

"If there is concern about algorithms being manipulated by foreign governments, using Fediverse-enabled domestic software prevents the problem."
The Courts

Frozen Embryos Are 'Children,' According To Alabama's Supreme Court (arstechnica.com) 557

An anonymous reader quotes a report from Ars Technica: The Alabama Supreme Court on Friday ruled that frozen embryos are "children," entitled to full personhood rights, and anyone who destroys them could be liable in a wrongful death case. The first-of-its-kind ruling throws into question the future use of assisted reproductive technology (ART) involving in vitro fertilization for patients in Alabama -- and beyond. For this technology, people who want children but face challenges to conceiving can create embryos in clinical settings, which may or may not go on to be implanted in a uterus.

In the Alabama case, a hospital patient wandered through an unlocked door, removed frozen, preserved embryos from subzero storage and, suffering an ice burn, dropped the embryos, killing them. Affected IVF patients filed wrongful-death lawsuits against the IVF clinic under the state's Wrongful Death of a Minor Act. The case was initially dismissed in a lower court, which ruled the embryos did not meet the definition of a child. But the Alabama Supreme Court ruled that "it applies to all children, born and unborn, without limitation." In a concurring opinion, Chief Justice Tom Parker cited his religious beliefs and quoted the Bible to support the stance.

"Human life cannot be wrongfully destroyed without incurring the wrath of a holy God, who views the destruction of His image as an affront to Himself," Parker wrote. "Even before birth, all human beings bear the image of God, and their lives cannot be destroyed without effacing his glory." In 2020, the US Department of Health and Human Services estimated that there were over 600,000 embryos frozen in storage around the country, a significant percentage of which will likely never result in a live birth.
The result of this ruling "could mean that any embryos that are destroyed or discarded in the process of IVF or afterward could be the subject of wrongful death lawsuits," notes Ars. [According to national ART data collected by the Centers for Disease Control and Prevention, the percentage of egg retrievals that fail to result in a live birth ranges from 46 percent to 91 percent, depending on the patient's age. Meanwhile, the percentage of fertilized egg or embryo transfers that fail to result in a live birth range from 51 percent to 76 percent, depending on age.]

"The ruling creates potentially paralyzing liability for ART clinics and patients who use them. Doctors may choose to only attempt creating embryos one at a time to avoid liability attached to creating extras, or they may decline to provide IVF altogether to avoid liability when embryos do not survive the process. This could exacerbate the already financially draining and emotionally exhausting process of IVF, potentially putting it entirely out of reach for those who want to use the technology and putting clinics out of business."
Classic Games (Games)

Atari Will Release a Mini Edition of Its 1979 Atari 400 (Which Had An 8-Bit MOS 6502 CPU) (extremetech.com) 64

An 1979 Atari 8-bit system re-released in a tiny form factor? Yep.

Retro Games Ltd. is releasing a "half-sized" version of its very first home computer, the Atari 400, "emulating the whole 8-bit Atari range, including the 400/800, XL and XE series, and the 5200 home console. ("In 1979 Atari brought the computer age home," remembers a video announcement, saying the new device represents "The iconic computer now reimagined.")

More info from ExtremeTech: For those of you unfamiliar with it, the Atari 400 and 800 were launched in 1979 as the company's first attempt at a home computer that just happened to double as an incredible game system. That's because, in addition to a faster variant of the excellent 8-bit MOS 6502 CPU found in the Apple II and Commodore PET, they also included Atari's dedicated ANTIC, GTIA, and POKEY coprocessors for graphics and sound, making the Atari 400 and 800 the first true gaming PCs...

If it's as good as the other Retro Games systems, the [new] 400Mini will count as another feather in the cap for Atari Interactive's resurgence following its excellent Atari50 compilation, reissued Atari 2600+ console, and acquisitions of key properties including Digital Eclipse, MobyGames, and AtariAge.

The 2024 version — launching in the U.K. March 28th — will boast high-definition HDMI output at 720p 50 or 60Hz, along with five USB ports. More details from Retro Games Ltd. Also included is THECXSTICK — a superb recreation of the classic Atari CX-40 joystick, with an additional seven seamlessly integrated function buttons. Play one of the included 25 classic Atari games, selected from a simple to use carousel, including all-time greats such as Berzerk, Missile Command, Lee, Millipede, Miner 2049er, M.U.L.E. and Star Raiders II, or play the games you own from USB stick. Plus save and resume your game at any time, or rewind by up to 30 seconds to help you finish those punishingly difficult classics!
Thanks to long-time Slashdot reader elfstones for sharing the article.
Technology

How Thermal Management is Changing in the Age of the Kilowatt Chip (theregister.com) 15

An anonymous reader shares a report: As Moore's Law slowed to a crawl, chips, particularly those used in AI and high-performance computing (HPC), have steadily gotten hotter. In 2023 we saw accelerators enter the kilowatt range with the arrival of Nvidia's GH200 Superchips. We've known these chips would be hot for a while now -- Nvidia has been teasing the CPU-GPU franken-chip for the better part of two years. What we didn't know until recently is how OEMs and systems builders would respond to such a power-dense part. Would most of the systems be liquid cooled? Or, would most stick to air cooling? How many of these accelerators would they try to cram into a single box, and how big would the box be?

Now that the first systems based on the GH200 make their way to market, it's become clear that form factor is very much being dictated by power density than anything else. It essentially boils down to how much surface area you have to dissipate the heat. Dig through the systems available today from Supermicro, Gigabyte, QCT, Pegatron, HPE, and others and you'll quickly notice a trend. Up to about 500 W per rack unit (RU) -- 1 kW in the case of Supermicro's MGX ARS-111GL-NHR -- these systems are largely air cooled. While hot, it's still a manageable thermal load to dissipate, working out to about 21-24 kW per rack. That's well within the power delivery and thermal management capacity of modern datacenters, especially those making use of rear door heat exchangers.

However, this changes when system builders start cramming more than a kilowatt of accelerators into each chassis. At this point most of the OEM systems we looked at switched to direct liquid cooling. Gigabyte's H263-V11, for example, offers up to four GH200 nodes in a single 2U chassis. That's two kilowatts per rack unit. So while a system like Nvidia's air-cooled DGX H100 with its eight 700 W H100s and twin Sapphire Rapids CPUs has a higher TDP at 10.2 kW, it's actually less power dense at 1.2 kW/RU.

Slashdot Top Deals