×
Social Networks

Is LinkedIn Becoming the Hottest New Dating Site? (businessinsider.com) 110

Business Insider's Kelli Maria Korducki reports on a growing trend happening on LinkedIn: some people are using the professional network for personal connections, fielding romantic offers amid job postings. But that leaves the question: Is it a good idea to mix work and love? From the report: Dustin Kidd, a professor of sociology at Temple University who researches social media and pop culture, said that dating via LinkedIn belonged to a long tradition of "dating hacks" -- using online tools designed for other purposes to snag a date. "In the aughts, this happened with Friendster and then Myspace," Kidd said, but has since spread to myriad platforms that are ostensibly romance-free. Even fitness-tracking sites such as Strava are fair game. The common thread for love-hijacked social-media sites is a single feature, Kidd said: DMs. "The design of LinkedIn helps to maintain its focus on the professional, but any platform with a direct-messaging option is likely to also be used to pursue sex and dating," he told me. The ease and relative privacy of direct messaging help explain how some people are using LinkedIn for romance, but it doesn't explain why. In an age with so many dedicated dating platforms -- from giants such as Tinder, Bumble, and Hinge to niche apps including Feeld (for the unconventional), Pure (for the noncommittal), and NUiT (for the astrologically inclined) -- why mix Cupid's arrow with corporate updates?

Any type of social media where you can see people's pictures can turn into a dating app. And LinkedIn is even better because it's not just showing people's fake lives. One answer may be the growing number of Americans who have gotten tired of the roulette-like experience that comes with modern dating apps. In a 2023 Pew survey of US adults, nearly one-third of respondents said they had used an online dating site or app at least once. More than half of women who had used the apps reported feeling overwhelmed by the number of messages they had received in the past year, while 64% of men said they felt insecure from the lack of messages they had gotten. Though an overwhelming majority of men and women said they'd felt excited about people they connected with, an even-larger proportion of respondents said they were sometimes or often disappointed by their matches. [...]

LinkedIn's appeal as a dating site, according to people who use it that way, is the platform's ability to give back some of that control and boost the caliber of their prospects. Because the professional-networking site asks users to link to their current and former employers' profile pages, it offers an additional layer of credibility that other social-media platforms lack. Many profiles also include first-person references from former colleagues and managers -- real people with real profile pages. [...] Even for those who shy away from using LinkedIn to angle for dates, the site has become a go-to tool for vetting romantic candidates found through conventional dating apps or in-person encounters. "Social media is just one big dating app," [said Samuela John, a 24-year-old personal organizer in New York City who developed chemistry with an oil-industry man on the platform]. "Any type of social media where you can see people's pictures can turn into a dating app. And LinkedIn is even better because it's not just showing people's fake lives." [...] "I don't think you should go into it like, 'All right, I'm going to find my husband on LinkedIn,'" John said. "I think you should go about it as if you were just networking, like in a casual sense. And then if you end up meeting the person, see the vibes and then go from there."

Space

New Images of Jupiter's Moon Io Capture Infernal Volcanic Landscape (nytimes.com) 16

NASA's Juno spacecraft made its closest flyby yet of Io, one of Jupiter's largest moons, sending back images of "sharp cliffs, edgy mountain peaks, lakes of pooled lava and even a volcanic plume," reports the New York Times. From the report: The Juno spacecraft, designed to study the origin and evolution of Jupiter, arrived at the planet in 2016. NASA extended the mission in 2021, and the orbiter has since captured photos of the Jovian moons Ganymede, Europa and most recently Io. [...] Juno conducted a number of more distant observations of Io in recent years. Its latest flyby occurred on Dec. 30, when the spacecraft came within 932 miles of the moon. The images captured during this visit were made with an instrument called JunoCam and are in visible wavelengths. They are some of the highest resolution views of Io's global structure. The mission's managers shared six images of Io on the mission's website, and members of the public have since uploaded digitally enhanced versions that highlight features on Io's surface.

Mission scientists are already at work analyzing these images, searching for differences across Io's surface to learn how often its volcanoes erupt, how bright and hot those eruptions are and how the resulting lava flows. According to Dr. Bolton, the team will also compare Juno's images to older views of the Jovian moon to determine what has changed on Io over a variety of encounters. And they'll get a second set of data to work with in a month, when Juno completes another close flyby of the explosive world on Feb. 3.

Power

First EV With Lithium-Free Sodium Battery Hits the Road In January (carnewschina.com) 67

Deliveries of the world's first mass-produced electric vehicle equipped with a sodium-ion battery will begin in January 2024. According to CarNewsChina, they're being produced by JAC Motors, a Volkswagen-backed Chinese automaker, through its new Yiwei EV brand. From the report: The Yiwei EV hatchback will have a cylindrical sodium-ion pack from Beijing-based HiNa Battery and adopt JAC's UE (Unitized Encapsulation) module technology. UE is also known as a honeycomb design because of its appearance. It is another battery structure concept like CATL's CTP (cell-to-pack) or BYD's Blade battery. Yiwei is a new EV brand under Anhui Jianghuai Automobile (JAC), established in 2023. JAC's parent company, Anhui Jianghuai Automobile Group Holdings (JAG), is 50% state-owned, and 50% belongs to Volkswagen Group. The German automotive giant acquired its stake in 2020 in an unprecedented move to invest in China's state-owned car maker.

[...] In February 2023, JAC announced they were the first automaker to put the lithium-free sodium-ion battery on an electric vehicle. That EV was a Sehol E10X hatchback, and the Na+ battery had the following specifications: 25 kWh capacity, 120 Wh/kg energy density (single cell 140 Wh/kg), 3C to 4C charging (10% - 80% in 20 minutes), 252 km (157 miles) range for E10X, and HiNa NaCR32140 cell. Sehol was a brand under Volkswagen Anhui JV, which VW transferred to JAC in 2021. When the Yiwei brand was launched in May 2023, JAC announced that it would ditch the Sehol brand, and all vehicles are being rebadged to JAC or Yiwei. The pictures JAC released today tell us that the new sodium-ion-powered EV is the Sehol E10X. JAC hasn't yet confirmed the name of the new car under the Yiwei brand; it could be Yiwei E10X, but we have to wait for JAC's confirmation.

JAC recently pushed a lot into sodium-ion batteries R&D. During the Shanghai Auto Show in April 2023, the company showcased its first car under the Yiwei brand called Yiwei 3, which was equipped with a sodium-ion battery. However, the EV launched later in June, only with a classic LFP lithium battery, and promised the Na+ variant would come later. The Yiwei 3 is a compact hatchback that competes with Wuling Bingo, BYD Seagull, or ORA Funky Cat. It has two power train options, both front-wheel drive: 70 kW and 100 kW motor. The maximum cruising range is 505 km CLTC with a 51.5 kWh battery.

AI

'What Kind of Bubble Is AI?' (locusmag.com) 100

"Of course AI is a bubble," argues tech activist/blogger/science fiction author Cory Doctorow.

The real question is what happens when it bursts?

Doctorow examines history — the "irrational exuberance" of the dotcom bubble, 2008's financial derivatives, NFTs, and even cryptocurrency. ("A few programmers were trained in Rust... but otherwise, the residue from crypto is a lot of bad digital art and worse Austrian economics.") So would an AI bubble leave anything useful behind? The largest of these models are incredibly expensive. They're expensive to make, with billions spent acquiring training data, labelling it, and running it through massive computing arrays to turn it into models. Even more important, these models are expensive to run.... Do the potential paying customers for these large models add up to enough money to keep the servers on? That's the 13 trillion dollar question, and the answer is the difference between WorldCom and Enron, or dotcoms and cryptocurrency. Though I don't have a certain answer to this question, I am skeptical.

AI decision support is potentially valuable to practitioners. Accountants might value an AI tool's ability to draft a tax return. Radiologists might value the AI's guess about whether an X-ray suggests a cancerous mass. But with AIs' tendency to "hallucinate" and confabulate, there's an increasing recognition that these AI judgments require a "human in the loop" to carefully review their judgments... There just aren't that many customers for a product that makes their own high-stakes projects betÂter, but more expensive. There are many low-stakes applications — say, selling kids access to a cheap subscription that generates pictures of their RPG characters in action — but they don't pay much. The universe of low-stakes, high-dollar applications for AI is so small that I can't think of anything that belongs in it.

There are some promising avenues, like "federated learning," that hypothetically combine a lot of commodity consumer hardware to replicate some of the features of those big, capital-intensive models from the bubble's beneficiaries. It may be that — as with the interregnum after the dotcom bust — AI practitioners will use their all-expenses-paid education in PyTorch and TensorFlow (AI's answer to Perl and Python) to push the limits on federated learning and small-scale AI models to new places, driven by playfulness, scientific curiosity, and a desire to solve real problems. There will also be a lot more people who understand statistical analysis at scale and how to wrangle large amounts of data. There will be a lot of people who know PyTorch and TensorFlow, too — both of these are "open source" projects, but are effectively controlled by Meta and Google, respectively. Perhaps they'll be wrestled away from their corporate owners, forked and made more broadly applicable, after those corporate behemoths move on from their money-losing Big AI bets.

Our policymakers are putting a lot of energy into thinking about what they'll do if the AI bubble doesn't pop — wrangling about "AI ethics" and "AI safety." But — as with all the previous tech bubbles — very few people are talking about what we'll be able to salvage when the bubble is over.

Thanks to long-time Slashdot reader mspohr for sharing the article.
NASA

Nikon Makes Special Firmware For NASA To Block Galactic Cosmic Rays In Photos (petapixel.com) 31

In an exclusive interview with PetaPixel, astronaut Don Pettit reveals the changes that Nikon makes to its firmware especially for NASA. From the report: Galactic cosmic rays are high-energy particles that originate from outside the solar system that likely come from explosive events such as a supernova. They are bad news for cameras in space -- damaging the sensor and spoiling photos -- so Nikon made special firmware for NASA to limit the harm. Pettit tells PetaPixel that Nikon changed the in-camera noise reduction settings to battle the cosmic rays -- noise is unwanted texture and blur on photos.

Normal cameras have in-camera noise reduction for exposures equal to or longer than one second. This is because camera manufacturers don't think photographers need noise reduction for shorter exposures because there's no noise to reduce. But in space, that's not true. "Our cameras in space get sensor damage from galactic cosmic rays and after about six months we replace all the cameras but you still have cameras with significant cosmic ray damage," explains Pettit. "It shows up at fast shutter speeds, not just the slow ones. So we got Nikon to change the algorithm so that it can do in-camera noise reduction at shutter speeds of up to 500th of a second."

Pettit says Nikon's in-camera noise reduction "does wonders" for getting rid of the cosmic ray damage and that "trying to get rid of it after the fact is really difficult." That's not the only special firmware feature that Nikon makes for NASA; photographers who shoot enough photos know that the file naming system resets itself eventually which is no good for the space agency's astronauts. "The file naming system on a standard digital system will repeat every so often and we can't have two pictures with the same number," explains Pettit. "We'll take half a million pictures with the crew on orbit and so Nikon has changed the way the RAW files are numbered so that there will be no two with the same file number."
The report notes that NASA started using Nikon film cameras in 1971, shortly after the Apollo era; "in part because Nikon is so good at making custom modifications that help the astronauts." Previously, the agency used boxy, black Hasselblad cameras.
AI

New 'Stable Video Diffusion' AI Model Can Animate Any Still Image (arstechnica.com) 13

An anonymous reader quotes a report from Ars Technica: On Tuesday, Stability AI released Stable Video Diffusion, a new free AI research tool that can turn any still image into a short video -- with mixed results. It's an open-weights preview of two AI models that use a technique called image-to-video, and it can run locally on a machine with an Nvidia GPU. [...] Right now, Stable Video Diffusion consists of two models: one that can produce image-to-video synthesis at 14 frames of length (called "SVD"), and another that generates 25 frames (called "SVD-XT"). They can operate at varying speeds from 3 to 30 frames per second, and they output short (typically 2-4 second-long) MP4 video clips at 576x1024 resolution.

In our local testing, a 14-frame generation took about 30 minutes to create on an Nvidia RTX 3060 graphics card, but users can experiment with running the models much faster on the cloud through services like Hugging Face and Replicate (some of which you may need to pay for). In our experiments, the generated animation typically keeps a portion of the scene static and adds panning and zooming effects or animates smoke or fire. People depicted in photos often do not move, although we did get one Getty image of Steve Wozniak to slightly come to life.

Given these limitations, Stability emphasizes that the model is still early and is intended for research only. "While we eagerly update our models with the latest advancements and work to incorporate your feedback," the company writes on its website, "this model is not intended for real-world or commercial applications at this stage. Your insights and feedback on safety and quality are important to refining this model for its eventual release." Notably, but perhaps unsurprisingly, the Stable Video Diffusion research paper (PDF) does not reveal the source of the models' training datasets, only saying that the research team used "a large video dataset comprising roughly 600 million samples" that they curated into the Large Video Dataset (LVD), which consists of 580 million annotated video clips that span 212 years of content in duration.

Earth

World's Richest 1% Emit As Much Carbon As Bottom Two-Thirds, Report Finds (phys.org) 214

An anonymous reader quotes a report from Phys.Org: The richest one percent of the global population are responsible for the same amount of carbon emissions as the world's poorest two-thirds, or five billion people, according to an analysis published Sunday by the nonprofit Oxfam International. [...] Among the key findings of this study are that the richest one percent globally -- 77 million people -- were responsible for 16 percent of global emissions related to their consumption. That is the same share as the bottom 66 percent of the global population by income, or 5.11 billion people. The income threshold for being among the global top one percent was adjusted by country using purchasing power parity -- for example in the United States the threshold would be $140,000, whereas the Kenyan equivalent would be about $40,000. Within country analyses also painted very stark pictures.

For example, in France, the richest one percent emit as much carbon in one year as the poorest 50 percent in 10 years. Excluding the carbon associated with his investments, Bernard Arnault, the billionaire founder of Louis Vuitton and richest man in France, has a footprint 1,270 times greater than that of the average Frenchman. The key message, according to Lawson, was that policy actions must be progressive. These measures could include, for example, a tax on flying more than ten times a year, or a tax on non-green investments that is much higher than the tax on green investments.

While the current report focused on carbon linked only to individual consumption, "the personal consumption of the super-rich is dwarfed by emissions resulting from their investments in companies," the report found. Nor are the wealthy invested in polluting industries at a similar ratio to any given investor -- billionaires are twice as likely to be invested in polluting industries than the average for the Standard & Poor 500, previous Oxfam research has shown.

AI

White Faces Generated By AI Are More Convincing Than Photos, Finds Survey (theguardian.com) 70

Nicola Davis reports via The Guardian: A new study has found people are more likely to think pictures of white faces generated by AI are human than photographs of real individuals. "Remarkably, white AI faces can convincingly pass as more real than human faces -- and people do not realize they are being fooled," the researchers report. The team, which includes researchers from Australia, the UK and the Netherlands, said their findings had important implications in the real world, including in identity theft, with the possibility that people could end up being duped by digital impostors.

However, the team said the results did not hold for images of people of color, possibly because the algorithm used to generate AI faces was largely trained on images of white people. Dr Zak Witkower, a co-author of the research from the University of Amsterdam, said that could have ramifications for areas ranging from online therapy to robots. "It's going to produce more realistic situations for white faces than other race faces," he said. The team caution such a situation could also mean perceptions of race end up being confounded with perceptions of being "human," adding it could also perpetuate social biases, including in finding missing children, given this can depend on AI-generated faces.
The findings have been published in the journal Psychological Science.
Space

Euclid Telescope: First Images Revealed From 'Dark Universe' Mission (bbc.com) 10

AmiMoJo shares a report from the BBC: Europe's Euclid telescope is ready to begin its quest to understand the greatest mysteries in the Universe. Exquisite imagery from the space observatory shows its capabilities to be exceptional. Over the next six years, Euclid will survey a third of the heavens to get some clues about the nature of so-called dark matter and dark energy. The 1.4 billion euro Euclid telescope went into space in July. Since then, engineers have been fine-tuning it. There were some early worries. Initially, Euclid's optics couldn't lock on to stars to take a steady image. This required new software for the telescope's fine guidance sensor. Engineers also found some stray light was polluting pictures when the observatory was pointed in a certain way. But with these issues all now resolved, Euclid is good to go -- as evidenced by the release of five sample images on Tuesday. "No previous space telescope has been able to combine the breadth, depth and sharpness of vision that Euclid can," notes the BBC. "The astonishing James Webb telescope, for example, has much higher resolution, but it can't cover the amount of sky that Euclid does in one shot."
Businesses

An AI Smoothie Shop Opened in San Francisco With Much Hype. Why Is It Closed Already? 91

In September, a "bespoke AI nutrition" store opened in beleaguered downtown San Francisco to much fanfare, promising smoothie concoctions generated by AI and a much-needed boost to the area. Less than two months later, it has seemingly closed without explanation. From a report: BetterBlends advertised "Your Smoothie, powered by AI" and received positive press upon its opening, ginning up excitement for a new business and a novel use of artificial intelligence. Its AI model would take customer orders and preferences to generate a smoothie recipe that would then be blended by hand by co-founders Michael Parlato and Clayton Reynolds, who worked in the shop. But now the storefront sits empty. On Friday 20 October, the locked doors to BetterBlends featured a sign that read "temporarily closed," stating the shop would reopen in one hour -- but sources in the neighborhood said the storefront had been closed for more than three weeks.

By the following Monday, the sign had been removed, and the inside of the shop was largely cleared of blenders, fruit, vegetables and other supplies -- anything you might need to make a smoothie, with or without AI. Only a trashcan and a few plants remained. The store's Google Maps listing speaks to both problems in the physical world and its roots in AI. A Google Maps review posted two weeks ago accompanied by a picture of the sign read: "I was hopeful for this business. The owners however did not understand the discipline to run a restaurant. It was often open late and closed early. They changed their hours after a week of being open. And then 1 day they put up a sign, 'Temporarily closed, be back in an hour.' They have not been back in over two weeks."

Other reviews were positive, awarding BetterBlends four or five stars. The shop owners themselves uploaded pictures of their smoothies to the Google Maps page as well as an image of happy customers that bears the hallmarks of generative AI. The light on their smiling faces is soft and glossy like a photoshoot. The fruits in the store window are, on closer inspection, unrecognizable blobs of fruit-colored things. The clear plastic cups are branded with gibberish characters that don't spell anything and filled with lumpy smoothie-ish mixtures. They are cartoonishly large in the customers' hands, one of which has only three too-long fingers. AI image generators have a documented history of failing to produce text within images or realistic human hands.
AI

Leica Camera Has Built-In Defense Against Misleading AI, Costs $9,125 45

Scharon Harding reports via Ars Technica: On Thursday, Leica Camera released the first camera that can take pictures with automatically encrypted metadata and provide features such as an editing history. The company believes this system, called Content Credentials, will help photojournalists protect their work and prove authenticity in a world riddled with AI-manipulated content.

Leica's M11-P can store each captured image with Content Credentials, which is based on the Coalition for Content Provenance and Authenticity's (C2PA's) open standard and is being pushed by the Content Authenticity Initiative (CAI). Content Credentials, announced in October, includes encrypted metadata detailing where and when the photo was taken and with what camera and model. It also keeps track of edits and tools used for edits. When a photographer opts to use the feature, they'll see a Content Credentials logo in the camera's display, and images will be signed through the use of an algorithm.

The feature requires the camera to use a specialized chipset for storing digital certificates. Credentials can be verified via Leica's FOTOS app or on the Content Credentials website. Leica's announcement said: "Whenever someone subsequently edits that photo, the changes are recorded to an updated manifest, rebundled with the image, and updated in the Content Credentials database whenever it is reshared on social media. Users who find these images online can click on the CR icon in the [pictures'] corner to pull up all of this historical manifest information as well, providing a clear chain of providence, presumably, all the way back to the original photographer." The M11-P's Content Credentials is an opt-in feature and can also be erased. As Ars has previously noted, an image edited with tools that don't support Content Credentials can also result in a gap in the image's provenance data.
Advertising

'Pause Ads' Creep Onto Hulu, Peacock and Max As Streamers Seek New Revenue (variety.com) 53

Brian Steinberg reports via Variety: So-called "pause ads" -- they only turn up a few seconds after a viewer has decided to halt the programming, and not every time one does -- are seeing new movement in the streaming world, with the format appearing more frequently on Hulu since July, according to Josh Mattison, senior vice president of revenue management and operations for Disney Advertising. Pause ads are also in motion in venues such as NBCUniversal's Peacock and Warner Bros. Discovery's Max.

As more media companies seek to goose subscriber rates by offering cheaper ad-supported versions of their streaming services, this type of commercial may become more handy. One of the main attractions of streaming, after all, is that it boasts fewer traditional commercials than its linear TV counterpart. The industry hopes that a pause ad -- other "out of pod" commercial experiences are also in development -- can appear on screen without upsetting a subscriber who gets viscerally roiled by the prospect of a glut of typical TV spots.

Others have also found ways to work ads into the moments when streaming fans come to a stopping point. NBCUniversal's Peacock launched with pause ads, says Peter Blacker, executive vice president of streaming and data products for NBCUniversal's ad-sales division, while Warner Bros. Discovery's Max introduced them in 2022, says Ryan Gould, head of digital ad sales and client partnerships at the company. No one has been holding back on the new format. Hulu has experimented with pauses since at least 2018, and an early version of the idea surfaced last decade when Coca-Cola and Universal Pictures tested concepts with ReplayTV, an early backer of digital video recording technology. Coke, which once used the slogan "the pause that refreshes" to great effect, and Charmin, the Procter & Gamble toilet tissue that can offer succor during many breaks in TV viewing, tested the format with Hulu in 2019.

Android

Google Removes the Photo Sphere Mode From the Pixel 8 Camera (androidauthority.com) 9

Since 2012, Google Pixel phones have had a Photo Sphere Mode, allowing users to capture 360-degree images. Now, according to Android Authority, Google has dropped the feature from the Pixel 8 series with no explanation given. From the report: Photo Sphere Mode allowed you to capture panoramic 360-degree pictures by stitching multiple images together. The feature was first introduced back in 2012 on the Nexus 4 and persisted well into the Pixel era, with the likes of the Pixel Fold and Pixel 7a still offering it. The act of capturing a Photo Sphere wasn't exactly seamless owing to the sheer number of images required, although it had an admittedly intuitive UI. Significant stitching issues and exposure/white balance differences were also very common.

We're therefore not surprised Google has decided to drop the feature. Even without taking the aforementioned issues into account, the mode's utility seemed limited beyond some scenarios like mapping purposes (e.g. viewing environments in Google Maps) and VR. In saying so, we hope the company rebounds with a more polished take on 360-degree photos in the future.

Programming

Man Trains Home Cameras To Help Repel Badgers and Foxes (bbc.co.uk) 77

Tom Singleton reports via the BBC: A man got so fed up with foxes and badgers fouling in his garden that he adapted cameras to help repel them. James Milward linked the Ring cameras at his Surrey home to a device that emits high frequency sounds. He then trained the system using hundreds of images of the nocturnal nuisances so it learned to trigger the noise when it spotted them. Mr Milward said it "sounds crazy" but the gadget he called the Furbinator 3000 has kept his garden clean.

Getting the camera system to understand what it was looking at was not straightforward though. "At first it recognised the badger as an umbrella," he said. "I did some fine tuning and it came out as a sink, or a bear if I was lucky. Pretty much a spectacular failure." He fed in pictures of the animals through an artificial intelligence process called machine learning and finally, the device worked. The camera spotted a badger, and the high frequency sound went off to send the unwanted night-time visitor on its way and leave the garden clean for Mr Milward's children to play in.
The code for the Furbinator 3000 is open source, with detailed instructions available in Milward's Medium post.
Sony

Sony's High-Bitrate Movie Service is Now Available on PS5 and PS4 (theverge.com) 12

Sony is bringing its own movie streaming service to PlayStation consoles beginning today. From a report: Previously known as Bravia Core, the service is being rebranded to Sony Pictures Core as it arrives on the PS5 and PS4. "Once you sign up for Sony Pictures Core, you will be able to buy or rent up to 2,000 movies straight from your console," Sony's Evan Stern wrote in a blog post. "At launch, this will include blockbuster hits such as Spider-Man: Across the Spider-Verse, Spider-Man: No Way Home, Uncharted, The Equalizer, No Hard Feelings, Bullet Train, and Ghostbusters: Afterlife, among others."

Now, you can rent or buy those movies in any number of places. If you're wondering why you'd want to use Sony's service, the answer is video fidelity. As noted on the Bravia Core website, it includes what the company calls Pure Stream, "which can stream HDR movies at up to 80Mbps -- similar to 4K UHD Blu-ray -- on a wide range of content." That is a significantly higher bitrate than anything Netflix, Amazon Prime Video, Max, Vudu, or other streamers will give you. So, if you're a stickler for picture quality and have the right TV for it, you should notice greater detail when using Pure Stream. In addition to all that, Sony also claims it has the largest collection of IMAX Enhanced films of any streaming service.

Businesses

Disney VFX Workers Vote Unanimously To Unionize (variety.com) 35

Jazz Tangcay writes via Variety: Visual effects workers at Walt Disney Pictures have voted unanimously in favor of unionizing with the International Alliance of Theatrical Stage Employees (IATSE) in an election held by the National Labor Relations Board (NLRB). The 13-0 vote comes just weeks after VFX workers at Marvel Studios voted to unionize with IATSE and comes amid the ongoing WGA and SAG-AFTRA strikes, as the guilds continue to seek fair contracts with the Alliance of Motion Picture and Television Producers.

The 18 crew members who work in-house at Walt Disney Studios who were eligible voters seek fair compensation for all hours worked, adequate health care and retirement benefits. The unionizing VFX workers are responsible for creating the special effects across the studio's catalog, which includes "Beauty and the Beast," "Aladdin," "The Lion King" and more, are also seeking the same rights and protections afforded to their unionized coworkers who are already represented by IATSE. [...] With the workers behind the vote, the union's next step is to engage in collective bargaining negotiations with Disney execs to draft a contract that addresses the workers' needs. Negotiation dates have yet to be set.

Hardware

Modder Turns Framework Laptop PCB Into a Handheld Gaming PC (tomshardware.com) 17

YouTuber Pitstoptech built a "fully upgradeable gaming handheld" around one of Framework's upgradable motherboards. Tom's Hardware reports: The handheld model you see in the video is equipped with the following components:

- Framework's Intel Core i7-1260P processor equipped mainboard
- 7-inch FHD touchscreen display
- 16 GB RAM
- 512 GB SSD
- Dual front-facing speakers
- Detachable controllers
- 55 Wh Battery
- High-speed Wi-Fi & Bluetooth

These components appear to offer some passable small-screen gaming. And in the video, you can see the device plugs into a larger monitor / TV where using the controllers in a detached configuration (Bluetooth) may be more comfortable. [...] Pitstoptech intends to prepare and sell handheld DIY kits "soon," based on the prototype design you see in the pictures / video.

Crime

Ignored by Police, Two Women Took Down Their Cyber-Harasser Themselves (msn.com) 104

Here's how the Washington Post tells the story of 34-year-old marketer (and former model) Madison Conradis, who discovered nude behind-the-scenes photos from 10 years earlier had leaked after a series of photographer web sites were breached: Now the photos along with her name and contact information were on 4chan, a lawless website that allows users to post anonymously about topics as varied as music and white supremacy... Facebook users registered under fake names such as "Joe Bummer" sent her direct messages demanding that she send new, explicit photos, or else they would further spread the already leaked photos. Some pictures landed in her father's Instagram messages, while marketing clients told her about the nude images that came their way. Madison was at a friend's party when she got a panicked call from the manager of a hotel restaurant where she had worked: The photos had made their way to his inbox. After two years, hoping a new Florida law against cyberharassment would finally end the torture, Madison walked into her local Melbourne police station and shared everything. But she was told that what she was experiencing was not criminal.

What Madison still did not know was that other women were in the clutches of the same man on the internet — and all faced similar reactions from their local authorities. Without help from the police, they would have to pursue justice on their own.

Some cybersleuthing revealed the four women all had one follower in common on Facebook: Christopher Buonocore. (They were his ex-girlfriend, his ex-fiance, his relative, and a childhood friend.) Eventually Madison's sister Christine — who had recently passed the bar exam — "prepared a 59-page document mapping the entire case with evidence and relevant statutes in each of the victims' jurisdictions. She sent the document to all the women involved, and each showed up at her respective law enforcement offices, dropped the packet in front of investigators and demanded a criminal investigation." The sheriff in Florida's Manatee County, Christine's locality, passed the case up to federal investigators. And in July 2019, the FBI took over on behalf of all six women on the basis of the evidence of interstate cyberstalking that Christine had compiled...

The U.S. attorney for the Middle District of Florida took action at the end of December 2020, but without a federal law criminalizing the nonconsensual distribution of intimate images, she charged Buonocore with six counts of cyberstalking instead, which can apply to some cases involving interstate communication done with the intent to kill, injure, intimidate, harass or surveil someone. He pleaded guilty to all counts the following January...

U.S. District Judge Thomas Barber sentenced Buonocore to 15 years in federal prison — almost four years more than the prosecutor had requested.

NASA

NASA To Demonstrate Laser Communications From Space Station (nasa.gov) 40

SonicSpike shares a report from NASA: In 2023, NASA is sending a technology demonstration known as the Integrated LCRD Low Earth Orbit User Modem and Amplifier Terminal (ILLUMA-T) to the space station. Together, ILLUMA-T and the Laser Communications Relay Demonstration (LCRD), which launched in December 2021, will complete NASA's first two-way, end-to-end laser relay system. With ILLUMA-T, NASA's Space Communications and Navigation (SCaN) program office will demonstrate the power of laser communications from the space station. Using invisible infrared light, laser communications systems send and receive information at higher data rates. With higher data rates, missions can send more images and videos back to Earth in a single transmission. Once installed on the space station, ILLUMA-T will showcase the benefits higher data rates could have for missions in low Earth orbit.

"Laser communications offer missions more flexibility and an expedited way to get data back from space," said Badri Younes, former deputy associate administrator for NASA's SCaN program. "We are integrating this technology on demonstrations near Earth, at the Moon, and in deep space." In addition to higher data rates, laser systems are lighter and use less power -- a key benefit when designing spacecraft. ILLUMA-T is approximately the size of a standard refrigerator and will be secured to an external module on the space station to conduct its demonstration with LCRD. Currently, LCRD is showcasing the benefits of a laser relay in geosynchronous orbit -- 22,000 miles from Earth -- by beaming data between two ground stations and conducting experiments to further refine NASA's laser capabilities. "Once ILLUMA-T is on the space station, the terminal will send high-resolution data, including pictures and videos to LCRD at a rate of 1.2 gigabits-per-second," said Matt Magsamen, deputy project manager for ILLUMA-T. "Then, the data will be sent from LCRD to ground stations in Hawaii and California. This demonstration will show how laser communications can benefit missions in low Earth orbit."

ILLUMA-T is launching as a payload on SpaceX's 29th Commercial Resupply Services mission for NASA. In the first two weeks after its launch, ILLUMA-T will be removed from the Dragon spacecraft's trunk for installation on the station's Japanese Experiment Module-Exposed Facility (JEM-EF), also known as "Kibo" -- meaning "hope" in Japanese. NASA's Laser Communications Roadmap. Following the payload's installation, the ILLUMA-T team will perform preliminary testing and in-orbit checkouts. Once completed, the team will make a pass for the payload's first light -- a critical milestone where the mission transmits its first beam of laser light through its optical telescope to LCRD. Once first light is achieved, data transmission and laser communications experiments will begin and continue throughout the duration of the planned mission.

Businesses

Disney VFX Workers File For Union Election (vice.com) 27

Walt Disney Pictures' VFX team filed for a union election with the National Labor Relations Board on Monday. As Motherboard notes, the filing "marks the second time in history that workers in the visual effects industry have announced their intent to organize -- the first being Marvel VFX workers, who did so three weeks prior." From the report: The Walt Disney Pictures workers, who are behind the visual effects in movies like the live-action Aladdin and Pirates of the Caribbean, plan to join the VFX Union, a new branch of the International Alliance of Theatrical Stage Employees (IATSE), which represents much of the entertainment industry behind the scenes. Their filing comes after over 80 percent of the 18 in-house VFX crewmembers at Walt Disney Pictures in Los Angeles signed cards demonstrating their desire to unionize, according to a press release by the union.

"Today, courageous visual effects workers at Walt Disney Pictures overcame the fear and silence that have kept our community from having a voice on the job for decades," said Mark Patch, a IATSE VFX union organizer, in a statement. "With an overwhelming supermajority of these crews demanding an end to 'the way VFX has always been,' this is a clear sign that our campaign is not about one studio or corporation. It's about VFX workers across the industry using the tools at our disposal to uplift ourselves and forge a better path forward."

Slashdot Top Deals