Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Submission + - Fifteen Years Later, Citizens United Defined the 2024 Election (brennancenter.org)

NewYorkCountryLawyer writes: The influence of wealthy donors and dark money was unprecedented. Much of it would have been illegal before the Supreme Court swept away long-established campaign finance rules. Citizens United v. Federal Election Commission, the Supreme Court’s controversial 2010 decision that swept away more than a century’s worth of campaign finance safeguards, turns 15 this month. The late Justice Ruth Bader Ginsburg called it the worst ruling of her time on the Court. Overwhelming majorities of Americans have consistently expressed disapproval of the ruling, with at least 22 states and hundreds of cities voting to support a constitutional amendment to overturn it. Citizens United reshaped political campaigns in profound ways, giving corporations and billionaire-funded super PACs a central role in U.S. elections and making untraceable dark money a major force in politics. And yet it may only be now, in the aftermath of the 2024 election, that we can begin to understand the full impact of the decision.

Submission + - Anti-Trump Searches Appear Hidden on TikTok (ibtimes.com)

AmiMoJo writes: Searches for anti-Trump content are now appearing hidden on TikTok for many users after the app came back online in the U.S. TikTok users have taken to Twitter to share that when they search for topics negatively related to President Donald Trump, a message pops up saying "No results found" and that the phrases may violate the app's guidelines. One user said that when they tried to search "Donald Trump rigged election" on a U.S. account, they were met with blocked results. Meanwhile, the same phrase searched from a U.K. account prompted results. Another user shared video of them switching between a U.S. and U.K. VPN to back up the user's viral claims, which has since amassed more than 187,000 likes.
Crime

Silk Road Creator Ross Ulbricht Pardoned (bbc.com) 339

Slashdot readers jkister and databasecowgirl share the news of President Donald Trump issuing a pardon to Silk Road creator Ross Ulbricht. An anonymous reader shares a report from the BBC: US President Donald Trump says he has signed a full and unconditional pardon for Ross Ulbricht, who operated Silk Road, the dark web marketplace where illegal drugs were sold. Ulbricht was convicted in 2015 in New York in a narcotics and money laundering conspiracy and sentenced to life in prison. Trump posted on his Truth Social platform that he had called Ulbricht's mother to inform her that he had granted a pardon to her son. Silk Road, which was shut down in 2013 after police arrested Ulbricht, sold illegal drugs using Bitcoin, as well as hacking equipment and stolen passports.

"The scum that worked to convict him were some of the same lunatics who were involved in the modern day weaponization of government against me," Trump said in his post online on Tuesday evening. "He was given two life sentences, plus 40 years. Ridiculous!" Ulbricht was found guilty of charges including conspiracy to commit drug trafficking, money laundering and computer hacking. During his trial, prosecutors said Ulbricht's website, hosted on the hidden "dark web", sold more than $200 million worth of drugs anonymously.

Submission + - Trump Pardons Silk Road Founder (nypost.com)

databasecowgirl writes: President Trump announced Tuesday night that he had granted a âoefull and unconditionalâ pardon to Ross Ulbricht, founder of the notorious dark web site Silk Road.

Submission + - Decentralized Social Media Is the Only Alternative to the Tech Oligarchy (404media.co)

An anonymous reader writes: If it wasn’t already obvious, the last 72 hours have made it crystal clear that it is urgent to build and mainstream alternative, decentralized social media platforms that are resistant to government censorship and control, are not owned by oligarchs and dominated by their algorithms, and in which users own their follower list and can port it elsewhere easily and without restriction. [...] Mastodon’s ActivityPub and Bluesky’s AT.Protocol have provided the base technology layer to make this possible, and have laid important groundwork over the last few years to decorporatize and decentralize the social internet.

The problem with decentralized social media platforms thus far is that their user base is minuscule compared to platforms like TikTok, Facebook, and Instagram, meaning the cultural and political influence has lagged behind them. You also cannot directly monetize an audience on Bluesky or Mastodon—which, to be clear, is a feature, not a bug—but also means that the value proposition for an influencer who makes money through the TikTok creator program or a small business that makes money selling chewing gum on TikTok shop or a clothes brand that has figured out how to arbitrage Instagram ads to sell flannel shirts is not exactly clear. I am not advocating for decentralized social media to implement ads and creator payment programs. I’m just saying that many TikTok influencers were directing their collective hundreds of millions of fans to follow them to Instagram or YouTube, not a decentralized alternative.

This doesn’t mean that the fediverse or that a decentralized Instagram or TikTok competitor that runs on the AT.Protocol is doomed. But there is a lot of work to do. There is development work that needs to be done (and is being done) to make decentralized protocols easier to join and use and more interoperable with each other. And there is a massive education and recruitment challenge required to get the masses to not just try out decentralized platforms but to earnestly use them. Bluesky’s growing user base and rise as a legitimately impressive platform that one can post to without feeling like it’s going into the void is a massive step forward, and proof that it is possible to build thriving alternative platforms. The fact that Meta recently blocked links to a decentralized Instagram alternative shows that big tech sees these platforms, potentially, as a real threat.

Submission + - TikTok is censoring anti-Trump content (newsweek.com)

smooth wombat writes: After going dark for 12 hours in response to a U.S. law saying it must divest from Chinese ownership, TikTok came back on line when the new administration took office. However, once up and running, users found one unexpected change. Anti-Trump content is now being censored. Words, phrases, and videos which were readily accessible pre-blackout were now unavailable or being removed entirely.

A post on X, formerly Twitter, which has received 4.5 million views at the time of reporting, claims that "TikTok is now region locking Americans from looking up things like "fascism" and "Donald Trump rigged election"."

The post includes two screenshots of the TikTok app. The screenshot is of the search page, and in both the search term is "Donald Trump rigged election." The post states that: "On the left are results from a device in America, and on the right are results from one in the UK."

The post on the left shows a results page stating "No results found," while on the left it shows two videos of the President.

Another post from the account Dustin Genereux said that, "Censorship on TikTok is at an all time high with accounts being deleted, posts going back years being flagged, people losing access to the creator fund for saying anything Anti-Trump, MAGA, Elon, etc. But free speech and all that right?"

Earth

Great Barrier Reef Hit By Its Most Widespread Coral Bleaching, Study Finds (theguardian.com) 15

More than 40% of individual corals monitored around a Great Barrier Reef island were killed last year in the most widespread coral bleaching outbreak to hit the reef system, a study has found. The Guardian: Scientists tracked 462 colonies of corals at One Tree Island in the southern part of the Great Barrier Reef after heat stress began to turn the corals white in early 2024. Researchers said they encountered "catastrophic" scenes at the reef.

Only 92 coral colonies escaped bleaching entirely and by July, when the analysis for the study ended, 193 were dead and a further 113 were still showing signs of bleaching. Prof Maria Byrne, a marine biologist at the University of Sydney and lead author of the study, has been researching and visiting the island for 35 years.

Communications

Brendan Carr is Officially in Charge of the FCC (theverge.com) 71

An anonymous reader shares a report: Brendan Carr is now formally the chair of the Federal Communications Commission, giving him the power to set the agency's agenda and usher through a host of regulations with major implications for the tech and media industries as soon as he has a Republican majority. In a statement, Carr named a few areas of focus: "issues ranging from tech and media regulation to unleashing new opportunities for jobs and growth through agency actions on spectrum, infrastructure, and the space economy."

Carr's priorities might also be gleaned from a document you might have already heard about: Project 2025. That's because he authored the FCC chapter of the Heritage Foundation's wishlist for a Donald Trump presidency. In that chapter, Carr proposes actions including: limiting immunity for tech companies under Section 230 of the Communications Decency Act, requiring disclosures about how platforms prioritize content, requiring tech companies to pay into a program that funds broadband access in rural areas, and more, quickly approving applications to launch satellites from companies like Elon Musk's Starlink.

AI

Authors Seek Meta's Torrent Client Logs and Seeding Data In AI Piracy Probe (torrentfreak.com) 15

An anonymous reader quotes a report from TorrentFreak: Meta is among a long list of companies being sued for allegedly using pirated material to train its AI models. Meta has never denied using copyrighted works but stressed that it would rely on a fair use defense. However, with rightsholders in one case asking for torrent client data and 'seeding lists' for millions of books allegedly shared in public, the case now takes a geeky turn. [...] A few weeks ago, the plaintiffs asked for permission to submit a third amended complaint (PDF). After uncovering Meta's use of BitTorrent to source copyright-infringing training data from pirate shadow library, LibGen, the request was justified, they argued. Specifically, the authors say that Meta willingly used BitTorrent to download pirated books from LibGen, knowing that was legally problematic. As a result, Meta allegedly shared copies of these books with other people, as is common with the use of BitTorrent.

"By downloading through the bit torrent protocol, Meta knew it was facilitating further copyright infringement by acting as a distribution point for other users of pirated books," the amended complaint notes. "Put another way, by opting to use a bit torrent system to download LibGen's voluminous collection of pirated books, Meta 'seeded' pirated books to other users worldwide." Meta believed that the allegations weren't sufficiently new to warrant an update to the complaint. The company argued that it was already a well-known fact that it used books from these third-party sources, including LibGen. However, the authors maintained that the 'torrent' angle is novel and important enough to warrant an update. Last week, United States District Judge Vince Chhabria agreed, allowing the introduction of these new allegations. In addition to greenlighting the amended complaint, the Judge also allowed the authors to conduct further testimony on the "seeding" angle. "[E]vidence about seeding is relevant to the existing claim because it is potentially relevant to the plaintiffs' assertion of willful infringement or to Meta's fair use defense," Judge Chhabria wrote last week.

With the court recognizing the relevance of Meta's torrenting activity, the plaintiffs requested reconsideration of an earlier order, where discovery on BitTorrent-related matters was denied. Through a filing submitted last Wednesday, the plaintiffs hope to compel Meta to produce its BitTorrent logs and settings, including peer lists and seeding data. "The Order denied Plaintiffs' motion to compel production of torrenting data, including Meta's BitTorrent client, application logs, and peer lists. This data will evidence how much content Meta torrented from shadow libraries and how much it seeded to third parties as a host of this stolen IP," they write. While archiving lists of seeders is not a typical feature for a torrent client, the authors are requesting Meta to disclose any relevant data. In addition, they also want the court to reconsider its ruling regarding the crime-fraud exception. That's important, they suggest, as Meta's legal counsel was allegedly involved in matters related to torrenting. "Meta, with the involvement of in-house counsel, decided to obtain copyrighted works without permission from online databases of copyrighted works that 'we know to be pirated, such as LibGen," they write. The authors allege that this involved "seeding" files and that Meta attempted to "conceal its actions" by limiting the amount of data shared with the public. One Meta employee also asked for guidance, as "torrenting from a corporate laptop doesn't feel right."

Comment Re:Haha. (Score 1) 29

Probably different projects though. Not sure what projects the US has but in the UK you have a number of types of warehouse:

- Fulfilment centres: Where all your normal run of the mill Amazon orders are picked and packed.

- Distribution centres: Centralised locations where orders are routed through; i.e. you might have 20 fulfilment centres sending packages to a distribution centre which then amalgamates them onto individual trucks (or planes) destined to further away locations. So imagine 20 trucks bring packages in from the closest fulfilment centres, those packages are then reorganised such that maybe 4 trucks worth are amalgamated to go down to London, 4 upto Scotland, 2 the South West and so on, whilst the remaining ones might go to local delivery centres because they're for local parcels.

- Delivery stations: These are the last mile stations, where parcels are offloaded from trucks onto local delivery vans.

- AMXL Warehouses: This is Amazon's extra large project for oversized goods. If you buy something like a washing machine, or fridge on Amazon it's picked and dispatched through AMXL centres. They're kitted out with equipment for shifting heavy goods.

- Prime Now Warehouses: These are local all in one time centres that have a smaller selection of goods people want quickly, i.e. typically groceries, the latest video games, batteries that are picked, packed, and dispatched all from one place for same day deliveries within 2 hours to people local enough to them to offer that.

- Returns centres: These handle returns unsurprisingly.

I'm sure there are other types I'm not aware of.

It's not uncommon for Amazon to build clusters; i.e. AMXL, Distribution, and Delivery all next to each other; some are even interconnected so parcels destined for deliveries local to Distribution Centres for example might have conveyors straight from the DC into the Delivery station so they can be routed straight through without packing and unpacking trucks.

So there is method to the madness of them building warehouses next to each other. On the outside they all look the same on the inside they're all doing completely different things.

Comment Re:Aren't Javascript containers (Score 1) 94

I can kind of understand the use case for this, the problem is that for serverless code execution cloud providers are currently typically using containers to deliver FaaS, the problem is even in the best case you still have sometimes unacceptable cold start times if no instance of the execution environment is cached and available to serve.

This means the promise of cloud based hyper-scalability through FaaS for web apps has some real problems, on both ends of the scale:

- At the bottom end, low rates of concurrent execution, FaaS suffers from the cold start problem, you can't realistically serve backend requests for a front end site using FaaS in this scenario because there's never an instance ready to serve the request, so each time a user visits the site and a call is made to your FaaS function the cold start means you could see response times as bad as on the order of seconds, that's too long for user interaction.

- At the middle range the cold start problem goes away some degree because you have enough frequency of requests to your service that there's always a warm instance to serve the request and don't have the cold start problem for all your users. You still have it for some of your users however as demand ebbs and flows; your functions still have to scale up and scale down appropriately and so it becomes a headache making sure you don't under or overprovision (note you can have this problem at the low end too using things like provisioned concurrency for AWS lambdas or Azure Premium functions).

- At the high end you have different issues, the promise of FaaS, serverless, and infinite scalability vanishes. AWS only allows 3000 concurrent Lambda executions in even their largest data centres for example, that's a reasonable load but I've worked on services where you need to go to say 50,000 concurrent requests and so AWS Lambdas just can't do it - your users get sporadic HTTP 429 errors as it throttles your requests. AWS can up this limit for you, but at that point again you're really fudging a solution into a cloud architecture that's straining to cope. Amazon's limits are because for all the hype, even Amazon having to fire up 3000 containers for a number of customers at once can become a strain on their capacity; forcing people to explicitly ask for a higher cap means they can better do capacity planning for customers on any given region.

So it's not entirely surprising therefore there's a push for a more granular type of container; one that's faster to spin up, whilst still be isolated from other execution environments, and has less overhead than even Linux containers. Such a thing is needed to get us closer to that goal of a cloud environment that can support both small and large web app back ends alike, because right now the issues above mean that FaaS is often limited to other types of workflow, like background order processing and that kind of thing.

I imagine therefore, that this is what this solution is oriented towards. The problem is it's not language agnostic, meaning you'd need a similar solution for other supported languages. Ideally you need an execution environment to be able to guarantee it'll spin up, process and respond to a request on the order of mere milliseconds in an isolated execution environment.

Disclaimer: I 100% agree with you around the usage of JS, I use it professionally day in day out at a FAANG scale company right now, but have worked with C, C++, Java, C#, PHP, and Python professionally in the past. Companies shouldn't be using JS like they are, it's genuinely leading to poorer quality code, but employers are being baited in by the hoards of cheap young code camp students getting taught JS. We also use TS, but unless you understand OO and types properly these devs just end up treating TS like JS; the second they encounter the need for a complex type they just resort to the any type and go back to the JS way of doing things and it rapidly becomes a clusterfuck once more. By the time you've taught them to code properly you might as well have just used a more appropriate language like Java, C#, Go, Rust, C++ or similar and trained people from the ground up yourself through an apprenticeship programme, so it's really a false economy using JS because of the "ease of use", or "cheap labour".

Comment Re:I don't think this makes sense (Score 4, Interesting) 25

Probably:

https://earthquaketrack.com/r/...

"North Atlantic Ocean has had: (M1.5 or greater)
6 earthquakes in the past 24 hours
51 earthquakes in the past 7 days
190 earthquakes in the past 30 days
3,109 earthquakes in the past 365 days"

Interestingly that page shows there was a magnitude 5.3 in the last 24 hours alone along the mid-Atlantic ridge. Unfortunately the page is a bit shit to navigate so I can't find an easy way of seeing the data for the 15th.

I think the problem is though that these things happen all the time, whether they cause any surface water movement though is dependent entirely on very unique circumstances of each one. You could have a magnitude 6 that no one notices because it had no real impact, but a magnitude 3 that just happens to trigger a massive underwater landslide resulting in a tsunami.

Having shore dived the Caribbean regularly, sometimes doing 5 dives a day and knowing that the waves can vary within a range of fuck all to a metre or so high in the space of just a few hours separate from standard tidal movements if the winds pick up. I'd wager there are plenty of 10cm tsunamis but the vast majority will simply be lost in natural variation of the waves driven by weather. This one was probably only noticed simply because someone was looking at tidal variation at the time of a big, internationally well publicised tsunami and has as a result gotten themselves a bit overexcited theorising without really thinking it through.

Comment I don't think this makes sense (Score 4, Interesting) 25

Eruption happened at 04:14 UTC, Tsunami hit Japan at 14:14 UTC. Three hours before that would be 11:14 UTC, but the shockwave was travelling at the speed of sound, which is around 761mph at sea level. The Caribbean is at least 7,000 miles away, so at the speed of sound it would take around 9hrs 30mins to get there, which would mean 13:44 UTC at the absolute earliest - only 30mins before the Japanese tsunami, not 3 hours before. Furthermore, if the shockwave itself was causing tsunamis this doesn't explain why the first Japanese tsunami happened after the first Caribbean one; you'd have expected shockwave driven tsunamis to appear in Japan before the Caribbean ones regardless, even if the main tsunami driven by the displacement of water at the site of the eruption itself took longer to arrive.

I appreciate it's possible the suggestion is that the shockwave passed through the core of the planet or similar rather than around the surface, but I'm not convinced they're not simply confusing correlation for causation here. I suspect more likely what happened is that movements within the earths core or crust that triggered the Tonga eruption also triggered a minor eruption or shifting of plates causing an underwater landslide somewhere in the Atlantic around the same time as the eruption near Tonga resulting in the minor tsunami seen in the Caribbean. This would be a far more plausible explanation because it would actually be physically possible in the timeframes given for starters.

Slashdot Top Deals

Radioactive cats have 18 half-lives.

Working...