Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Re:Great start! (Score 2) 44

I agree... I would really love to have a system setting that said "No shorts, ever, anywhere on the platform" but we must be grateful for the fact that it's only taken how many years for YT to finally listen to those of us who've been screaming for today's enhancement?

As others have said... now it's time for a "No AI" filter but unfortunately YT doesn't ask uploaders whether their video is AI generated or not -- it asks far more vague questions that also require anyone who uses VFX (such as almost any modern feature or indie movie) to tag their uploads as "altered content". Why is that?

Comment "just use a TV" commenters - have you seen one? (Score 1) 52

'The Frame' is different to a standard TV because it's matt. That's the thing - it truly is designed more for static image than for moving glitz and glamour. I like them, and if it weren't for my distrust of online Samsung these days I'd likely have got one. I fully understand those who have done - it's a nice device that's good at its task.

Comment AI Reaction (Score 5, Insightful) 22

This was an inevitability, and not just because of the tightening job market. The real gasoline dumped on the tire-fire of employment (especially tech employment) in the United States is AI. It used to be that, when I posed a job opening, I got maybe 100 applications over the course of a few weeks from people who moderately embellished their resumes. And that's because the conventional wisdom was to spend the time to put together a really class-A application to the jobs that were the best fit for you.

But AI has changed the math.

Now the hours-long process of fine-tuning your resume to fit a job description and crafting a nice cover letter to go with it are the work of a click or two. Candidates have every reason in the world to apply to as many jobs as they can. It's a classic prisoners dilemma; sure, everyone else is worse off if the HR departments are flooded but if everyone else does it and you don't, you're never going to land a job.

And so the firehose opens and the job ad that used to get me 100 resumes over the course of a month gets me 1,000 resumes over the course of an hour.

And, just for funsies, most of them are wildly unqualified. AI is happy to lie on your resume for you and getting it not to do that is hard. So now not only do I have a ton of resumes to go through, I have a crisis of trust on my hands. Who, out of all of these applicants, are telling the truth and who's basically echoing the job-ad back to me?

In that situation, it makes a ton of sense to lean back on trust relationships. Harvard, MIT, Stanford, etc have reputations to uphold. I can count on them to police their candidates. My local university wants to maintain a good relationship with area businesses; if their graduates are BSing me in their resumes I can meet the head of their career center for coffee and get them to deal with it.

There is no solution here. The employment market is a feedback loop. The bigger the applicant:job ratio grows the harder it is for employers to adequately consider and respond to applications, turning the application process into something more akin to a lottery ticket than a proper application. The more converting an application to a job feels like random chance the more incentivized applicants are to prioritize quantity over quality, driving the ratio ever higher.

Nothing short of a sudden and profound cratering of the unemployment rate is going to slow this arms race. This isn't the end state of the market; it's going to get worse before it gets better.

Comment Re:Why (Score 1) 48

I do find it confusing, yes. What's the difference between the iPad and the iPad Air? Why is the one with the 'Air' monicker not the lightest and why is the Air not the Mini? What's with the Apple Pencil support being so strange, both in model and functionality? What is a 'Liquid Retina Display' vs an 'Ultra Retina XDR' display?

I believe iPadOS 26 may have sorted this recently, but even within the range when I was looking maybe a year/18 months ago you got different software functionality too, within the same model variation, purely based on screen size. Why could I run Stage Manager on one but not the other? In fact, what the hell is Stage Manager and why can't I just float a window? All this 'amazing' multitasking - yeah, I've been doing this that since the Amiga Workbench days thank you, nothing new except the weird insistence that it's somehow tied to screen layouts.

, As I say, I think iPadOS 26 has cleaned a bit of the functionality side up so some of my complaints on the OS side are now a little out of date, but I was literally standing in the shop with money to burn and I walked out since I was too confused and knew I'd always think I'd missed out somehow. These days I'm not so fussed about one anyway, though I do look in once in a while. I bought the original, iPad 2 and "the new iPad" (ridiculous name). Haven't bought any since, still have my iPad 2 but barely charge it - useful for a few hardware synths I have once in a blue moon.

Comment This is fantastic! (Score 4, Interesting) 99

As a long term Linux user I am delighted by this move on the part of Microsoft.

I suspect that at some point, even the most die-hard Windows user will tire of AI being shoved down their throat and decide to try out this "Linux thing" they've heard so much about. Given that so many Linux distros are now as easy to use as Windows (or even easier -- to the extent that my 71 year old wife uses Linux now), this will only boost the market share of the Penguin.

The other benefit is that the more people we have using Linux, the less ability big-tech will have to shift us to "hardware as a service" due to the massive costs of high-performance desktop computing systems now. NVIDIA's GeForce NOW is the perfect example of how we're increasingly being pushed to simply rent hardware rather than buy it. Today GPUs, tomorrow - entire systems because nobody can afford DRAM or a GPU that's up to the task.

I've got Linux running very happily on 3rd and 4th gen Intel i5 processors here with as little as 4GB of RAM and even my most powerful system is just an AMD 5600G with 32GB that serves perfectly well for everything but video editing.

Just keep drilling holes in the bottom of your boat Microsoft, we don't care.

Comment Re:Why (Score 2) 48

Agreeing with you and pointing out their worse offender - the iPad range. I have absolutely no idea the differences between their products there, it makes very little sense to me. And I've been using Apple products for 40 years.

Feels like the new Amelio era to me - ship what we've got in the parts bin and call it a strategy. Too many products.

Comment better sales pitch (Score 2) 85

>DJT made a better pitch

The issue I have with is that a lot of the sales pitch was a bald-faced lie that anyone with a grade-school education and life experience should have seen through. For example, a president does not have a "grocery price" knob in the oval office that he can just turn down on day one. Even if you interpret it generously as "will start implementing policies that will bring down consumer prices," none of that was detailed. In fact, the main talking point about tariffs would obviously have the opposite effect. Anyone with any clue would see tariffs as taxes on the American public, but he kept saying "other countries" pay the tariffs, another bald-faced lie. The guy can't go 2 complete sentences (if he actually manages to form them) without tossing in a lie.

Bring it back to /. appropriate analogy: If Intel colluded with PC publications and "influencers" to publish false benchmarks that show it to be better than AMD, and this lead to strong initial sales of a new processor compared to AMD's offering, does that really indicate that that's what consumers wanted? Is it fair to point to those sales numbers to say "see, Intel is more popular, it must be because it's better!" Is the correct solution to also have AMD lie so it's "fair"?

According to polling, it appears that there is significant buyer's remorse going on. What's depressing is that we already got a "test drive" 5-8 years ago. How many of those farmers who needed a bailout last time thought, "hey, we need more of that!" How about those union workers who saw Trump laughing with a billionaire about firing striking workers? Did they really think, "hey, that's him looking out for the little guy!" The mind still boggles at the amount of stupidity/willful ignorance to RE-elect this guy after plenty of examples of him acting directly against the interests of large portions of the population, in addition to being malicious, ignorant, and a general creep.

I do see that the Democrats didn't have "flashy" messaging. I'm not sure what they could have done against "flooding the zone" of BS. Politics, when done correctly, is (supposed to be) boring, and what we had at the end of Biden's term was boring in a good way.

Comment Re:Should dumb people get degrees? (Score 1) 91

They strive to build The Everyman, but fail at the core mission more often that we'd like.

But... shouldn't they? I graduated from college in 2002. Today I run software development teams that build cloud hosted machine learning models pretty much none of which was a thing in 2002. Even if we want to imagine that college is essentially a trade school but with ivy and columns, the fact remains that your average college graduate's degree far outlives the value of most of what they're taught unless they're majoring in history.

We certainly want people to exit college with the skills they need to either enter the workforce or grad school, but those skills could be aquired for a whole heck of a lot less time and money than a four-year-degree. The college degree is about critical thinking skills; managing complexity; proving a capability for self education and improvement; etc. Or it should be, anyway.

Comment Digital signing cameras (Score 1) 66

I tossed this out on /. like over a decade (or two) ago. I think the responses I got were either that it's impractical somehow or that it already existed. It's too far back to find, but would it be possible to create a camera with a private key in silicon that signs every image it creates. Any attempt to read out the key from the silicon would destroy it.

The image can only be read with a public key for that specific camera, and the encryption would be embedded in the image (watermark) in such a way that any alteration of the image will destroy the encryption. I'm no an encryption guy by any means so I'm not even sure what the correct terms are. In any case, is there a way to make something like this? At least for special cameras used by police/investigators, can this be done such that it is certain that the image is unaltered after it came from a specific camera.

Comment Did the Space Station put Pepper in the Radiator? (Score 1) 39

I'm reminded of all the BMW cars I've previously owned where it was often said "If there's no oil under it, there's no oil in it"...

Ahh, yes... German cars. If every decent car company does something with 6 parts, the Germans will find a way to make it require 27 parts. All of which are horribly expensive and require specialized tools to install. Or they'll put the timing system at the back of the engine so that a routine service item becomes an engine-out procedure. Garbage cars driven by people who don't know any better.

The space station leak reminds me of an old trick for a leaky cooling system in a car: put pepper into the radiator.

The little flecks of ground pepper get washed around the cooling system and eventually block tiny cracks in the radiator or other places. Putting a raw egg into a *cold* radiator will do the same thing; when the engine gets warm it cooks and blocks the leak. Both of these tricks have saved me on the road, they do work. But they are temporary and you need to thoroughly flush the cooling system after the repair.

I wonder if the Space Station has had the same sort of thing happen - airborne dust blocking a leak?

Comment Re:Nope. (Score 3, Informative) 93

Yes, the AI-revolution will be hugely different to various "revolutions" that came before and not in a good way.

The industrial revolution saw manufacturing automated -- but the jobs that were eliminated were usually low-skill laboring ones. People could retrain and take on more skilled work with higher pay so the net earnings of the workforce actually increased.

The IT-revolution once again saw relatively unskilled roles automated by computers and once again people could retrain for more skilled rolls that grew due to the productivity improvements that IT systems offered.

However, the AI-age is hugely different.

That's because the roles being displaced are what we already consider to be "skilled" ones. Programmers, artists, writers, musicians, management -- in fact a huge swathe of professional or semi-professional roles will be hugely affected by AI systems. There no *new* jobs being created by AI (other than a handful of people to dust the server racks in the data-centers) so this will mean unemployment will rise.

Rising unemployment means less money in the pockets of the average citizen so the economy as a whole will suffer - despite the vastly improved productivity of AI-enabled companies. Without a market for the products and services that AI-enabled companies make, their revenues and profits will also be negatively impacted, despite that higher productivity.

This downward economic spiral could be even worse than a bursting AI bubble and lead to huge socio-economic problems with massive destabilizing effects.

I'm pretty sure that during the great depression of the 1920s, lots of people were on four, three or even zero day working weeks and that didn't work out too well for them.

Comment An economic necessity (Score 3, Informative) 39

With the potential for the Kessler syndrome to kick in any time now, it's an economic necessity for Starlink to do whatever it can to reduce the risk.

As Anton Petrov points out in this video, a Kessler syndrome catastrophe could be just around the corner and the only way to reduce the risk is to reduce the levels of congestion in certain parts of LEO.

In the event of such an event, Starlink would become worthless and SpaceX's stock price would fall through the floor so I guess someone's crunched the numbers and figured that they now have little option but to do everything they can to reduce the possibility.

Comment Public domain means nothing (Score 4, Insightful) 36

Sadly, simply placing a work into the "public domain" means nothing these days.

I regularly see YouTube creators who are hit with copyright claims/strikes for using public domain footage from the likes of NASA -- because broadcasters have used that same footage in their own production and YT's content-ID system automatically issues a claim/strike when anyone else uses the same footage.

This wouldn't be a problem if YouTube's appeal process worked -- but it doesn't, it's so badly broken that even big creators like Scott Manley are being hit and having the revenues stolen from their efforts simply because a broadcaster like Channel 4 in the UK has claimed his video for using the very same PD NASA footage that they used in one of their videos.

Copyright is so easy to abuse and misuse that is now almost laughable.

Slashdot Top Deals

Anyone can hold the helm when the sea is calm. -- Publius Syrus

Working...