Forgot your password?
typodupeerror

Comment Re:Why not? (Score 1) 134

Side mirrors almost always leave a large blind spot directly behind and close to the vehicle. There's a reason that when firefighters are reversing their appliances they always have at least one of the crew physically get out and watch the area behind the vehicle.

Even a rear window and rear view mirror almost always leave a significant blind spot low and close behind the vehicle, which is why reversing cameras became a thing. When they're done well, they really are significantly safer, as well as sometimes making it a lot more reliable for most people to park the vehicle in difficult spaces.

Comment Re:What's "eye-like focal length"? (Score 1) 134

One of the modern innovations I really would like to have is full AR on my windscreen. I want unexpected hazards highlighted in real time, particularly those that are more easily detectable by non-visual sensors, like big potholes or animals obscured by vegetation near the side of a country road. I want the actual driving line I need to take to follow my planned route through complex junctions overlaid slightly on my view of the road ahead. I want light amplification for night driving, ideally combined with some other technology that can reduce the glare from oncoming headlights to prevent dazzle.

Although I only want all of this if (a) it's implemented well and (b) any additional data it uses is reliably up-to-date and (c) there's an emergency shut-off that instantly clears everything off the windscreen in case anything goes wrong.

Comment Re:Mirrors (Score 1) 134

We don't need tech to replace something that works better than the tech.

Oh, don't be silly. Next you'll be making even more absurd claims, like that car theft was already a solved problem 20 years ago thanks to immobilisers, or that having separate physical controls for essential functions that you can find and use without taking your eyes off the road for several seconds to mess around with a touchscreen is safer, or that no-one ever hacked 100,000 cars at once from 1,000 miles away back when they didn't have always-on remote connectivity and allow OTA updates to their essential control systems.

Comment Re:Let me guess: new standard? (Score 2) 27

Google learned to embrace, extend and extinguish right out of Microsoft's playbook. They were excellent students and you can see the results in how email and web "standards" work today.

The difference is that when Microsoft did it the authorities eventually started getting in their way to promote more openness and competition again. So far there is little sign that anyone intends to challenge the way a few tech giants have recently been capturing long-established standards that we rely on for what have become vital services and effectively taking ownership for their own purposes. The governments and their regulators are either asleep at the wheel or, if you're a bit less trusting, bought and paid for.

Comment Re:Working with other people's code (Score 0) 150

Yes. So far, the LLM tools seem to be much more useful for general research purposes, analysing existing code, or producing example/prototype code to illustrate a specific point. I haven't found them very useful for much of my serious work writing production code yet. At best, they are hit and miss with the easy stuff, and by the time you've reviewed everything with sufficient care to have confidence in it, the potential productivity benefits have been reduced considerably. Meanwhile even the current state of the art models are worse than useless for the more research-level stuff we do. We try them out fairly regularly but they make many bad assumptions and then completely fail to generate acceptable quality code when told no, those are not acceptable and they really do need to produce a complete and robust solution of the original problem that is suitable for professional use.

Comment Re: sure (Score 2) 150

But one of the common distinctions between senior and junior developers -- almost a litmus test by now -- is their attitude to new, shiny tools. The juniors are all over them. The seniors tend to value demonstrable results and as such they tend to prefer tried and tested workhorses to new shiny things with unproven potential.

That means if and when the AI code generators actually start producing professional standard code reliably, I expect most senior developers will be on board. But except for relatively simple and common scenarios ("Build the scaffolding for a user interface and database for this trivial CRUD application that's been done 74,000 times before!") we don't seem to be anywhere near that level of competence yet. It's not irrational for seniors to be risk averse when someone claims to have a silver bullet but both the senior's own experience and increasing amounts of more formal study are suggesting that Brooks remains undefeated.

Comment Re:Please don't use Paramount+ Platform (Score 3, Interesting) 55

(+1, Truth)

Of all the major streaming platforms, Paramount+ stands alone in how often it just doesn't work. It doesn't work reliably on state-of-the-art streaming boxes. It doesn't work reliably on desktop PCs. In fact, of all the devices we have in our household, it works reliably on a total of zero of them.

We have several of the other commercial streaming platforms plus the apps or online services for several of our main national TV channels as well and almost all of them work almost all of the time. It's bizarre how bad Paramount+ manages to be compared to literally everyone else. It must be hurting their bottom line to some degree or surely will do soon if they don't get a handle on it, because why pay for something you literally can't watch?

Comment Re: Interesting Summary (Score 1) 58

There's a difference between not using AI tools at all and not using code generated by AIs.

The latter involves a lot of risks that aren't well understood yet -- some technical, some legal, some ethical -- and it's entirely possibly that some of those risks are going to blow up in the face of the gung-ho adopters with existential consequences for their businesses.

I mostly work with clients in industries where quality matters. Think engineering applications where equipment going wrong destroys things or kills people and where security vulnerabilities are a proxy for equipment going wrong.

I know plenty of smart, capable people working in this part of the industry who are totally fine with blanket banning the use of AI-generated code on these jobs. A lot of that code simply isn't up to the required standards anyway, but even if it does produce something you could actually use, there are still all the same costs for review and certification that any other code incurs. That includes the need for at least one human reviewer to work out why the AI wrote what it did, which may or may not have any better answer than "statistically, it seemed like a good idea at the time".

Slashdot Top Deals

A commune is where people join together to share their lack of wealth. -- R. Stallman

Working...