Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment I can see the point. (Score 2, Insightful) 42

Social media has become a toxic dump. If you wouldn't allow children to play in waste effluent from a 1960s nuclear power plant, then you shouldn't allow them to play in the social media that's out there. Because, frankly, of the two, plutonium is safer.

I do, however, contend that this is a perfectly fixable problem. There is no reason why social media couldn't be safe. USENET was never this bad. Hell, Slashdot at its worst was never as bad as Facebook at its best. And Kuro5hin was miles better than X. Had a better name, too. The reason it's bad is that politicians get a lot of kickbacks from the companies and the advertisers, plus a lot of free exposure to millions. Politicians would do ANYTHING for publicity.

I would therefore contend that Australia is fixing the wrong problem. Brain-damaging material on Facebook doesn't magically become less brain-damaging because kids have to work harder to get brain damage. Nor are adults mystically immune. If you took the planet's IQ today and compared it to what it was in the early 1990s, I'm convinced the global average would have dropped 30 points. Australia is, however, at least acknowledging that a problem exists. They just haven't identified the right one. I'll give them participation points. The rest of the globe, not so much.

Comment Re:Just shows he does not really understand hardwa (Score 2) 81

One major difference, assuming you've got full platform support(should be the case on any server or workstation that isn't an utter joke; but can be a problem with some desktop boards that 'support' ECC in the sense that AMD didn't laser it off the way Intel does; but don't really care); is that ECC RAM can (and should) report even correctable errors; so you get considerably more warning than you do with non-ECC RAM.

If you pay no attention to error reports ECC or non-ECC are both rolling the dice; though ECC has better odds; but 'proper' ECC and Linux-EDAC support will allow you to keep an eye on worrisome events(normally with something like rasdaemon, not sure what other options and preferences there are in terms of aggregating the kernel-provided data) and, unless the RAM fails particularly dramatically and thoroughly, will give you much better odds of knowing that you have a hardware problem while that problem is still at correctable levels; so you can take appropriate action(either replacement, or on the really fancy server systems, some 'chipkill'-like arrangement where the specific piece of DRAM that is failing gets cut out of use when deeemed unreliable without having to bring the system down.

Comment Re:BSoD was an indicator (Score 2) 81

Sometimes you'd get a BSOD that was a fairly clear call to action; when the error called out something recognizable as the name of part of a driver; but that is mostly just a special case of the "did you change any hardware or update any drivers recently?" troubleshooting steps that people have been doing more or less blind since forever; admittedly slightly more helpful in cases where as far as you know the answer to those questions is 'no'; but windows update did slip you a driver update; or a change in OS behavior means that a driver that used to work is now troublesome.

Realistically, as long as the OS provides suitable support for being configured to collect actual crash dump material if you want it; it's hard to object too strongly to the idea that just rebooting fairly quickly is probably the better choice vs. trying to make the BSOD a genuinely useful debugging resource; especially given how rare it is for the person with useful debugging ability to happen to be at the console at the time of crash(rather than just an end user who is ill equipped to make sense of it; or a system that mostly does server stuff, quite likely not on actual physical hardware, where nobody has even touched the physical console in months or years; and it's more or less entirely useless to display a message there; rather than rebooting and hoping that things come up enough that management software can grab the dump files; or giving up and leaving the system in EMS so that someone can attach to that console.

Comment Re: Definition of car vs SUV (Score 1) 254

It's not an SUV unless it can comfortably go up an old logging road with exposed rocks embedded in the road and many potholes, on a mountain.

A Pathfinder was an SUV, as was a Forerunner, or a Range Rover.

These new "SUVs" (grocery getters, kid soccer transporters) are a marketing joke that we all seem to have accepted. Those are cars. with a slightly more upright seating position.

Comment Re:study confirms expectations (Score 1) 201

That's actually a good question. Inks have changed somewhat over the past 5,000 years, and there's no particular reason to think that tattoo inks have been equally mobile across this timeframe.

But now we come to a deeper point. Basically, tattoos (as I've always understand it) are surgically-engineered scars, with the scar tissue supposedly locking the ink in place. It's quite probable that my understanding is wrong - this isn't exactly an area I've really looked into in any depth, so the probability of me being right is rather slim. Nonetheless, if I had been correct, then you might well expect the stuff to stay there. Skin is highly permeable, but scar tissue less so. As long as the molecules exceed the size that can migrate, then you'd think it would be fine.

That it isn't fine shows that one or more of these ideas must be wrong.

Comment Just shoddy... (Score 4, Interesting) 95

What seems most depressing about this isn't the fact that the bot is stupid; but that something about 'AI' seems to have caused people who should have known better to just ignore precautions that are old, simple, and relatively obvious.

It remains unclear whether you can solve the bots being stupid problem even in principle; but it's not like computing has never dealt with actors that either need to be saved from themselves or are likely malicious before; and between running more than a few web servers, building a browser, and slapping together an OS it's not like Google doesn't have people who know that stuff on payroll who know about that sort of thing.

In this case, the bot being a moron would have been a non-issue if it had simply been confined to running shell commands inside the project directory(which is presumably under version control, so worst case you just roll back); not above it where it can hose the entire drive.

There just seems to be something cursed about 'AI' products, not sure if it's the rush to market or if mediocre people are most fascinated with the tool, that invites really sloppy, heedless, lazy, failure to care about useful, mature, relatively simple mitigations for the well known(if not particularly well understood) faults of the 'AI' behavior itself.

Comment Re:Only part of the story... (Score 1) 126

What always puzzled me about Intel's...more peripheral...activities is that they seemed to fall into a weird, unhelpful, gap between 'doing some VC with the Xeon money; rather than just parking it in investments one notch riskier than savings accounts' and 'strategic additions to the core product'; which normally meant that the non-core stuff had limited synergies with intel systems; and had the risks associated with being a relatively minor program at a big company with a more profitable division; and thus subject to being coopted or killed at any time.

Seemed to happen both with internal projects and with acquisitions. Intel buys Altera because, um, FPGAs are cool and useful and it will 'accelerate innovation' if Intel is putting the PCIe-connected FPGA on the CPU's PCIe root complex rather than a 3rd party vendor doing it? Or something? Even at the tech demo level I'm not sure we even saw a single instance of an FPGA being put on the same package as a CPU(despite 'foveros' also being the advanced-packaging hotness that Intel assured us would make gluing IP blocks together easy and awesome). They just sort of bought them and churned them without any apparent integration. No 'FPGA with big fuck-off memory controller or PCIe root we borrowed from a xeon' type part. No 'Intel QuickAssist Technology now includes programmable FPGA blocks on select parts' CPUs or NICs. Just sort of Intel sells Altera stuff now.

On the network side, Intel just kind of did nothing with and then killed off both the internal Omni-path(good thing it didn't turn out that having an HPC focused interconnect you could run straight from your compute die would have been handy in the future...luckily NVlink never amounted to much...) and the stuff they bought from Barefoot; and at this point barely seems to ship NICs without fairly serious issues. I'm not even counting Lantiq; which they seem to have basically just spent 5 years passing on to Maxlinear with minimal effect; unless that one was somehow related to that period where they sold cable modem chipsets that really sucked. It's honestly downright weird how bad the news seems to be for anything that intel dabbles in that isn't the core business.

Comment Re:Quality Work Can't Be Rushed (Score 1) 126

Not delivering on schedule is absolutely a symptom; it's just a somewhat diagnostically tricky one since the failure can come from several directions; and 'success' can be generated by gaming the system in several places, as well as by successful execution.

In the 'ideal' case things mostly happening on schedule is a good sign because it means both that the people doing the doing are productive and reliable and the people trying to plan have a decent sense(whether personally, or by knowing what they don't know and where they can get an honest assessment and doing so) of how long things are going to take; whether there's something useful that can be added or whether forcing some mythical man-month on the people already working on it would just be a burden; keeping an eye on whether there's anything in the critical path that is going to disrupt a bunch of other projects, and so on.

If you start losing your grip on the schedule, that fact alone doesn't tell you whether your execution is dysfunctional or your planners are delusional, or some combination of the two; but it's not a good sign. Unhelpfully, the relationship between how visibly the gantt charts are perturbed and how big a problem there is is non-obvious(a company whose execution is robust but whose planners live in a world of vibes-based theatre and one whose execution is dysfunctional and crumbling and whose planners are reusing estimates from the time before the rot set in might blow a roughly equal number of deadlines; despite one having mostly a fluff problem and one probably being in terminal decline); but it's never a good sign.

Comment Re:Seems reasonable (Score 2) 25

It seems reasonable; but also like something that should really spook the customers.

It seems to be generally accepted that junior devs start out as more of an investment than a genuine aid to productivity; so you try to pick the ones that seem sharp and with it, put some time into them, and treat them OK enough that they at least stick around long enough to become valuable and do some work for you.

If that dynamic is now being played out with someone else's bots, you are now making that investment in something that is less likely to leave, whatever as-a-service you are paying for will continue to take your money; but which is much more likely to have a price regularly and aggressively adjusted based on its perceived capabilities; and have whatever it learned from you immediately cloned out to every other customer.

Sort of a hybrid of the 'cloud' we-abstract-the-details arrangement and the 'body shop' we-provision-fungible-labor-units arrangement.

Some customers presumably won't care much; sort of the way people who use Wix because it's more professional than only having your business on Facebook don't exactly consider web design or site reliability to be relevant competencies; their choice is mostly going to be between pure off the shelf software and maybe-the-vibe-coded-stuff-is-good-enough; but if your operation depends in any way on your comparative ability to build software Amazon is basically telling you that any of the commercial offerings are actively process-mining you out of that advantage as fast as the wobbly state of the tech allows.

Comment Re:Wrong question. (Score 1) 197

Investment is a tricky one.

I'd say that learning how to learn is probably the single-most valuable part of any degree, and anything that has any business calling itself a degree will make this a key aspect. And that, alone, makes a degree a good investment, as most people simply don't know how. They don't know where to look, how to look, how to tell what's useful, how to connect disparate research into something that could be used in a specific application, etc.

The actual specifics tend to be less important, as degree courses are well-behind the cutting edge and are necessarily grossly simplified because it's still really only crude foundational knowledge at this point. Students at undergraduate level simply don't know enough to know the truly interesting stuff.

And this is where it gets tricky. Because an undergraduate 4-year degree is aimed at producing thinkers. Those who want to do just the truly depressingly stupid stuff can get away with the 2 year courses. You do 4 years if you are actually serious about understanding. And, in all honesty, very few companies want entry-level who are competent at the craft, they want people who are fast and mindless. Nobody puts in four years of network theory or (Valhalla forbid) statistics for the purpose of being mindless. Not unless the stats destroyed their brain - which, to be honest, does happen.

Humanities does not make things easier. There would be a LOT of benefit in technical documentation to be written by folk who had some sort of command of the language they were using. Half the time, I'd accept stuff written by people who are merely passing acquaintances of the language. Vague awareness of there being a language would sometimes be an improvement. But that requires that people take a 2x4 to the usual cultural bias that you cannot be good at STEM and arts at the same time. (It's a particularly odd cultural bias, too, given how much Leonardo is held in high esteem and how neoclassical universities are either top or near-top in every country.)

So, yes, I'll agree a lot of degrees are useless for gaining employment and a lot of degrees for actually doing the work, but the overlap between these two is vague at times.

Slashdot Top Deals

Lack of skill dictates economy of style. - Joey Ramone

Working...