Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment Re:Only part of the story... (Score 1) 83

What always puzzled me about Intel's...more peripheral...activities is that they seemed to fall into a weird, unhelpful, gap between 'doing some VC with the Xeon money; rather than just parking it in investments one notch riskier than savings accounts' and 'strategic additions to the core product'; which normally meant that the non-core stuff had limited synergies with intel systems; and had the risks associated with being a relatively minor program at a big company with a more profitable division; and thus subject to being coopted or killed at any time.

Seemed to happen both with internal projects and with acquisitions. Intel buys Altera because, um, FPGAs are cool and useful and it will 'accelerate innovation' if Intel is putting the PCIe-connected FPGA on the CPU's PCIe root complex rather than a 3rd party vendor doing it? Or something? Even at the tech demo level I'm not sure we even saw a single instance of an FPGA being put on the same package as a CPU(despite 'foveros' also being the advanced-packaging hotness that Intel assured us would make gluing IP blocks together easy and awesome). They just sort of bought them and churned them without any apparent integration. No 'FPGA with big fuck-off memory controller or PCIe root we borrowed from a xeon' type part. No 'Intel QuickAssist Technology now includes programmable FPGA blocks on select parts' CPUs or NICs. Just sort of Intel sells Altera stuff now.

On the network side, Intel just kind of did nothing with and then killed off both the internal Omni-path(good thing it didn't turn out that having an HPC focused interconnect you could run straight from your compute die would have been handy in the future...luckily NVlink never amounted to much...) and the stuff they bought from Barefoot; and at this point barely seems to ship NICs without fairly serious issues. I'm not even counting Lantiq; which they seem to have basically just spent 5 years passing on to Maxlinear with minimal effect; unless that one was somehow related to that period where they sold cable modem chipsets that really sucked. It's honestly downright weird how bad the news seems to be for anything that intel dabbles in that isn't the core business.

Comment Re:Quality Work Can't Be Rushed (Score 1) 83

Not delivering on schedule is absolutely a symptom; it's just a somewhat diagnostically tricky one since the failure can come from several directions; and 'success' can be generated by gaming the system in several places, as well as by successful execution.

In the 'ideal' case things mostly happening on schedule is a good sign because it means both that the people doing the doing are productive and reliable and the people trying to plan have a decent sense(whether personally, or by knowing what they don't know and where they can get an honest assessment and doing so) of how long things are going to take; whether there's something useful that can be added or whether forcing some mythical man-month on the people already working on it would just be a burden; keeping an eye on whether there's anything in the critical path that is going to disrupt a bunch of other projects, and so on.

If you start losing your grip on the schedule, that fact alone doesn't tell you whether your execution is dysfunctional or your planners are delusional, or some combination of the two; but it's not a good sign. Unhelpfully, the relationship between how visibly the gantt charts are perturbed and how big a problem there is is non-obvious(a company whose execution is robust but whose planners live in a world of vibes-based theatre and one whose execution is dysfunctional and crumbling and whose planners are reusing estimates from the time before the rot set in might blow a roughly equal number of deadlines; despite one having mostly a fluff problem and one probably being in terminal decline); but it's never a good sign.

Comment Re:what else is new? (Score 1) 111

No it isn't. Neither a teeny little collection of actual hermaphrodites and genetic sports, nor male fetishplay justifies the rewriting of common sense.

Humans are divided into two sexes, except when something goes WRONG.

Your view is complete bullshit, an ideology-cloaked-in-"science" promulgated by John Money, a sadistic pedophile whose kinks ended up destroying at least one family with BOTH sons suiciding eventually from his "therapies" but hey, at least he took ample photos when he compelled them to play sex games with each other, right?

Comment what else is new? (Score 0) 111

The populations lack of a grasp on reality is awful, but not surprising.

We just spent what, 4 years with every official agency, every mainstream media outlet, even MEDICAL professionals saying - nay, insisting - that putting on makeup or a dress could make a man literally a woman. That chopping off a child's genitals and replacing them with a simulacrum would do the same.

Don't blame chatgpt for the general public's lack of grasp on reality. That has been the product of carefully crafted propaganda.

Oh btw, every angry downvote just emphasizes my point.

Comment Re:Decentralized services (Score 2) 206

Looked up details on the wording, and it may not be just a logistical nightmare but a legal impossibility. The law appears to only apply to specific platforms, and no Mastodon servers appear on the list. New instances wouldn't either, so there'd be no legal basis for trying to force them to ban teens.

Comment Decentralized services (Score 2) 206

I bet a large enough number of those kids know enough to know about Fediverse-based services like Mastodon to start spreading the word. Instead of a dozen large social media platforms, the government will be faced with thousands of bulletin-board-sized "services" networked together into a platform that has no single place you can go to deactivate accounts. Controlling that would be a logistical nightmare.

Comment Re:Seems reasonable (Score 2) 23

It seems reasonable; but also like something that should really spook the customers.

It seems to be generally accepted that junior devs start out as more of an investment than a genuine aid to productivity; so you try to pick the ones that seem sharp and with it, put some time into them, and treat them OK enough that they at least stick around long enough to become valuable and do some work for you.

If that dynamic is now being played out with someone else's bots, you are now making that investment in something that is less likely to leave, whatever as-a-service you are paying for will continue to take your money; but which is much more likely to have a price regularly and aggressively adjusted based on its perceived capabilities; and have whatever it learned from you immediately cloned out to every other customer.

Sort of a hybrid of the 'cloud' we-abstract-the-details arrangement and the 'body shop' we-provision-fungible-labor-units arrangement.

Some customers presumably won't care much; sort of the way people who use Wix because it's more professional than only having your business on Facebook don't exactly consider web design or site reliability to be relevant competencies; their choice is mostly going to be between pure off the shelf software and maybe-the-vibe-coded-stuff-is-good-enough; but if your operation depends in any way on your comparative ability to build software Amazon is basically telling you that any of the commercial offerings are actively process-mining you out of that advantage as fast as the wobbly state of the tech allows.

Comment Re:Anyone still using IPv4 (Score 2) 40

Most consumers today aren't using IPv4 by choice, but by necessity. Every OS out there supports IPv6, as does every router made in the last 10 years, and supports it pretty much automatically if it's available. The main reason they still use IPv4 is that their ISP hasn't deployed IPv6 support on their residential network, so IPv6 isn't available unless you're a techie and recognize the name Hurricane Electric. The next most common reason is that the site they're accessing only has IPv4 addresses assigned so connections are automatically done via IPv4. Consumers have control over neither of those reasons.

Comment Re:Wrong question. (Score 1) 186

Investment is a tricky one.

I'd say that learning how to learn is probably the single-most valuable part of any degree, and anything that has any business calling itself a degree will make this a key aspect. And that, alone, makes a degree a good investment, as most people simply don't know how. They don't know where to look, how to look, how to tell what's useful, how to connect disparate research into something that could be used in a specific application, etc.

The actual specifics tend to be less important, as degree courses are well-behind the cutting edge and are necessarily grossly simplified because it's still really only crude foundational knowledge at this point. Students at undergraduate level simply don't know enough to know the truly interesting stuff.

And this is where it gets tricky. Because an undergraduate 4-year degree is aimed at producing thinkers. Those who want to do just the truly depressingly stupid stuff can get away with the 2 year courses. You do 4 years if you are actually serious about understanding. And, in all honesty, very few companies want entry-level who are competent at the craft, they want people who are fast and mindless. Nobody puts in four years of network theory or (Valhalla forbid) statistics for the purpose of being mindless. Not unless the stats destroyed their brain - which, to be honest, does happen.

Humanities does not make things easier. There would be a LOT of benefit in technical documentation to be written by folk who had some sort of command of the language they were using. Half the time, I'd accept stuff written by people who are merely passing acquaintances of the language. Vague awareness of there being a language would sometimes be an improvement. But that requires that people take a 2x4 to the usual cultural bias that you cannot be good at STEM and arts at the same time. (It's a particularly odd cultural bias, too, given how much Leonardo is held in high esteem and how neoclassical universities are either top or near-top in every country.)

So, yes, I'll agree a lot of degrees are useless for gaining employment and a lot of degrees for actually doing the work, but the overlap between these two is vague at times.

Slashdot Top Deals

16.5 feet in the Twilight Zone = 1 Rod Serling

Working...