Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror

Comment Re:ChatGPT is not a chess engine (Score 1) 112

A lot of the 'headline' announcements, pro and con, are basically useless; but this sort of thing does seem like a useful cautionary tale in the current environment where we've got hype-driven ramming of largely unspecialized LLMs as 'AI features' into basically everything with a sales team; along with a steady drumbeat of reports of things like legal filings with hallucinated references; despite a post-processing layer that just slams your references into a conventional legal search engine to see if they return a result seeming like a pretty trivial step to either automate or make the intern do.

Having a computer system that can do an at least mediocre job, a decent percentage of the time, when you throw whatever unhelpfully structured inputs at it is something of an interesting departure from what most classically designed systems can do; but for an actually useful implementation one of the vital elements is ensuring that the right tool is actually being used for the job(which, at least in principle, you can often do since you have full control of which system will process the inputs; and, if you are building the system for a specific purpose, often at least some control over the inputs).

Even if LLMs were good at chess they'd be stupid expensive compared to ordinary chess engines. I'm sure that someone is interested in making LLMs good at chess to vindicate some 'AGI' benchmark; but, from an actual system implementation perspective, this is the situation where the preferred behavior would be 'Oh, you're trying to play chess; would you like me to set "uci_elo" or just have Stockfish kick your ass?" followed by a handoff to the tool that's actually good at the job.

Comment Why is dueling CEO quotes a story? (Score 5, Insightful) 32

Why do we even consider it a story when there are a couple of CEO quotes to mash together?

Even leaving aside the notrivial odds that what a CEO says is flat out wrong and the near certainty that what the CEO says is less well informed than what someone at least a layer or two closer to the technology or the product rather than to vague, abstract, 'management'; unless a C-level is being cleverly ambushed when away from their PR handlers with a few drinks in them or actively going off script in the throes of some personal upset, why would you expect their pronouncements to be anything but their company's perceived interests restated as personal insights?

Surprise, surprise, the AI-company guy is here to tell us that the very large, high barrier to entry, models are like spooky scary and revolutionary real soon now; even if you wouldn't know it from the quality of the product they can actually offer at the present time; while the AI-hardware guy is here to tell you that AI is friendly and doesn't bite but everyone needs even more than they thought they did, ideally deployed yesterday; because the AI-company people need to hype up the future value of throwing more cash and more patience at money-losing LLMs; and the AI-hardware people need to juice the total addressable market by any means necessary.

Comment Now we have a new problem... (Score 1) 19

This seems like it radically increases the (historically quite low) risk of steroid abuse within network engineering. We don't ask how "Tank Coreswitch" is preparing for the move from 100Kg/E to 400Kg/E; but apparently it involves more endocrinology and dodgy sports medicine than most other networking standards.

Comment Re:Can't Repair in Peace time? (Score 1) 133

I suspect that finding out the hard way would suck; but I'd honestly be a little curious what the breakdown would be between "it's been decades since we sold this stuff with the expectation of more than toy use; it's bad for margins to have more than bare minimum service techs and spares" where you'd basically be screwed; and "we jerk you around because we can; but if you just conscripted our contractors and Defense Production Act-ed our production priorities it would actually work fine".

If the problem is basically just 'because we can' contract fuckery a real war would probably sort it out; because the DoD can also 'because we can' in a pinch. It's if the system looks rotten because, deep down, it's been at least two generations of people selling cool toys that we all know are just going to be used against pitifully inferior non-state or pariah-state actors to people buying cool toys who know how to talk about 'peer adversaries' but can't forget that their entire career has been more or less discretionary and recreational uses of force that we barely bother to call wars.

There are definitely upsides to not having spent prolonged periods of time in hot wars with existential threats recently; but I suspect that it's hard to keep deep cynicism from creeping into the supply chain when it's so hard to pretend that you aren't just going through the motions.

Comment Re:Gaslighting writ large (Score 1) 90

There's an important thing to keep in mind about 'cultural diversity' in this context.

Under typical circumstances valuing cultural diversity gets to be more than enthusiasm for novelty because it's also a desire to protect (at least some, you don't have to deem them all equally desirable) people from being leaned on more or less aggressively to stop doing what they are doing. That changes if you get too close to the line of advocating more hosts be thrown at the problem in order to keep the show going so it remains available. Goes from being a matter of treating people as ends to treating them as means fairly sharply.

Comment Does anyone care? (Score 1, Insightful) 28

I realize that the herd animals of finance and the illustrious thought leadership of linkedin essentially assume that you are making coal-powered buggy whips by banging rocks together if you aren't doing nation-state levels of capex on chatbots; but is there any evidence that Apple is actually suffering from their alleged deficiencies?

The most angry reaction I've heard, though I'm only privy to anecdotes rather than significant amounts of buying information, is from people who were pissed that Apple went down the hypebeast route by pre-announcing a bunch of AI faff that wasn't ready to ship; and that cut against their historical behavior of saying nothing or rubbishing a category until they decided to enter it; plus the ongoing acknowledgement that siri is useless.

Comment Re: Lol (Score 1) 44

I hope that you are incorrect, because they've been doing some pretty solid work in the area(albeit still at the stage where you are betting on them continuing to do driver support); but the moves like "don't be a total dick about RAM allocations like Nvidia" definitely seem like they could end up on the chopping block if a myopic spreadsheet cruncher gets a look at them.

Comment The sort of 'progress' that makes you more nervous (Score 2) 44

"Some of our early testing with the components we've turned off in Windows, we get about 2GB of memory going back to the games while running in the full-screen experience." is one of those sentences where they guy specifically working on this project sounds like he has done his job; but it really makes you wonder what the hell MS is thinking with the standard setup. It's not like the win11 shell is especially compelling or feature rich; and games' expectations of the platform are weird and varied enough that this wouldn't work if it were some kind of 'disable legacy jank' mode.

Comment Re:Lol (Score 3, Insightful) 44

It seems like a potentially bigger threat to any adventures outside of their 'core' products they want to try.

If I'm just buying a CPU from them that's a fairly low risk bet. Very mature compiler target; more or less a known quantity once benchmarks are available barring the discovery of some issue serious enough to be RMA material. Even if they decide to quit on that specific flavor of CPU the ones I have and the remaining stock should continue to just work until nobody really cares anymore.

If it's something that requires a lot more ongoing investment, though, like targeting Intel for GPU compute or one of the fancy NICs with a lot of vendor specific offload onboard, I'm going to have bad day if my effort is only applicable for one generation because there's no follow-up product; and a really bad day if something goes from 'a little rough but being rapidly polished' to 'life support'.

Even back when they made money Intel never had a great track record for some of that stuff; they've always got something goofy going on the client side that they lose interest in relatively quickly; like that time when optane cache was totally going to be awesome; or the more recent abandonment of 'deep link technology' that was supposed to do some sort of better cooperation between integrated and discrete GPUs; but that stuff is more on the fluff side so it hasn't really mattered.

Comment Re:Nice work ruining it... (Score 1) 98

I hope I'm wrong; but my concern is that MS' decision here might end up being a worst-of-both-worlds outcome:

Devices that are mechanically restricted to type-c by mechanical constraints that require the smaller connector have a greater incentive to just skimp on ports; while devices big enough for type-As now have greater incentive to retain mixed ports because type-Cs now mandate further costs on top of the slightly more expensive connector. If you want to give someone a place to plug in a mouse; poster child of the 'even USB 1.1 was overqualified for this' school of peripherals; you'll either be keeping type A around or running DP or DP and PCIe to that port. Fantastic.

Comment Re:Nice work ruining it... (Score 1) 98

I specifically mentioned that case "You want a cheap just-USB USB port? Either that's Type A so nobody can standardize on connectors; or it gets omitted to meet logo requirements"; and noted it as undesirable because it encourages the perpetuation of dongle hell. I'd certainly rather have a type A than no USB port(and, at least for now, I've still got enough type A devices that the port would be actively useful; but that may or may not be true forever; and is less likely to be true for 'want to pack efficiently for travel' cases rather than 'at the desk that has my giant tech junk drawer' cases).

As for the controller chip; that's a matter of...mixed...truth with USB-C. The USB part of the port will run from the USB controller, or an internal hub; but any AUX behavior(like DP support) is related to the USB controller only in the sense that there's a standardized way for it to surrender most of the high speed differential pairs for the use of the AUX signal. Actually running DP from the GPU to the port is a separate problem. For power delivery; I assume that at least some controllers will implement the negotiation for you(since it's mandatory even for devices that will neither request nor provide more than a relative pittance at 5v); but there is absolutely going to per a per-port cost difference in terms of the support components and size of traces between a port that is expecting to provide an amp, maybe 2, of +5v to peripherals and a port that is expecting to take a hundred watts at 20v and feed it to power input for the entire device.

Comment Nice work ruining it... (Score 5, Insightful) 98

This seems both loaded with perverse incentives and like it doesn't even necessarily solve the problem that it claims to solve.

Most obviously, MS is saying that if it doesn't support a display and device charging it's forbidden. So it's mandatory for all type-C ports to include the expense of power delivery circuitry capable of handling your device's potential load and either a dedicated video out or DP switching between type-C ports if there are more ports than there are heads on the GPU. You want a cheap just-USB USB port? Either that's Type A so nobody can standardize on connectors; or it gets omitted to meet logo requirements. Further; if a system supports 40Gbps USB4 all its ports are required to do so; including higher peripheral power limits, PCIe tunneling, and TB3 compatibility. You think it might be nice to have a port to plug flash drives into without allocating 4 PCIe lanes? Screw you I guess.

Then there's what the alleged confusion reduction doesn't actually specify: USB3 systems are only required to support 'minimum 1' displays. They need to have the supporting circuitry to handle that one display being on any port; but just ignoring the second DP alt mode device connected is fine; no further requirements. Data rates of 5, 10, or 20Gbs and accessory power supply of either greater than 4.5 or 7.5w are also fine(except that 20Gbs ports must be greater than 7.5); USB4 systems have higher minimum requirements; 2 4k displays and 15w power; but are similarly allowed to mingle 40 and 80Gbs; and it's entirely allowed for some systems to stop at 2 displays and some to support more; so long as the displays that are supported can be plugged in anywhere.

Obviously the tendency to do type-C ports that are just totally unlabeled or with a teeny cryptic symbol was no unduly helpful; but this seems like taking what could have been a fairly simple distinction (like the one that existed all the way back in the firewire/USB 1.1 days, or in the thunderbolt/USB systems, or slightly more informally on non-intel systems without thunderbolt), of "the fast port that does the things" and "the cheap port that is in ample supply"; and 'reducing confusion' by just banning the cheap port that is in ample supply(unless it's type A, for space consumption and to prevent connector standardization).

Are you really telling me that there wasn't something you could come up with to just tell the user which ones are power/video/PCIe and which ones are normal random accessory USB ports? I hope you like docking stations; because it seems like there will be a lot of those in our future.

Slashdot Top Deals

What is algebra, exactly? Is it one of those three-cornered things? -- J.M. Barrie

Working...