Forgot your password?
typodupeerror

Comment Re: 4GB has been insufficient for many years now (Score 3, Informative) 110

I have not seen AI code that is *more* efficient than human code, yet. I have seen AI write efficient, compact code when pressed, very, very hard to do so, but only then. Otherwise, in my hands, and those of my developer colleagues, AI produces mostly correct, but inefficient, verbose code.

Could that change? Sure, I suppose. But right now it is not the case, and the value system that is driving auto-generated code (i.e., the training set of extant code), does not put a premium on efficiency.

Comment Re:4GB has been insufficient for many years now (Score 5, Informative) 110

Web browsers are absolute hogs, and, in part, that's because web sites are absolute hogs. Web sites are now full-blown applications that were written without regard to memory footprint or efficiency. I blame the developers who write their code on lovely, large, powerful machines (because devs should get good tools, I get that), but then don't suffer the pain of running them on perfectly good 8 GB laptops that *were* top-of-the line 10 years ago, but are now on eBay for $100. MS Teams is a perfect example of this. What a steaming pile of crap. My favored laptop is said machine, favored because of the combination of ultra-light weight and eminently portable size, and zoom works just fine on it, but teams is unusable. Slack is OK, if that's nearly the only web site you're visiting. Eight frelling GB to run a glorified chat room.

The thing that gets my goat, however, is that the laptop I used in the late 1990s was about the same form factor as this one, had 64 MB (yes, MB) of main memory, and booted up Linux back then just about as fast. If memory serves, the system took about 2 MB, once up. The CPU clock on that machine was in the 100 MHz range. Even not counting for the massive architectural improvements, my 2010s-era laptop should boot an order of magnitude faster. It does not.

Why? Because a long time ago, it became OK to include vast numbers of libraries because programmers were too lazy to implement something on their own, so you got 4, 5, 6 or more layers of abstraction, as each library recursively calls packages only slightly lower-level to achieve its goals. I fear that with AI coding, it will only get worse.

And don't get me started on the massive performance regression that so-called modern languages represent, even when compiled. Hell in a handbasket? Yes. Because CPU cycles are stupidly cheap now, and we don't have to work hard to eke out every bit of performance, so we don't bother.

Comment Re: Why can't the pre-compiled ones be distributed (Score 1) 61

Oh, that's pretty neat. Microsoft is definitely the right level to address this at - they already have permission to enumerate the HW, own the hardware and software infra to tackle this, enjoy economy of scale other players are not privvy too, and can deliver a solution in a vendor agnostic way. Thanks for the heads up. It's the right thing to happen.

Comment Re: Why can't the pre-compiled ones be distributed (Score 1) 61

Of course there are. Tragedy of the commons. My point is that no single entity is likely to absorb the costs unless they're already enjoying economy of scale advantages and there are business experience/optic benefits to doing so. The poster above you pointed out that Microsoft seems to be addressing this, which makes a lot more sense to me than doing it at the 3d HW vendor level.

Comment Re:BitTorrent (Score 1) 61

Sure, but many people would opt in, especially if you explained that they would benefit.

Maybe. Maybe not. Before committing to developing such a thing, you'd have to at least do some research and analysis to find out if that's true and how the likely opt in/out ratios would impact the business case. Remember, this is hosting content in a daemon on your machine .. I think that'd a non-starter for a lot of people, despite the upside of shorter shader updates. (I'm not super up on what the US ISP market/landscape is like these days, but are not data caps still a thing on many plans there? I get the sense that hosting off a home line is not only a performance concern but a concern with actual possible financial ramifications.)

It can't be only when the game is open - this is when gamers are most sensitive to their computers doing other work, and the available of such a network would be far more limited.

Comment Re:BitTorrent (Score 1) 61

Also a torrent like network would be absolute loaded with cache misses. You need to fetch a shader from somebody who has the exact same hardware/drive/game version combination as you do, and they need to have opted in. I highly suspect the majority case for many would be to cache miss and end up compiling locally.

Comment Re:BitTorrent (Score 1) 61

Asking people to host and serve a non-trivial amount of content to other players is a non-starter. (The size of compiling all the shaders for CoD can range from a couple gigs to 10 gigs.) Opting in to a torrent-like network would have to be opt in - many people would just opt out (justifiably or not) minimizing the point of such a network.

You can probably assume that if you've thought of something, they've thought of it too. They simply have constraints and considerations - both technical and business oriented - you don't need or want to account for.

Comment Re: Why can't the pre-compiled ones be distributed (Score 1) 61

It's worth noting that many game studios/engines do support shared shader caches in their local studio pipelines, but the hardware config spread is much more limited, and the costs for lost productivity waiting for shaders is far greater than hosting a shader cache on premises.

Comment Re:Seems pointlessly unsafe (Score 1) 185

A dummy load and some chemistry to use oxygen would do the same job with zero human risk.

If they're not putting boots on the Moon, they shouldn't have their asses in the rocket.

Remember kids, spaceflight is hard. Nature does not like us being in space, at all. She puts up serious, difficult barriers that we need to overcome. Just look how hard it was for a new program like Space X to start from scratch even with all of the existing knowledge developed by NASA, ESA, etc.. How many rapid unscheduled disassembly events did they suffer? I lost count. Even the Russians, who arguably have as much or more LEO experience than the US, continue to face challenges. Heck, so do we, as the current generation of engineers no longer has the direct experience from Gemini and Apollo to guide them. Space is deeply unforgiving of mistakes.

To the GP, if you think that your 5-second considered opinion is better than a fleet of talented folks, I'll wager that if you more time, did some research, you'd change your opinion. I hope you do.

Slashdot Top Deals

Happiness is twin floppies.

Working...