Forgot your password?
typodupeerror

Comment Re:*nix systems are more stable? -- We know.... (Score 1) 171

Have you ever noticed why no one celebrates Patch Tuesday in the pub? It's because they're waiting by consoles waiting for stuff to break.

Windows, client and server, are a house of cards. This goes far back in history. The citations you challenge are each provably wrong. Ever wonder why the cloud isn't rife with Windows servers? There's a reason for that. Cloud Native Windows is almost an oxymoron. Linux and to a lesser extent, BSD, have taken over that space.

In so many ways, Windows is now a legacy data OS in data centers and for good reason. It's developer community has all but collapsed. It's backwards compatibility with other house-of-cards platforms like dot-net have put ball-and-chains around its neck.

The additions of tawdry pre-release-in-production AI with Co-Pilot causes new train wrecks each and every day.

No serious services developer uses Windows as a new development platform. The metaphors you diss are a dodge. You know exactly what the remarks are about. Windows continues to be a sieve for security. Linux and BSD are far more difficult to breach, correctly configured-- and it doesn't take much.

As a developer, if you are one, Microsoft is putting you to the pasture. Have fun eating oats.

Comment Re:Could this all be solved (Score 1) 26

And rightfully so, IMHO. The IA has legitimate goals for fair-use, but I don't think that fair use extends to bulk copying of an archive and for profit.

It's also my belief that the Internet makes it really convenient for outright theft of other intellectual property.

You can't have it both ways; kleptocracy is evil.

Comment Cheap AI is here to stay (Score 3, Interesting) 112

I disagree completely with the premise. Prices don't necessarily have to increase. We have yet to reach maturity of chips optimized for inference, in addition to the regular old factor of computing work per dollar going up over time. Moore's law might be "slowing down" but we aren't at the end of the road yet. Keeping today's features running will cost less to deliver in both infrastructure and electricity in the future.

What will more likely happen is that features and functionality will keep expanding to use more processing power. But where is the limit? I say that comes when we can render at near-realtime 8k/240hz a video (or video game if you prefer) with procedurally generated world, characters, and storyline based on the users input via whatever real-world data, UI or sensors you want to use. This might even be possible now if you are a billionaire with access to million dollar server farms. Probably my imagination isn't broad enough in estimating the limit of "personal computing" but additional computing power beyond that seems pointless for any one individual.

In any case the price of an ai product will depend on the features offered and how much hardware is needed. Probably you can run a 500 billion parameter LLM on a smart watch in 2055 but if you want that power today it seems to cost about $20 a month. I doubt anyone will try to ever charge more for today's $20 featureset. The price of this stuff will absolutely decrease, the only unknown is how companies will roll out new functionality and how specific future features fit into the pricing tiers over time.

Comment Re: Slashdot method (Score 1) 39

It's somewhat understandable though. The $5 contributors waited this long, they can wait a little longer for things to be fixed. If that is possible. Ultimately though they may need to verify identity or at least a domestic phone number within the user's region, which may not be what the users were looking for.

Comment Re:Necessary Questions (Score 1) 86

I know who the maintainers are and what their responsibilities and trajectories are. While no one was looking (Linus), lots of diverse platforms became supported. Provisions were made for both very progressive (if often never ever ever used) modularity.

On the app side, an enormous number of apps found their way inside, often stuff that users (remember users?) didn't ask for but they got The Big Gulp anyway.

I indeed wrote operating systems before Linus Torvalds was born. Wrote in Byte about Linux long ago. Watched a ton of operating systems grow and fail for want of a practical purpose or momentum.

I've surfed the wave of FOSS and Linux (and various BSD) developments for a long time. I have an engineers sense of less is better, and that attack surface involves anchoring unmaintained yet still distributed JUNK into fun attacks. Or watching users (remember users?) become dogged by sheer inode displacement.

Distros don't use Darwin as their set of choices, or even public demand for their content choices. They simply shovel in stuff. More is better. This policy is provably a poor choice.

The relationship between enabling interesting stuff in the kernel and the bloat of distribution apps is highly intertwined. The kernel and app payload enable each other.

Philosophically and relating this to the OP, bloat is bad. It eventually serves too many at the price of integrity and TCO.

Slashdot Top Deals

10.0 times 0.1 is hardly ever 1.0.

Working...