Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:SUDO should not even be in Linux (Score 1) 100

Arguably it depends on whether you are expecting sudo to act as a rigid security barrier that you can use to create accounts if intermediate privilege; or whether you are treating it mostly as a tool for people you'd give root to reduce the amount of stuff they actually run as root.

It's pretty tricky to use it as a security barrier, even when it works perfectly, because so many of the tools that you'd potentially want to use sudo to grant access to are not really designed to restrict the user: once you have a package manager running as root you can use it to do basically anything by installing a package that imposes the changes you want; all kinds of utilities can just pop a shell or be used to edit files; etc. Even if sudo itself is free of holes; you'd really need a whole set of deliberately constrained utilities in order to prevent it from being used for privilege escalation. At that point it probably makes more sense to rethink the security model from the other direction; and focus on reducing the number of operations that are root-only in favor of ones that can be delegated to groups.

Where it's much more useful is allowing someone who is basically trusted as root to not just log in as root and run giant chunks of software that don't need(and probably shouldn't be trusted with) high privileges with high privileges just because they logged in as root and so everything they do is running as root.

Comment Re:So crappy processes? (Score 4, Informative) 43

That's what amazes me.

Maybe I'm just old; but "Signature Authority List" is supposed to mean what it says(possibly blue pen if you really are old; cryptographic if you aren't); it doesn't mean "verbal authorization in a video chat that may or may not even be being recorded somewhere with retention policies set".

I'd be more sympathetic if this were one of the low-value ones where someone impersonates the CEO and tells a random executive assistant or other fairly low-on-the-food-chain employee to make a relatively petty cash transfer to the scammers: you have to feel bad for the person who doesn't want to hassle the big boss, even if they have doubts; but someone with approval authority in the multiple millions is someone whose job description(implicitly or explicitly) is to be slightly prickly about actually approving things.

Comment I wonder why... (Score 1) 42

I'm curious whether the backend for hosting this is disproportionately complex(either following a design from when 19GB of data was still something of some note; or perhaps quite literally a configuration that has been brought forward for a couple of decades with only minimal changes, I'd assume that it's not still running on literally the same FTP servers it started on); whether it was someone's passion project and they are retiring/died; or whether the bean counters are looking so carefully and squeezing so tight that university IT isn't being allowed to throw a pittance at preserving a piece of history that they can't cross charge to another cost center.

Coming from working on much more recent systems it's a little hard to wrap my head around; we have Legal browbeating people and having us enforce retention policies specifically to keep vastly larger amounts of data from just being inadvertently retained because it's actively a hassle to go through and weed things out; and while storing it doesn't cost nothing it often compares favorably to the cost of determining who can give the OK to delete it and hassling them.

I can only assume that, given its age, this system has a lot more infrastructure complexity(possibly understood best by people who are leaving or gone) per GB; so it's not really about the disk space, or the brutal bandwidth load imposed by the tiny OS/2 enthusiast community; but about a comparatively fiddly backend.

Either that or someone in bean counting is being astonishingly petty.

Comment Re:The future (Score 1) 45

Aside from the potential effect on upgrades of "the lost decade"(or decades, sources differ) that started in the early 90s; it actually seems like a reasonably common pattern: technology buildouts that are impressive and functional for their time have a habit of becoming entrenched and(through some combination of relative adequacy vs. rev.1 of the new stuff and incumbents with investments they don't want to write off) remaining stickier longer than one would like.

We certainly saw a similar thing in the US with, say, wireline telco: you may not have loved the monopoly prices; but aggressive coverage levels were a national policy, reliability was high, and Bell Labs was doing all sorts of neat stuff. That all proved to be...unhelpful...when it came to cellular adoption at either reasonable prices or with reasonable handset features: stateside a Blackberry was the future; everyone else was dealing with carrier-locked BREW garbage and paying per-SMS(and paying more for WAP, except that that sucked so much that most people couldn't be bothered); while over in Europe pay-per-SMS was much less of a thing; and Symbian-type arguably-smartphones were reasonably common; and Japan had i-mode and all the handsets built around its still-a-weird-proprietary-mess-but-way-the-hell-better-than-WAP capabilities.

Of course, that ended up being the same phenomenon again, in its turn: US carrier-based services(SMS, MMS, WAP, etc.) were expensive or hot garbage or both; which made the US market ripe for rapid adoption of 'contemporary' style smartphones that do support cellular standards; but are fundamentally oriented around doing as much as possible over TCP/IP with the carrier just acting as a pipe; because only Blackberries were even remotely non-garbage as more telco-oriented 'smart' phones. In Europe and Japan the old style didn't last forever; but the relative quality and sophistication of pre-"It's all just TCP/IP on a small computer; right?" style designs actually gave the iphones and androids a run for their money. In some cases (like ability to do contactless payments in certain subway systems and things from your phone) the new gear remained a regression in certain respects for years afterwards.

Comment Re: Am I missing something? (Score 1) 86

There is the additional complication of whether 'cloud' just means "VPS" or whether you are hitting the more abstracted tools that most of the cloud guys offer.

They'll all certainly sell you very classical VMs, just with more room to scale them up or down or bring more online than you probably have at home; but you are starting to look at architectural changes if you wander into the "managed instance" or "serverless" offerings(Cosmos or Aurora DB; AWS Lambada or Azure Functions; S3 buckets; etc.)

Obviously those are still someone else's computer under the hood; "serverless" just means that it's hidden and you can't touch it; not that it's truly absent; but if you start hitting those sorts of services you are making changes that mean that "your datacenter vs. their datacenter" only remains true at a fairly high level: if you wanted to move back you'd either have to change back how you are doing some things; or go with something like Openstack that is dedicated to making your computers present AWS-style abstractions.

Comment Innovation! (Score 2) 43

For too long Goodhart's Law has oppressed those who seek to reward some certain measurable behavior by inspiring the measured to game whatever the target is.

Now, with Watson just-screw-the-employees technology IBM has demonstrated a bold path forward: just make it real clear that whatever measure you are treating as a target might actually result in no rewards whatsoever; so it's not worth gaming!

Comment Re:computers are reliable (Score 1) 96

They generally seem to be better behaved than the software that runs on them; but computers absolutely aren't reliable(especially the ones that skip things like ECC and storage medium redundancy). What's even worse is that(unlike software, which at least in principle can be correct, even if it's generally uneconomic to write it at the level of formal verification and people don't bother) hardware fails unpredictably. Some particularly bad designs or defective components can make certain failures so overwhelmingly likely as to be good guesses(as during the capacitor plague era; or with certain laptops that are known to stress their internal display cables at the hinge); but sooner or later physical degradation catches up with them all in one way or another.

Comment Re:Jettisons Itanium? (Score 3, Informative) 52

There will probably be Itanium users for a while; but only the ones who have some legacy workload that absolutely cannot be touched for some reason or another.

Some of those people will probably be willing to pay for careful security backports; some will just firewall it and roll the dice; but neither are really of any interest to the mainline kernel.

Worse, there's no incentive whatsoever to use it outside of that handful of legacy cases: it was produced in fairly modest quantities; and the last significant improvement in the architecture was with Poulson chips in late 2012(Kittson came in mid 2017 but was on the same process side and included no architectural improvements; the project was on life support at that point); and even at release Itanium was having a pretty good day when it traded blows with Xeons; so we're talking something between a Sandy Bridge Xeon on a bad day and maybe a Haswell one with a following wind; except really obscure. Also no afterlife as an embedded instruction set or other niche application; it was only ever one product line.

Comment Re:It's got nothing to do with AI (Score 1) 128

That computing power isn't 'backing'; it's a transaction cost. It may have taken a zillion hashing operations to bring a 'coin' to its current state; but it's not like you can convert it back into that amount of compute power for your use(even if it's one of the flavors that does something relatively general purpose; rather than one highly specific flavor of makework that has no other uses); it's all just expended.

It's like saying that a paper currency is 'backed by printing presses'. Do you need to do some printing to keep a supply of bills in decent condition in the field? Sure. Does that mean that bills are convertible into print services? Only very incidentally if there are printers who accept that currency as payment for jobs; the printing done to put the bills in circulation is just a sunk cost that cannot be converted into anything.

Comment Not sure the math works, or what is being asked. (Score 1) 93

"On another level, however, it's a disaster for about 99 percent of releases, which stand absolutely no chance of garnering any attention, no matter their quality. The solution: human storefront curation, which Valve has never shown any intention of doing."

I'm not quite sure how this is intended to work: if "human storefront curation" is intended to provide better recommendations it's quite possible that it would be successful, though the automated similarity/people-like-you-bought ones already aren't terrible; but that wouldn't really change the fact that the majority of releases die in obscurity. There are just so many that they cannot all be visible at once; only some relatively more visible than others.

If "human storefront curation" is intended to mean tougher reviews; then isn't the effect the same? Roughly the same games that today languish in obscurity will instead just not get listed.

I don't mean to defend Steam's curation and discovery as the gold standard, there's definitely room for improvement; but I just don't see how even an arbitrarily good curation and discovery mechanism, with downright omniscient understanding of what each buyer wants and what each game delivers, will substantially change the fact that there aren't enough man hours available for 14,500 games/year to get enough attention to keep most of them from selling basically nothing. Especially when so many of them are just bad, or OK-ish but a direct clone of a strictly better game. There probably are some undiscovered gems that are tragically unknown and undersold because they fall into some sort of algorithmic blind spot; but there are also overdiscovered messes that probably deserve more obscurity than they get, so curation improvements would cut both ways.

Comment Re:Just in case... (Score 2) 45

If it actually works as advertised; that might be part of the appeal of the multichannel support:

Assuming that the link can fall back gracefully to a lower frequency when necessary, and remain reasonably stable, just slower; it becomes a lot easier to see the higher frequency bands as just a nice bandwidth bonus that you get when you happen to have decent line of sight on an AP; rather than something you simply cant' trust because having your link drop is deeply irksome.

Comment Seems like the wrong area... (Score 1) 32

It wouldn't be entirely surprising(though, given some of the...interesting...choices you get on Chinese phone vendor android skins, hardly assured) if a phone vendor can do a car infotainment system that at least comes out of the gate not feeling a decade old and badly broken; unlike some of the primarily car guys; but that doesn't really seem like it's going to cut it unless the implied plan is to change the lifecycle pretty radically.

It's a pretty major failure if a car doesn't last long enough that its embedded systems will be a decade+ old while it is still within its operating life; so (while, obviously, being broken from day 1 is not going to do you any favors) it seems like the problem you need to solve is ultimately one of compartmentalization rather than just freshening up: How are you going to keep the really low level stuff(motor controllers, random sensors and power windows and all the other CAN bus widgets) that is mostly fixed-function but probably not just a masterpiece of security that should be allowed near anything internet-connected suitably isolated? Is the user-facing UI intended to defer to external devices with shorter replacement cycles(android auto/carplay style)? Is it running under a hypervisor on reasonably overqualified hardware such that you'll be able to target generic OS updates at it with a minimum of fuss more or less indefinitely; rather than being left behind the second the low-bidder's BSP ages out? Is it a system-on-module that is intended to interface with the vehicle only at a few well defined and standardized connection points?

Comment Re: Generally agree. (Score 1) 174

For some reason Microsoft did that thing they like do to and murdered the easy, user-visible, option in favor of a more cryptic thing aimed at IT. The https://learn.microsoft.com/en...â>User State Migration Tool. If memory serves the two have a lot in common(I think you can even get them to recognize one anotherâ(TM)s output files; but USMT is all command line and XML config and intended for bulk use.

Itâ(TM)s honestly kind of weird; sort of like the period where Time Machine was a gross, rather brittle, hack on top of HFS+; but at least it was there for users; while NTFS volume shadow copies were a robust feature; but one that was only ever used by IT or for the (always terrible) âsystem restore pointsâ(TM).

Comment Re:It's the politics (Score 1) 228

"It puts the mother to be at risk every time it's done"

It's invasive and risky by the standards of first-line contraception; but it has an enviable safety record compared to childbirth. Over a factor of ten lower mortality rate in US statistics; probably not quite as dramatic in places with lower maternal death rates.

Slashdot Top Deals

Always draw your curves, then plot your reading.

Working...