Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment He may be missing the quiet part... (Score 1) 141

Eberhart seems like he may be falling for the hype himself. He says "What's happening now isn't innovation; it's aspiration masquerading as disruption..."; but fails to note the fairly profound differences in results between the orbital delivery guys and the moonshot guys; and how neatly that maps onto what is aspiration and what isn't.

Putting satellites into orbit is kind of mundane at this point, too common, too obviously useful; but it's sufficiently obviously useful that more or less anyone with nation-state aspirations wants to at least have a program that executes; and civilian and day-to-day operations want someone who executes but cheaper. And that exists. Going to the moon is cool, and it's a nice prestige project for when the gerontocracy needs to show that they still have it just like when they showed the commies what for; but it's unclear exactly what the point is or the stakes are beyond that. The customer presumably would like to actually land something on the moon, at some point, just to say that they did; but what they are buying is mostly aspiration on the cheap: We get to say that we have a lunar program for way less than Apollo money, you do some open-ended tinkering, honor satisfied.

He can talk about 'accountability'; but it seems like it's a fundamentally hard problem to actually sustain a lie about how serious you are, at an institutional level, in the long term. It's not like do-or-die projects are free of losers(especially because circumstances have a nasty habit of thrusting them on people whether they like it or not; rather than giving them the luxury of choosing whether or not to take on those stakes); but they tend to be animated by a sense of genuine urgency. Stuff that is, fundamentally, kind of optional, by contrast, tends to reflect that in bulk. Timmy Rockets may be genuinely more passionate about stir-welding than you've ever been about anything; but, like is cousin who is really passionate social worker, will soon discover that going to the moon and fighting poverty are open-ended projects we do because they sound nice, not because anyone who matters is actually committing to a deadline.

Comment I'm skeptical. (Score 1) 52

I can think of some niche cases where this might be useful(mostly HHD/SSD wear data; though bad actors have been able to tamper with those values without much difficulty); but overall this seems like throwing an awful lot of identifying data and a whole 'trust me bro' shadow subsystem at a problem that the data is unlikely to actually help all that much with.

This will be very good at fretting if the refurbisher swapped out RAM or mass storage; but it's not like onboard diagnostics are all that good at picking up the difference between a machine that has had a fairly hard life and now has somewhat dodgy ports and a bit of uncomfortable flex vs. one that sat on a dock most of its life and got unplugged only a handful of times; any any issue that the embedded diagnostics can pick up can also be picked up without any special recordkeeping by just running the diagnostics when you receive the device and verifying that it doesn't throw any errors out of the box.

If you've already got the trust me bro shadow subsystem I assume it's relatively cheap to propose having it keep more records; but I'm not really convinced of how much value is being added.

Comment What's the core of the project? (Score 1) 23

Is there some problem particular to human DNA that they are looking to solve; or is this just an extension of the ongoing work on DNA synthesis(if you are OK with relatively short segments that has come down to being something you can just order, not nearly as exotic as it once was) but being hyped because there's some human cell genetic engineering at the end; rather than just meeting more aggressive targets for achievable lengths?

Comment Re:Perpetual (Score 2, Interesting) 67

Having spent a whole hell of a lot of time lately on Gnome, configuring it and testing various configurations for rollout at the company I work for, all I can say is that it just works. There's a browser, and bizarrely, printers just work on Linux now in a way they just used to work on Windows, and it's now Windows, at least in an enterprise environment, where printing has become the technical equivalent of having your teeth filed down. Where work does need to be done is on accessibility, so we have one staff member who will stick with Windows 11 for now. Libreoffice's Calc is good enough for about 90% of the time, and Writer about 95%. We remain open to Windows machines for special use purposes, but most people after mucking around for a bit are able to navigate Gnome perfectly well, since once they're in the program they need to use, what's going on on the desktop is irrelevant.

On the enterprise back end, supporting global authentication has been around a long time, and if you only have admins who know how to navigate a GUI, then you have idiots. The *nix home folder is infinitely superior in every way to the hellscape that is roaming profiles, so already you're ahead of the game.

Comment Re:So, yeah for microkernels? (Score 4, Interesting) 36

That just about sums it up. Moving drivers into user land definitely reduces the attack surface. As it stands, antivirus software in most cases is essentially a rootkit, just one we approve of because that low level access allows it to intercept virus activity at the lowest level. With a microkernel, nothing gets to run at that level anyways, so microkernels are inherently more secure.

Traditionally the objection to microkernels was they were slower, since message passing has a processing cost in memory, IO bandwidth and CPU cycles. In the old days where may you had a couple of MB of RAM, or even 8 or 16mb of RAM (like my last 486), with 16 bit ISA architecture and chips that at the high end might run at 40-60mhz, a microkernel definitely was going to be a bit more sluggish, particularly where any part of that bandwidth was being taxed (i.e. running a web stack), so Windows and Linux both, while over time adopting some aspects of microkernel architecture (I believe Darwin is considered a hybrid), stuck with monolithic architecture overall because it really is far less resource intensive.

But we're in the age when 16gb of RAM on pretty high end CPUs where even USB ports have more throughput that an old ISA bus, that I suspect it may be time to revive microkernels.

Comment Re:dust (Score 4, Insightful) 87

The paper claims that the photochemistry of the particles is important to the process; allows them to generate free oxygen at the target site when illuminated without the downsides of just shoving hydrogen peroxide into your sinuses and oxidizing things indiscriminately.

I'm unclear on why small magnetic particles are being called 'robots' now; by that logic you could claim that laser printers use nanorobot swarms to produce text; or that paint is actually an aqueous suspension of visual-band signaling nanites; but it sounds like the surface chemistry of the particles is an important part of the process.

If you were using iron you might be able to get similar effects by inductive heating once you delivered them to the target area; you absolutely could destroy cells in the immediate proximity that way; but it would come down to what option is easier to tightly control and, ideally, more discriminating between bacteria and local human cells. I assume that the actually-qualified people chose photochemistry and free oxygen over inductive heating for good reasons; but I don't know how they compare.

Comment Re:Is weird that LinuxSteam is still 32 bit (Score 1) 61

It's not that weird. Much of what Steam sells won't run as expected without some amount of 32 bit support(you are much less likely to find that the main game is 32 bit for anything that got a cross platform release with an 8th gen console, since all but the switch had more than 4GB of RAM; but absolutely no promises on every config launcher or random middleware component); and the steam client itself has, thankfully, remained comparatively lightweight. Probably not as light as it could be; but on the speedier end of programs that are basically web browsers with some background extras.

Not being 64 bit is more of an issue in cases where your one program is the only reason an entire 32 bit support environment is being loaded up; but that's very unlikely to be the case on a windows gaming system and fairly unlikely to be the case on a linux gaming one; and is no longer a terribly relevant consideration on MacOS now that Apple just executed all the 32 bit stuff. Maybe more people than I think are running Steam on TV-connected PCs purely as a remote play client?

It's not desperately elegant; and if they were holding off because they viewed it as an actually hard problem you' be a bit worried about either the state of the codebase or the people working on it; but it's hard to make a strong case for a 64 bit steam client being a particularly urgent priority; given the software that you normally use steam to install and run.

Comment Re:Curious... (Score 1) 97

I just said wifi 6 because that's what all the stuff on their website was. That's what struck me as weirdly unambitious for someone who is pushing a wiring standard capable of substantially more.

I don't write marketing copy; but if I were emphasizing the superiority of fiber I would have bulked out the list of models with at least a few blatantly 10Gb or higher options; rather than a bunch of random undemanding APs.

Slashdot Top Deals

As far as we know, our computer has never had an undetected error. -- Weisert

Working...