Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Depends on the project (Score 1) 24

It was way more up to the mid 90s or so to have these doodles - basically right when we started going from LSI to VLSI in designs. In LSI and early VLSI and before, chips were generally laid out by hand - hand drawn in mask (you might remember people saying Rubylith, which is a masking film so when it came time to do the photoreduction the rubylith would block light, but under normal light it was a transparent red so you could see what you're masking).

As such, doing such doodles was relatively easy since you had the time and space to do it.

But once going into the VLSI era, on digital chips it started to be whittled down to initials and such. But it was also strongly discouraged because those doodles had been known to interact with signals causing issues.

Because analog IC design generally doesn't use VLSI techniques, they often still have doodles because a lot of it is still manually laid out by hand. (Analog IC design isn't using cutting edge processes).

These days, you're really likely to just get initials as it's relatively easy to get those imported into the design and there's generally an area of the chip for such text anyways (you usually will have die ID codes visible).

Comment Excuses, excuses⦠(Score 1) 39

Heâ(TM)s arguably not wrong that VMwareâ(TM)s offerings outside of their core product are kind of inchoate(though, in fairness, itâ(TM)s not like the âhyperscale cloudâ(TM) guys donâ(TM)t all have a stable of shit thrown at the wall to see what sticks that surrounds the core of services that people actually care about or trust); but that seems like a pretty shabby excuse in this context; where it would have been trivial to just not fuck with what people were using and liked while making the alleged investments in glorious future VMware; then letting the value proposition of that help sell it.

As it is, itâ(TM)s hard to read this as anything other than an awkward(and almost certainly temporary, nobody ever genuinely stops trying to boil the frog once they start); climbdown after recklessly spooking more customers, harder, than intended.

Comment Re:Funny how this is only for the EU (Score 2) 35

Nope. Apple still requires ALL apps to be reviewed and notarized.

I'm not sure why people keep having this misconception that Apple has opened anything up. Apple still has final say on approving an app, which means that this is nothing like how it works on Android.

No, Apple is NOT reviewing any apps not using the App Store. Apple is notarizing apps, but that just means Apple is signing them.

Basically you send your IPA through Apple's page, and it comes back to you signed.

Apple will sign anything, without reviewing it. If something turns out to be malware then Apple could potentially figure out whose app it is and ask the developer to fix it.

Comment Re:Why (Score 4, Informative) 115

8.3 came from Gary Kildall (who never worked for Microsoft) in 1974. Microsoft wasn't even founded until a year later and in 1980 chose to make the DOS file system compatible with Kildall's existing CP/M. None of this seems unwise or shows a lack of foresight on Microsoft's part.

Gary Kildall was the CEO of Digital Research. They created CP/M, a very popular operating system for (at the time) 8 bit microcomputers running the Z80 CPU.

Microsoft at best made a BASIC for these microcomputers - things like the FAT filesystem can trace their origins to microcomputer BASIC from Micro-Soft (as it was known then).

Now, IBM famously tried to arrange a hush-hush meeting with Digital Research to get CP/M ported to 16-bit processors (i.e., the 8086) for their new Personal Computer. Unfortunately, Gary was out flying, while his wife refused to sign the NDA IBM wanted. Now, depending on the version of the story, the widely publicized one was that Gary was flying for fun and not to be disturbed which irritated IBM as they mutually arranged for this meeting. In reality, Gary's wife was the one who did the business matters (Gary was the technical guy/programmer/etc) so Gary wasn't actually needed since it was thought that they would hammer out the legal agreements. But it was not only the failure of Gary's wife to sign the NDA, but that they could not hammer out a licensing arrangement. IBM wanted "free and clear" licensing - they'd pay Digital Research a sum of money for unlimited copies, and that was something Digital Research refused to do. So IBM left.

IBM then asked Micro-Soft (who was providing the BASIC and languages support) if they would supply the OS as well, which is where they bought QDOS from Seattle Computer Products, which was a clean room implementation of a CP/M like OS for the 8086. (It was by definition clean room - as CP/M for 16-bit processors didn't yet exist). Of course, Digital Research sued IBM and Microsoft, and IBM settled by offering CP/M, when it was available, with the IBM PC. Incidentally, this was one of the first "look and feel" lawsuits - it was felt that MS-DOS felt a little "too close" to CP/M

Of course, by then it was too late - MS-DOS and PC-DOS was well established for years, and were cheaper at $99 vs. DR's 16-bit CP/M at $250.

And yes, 8.3 was ridiculous in the 80s, when even contemporary PCs supported longer file names - I think the Commodore 64 supported 16 characters for the filename, while the Apple II supported 30 characters (!) in DOS and 15 characters (aww...) in ProDOS. And the Macintosh supported 32 characters initially.

And that we put up with this silliness until the mid 90s. When every other computer and OS on the market supported much longer filenames. Or *gasp*, spaces.

It should be noted that FAT12 and FAT16 are hardcoded to only support 8.3 filenames - the directory contains 8 characters for name, and 3 characters for extension (all filenames are padded to 8 with spaces). FAT32 was the first to support long filenames natively (support was added in Windows 98, so Windows 95 only did FAT12 and FAT16). So long filename support was a hack for FAT12 and FAT16 (and added to Linux as "vfat" but you could mount it as "msdos" to get back to 8.3. (and there was the umsdos extension to add UNIX permissions and ownership information).

Incidentally, long filename support was one of the patents Microsoft sued over initially, leading to vfat being slightly incompatible (I believe it stopped making the 8.3 compatible filename), but it was OK as Windows would auto-create them on mounting.

Comment Re:It's not really a surprise (Score 2) 140

I diagree. Music often can be played in places where other forms of entertainment is inappropriate.

For example, background music is often present everywhere because it only uses your sense of hearing, while you're doing other things. It's extremely difficult to read a book and drive, for example (though many people try), or to watch TV and read a book at the same time. But listening to music and driving, or reading a book, or doing other things like shopping, happens all the time.

It's one of those things that can make taking exams difficult - if you study while listening to music (and most people do), going into an exam room where it's quiet can be extremely disconcerting to the point of distraction.

Of course, the problem with music is the radio - ClearChannel (now IHeartRadio) has basically destroyed music by turning it into something that's overplayed and bland and heard in every market. Back when there were limits on ownership (first it was you couldn't own more than 6 stations, then something like you couldn't own 3 in a market) you had more variety. Now every market has a KISS-FM, or MOVE radio, or everything else iHeartRadio has.

Comment Re:Price of living (Score 1) 30

You missed a very important task - getting a job requires the Internet as well.

Any retail job - all the job applications are online. There are minimum wage entry level jobs. You can ask, but most people will say to apply at the website.

Still, most jobs require application over the Internet - I'm sure the next job you're considering is being done online - either through some online job portal or you're emailing your resume around to people you know.

And then those interviews are often done remotely and not in person, again, over the internet.

There are very few jobs you can apply for in person these days.

Comment Re:But not practical everywhere (Score 1) 164

I live in rural America, and an EV charging infrastructure is largely non-existent. In concept, EVs have their merits, but in execution, they are not usable everywhere. And frankly, I can't afford to replace 2 ICE vehicles and a farm vehicle with EVs and the supporting charging infrastructure. And besides, when the power goes out, all of my vehicles can still run.

In the late 19th century, there was little infrastructure for ICE vehicles. I mean, you had to run to a pharmacy and buy it in single gallon bottles because that's how they came. Gas stations weren't always around - and people couldn't see how running to pharmacies to buy gas was any more convenient than charging at home (if you were lucky enough to have electricity) or stuffing wood/coal in the firebox (steam powered cars)

Infrastructure gets built - it doesn't appear magically overnight. The road trip is a recent invention - it only happened by the middle of the 20th century when the interstate system was built. Before that travelling by roads was basically impossible unless you had a caravan full of mechanics and fuels and other things because breakdowns were likely, AAA wasn't around, and not having fuel at your destination was a real possibility.

EVs only need electricity to charge, and even the most off-grid person has access to it. It may be inconvenient, but electricity is much more available than gas - you have electricity yet you claim charging infrastructure is non-existent. Unless you're camping in the woods or something, you probably have access to charging infrastructure. The electrons are still the same.

And during power outages, EVs work just as well as gas cars. Perhaps better, because gas cars cannot get gas (gas pumps require electricity). And many EVs support V2L (Vehicle to load) capability, so you can be the house on the block that has access to electricity.

And in case you think EVs use some sort of special electrons, malls are having to police their parking lots because some EV owners are using extension cords to plug into ye old bog standard dangerous as heck 120V outlet on the wall. Yes, the same outlets that you see people sitting around with a charger plugged into their phone, you have people plugging their EVs into.

Comment Re:Canonical has .5% of the desktop market (Score 2) 8

Why would any company partner with them?/blockquote

0.5% is a lot considering Linux as a whole has just over 4% of the desktop market. That implies Canonical has 1-in-8 Linux installs.

Qualcomm is a silicon chip vendor who basically just makes chips for phones running Android. They don't have a tremendous presence for running Linux standalone or Windows.

Partnering up with Canonical means Canonical will likely offer Ubuntu for ARM processors using Qualcomm chips. Great for my former employer (we put Qualcomm SoCs in embedded devices - most customers wanted Android, but a few wanted Linux). Now typically the Linux we put in was based on OpenEmbedded, which is fine for embedded devices.

But Qualcomm is trying to break into the desktop market - to be competitive with Apple. They're partnering with Microsoft to get Windows laptops running ARM, and likely want to bring out Linux laptops running ARM as well. Microsoft wants it because they see Apple having these wild battery life figures, something extremely difficult to get with x86 without turning your laptop into a tank. And Qualcomm probably sees Linux on the desktop as something, given Chromebooks and others have tried it. Get it going reasonably well, and maybe you can get a cheap ARM gaming laptop because of Steam and Proton.

Comment This seems exceptionally stupid. (Score 1) 314

If you are trying to explain why we haven't detected any aliens, how is "they were massacred by even more advanced aliens" a remotely adequate answer? That just leaves you with "why haven't we detected the even more advanced aliens?". The question was never "why do we detect so many deathbots and so few little green men?"

If anything, superintelligences are presumably more capable of doing high-visibility things(if they want to) by virtue of being more advanced; and, while they could all be carefully hiding because they're paranoid that same explanation would hold for standard aliens as well.

Seems like an awful lot of hypothesis to explain nothing.

Comment Re:Hard to fathom (Score 3, Informative) 21

It's hard to understand why speculative execution information and control mechanisms are available to user mode processes at all. It would seem like an obvious hole. Who would need this information other than the microcode?

It's nothing to do with user code getting at speculative execution information.

It's about the side channel attacks that are inadvertently made available due to speculation.

For example, let's say you are trying to determine where some code is executing - perhaps it's an encryption operation. You could code up something that trashes the cache. Then you can start poking at access times to hit certain functions in memory. If speculative execution tries to access a branch in the code, then when you time the results of accessing that code, it would be dramatically lower than if it was loaded from RAM.

This is because the speculative execution would have loaded code and data from RAM into the cache. Then if the speculation failed, the results are tossed away. But the fact remains - code and data was still loaded in the cache, which means if you attempt to time how long it takes to load that code or data, suddenly it decreases because it's already in cache.

It's all side channel attacks to get at the information, not something that the processor directly reveals. That's pretty much why Spectre attacks are multi-architecture - they are not limited but affect all speculative and out of order execution architectures. Spectre attacks happen not just on Intel, but AMD processors and even ARM processors. The latest one attacking Apple's M series processors is again the same thing (the processor has a hardware block that snoops the registers for things that look like pointers and then pre-emptively loads them into cache. This can speed up execution because the cache will be pre-loaded with contents of main memory so when the processor does a memory access it's already there in cache. Of course, the problem is it's also a side channel attack in that timing can reveal what got loaded into cache.

Comment Re:Dick (Score 1) 88

You could always do something using OBS' virtual camera thing. It's actually a setting now where OBS can be used as a virtual webcam for meetings. I think it became really popular during the pandemic to the point where it's a preset when it asks how to configure itself.

So use OBS and let the camera effects fly. You could always make yourself really small in the frame, make your head bounce around the screen, or do other effects.

Slashdot Top Deals

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...