Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:Infinity rules baby. (Score 1) 77

The Triassic wasn't particularly "lush". With Pangaea barely getting started on it's pre-breakup LIP and rifting, most of the Earth's continents were far form any oceans to produce moist air and rainfall, making it, on average, a fairly arid period.

Also, most oil deposits are considerably younger than the Triassic.

Comment Re:Something I posted on Gary and CPM here in 2014 (Score 1) 77

I quote someone else saying: "The PC world might have looked very different today had Kildall's Digital Research prevailed as the operating system of choice for personal computers. DRI offered manufacturers the same low-cost licensing model which Bill Gates is today credited with inventing by sloppy journalists - only with far superior technology. DRI's roadmap showed a smooth migration to reliable multi-tasking, and in GEM, a portable graphical environment which would undoubtedly have brought the GUI to the low-cost PC desktop years before Microsoft's Windows finally emerged as a standard. But then Kildall was motivated by technical excellence, not by the need to dominate his fellow man."

That did happen. There was Concurrent DOS in the mid 80s, which was multitasking DOS. GEM was used in several other computers, but also was a competitor to Windows in the early days - because it was extremely lightweight and you didn't need to upgrade your 8088 PCs to run it.

But low cost licensing they didn't have.

In the famous "IBM visits Digital Research", the narrative often heard is that IBM came to Digital Research (Kildall's house) and his wife said he was out flying. That was all true. However, IBM did not leave nor demand Kildall return - the reason is that Kildall's wife handled the business side of Digital Research while Kildall did the technical side. Thus, he wasn't needed for the meeting at all.

The main sticking points were that IBM wanted an NDA signed, and Kildall's wife refused. The second demand was that IBM wanted an all-in price to license CP/M. They didn't want to have a per-PC license, they wanted an all-in price.That was the biggest stumbling block because DR did not offer an unlimited seat license.

So after that they went back to Microsoft to get an OS from them (Microsoft was tasked with producing and porting their suite of languages to the new computer). Microsoft did cheat Seattle Computer Products when it bought QDOS from them - because they wanted to pay per-customer. And they sold it to well, one customer - IBM. So that's why PC-DOS and MS-DOS existed and why IBM kept shipping PC-DOS. They bought it from Microsoft and had Microsoft maintain it for a while.

In the end, Digital Research did sue IBM and Microsoft because MS-DOS/PC-DOS looked too similar to CP/M (one of the first "look and feel" lawsuits - DR was known to sue companies for making something look like CP/M). IBM settled out of court by offering to ship their PCs with CP/M in the end. (The PC never shipped with an OS - that was something the IBM reseller added as a package deal). Of course, CP/M couldn't compete in the 16-bit world because MS-DOS was $99, while CP/M was $250 and never really caught on, because by that time, people have ported their CP/M applications to MS-DOS/PC-DOS. (Of course, Microsoft had a hand here - they made some source translator tools - thanks to the similar architectures of the 808x (8080, 8085, 8086), 6800, 650x and Z80 CPUs, it was possible to do a mechanical source translation to aid porting. MS-DOS/PC-DOS/QDOS was structured after CP/M's design which helped greatly because system calls ended up being similar.

I don't know how much CP/M cost on 8-bit computers - it could be cheap, it could be expensive. Just on the PC side, MS-DOS was far cheaper than CP/M ever was.

CP/M produced MP/M which was a multi-user version of CP/M, which begat 16-bit versions CP/M-86 and MP/M-86, which then became Concurrent CP/M-86, which evolved into Concurrent DOS. That eventually transformed into Multiuser DOS in the 90s (after Novell acquired Digital Research).

Incidentally, MS-DOS 1.x/PC-DOS 1.x was very CP/M like. MS-DOS 2.x started adding more traditional operating system conventions that Microsoft adopted from their Xenix. In that in CP/M, and DOS 1.x, files specified on the command line were opened for you by the operating system and you basically manipulated pointers and asked the OS to bring it into memory. DOS 2.x later acquired the more traditional open/read/write/seek/close style semantics and system calls.

Comment Re:do not want (Score 1) 200

In the UK, electricity is on average about 1/5th the cost of petrol per mile.

In California, PG&E raised prices so high that a model 3 (~250wh/mi, $.35 /kWH = 9Â/mi) is in the same ballpark per-mile to a Prius (55mpg, $5/gal = 9Â/mi).

And that's assuming you only charge at home overnight on the lowest off-peak rate and don't get dinged too much from the exorbitant ($.66/kWH) peak rate.

Makes a ton of sense in the rest of the country though. So much for promoting a clean environment :-(

Comment Re:$53,000 goal? (Score 1) 32

The Kickstarter goal was $53,000, it is over $75,000 now. I'm definitely not an expert, but that sounds like 1% of the money I would expect is needed to manufacture a tablet.

I guess they already make a phone, so maybe I'm wrong, but that's a suspiciously low number in my mind.

The tablet is already manufactured. You can probably buy it off AliExpress today.

Most of the hardware kickstarters you'll find are simple repackaging of AliExpress. For example, there was a walkman with Bluetooth - it already existed in a form you could buy immediately, the Kickstarter was just a slightly modified and rebranded version.

Likewise, this tablet is probably the same thing - they're just going to tell the company to make a few modifications and then ship it. So you're really just getting a cheap tablet resold at higher prices.

Quite a bit of the hardware Kickstarters are really just a curated form of AliExpress.

Comment Re:This should be impossible (Score 5, Informative) 90

It's kind of like how vulnerable most of the world is to an EMP attack. Think it through. Someone blows off an EMP above North America. It fries everything imaginable, including our electrical grid. We don't have the spares in stock to fix it, and in fact we have to go to China to manufacture them with an ungodly lead time. Meanwhile, the majority of the world's advanced semiconductor production is in Taiwan.

We're all fucked for years. Your new Teslas (or anything made after the early 90s, including ICEs) are bricks. I've been into places that have had a power failure and they literally had to shut down because the retards that they hired as cashiers can't do basic math and check people out.

Thankfully such an attack is extremely difficult because of two things - shielding and the inverse square law.

Even the largest nukes known on the planet, the EMP area is rather small - an EMP large enough to take out everything in North America would basically destroy the globe in which case I think we have bigger problems than our cars being dead.

The other problem is shielding - those metal cans on everything do a really good job at blocking EMPs as well - by orders of magnitude. If an unshielded device can be affected by an EMP at say, 2km, wrapping it metal foil reduces the range to around 2m or so.

So your car's ECU, which generally lives in a very hostile environent is already sealed inside very thick metal boxes designed to keep out lots of electrical noise and other things but also protects it from the EMP. Chances are, if your car dies from an EMP, you won't have much of a car left anyhow.

Also, the most effective protection against an EMP is basically turning it off. With enough warning, it's possible to shut down the electrical grid to protect it.

Of course ,let's also not forget that gas pumps require electricity, so if it ever should come to that kind of scenario, you're still pretty screwed without having to siphon gas endlessly.

The biggest threat to the electrical grid is a CME (coronal mass ejection) because those wobble the Earth's magnetic field, and because power lines are long, those wobbles induce currents in them which can burn out transformers if the protective equipment doesn't react fast enough. But your car will be just fine other than the compass might be slightly messed up.

Comment So the 6502 wins! (Score 2) 79

The 6502 is still available today brand new along with several of its companion chips as well as the 16-bit version.

Though, honestly, I don't see why those chips haven't been long discontinued - the Z80 and 6602 basically should be soft cores these days and implementable in basically a reasonably cheap FPGA able to run at full speed. I know Western Design Center (the current provider of 6502 chips) already sells such a design so you can implement a 6502 core in your product and achieve reasonable integration density.

I don't see many people implementing the 6502 or Z80 as a standalone chip other than retro enthusiasts. If you need a Z80 core, you probably will just have one for your FPGA already or build it into an ASIC.Though, oddly, the 8051 has shown incredible robustness for an embedded controller - they're seemingly everywhere especially in USB applications

Comment Re: Why is this not easy? (Score 1) 21

Think of something like Linux. When you boot it up, it prints a banner, which contains the version and timestamp and who built the kernel. (Linux 6.7.1 built on date by blah). That's a timestamp - it's handy during development because hey, the version number might stay the same, but the timestamp gives you a rough idea of where to look at what changes it might have. But they're murder on reproducible builds.

Another one is if you're doing parallel builds - the build server may have 20+ cores on it to make builds speedy so you build Linux using all 20 cores. But which cores get which source files is completely random and if you have an Intel performance and efficiency core thing going, the E cores would take longer, and this can affect the link order of the kernel objects. Changing the link order may mean objects are laid out differently in the final binary as well as branch and jump addresses. Especially tricky since the kernel partially links every file first.

The fun one to get is filesystem layouts. When you traverse a directory, you often get files in the order the files were created in the directory. If your build system does parallel builds, the order the files are created may no longer be deterministic because as the code is built, it's placed int he output directories randomly. When the filesystem image is built, it's often done by pointing the tool at a directory, so the tool enters the directory and then traverses it creating the image as it encounters files and directories. If the build system puts files and directories in a non-deterministic order you can often get files added randomly which can mess up the image. COmpression and encryption compound the problem. (Most tools use calls like opendir() and readdir() to get through the directory tree so the file order they add to the final image is dependent on the order they were created in the directory). This is the hardest to solve, but it can be done if the tool sorts the files first in alphabetical order before processing, thus ensuring files are processed in a deterministic order.

The biggest non-determinism is that, usually. But especially since it can be caused by parallel builds so even if you start with exactly the same files, the final arrangement can vary. If you're unpacking 20 tarballs in parallel it just creates chaos.

Of course, the easiest way is to make it deterministic and only allow one build to proceed at a time, but that means what takes 20 minutes to builds completely now takes hours. And even then you might get some randomization because the disk cache might cause a file to be written ahead of another

Comment Remind me to stay off Baloo Uriza's lawn (Score 1) 86

Funny thing is, the police are constantly ticketing and arresting people driving with suspended or revoked licenses where I live. Somehow I don't think that's going to solve the problem.

Self-driving automotive technology continues to improve, but human drivers hit their limits about a century ago. If anything, the trends are going backwards for human drivers, as U.S. highway fatalities have ticked upwards in recent years.

The politicians in British Columbia can rail against progress in the short term, but 20 years from now everyone will be looking back on arguments against autonomous vehicles and wondering what all the fuss was about.

Comment Re:Think Different (Score 3, Insightful) 107

And the problem is that women often don't know they can enter these fields.

There are tons of women who want to enter trades. However, all they see are men everywhere - and thus society has set up a stigma that "girls can't code" or "girls can't be electricians".

The assumption is "girls aren't interested" is about as true as saying "boys don't cry". It's a sexism thing - girls play with Barbie, boys play with GI Joe. Heaven forbid you have a boy who's interested in Barbie, or a girl who wants Transformers.

In fact, many trades have "girls do trades" type events where the whole purpose is to show that yes, if you want to sling a hammer, or do electrical work, or plumbing, or whatever, you can. Often there's no role model to say "yes, you can!" in a family context, so plenty of people make the assumption that no, you might like to work on the computer and have a great time at it, but only boys code, so go and be a nurse instead".

I'm sure there's probably a huge dichotomy of people who are in unhappy careers because it "fit the stereotype" rather than actually exploring options in what they are more interested in. I know people who did computer science in university because they didn't know what to do, so their parents said to do computer science as it was popular. I'm sure you all know how that usually turns out

And the reason I think it's cultural? Other cultures have often no such qualms. I've seen lots of women from Indian, Chinese, or Eastern European backgrounds. They were encouraged to do the sciences and they pursued them Also, take a look at advertising for video games prior to the videogame crash - you'll find it's often done as a family activity with both a son and a daughter enjoying what little game there is. But afterwards, it's purely something the boys do. And it has something to do with Nintendo - because in order to sell the NES, Nintendo couldn't market it as a "video game" as that was poison. So they marketed it as a toy (aided by Robby). But if you do that, you need to pick - does it go in the boys section or the girls - heaven forbid a toy might actually be interesting to both sexes.

Comment Re:My best friend's company has done that for year (Score 1) 51

All I can say is that the race to the bottom to cut salaries on IT workers can't end in anything good.

You just have to look at other industries. You know, like say, manufacturing. Most things are made outside the US, because it's cheaper to make stuff there because pay is lower.

The same forces that pushed manufacturing overseas is the same forces pushing IT salaries lower.

Comment Re:People still use Windows? (Score 1) 59

>"Also there the issue of let's say I want to port my giant CAD app to Linux? Which distro? Which installer package? There's a lot of variables with that."

Yes, but it is not that difficult to overcome. Porting/coding it is mostly a single investment/code base and will essentially work on any Linux. The packaging of it is easy, in comparison. You just have to follow some reasonable practices of using conservatively-available libraries, or include your own. Probably target a generic deb and rpm for the major distros, and also flatpak. I have used numerous commercial Linux packages for decades and they figured it out :)

Porting is fairly easy. You port to the Linux most people will use.

If it's a CAD program and you expect people to use it in a corporate environment, you go RHEL and be done with it. Hell, the company is probably already paying thousands of dollars for support anyways (aka subscription) so using a commercial Linux isn't really too unusual.

If you were a little lower end and your package is used by people ancillary to the job, then you might target a more mainstream Linux, like Ubuntu. And let what3ever that uses be what you support.

That's all you really have to support. Everyone else using something different can figure it out for themselves. Linux Mint, Arch Linux, Debian, etc., those users are experienced in Linux so let them figure out how to get it working.

Heck, if your application is that damn important, people will find a way. If it means having to have a whole new computer just for your application, so be it. Computers are cheap these days.

Comment Re:Pandemic Russian Roulette (Score 1) 65

But a bigger problem is risk of a "Mars pandemic". There could be microbes on Mars that Earth life has no immunity to.

Meh.

If there is life on Mars, and if it can survive a few centuries/ million years in vacuum below the surface of a lump of rock, then samples have been raining down on Earth since the Hadean. Despite Fred Hoyle and Chandra Wickramasinghe's best efforts, nobody has yet been convinced of any cases of "Mars Flu", despite the constant (if thin) rain of such projectiles. We have found and identified hundreds, possibly thousands (I can't be bothered keeping count), of Martian meteorites, which means there are millions or billions out there on Earth's surface which haven't been identified. Yet.

If Mars had infective biota, and that can be naturally transferred between the planets, then it has already arrived here. Repeatedly.

Now, personally, I don't think it is very likely that such a transfer (of organisms)could happen, in the Solar system. And it's even less likely to happen between stellar systems. But the possibility is just possible enough that people don't get laughed at (much, in public) for suggesting that Earth life originated on Mars. It's about 99 times as likely as all Mars life having originated on Earth. But it's not quite an insane speculation - the mechanisms exist, even if the stack of probabilities against is pretty daunting.

But if you grant that possibility, then there is no way you can have that without Earth having been repeatedly inoculated with Martian organisms. Therefore, the surviving life on Earth today are descendents of survivors of, say, the last time that 96% of genera of life on Earth were wiped out. (Permo-Trias "Great Dying" Mass Extinction, I'm looking at you!) And since that "Great Dying" didn't get us (our ancestors), the next delivery of "Death from Martian Skies" is pretty unlikely to either.

(There are very good terrestrial-only explanations for the "Great Dying" - but not for all the major extinctions in the Phanerozoic Era ("Era of Evident Life"), and I'm not claiming that was the cause of that mass extinction - I'm just arguing that IF Mars had Earth-infective biota, then our ancestors have already have survived an encounter.)

I've also got a philosophical contempt for the concept of "panspermia" - but I grant that it's physical mechanisms aren't impossible, just bloody unlikely.

Comment Re:Would it be cheaper to send a lab to Mars? (Score 1) 65

If you're doing sensitive, accurate measurements - such as isotope measurements to see if you've got biological processes going on - you need equipment that doesn't vibrate much, and whose parts don't move relative to each other.

That is not a recipe for launch from Earth, or the "Entry, Descent, Landing" phase of a mission, invariably described for Mars as "seven minutes of terror" (for the flight engineers).

JWST was on the drawing board for the thick end of a quarter century, and it only contains one class of instruments (optical). Now add isotope measurements (for dating), equipment for grinding and preparing thin-, thick- and polished sections of rocks. An XRF lab for heavy element analyses. An IR lab for your organic analyses (because light elements are pretty indistinct on XRF ; not enough nuclear electric field to give narrow fluorescence signals). An XRD lab for potential bio-materials, and for understanding the clay minerals.

Then freeze the instrument designs in (say) 2025 for a 2038 launch (2040 EDL). And lose access to whatever advances in analytical techniques happen between 2025 and 2040. And the option of re-analysing the samples with the incredible new techniques that will be discovered in 2050.

Since Lowell's self-delusion of being able to see an "annual wave of greening" on Mars, from Earth, we have known that any life on Mars is not as blindingly obvious as that. You don't hear much about it, but every single rock examined by a Mars rover in the last 19 years has been pored-over by a considerable number of experienced geologists (and innumerable internet wingnuts) and the total number of fossils discovered remains at a big fat zero. And that is on missions aimed and, and steered onto, interesting areas with a relatively good chance of hosting fossils.

Do you really think a "basic" analysis is going to actually answer the question? We already know that nearly 20 years of efforts have not detected any clear signs of life, and that's with some quite sophisticated tools being flown (I'd love to have a hand-lens with built-in XRF, even if it's only good to concentrations of a few hundreds of ppm.)

Comment Re: Starship to the rescue? (Score 1) 65

Try it, if it fails, send another one in 90 days with the problems fixed.

The cadence of orbital alignments between Earth and Mars is 2 years (actually, slightly over, IIRC), not 90 days. If you want to operate on a different cycle, you hugely increase your propellant costs, hence launch weights, hence overall costs.

You can fudge it to be a 21-, 22-, [...], 26- 27- month cycle at acceptable costs, but outside that, the planets, in a very literal sense, do not align for you.

Slashdot Top Deals

This restaurant was advertising breakfast any time. So I ordered french toast in the renaissance. - Steven Wright, comedian

Working...