Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment: Re:Cuz Minix Dude Was A Old Guy (Score 2) 469

by metamatic (#49635137) Attached to: Why Was Linux the Kernel That Succeeded?

Yeah. I was there at the time, writing patches for the Minix kernel... Linus specifically wanted support for 386 protected memory and virtual memory. AST wouldn't do it, because it would mean Minix wouldn't run on 68000-based systems like my Atari ST. So Linus went away and hacked together his own 386-only replacement kernel over a weekend.

Comment: Re:SAVE US AND THE WEB FROM MOZILLA! (Score 1) 324

by metamatic (#49605983) Attached to: Mozilla Begins To Move Towards HTTPS-Only Web

Then there was the whole Eich debacle. Regardless of your stance, it's pretty disgusting that somebody had to lose his job merely because of his beliefs regarding same-sex marriages.

He didn't lose his job merely because of his beliefs regarding same-sex marriages.

He lost his job because he spent money attempting to get laws passed which would prevent people, including his employees, from getting married. That made it hard for him to be a leader for those employees, so he resigned his position.

If he had merely had opinions, there wouldn't have been an issue.

But hey, don't let the actual facts get in the way of a dishonest misstatement of the situation.

Comment: Re:Doomed to fail (Score 1) 68

by mugnyte (#49329441) Attached to: Ask Slashdot: What Happened To Semantic Publishing?

Better yet, if a semantic derivative of any web page is built by these powerful web crawlers, building a channel for pushing a link to it back the original web site would mean each crawler wouldn't need to start from scratch. Instead they could annotate and extend the semantic information, serve it from multiple locations, while the original site stayed larger out of the process, save for serving the link(s) or be amenable to a filtering proxy that decorates pages with the links.

Reduced down, there would be a machine-friendly semantic version of the web that browsers plugin could tap to annotate the existing human-web, and the crawlers were constantly polishing this semantic version behind the scenes (with curated fixups). The infrastructure of the current web wouldn't need to change, but the experience of the browsing user would be greatly enhanced, largely raising the signal-to-noise ratio on "related" links.

Comment: Re:Early fragmentation (Score 1) 492

by metamatic (#48900921) Attached to: Ask Slashdot: Is Pascal Underrated?

while there were various decent, proprietary, dialects that let you actually write code that did stuff, *standard* Pascal was as much use as a chocolate teapot

And that's still a problem today. There's no standard for OO Pascal, and the ANSI Pascal standards have been moribund since 1990.

That's why I abandoned Pascal (and Modula-2): I didn't want to get locked in to a single vendor.

The major difference between bonds and bond traders is that the bonds will eventually mature.

Working...