Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:Make (Score 4, Insightful) 327

This is really disappointing to me as well. I've been a subscriber since its inception, but I'm about to let it drop. I know which end of a soldering gun to hold. I don't have a desire to add a toggle switch to a toy to impress hipsters.

Where are the articles like:

- Build a high quality mass spectrometer (;action=display;threadid=1268)

- Convert a cheap Chinese milling machine to CNC (

- Build a Tesla Turbine and reap geothermal energy.

It went from being "Make useful stuff" to "Make crap to impress dumb people"

Comment Re:Google is the competition.. (Score 1) 528

As a corporation, Microsoft needs to increase revenue, yet they are saturated in the OS and Office markets. They look to Google profits and want to be Google's competitor.

So Microsoft probably views themselves as Google's biggest competitor, whereas Google perceives them as a has-been.

Microsoft needs desperately to be relevant in the Internet age. Yet their fanatical adherence to maintaining their monopolies restrict them from entering other markets.

Textbook case of 'The Innovator's Dilemma':


Man Builds His Own Subway 174

jerryjamesstone writes "Everybody is into rail these days; it is the greenest way to get around next to a bike. Leonid Mulyanchik has been into it for years since before the Berlin Wall fell, since before the first Macintosh, building his own private underground Metro railway system. English-Russia says that he has been doing it with his pension, that it is all legal and approved and that he is still at it. Gizmodo calls it 'Partly the traditional, inspiring, one man against all odds type of persistence, but more the obsessive, borderline insane persistence.'" Update: 06/02 07:33 GMT by T : And if you're the type to visit Burning Man, you can actually ride a home-made monorail this summer, too.

Comment Re:Please let me use the same password (Score 4, Insightful) 497

Pretend that if an attempt to log into his account fails three times, his account is locked and requires a new password.

Or pretend that your security system notes what IP address such failures comes from, and disables all access from that IP. Or it scores various IP connections, giving more trust to IP addresses that are successful.

Whenever I see the onus forced on users, I see people who haven't learned the wisdom of the following quote:

"I object to doing things that computers can do." - Olin Shivers

Comment Re:Verifying hiring practices... (Score 2, Insightful) 223

Trust me, having Google or Amazon on the resume means a lot more the having Microsoft on the resume.

The former two guarantee you know something about scaling (insane scaling), the later only guarantees you know how to develop for Windows.

The former two haven't drowned themselves in bureaucracy. The later has.

The former two still have managers with technical chops. The later has MBA's for managers.


Submission + - New Theory Explains Periodic Mass Extinctions

i_like_spam writes: The theory that the dinosaurs were wiped out by an asteroid impact, the K-T extinction, is well known and supported by fossil and geological evidence. Asteroid impact theory does not apply to the other fluctuations in biodiversity, however, which follow an approximate 62 million-year cycle. As reported in Science news, a new theory seems to explain periodic mass extinctions. The new theory found that oscillations in the Sun relative to the plane of the Milky Way correlate with changes in biodiversity on Earth. The researchers suggest that an increase in the exposure of Earth to extragalatic cosmic rays causes mass extinctions. Here is the original paper describing the finding.

Submission + - Inside the Machine

Paul S. R. Chisholm writes: "Inside the Machine: An Illustrated Introduction to Microprocessors and Computer Architecture, written by Ars Technica's Jon "Hannibal" Stokes, talks about how CPUs work, and how they've evolved and advanced in the past fifteen years. The result is detailed, very up-to-date (including descriptions of Intel's Core 2), generally clear, and covers a lot of fascinating material.

[EDITORS, PLEASE NOTE — Please do not publish my e-mail address!]

How on earth have CPUs advanced as fast as they have? How have we gone from 60 MHz Pentiums in 1993 to 3.73 GHz Xeons (with two cores) and 2.66 GHz Core 2 Extremes (with four!) today? Sure, Moore's Law and competition pushed the chip makers, but how did they implement all that extra performance? In Inside the Machine, Jon "Hannibal" Stokes provides a thorough, exhaustive, nearly exhausting look at what's at the heart of your computer. If Stoke's name sounds familiar, he's a founder and long-time contributor to Ars Technica. Anyone who liked his work there, his comprehensive articles and brightly colored diagrams, will probably enjoy this book a lot.

The first two chapters cover the basics of CPU operation and machine language. These are pretty good, though you'll probably need some assembler language experience to really understand everything in these chapters. Even without such experience, you'll pick up enough to get through the rest of the book.

The next two chapters get more advanced, covering pipelined and superscalar execution. CPUs don't execute one instruction at a time. Instead, they break instructions into smaller operations, and work on those smaller operations in parallel. These two chapters begin to tell how CPUs do that. (The book also discusses caching, another huge performance booster. For some reason, Stokes doesn't get to that until chapter 11.)

The rest of the book discusses specific CPUs. From Intel, we see the original Pentium, Pentium Pro, Pentium 4, Pentium M, Core, and Core 2. (Intel didn't release as much information about the Pentium II and III.) From the Apple/IBM/Motorola alliance, we learn about the 601 (the heart of Apple's first "Power Mac"), 603, 604, 750 (G3), 7400 (G4), and 970 (G5). In the middle of all that, there's also an excellent description of 64-bit computing, its advantages, and even its disadvantages.

Every buzzword you've ever heard about CPUs is covered: front end vs. back end, branch prediction, out-of-order execution, pipeline stalls, SIMD, direct-mapped vs. N-way set associative mapping. That sounds intimidating, but Stokes introduces the concepts one at a time, clearly and in detail. The next time an overclocking fanatic tries to tell you why his AMD CPU is so much better than your Intel CPU (or vice versa), you'll not only be able to follow the whole discussion, you'll be able to argue back.

Stokes turns all this into a (highly technical) history of CPU development. One chip's virtue is its successor's vice; one generation's shortcoming is another's opportunity.

This book reinforced something I already knew but don't often enough live by: Portability depends on architecture (for example, x86 vs. PowerPC), but high performance depends on microarchitecture (for example, Pentium M vs. Athlon 64 X2). Today's Core 2 chips have many high performance features missing from the 1993 original Pentiums. A good compiler like gcc can take advantage of those additional features. This is bad news if you're using a binary Linux distribution, compiled to a lowest common denominator. It's good news if you're building and installing Linux from source, with something like Linux From Scratch or Gentoo/Portage. It's also good news for just-in-time compilers (think Java, .NET, and Mono); they're compiling on the "target" machine, so they can generate code tailored for the machine's exact microarchitecture.

The full color diagrams were a big help in understanding Stokes's points. On the other hand, I'm not sure why the book was printed in hardcover. To make it look more like a textbook? Is that a good thing?

The text is packed with jargon, buzzwords, and TLAs (Three Letter Abbreviations). Most of that is unavoidable, but a glossary would have been nice. Each chapter builds on the previous ones, so most readers will want to read all the chapters in order, paying close attention the whole time. Even so, this book had a lot more forward references ("I'll define that shortly" or "We'll get to that later") than most technical books.

Don't expect much non-technical discussion. Exceptions: There is a (very good) description of the Pentium 4's obsession with higher and higher clock speeds, including marketing pressures, and the resulting performance increases and drawbacks. The occasional "Historical Context" sections are also quite nice. But you'll see nothing on Apple's decision to move from PowerPC to Core, or the competitive battle between AMD and Intel. For that matter, you'll see almost nothing at all about AMD or its products.

Personally, I think Stokes missed an important opportunity to talk in depth about multiprocessing. He spends only four pages on the subject, and that only as part the description of the Core Duo. (You'd think there was never a multi-core G5.) There's only a couple of paragraphs on the difference between multiple CPUs and multiple CPU cores. ("Dual dual-cores" and the AMD 4x4, anyone?) He declines to discuss how caches interact with multiple CPUs or multiple cores. That's unfortunate, because anyone doing multi-threaded software development really needs to understand cache issues, at just about exactly the level this book covers. But you'll find nothing here about cache coherency, or about what out-of-order execution results might be visible only to multi-threaded software. Well, he spent three years of his life writing this; if I want a say in what gets said, I should write my own darned book.

Jon Stokes had an incredibly ambitious goal: to write an accessible book that covers much of the same ground as Hennessy and Patterson's Computer Architecture and Computer Organization and Design. I don't think he achieved that, but he came pretty close.

You can visit the book's home page or the author's blog.

Paul S. R. Chisholm has been developing software for 25 years. He's worked at AT&T Bell Laboratories, Ascend Communications / Lucent Technologies, Cisco Systems, and some small startups you've never heard of. His latest article, "'Pure Virtual Function Called': An Explanation," appeared in The C++ Source. He lives and works in New Jersey."

Slashdot Top Deals

This is an unauthorized cybernetic announcement.