Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

×

Comment: Re:But, but... (Score 1) 66

by hibiki_r (#49358589) Attached to: US Air Force Overstepped In SpaceX Certification

I've seem time factor heavily in some road construction: St Louis: I-64 roadwork was going to cause havoc on many people's commutes, as it required closing go the most traveled highway on the city. The contracts took into consideration how long it was going to take, offered bonuses for finishing early and severe escalating penalties for delays. It was finished early.

Comment: Re:Check their work or check the summary? (Score 1) 481

by hibiki_r (#49337849) Attached to: No, It's Not Always Quicker To Do Things In Memory

It might have helped in this problem, but nowadays, even assembly language is just an abstraction: You might thing you are doing in order operations on 8086, but they are really being translated to out of order operations inside of the processor that will get the same result, but with very different performance. Branch prediction? Nah, we can run the beginning of BOTH branches, and just discard the computation we did not want, because it's actually faster. And don't get me started on the differences between what you tell a video card to do, and what it actually does.

The distance between what we write in practical, end user facing applications and what happens in the hardware is so large nowadays that it's hard to have any real control over what is going on. The best we can do is understand the performance characteristics that we see in the layer right below ours, and hope things don't change too much.

Therefore, the problem with the original paper is that it fails to really explain what we can learn from the experiment. It's not that disk is faster than RAM: That's just ludicrous. But that we really have to have some understanding of the libraries and VMs we use to get anywhere. It'd not be impossible for the JVM to realize the immutable string is being edited in a loop, that there are no references to it that could escape, and then just optimize the whole thing into a string buffer implementation that should be as good as calling the file writer: It just happens to not do said optimization for us. It's happened in Java before: Code that was seen as terrible because it was very slow is not slow anymore, and easier to read than the old school way of optimizing it.

Comment: Re:Good grief... (Score 1) 681

by tambo (#49110555) Attached to: Bill Nye Disses "Regular" Software Writers' Science Knowledge

The nuts and bolts of computer architecture isn't in the scope of computer science. Sure, you might want to know a little about how things work from an abstract level, but let's be clear; Computer science and electrical engineering are two different disciplines.

There's also a big gap between traditional electrical engineering and computer science - computer engineering really is its own thing.

The standard EE curriculum covers a lot of topics: E&M, circuit analysis and design, signal analysis and information theory, wireless communication, VLSI and VHDL, linear electronics, control systems, FPGAs / MOSFETs / ASICs, etc. The most computer architecture that EE covers is the basics of digital logic, and *maybe* a selection of other topics, like memory addressing and a basic instruction set, but it's really just an introduction.

Computer engineering is where the rubber meets the road, so to speak. Consider all of the specialized things that a computer engineer studies: processor design, instruction sets, memory / storage / caching, buses / pipelines / wire protocols, networking, parallel processing, power management (especially conservation), GPUs, firmware and device drivers, multithreading and stack analysis, security systems...

My point is simply that a typical EE barely scratches the surface of CE, and a typical CE has only a modest overlap with both EE and CS.

It's frustrating that so many people don't appreciate just how deep and rich and technically challenging these areas are. It's oddly stylish to diss CS and CE as of a lesser scientific caliber than traditional sciences - to look at a computer as a commodity, a basic web browser wired to a basic keyboard. Very disappointing in general, and it's culturally perpetuated by offhanded comments like Nye's.

Comment: Re:Sweet F A (Score 1) 576

Even if they were at our level of technology, if they have starships, then they have nuclear weapons. They don't have to invade, they can simple drop rocks or nukes on us to accomplish the same thing, and there wouldn't be anything we could do about it...

Yeah, nukes aren't even necessary. A lump of any kind of matter, parked at the top of Earth's gravity well and possessing sufficient bulk / shape / cohesiveness to deliver a sizable mass to the surface, will be devastating. If you can multiply that by, I dunno, several thousand - you have a fairly low-tech and cost-effective means of civilization annihilation.

This whole "gravity" thing is really a bummer. If mankind can ever conquer its internal existential threats - global war, nuclear proliferation, climate change, and the cultural-dumbness bomb called "the Kardashians" - then our own gravity well becomes our largest existential vulnerability.

Comment: Riiight (Score 1, Insightful) 85

by tambo (#49075971) Attached to: Algorithmic Patenting

"There is reason to believe that at least some of its computer-conceived inventions could be patentable and, indeed, patents have already been granted on inventions designed wholly or in part by software."

Right. And according to Fox News, "It is SAID BY SOME that Obama isn't a native citizen. Not that *we're* saying it, mind you, so we can't be held accountable. It's just, you know, THEY said it was true. Who's THEY? Well, we can't tell you, and we can neither confirm nor deny that we're using 'they' in place of 'we.' So we're just going to state that it's some number of unnamed experts, or the public at large, or whatever. You know... THEM."

Comment: Re: You sunk my battleship (Score 1) 439

by phoenix321 (#49061491) Attached to: Will Submarines Soon Become As Obsolete As the Battleship?

Never ever underestimate the speed and efficiency of China's construction abilities. They complete a handful of skyscrapers before most other nations have finished designing just one; remodel entire cities, build insanely large dams and dozens of nuclear power plants without any noticeable drop in GDP at all.

Not to mention that they own most of the world's high tech manufacturing plants.

As a side note, I would not recommend placing any long-term investments in Taiwan.

Comment: Re: Big Data (Score 0) 439

by phoenix321 (#49061461) Attached to: Will Submarines Soon Become As Obsolete As the Battleship?

Aircraft carriers were key to defeating German and Japanese navies, which were more than peer-level opponents of the US in 1942. Their battleships were unable to do anything and by the end of the war, their subs were just easy targets for US aircraft.

Pummeling minor powers would be done best by decapitation strikes and if that doesn't work, saturation bombardment. The B52 fleet can do both extremely well now and had their service life and fighting role extended far beyond the wildest expectation of any of its designers because of that. And B52's can't easily operate from carriers.

Drones can start from carriers however and we will see how the drone war works out in the next few years.

Comment: Re:Big Data (Score 1) 439

by phoenix321 (#49061407) Attached to: Will Submarines Soon Become As Obsolete As the Battleship?

A few LiveLeak videos of collateral damage later and accurate missiles are back in fashion for double the price or more.

The general public has all but dissolved the notion of "the enemy" and sees just the few of them that have actual weapons in their hands and ready to fire to even be remotely acceptable to fire at. The public only wants these few actual combatants and militants surgically removed from the other side, while the rest of their population is seen as sweet little innocent angels that would never harm anyone and were just praying for a miracle liberation thanks to Raytheon and General Dynamics.

People are truly believing that fighting will cease the moment that all currently-armed militants drop dead. That the general population of the enemy provides an indefinite supply of militants and will never change unless forced to, will boggle their minds.

That's why no Western power has won any wars in the last decades. They are winning battles left and right, toppled dictators, killed bad guys, built schools and bridges, thwarted endless terrorist plots, sacrificed scores of people and the best young men, but accomplished nothing in the long run.

Maybe we should bring the battleships back. Maybe area bombing works. At least people of Tokio or Dresden believe in democracy and freedom now.

Comment: Re:Big Data (Score 1) 439

by phoenix321 (#49061303) Attached to: Will Submarines Soon Become As Obsolete As the Battleship?

Artillery is going to produce collateral damage. People with mobile phones will film all the gory details and upload it everywhere. The general public and the American public do not like collateral damage and gory details of that school that was hit instead of the ammo depot and training camp of the enemy. Public opinion wins wars today, with ammunition in a far distant second place.

Firebombing, city-leveling area bombardment will never be acceptable again to a public that currently imagines soldiers to always be well-equipped, omniscient and omnipotent superheroes that only kill the bad guys, with minimal damage and in the cleanest possible way.

The cost of munitions (and training of individual soldiers) has become irrelevant, because a single error can lose a war. The accuracy required from warfare by the general public is now at ridiculous levels, with associated costs per shot and can only be described as utterly insane.

So, unless all shells are GPS-guided with the same or better precision than current missiles, battleships aren't making a comeback. And when they are as accurate, they are as expensive and still only have a fraction of the range.

Comment: It's jetpack technology: always 10 years away. (Score 2) 248

by tambo (#49051337) Attached to: Smart Homes Often Dumb, Never Simple

Over the years, I've invested thousands of dollars in several home automation platforms. I've yet to have an experience that I'd call "good."

Candidate #1: X10. Future-tech, circa 1978.

  • Pros:
    • Drop-dead simple implementation - there's a physical dial on every receiver to specify a code, and a physical dial on every controller to specify which codes it controls.
    • Supported by a broad set of manufacturers back in the 1990's.
  • Cons:
    • Wildly unreliable protocol = don't count on your lights actually turning on. Flakes out at the drop of a hat.
    • Hardware had extensive quality issues. Devices spontaneously died without warning. Wonderful if you enjoy debugging your light switches; terrible for people with better things to do in life.
    • Even when working perfectly, the latency was unacceptable: waiting a full second for your lights to turn on becomes painful fast.
    • No centralized management. Communication was largely one-way - switches broadcast; receivers receive - so things like "reporting status" and "verifying connectivity" were impossible.
    • Protocol security? What's that?
    • Deprecated and dead.

Candidate #2: INSTEON: The Commodore Amiga of home automation.

  • Pros:
    • Designed with a lot more redundancy and reliability than X10. Something about mesh network communication and blahblahblah.
  • Cons:
    • Overpriced. Holy crap, overpriced. Starter kits that controlled a single lamp ran for like $500.
    • One vendor = extremely constrained range of products. Sure, some of the gear had backwards-compatibility with X10, and mixing network gear was a great way to drive yourself insane fast.
    • Terrible business model = stunted growth and slow, painful death.

Candidate #3: Z-Wave: The People's Home Automation Platform.

  • Pros:
    • Totally open protocol! Anyone can make a Z-Wave-supported device!
    • Potential for built-in reliability through mesh communication, etc.
    • Hierarchical mesh architecture can be centrally managed by a hub.
  • Cons:
    • "Anyone can make a Z-Wave-compatible device" =/= "anyone can make a *good* Z-Wave-compatible device."
    • Entry-level devices are cheap, but inadequate. Fully-capable devices are reliable, but expensive. There are also expensive devices that are crippled, but no cheap devices that aren't. Have fun with that.
    • The architecture is both overcomplicated and poorly documented. Want to figure out how scenes work? Plan on setting aside an hour to scrape together bits and pieces of information from different vendors, and glue them together with guesswork and trial-and-error.
    • Lots of potential... not as many products. In theory, Z-Wave is great for motorized blinds. In practice... there's like one company offering an overpriced half-baked product, and an Instructable DIY video.
    • Hub architecture is feasible... but good luck finding a decent implementation:
      • SmartThings wants to be hip and polished, but feels like it was designed by ADHD-afflicted high school students as a summer project.
      • MiCasaVerde / MiOS / Vera is ambitious... i.e., overambitious, i.e., no support. Great for those who enjoy hacking a commodity-based Linux box and digging through log files to figure out why the kitchen lights won't turn on. The Facebook group is kind of surreal: it's a company rep posting happy-happy-joy-joy patch notes, and dozens of people asking why their Vera won't respond and why customer service won't get back to them.
      • Home Depot Wink is a subscription-based service. Let that sink in: you'll have to pay $x/month for the privilege of automating your light switches.
      • A handful of weird, little-known contenders exist (Staples Connect, ThereGate, the "Jupiter Hub," etc.), with virtually no buzz (and the bit that's there is typically poor).

    Candidates #4-50,000: Belkin WEMO, ZigBee, MyQ, Ninja Blocks, IFTTT, etc.: The Home Automation Also-Rans.

    All of these products and platforms map a consistent trajectory - from gee-whiz Kickstarter prototype to "whatever happened to..." entries on the technology scrap heap. They don't ever develop sufficient momentum beyond diehard hacker/maker cliques that warrant serious consideration beyond the level of "free-time toy project" status.

Comment: Re:UI code is bulky (Score 1) 411

by tambo (#49037457) Attached to: Your Java Code Is Mostly Fluff, New Research Finds

> You will find that an MPEG2 decoder is substantially more complex than DeCSS.

First, "complexity" and "size" are completely different concepts, and we're only discussing codebase size here. Of course, extremely complex code can be very small (see also: MD5, RSA, etc.), and extremely bulky code can be very routine. The auto-generated code for instantiating all of the controls in a window can be thousands of lines long, but they're very bland and mundane instructions. ("Create a button at coordinates x1, x2, y1, y2; set its border, background color, font, and caption; hook it up to these event handlers...")

But just to reinforce the point: Here is a full MPEG 2 decoder. It's about 130kb, uncompressed.

VLC is the opposite of pretty stripped-down. It does everything, including ripping DVDs and transcoding video.

First, it's certainly stripped-down as compared with other media rendering packages: Windows Media Player, WinDVD / PowerDVD, and of course iTunes.

Second, those features - ripping DVDs and transcoding videos - are not only expected in modern media player packages; they also utilize the same set of core functionality. Just like ordinary playing, ripping and transcoding are basically uses of the output of the codec - the software is just delivering the codec output somewhere other than the screen, like to a storage device, or as the input to a different codec.

Comment: Re:UI code is bulky (Score 1) 411

by tambo (#49032967) Attached to: Your Java Code Is Mostly Fluff, New Research Finds

> Is that really 28MB of code, or is that 1MB of code and 27MB of bitmaps, sound files, and other crud?

That's true - media resources are bulky, and thanks to plentiful storage and bandwidth, we don't have nearly the pressure to constrain these sizes that we did in 2000.

However, if you review just the code base for Winzip, I can guarantee that well over 95% of it is UI code as I mentioned above, and much less than 5% of it is the actual data compression/decompression "core functionality."

Here's another example: Media rendering. The actual codec for DVD media is tiny - DeCSS can be printed on a T-shirt! - and the entire source code package for the LAME MP3 codec is like one megabyte - but media rendering apps tend to be huge. And I'm not even talking about bloated monoliths like iTunes; VLC is pretty stripped-down, and it's still 33 megabytes. The logic that it needs to *show* you the decoded video in a proper UI is extensive.

Comment: UI code is bulky (Score 1) 411

by tambo (#49032365) Attached to: Your Java Code Is Mostly Fluff, New Research Finds

PKZIP.EXE and PKUNZIP.EXE, together, are about 80 kilobytes.

The current version of WinZip for Mac is 26 megabytes, or 26,000 kilobytes. That's a 32,500% size increase for the same basic functionality.

However, I don't see a lot of people preferring the command-line versions. Why? Because it's easier to drag-and-drop a bunch of files into a dialog box and select an output location and folder, than to type all of that crap into the command line WITH the right flags AND no typos.

Things like menus, options / configuration panes, and nicely formatted help documentation are also preferable to "pkunzip.exe -?", and then remembering that you have to pipe the output to MORE in order to read the six pages of help text spewed out to your terminal window.

UI code is bulky, because it's extraordinarily detail-oriented. Think of all of the operations that your application UI has to support: windows, and resizing, and hotkeys, and scrolling, and drag-and-drop, and accessibility features and visual themes and variable text sizes and multithreaded event loops and asynchronous event handlers and standard file dialogs and child window Z-ordering and printing and saving application configuration info... etc.

If our IDEs didn't include visual UI designers and auto-generate like 99% of that code for us, app development would be horribly stunted AND much more preoccupied with hunting down bugs in UI code.

But all of this UI code is bulky and verbose and nitpicky because the UI is extremely important for any modern app. Thousands of apps exist that feature excellent functionality that is impossible or painful to utilize because the UI sucks.

Comment: Re:Downvotes (Score 2) 467

by hibiki_r (#48988669) Attached to: Twitter CEO: "We Suck" At Dealing With Trolls, Vows To Kick Them Out

Downvotes without metamoderation just lead to downvoting mobs. Imagine the whole gamergate fiasco, with large groups of people downvoting each other. It's pretty terrible.

And twitter being so broad, metamoderation is just completely out of the question.

So ultimately, downvoting doesn't scale, and is only something you will like if you are the one with the popular opinions.

Comment: Re:Goodbye college football (Score 3, Interesting) 94

by hibiki_r (#48960233) Attached to: What Happens When the "Sharing Economy" Meets Higher Education

Schools will probably not go away quickly, as there is plenty of value in learning socialization, and kids will not learn that by sitting at home in front of a computer.

Schools are moving towards having some of that kind of learning though. Take, for instance, elementary school math. You have a bunch of kids coming in at K or 1st grade, which have drastically different experience and skill levels. Some kids will barely be able to count to 10, and read small numbers. Others enter K understanding multiplication and division. And yet traditionally, we put them in the same class, and teach them math together.

Now we have computer systems that can throw math exercises and lessons to kids, individualized to their skill level. So when the kindergartener that should be in 4th grade, seems to never miss at counting and number recognition, he just keeps getting more challenging material, until he's quickly doing 4th grade math.

I am a computer. I am dumber than any human and smarter than any administrator.

Working...