Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment: Re:No H1-Bs for contractors (Score 2) 636

by hibiki_r (#49582985) Attached to: Disney Replaces Longtime IT Staff With H-1B Workers

It'd help in many ways, but it also makes the H1-Bs situation far more precarious. Modern abuse and quotas means almost all H1-Bs come from those nasty companies, but even before that, many people chose contracting firms to handle their immigration because you are far safer from layoffs and such. I remember when I was an H1-B, a long time ago, going direct, and my then employer had round after round of layoffs. The moment I saw the pattern, I had to look for another job IMMEDIATELY, because getting hit by one of those layoffs meant a tiny window to find another employer or leave the country, and that new employer had to file for a visa. Through a contracting firm, a layoff might mean a job change, and maybe not getting paid for a few weeks, but it's far less onerous. This gets even harder when also applying for a green card. It's not uncommon for companies to ask immigrants that they want to sponsor to sign that they have to pay the immigration fees incurred in the green card process if they leave willingly before the green card is done plus one year. When going with direct employment, it also means you cannot run without taking a major financial penalty. And if you are laid off in process, then you better get a job extremely quickly, or your green card process might have to start all over again, and it can take many years.Getting my green card got a big weight off my shoulders.

So your proposed change to the H1-B program sounds like a wonderful idea as long as it comes together with something to minimize the precarious conditions of H1-B workers that easily qualify for green cards, and work in the US for 5, 10 years while they wait for a visa number. This should help American workers too, as the minute one of those workers gets a green card, their job mobility increases, and with it, their negotiation power. I got a 30% raise with my first job change after a green card. In 5 years after the green card, my salary more than doubled.

Having people as temporary workers for a decade? You've got to be kidding me.

Comment: Re:So how long before (Score 1) 181

by hibiki_r (#49457975) Attached to: Autonomous Cars and the Centralization of Driving

The problem of remote control of a car is that you cannot assume it'll happen for good reasons, or by the people that you think. It's the same issue of having an encryption backdoor, theoretically held by "the good guys"

If a vehicle can be controlled remotely, it's because there is some authentication mechanism that allows that to happen. Anyone that steals the keys can remotely control the car. Imagine how much fun kidnapping becomes when you can do it from the comfort of your own home.

Same problem when people talk about blockchain-based DNS. If the only thing protecting something is a digital key, then you have two problems: Making sure the key is not lost, and making sure nobody copies it. Stealing keys becomes more profitable than ever, as there is no more recourse: whatever you were protecting is compromised.

So the question is: Would it be a good idea to let a terrorist or a gang control everyone's car? Because that's the door you are opening the second you let police do it.

Comment: DBAs first? Strange (Score 1) 139

by hibiki_r (#49376137) Attached to: IT Jobs With the Best (and Worst) ROI

Around here, if anything, the DBA job is disappearing: There are a lot less openings, and most are at huge, extremely corporate places that you'd not even want to work for. And even in that world, they are switching to development models that don't need DBAs. So maybe the averages are high because only companies that pay well would even hire DBAs?

They also talk of averages, but not high ends. Around here, a programmer's high end is very high: I make 4x what my employer pays an entry level developer. People realize how much a top developer can get you. DBAs, not so much.

Comment: Re:Crashes (Score 1) 167

by hibiki_r (#49369663) Attached to: At the Track With Formula E, the First e-Racing Series

Comparing the safety of a formula that is weeks old and one that is over 90 years old by comparing total fatalities is laughable.

No driver has died in an actual F1 race/qualifying since 1994. Cars have changed, circuits have been taken off the calendar for safety reasons. Accidents still happen, but it's nonsense to compare the safety of today with how F1 was in the time of Senna or Lauda.

Comment: Re:But, but... (Score 1) 71

by hibiki_r (#49358589) Attached to: US Air Force Overstepped In SpaceX Certification

I've seem time factor heavily in some road construction: St Louis: I-64 roadwork was going to cause havoc on many people's commutes, as it required closing go the most traveled highway on the city. The contracts took into consideration how long it was going to take, offered bonuses for finishing early and severe escalating penalties for delays. It was finished early.

Comment: Re:Check their work or check the summary? (Score 1) 486

by hibiki_r (#49337849) Attached to: No, It's Not Always Quicker To Do Things In Memory

It might have helped in this problem, but nowadays, even assembly language is just an abstraction: You might thing you are doing in order operations on 8086, but they are really being translated to out of order operations inside of the processor that will get the same result, but with very different performance. Branch prediction? Nah, we can run the beginning of BOTH branches, and just discard the computation we did not want, because it's actually faster. And don't get me started on the differences between what you tell a video card to do, and what it actually does.

The distance between what we write in practical, end user facing applications and what happens in the hardware is so large nowadays that it's hard to have any real control over what is going on. The best we can do is understand the performance characteristics that we see in the layer right below ours, and hope things don't change too much.

Therefore, the problem with the original paper is that it fails to really explain what we can learn from the experiment. It's not that disk is faster than RAM: That's just ludicrous. But that we really have to have some understanding of the libraries and VMs we use to get anywhere. It'd not be impossible for the JVM to realize the immutable string is being edited in a loop, that there are no references to it that could escape, and then just optimize the whole thing into a string buffer implementation that should be as good as calling the file writer: It just happens to not do said optimization for us. It's happened in Java before: Code that was seen as terrible because it was very slow is not slow anymore, and easier to read than the old school way of optimizing it.

Comment: Re:Good grief... (Score 1) 681

by tambo (#49110555) Attached to: Bill Nye Disses "Regular" Software Writers' Science Knowledge

The nuts and bolts of computer architecture isn't in the scope of computer science. Sure, you might want to know a little about how things work from an abstract level, but let's be clear; Computer science and electrical engineering are two different disciplines.

There's also a big gap between traditional electrical engineering and computer science - computer engineering really is its own thing.

The standard EE curriculum covers a lot of topics: E&M, circuit analysis and design, signal analysis and information theory, wireless communication, VLSI and VHDL, linear electronics, control systems, FPGAs / MOSFETs / ASICs, etc. The most computer architecture that EE covers is the basics of digital logic, and *maybe* a selection of other topics, like memory addressing and a basic instruction set, but it's really just an introduction.

Computer engineering is where the rubber meets the road, so to speak. Consider all of the specialized things that a computer engineer studies: processor design, instruction sets, memory / storage / caching, buses / pipelines / wire protocols, networking, parallel processing, power management (especially conservation), GPUs, firmware and device drivers, multithreading and stack analysis, security systems...

My point is simply that a typical EE barely scratches the surface of CE, and a typical CE has only a modest overlap with both EE and CS.

It's frustrating that so many people don't appreciate just how deep and rich and technically challenging these areas are. It's oddly stylish to diss CS and CE as of a lesser scientific caliber than traditional sciences - to look at a computer as a commodity, a basic web browser wired to a basic keyboard. Very disappointing in general, and it's culturally perpetuated by offhanded comments like Nye's.

Comment: Re:Sweet F A (Score 1) 576

Even if they were at our level of technology, if they have starships, then they have nuclear weapons. They don't have to invade, they can simple drop rocks or nukes on us to accomplish the same thing, and there wouldn't be anything we could do about it...

Yeah, nukes aren't even necessary. A lump of any kind of matter, parked at the top of Earth's gravity well and possessing sufficient bulk / shape / cohesiveness to deliver a sizable mass to the surface, will be devastating. If you can multiply that by, I dunno, several thousand - you have a fairly low-tech and cost-effective means of civilization annihilation.

This whole "gravity" thing is really a bummer. If mankind can ever conquer its internal existential threats - global war, nuclear proliferation, climate change, and the cultural-dumbness bomb called "the Kardashians" - then our own gravity well becomes our largest existential vulnerability.

Comment: Riiight (Score 1, Insightful) 85

by tambo (#49075971) Attached to: Algorithmic Patenting

"There is reason to believe that at least some of its computer-conceived inventions could be patentable and, indeed, patents have already been granted on inventions designed wholly or in part by software."

Right. And according to Fox News, "It is SAID BY SOME that Obama isn't a native citizen. Not that *we're* saying it, mind you, so we can't be held accountable. It's just, you know, THEY said it was true. Who's THEY? Well, we can't tell you, and we can neither confirm nor deny that we're using 'they' in place of 'we.' So we're just going to state that it's some number of unnamed experts, or the public at large, or whatever. You know... THEM."

Comment: Re: You sunk my battleship (Score 1) 439

by phoenix321 (#49061491) Attached to: Will Submarines Soon Become As Obsolete As the Battleship?

Never ever underestimate the speed and efficiency of China's construction abilities. They complete a handful of skyscrapers before most other nations have finished designing just one; remodel entire cities, build insanely large dams and dozens of nuclear power plants without any noticeable drop in GDP at all.

Not to mention that they own most of the world's high tech manufacturing plants.

As a side note, I would not recommend placing any long-term investments in Taiwan.

Comment: Re: Big Data (Score 0) 439

by phoenix321 (#49061461) Attached to: Will Submarines Soon Become As Obsolete As the Battleship?

Aircraft carriers were key to defeating German and Japanese navies, which were more than peer-level opponents of the US in 1942. Their battleships were unable to do anything and by the end of the war, their subs were just easy targets for US aircraft.

Pummeling minor powers would be done best by decapitation strikes and if that doesn't work, saturation bombardment. The B52 fleet can do both extremely well now and had their service life and fighting role extended far beyond the wildest expectation of any of its designers because of that. And B52's can't easily operate from carriers.

Drones can start from carriers however and we will see how the drone war works out in the next few years.

Comment: Re:Big Data (Score 1) 439

by phoenix321 (#49061407) Attached to: Will Submarines Soon Become As Obsolete As the Battleship?

A few LiveLeak videos of collateral damage later and accurate missiles are back in fashion for double the price or more.

The general public has all but dissolved the notion of "the enemy" and sees just the few of them that have actual weapons in their hands and ready to fire to even be remotely acceptable to fire at. The public only wants these few actual combatants and militants surgically removed from the other side, while the rest of their population is seen as sweet little innocent angels that would never harm anyone and were just praying for a miracle liberation thanks to Raytheon and General Dynamics.

People are truly believing that fighting will cease the moment that all currently-armed militants drop dead. That the general population of the enemy provides an indefinite supply of militants and will never change unless forced to, will boggle their minds.

That's why no Western power has won any wars in the last decades. They are winning battles left and right, toppled dictators, killed bad guys, built schools and bridges, thwarted endless terrorist plots, sacrificed scores of people and the best young men, but accomplished nothing in the long run.

Maybe we should bring the battleships back. Maybe area bombing works. At least people of Tokio or Dresden believe in democracy and freedom now.

Comment: Re:Big Data (Score 1) 439

by phoenix321 (#49061303) Attached to: Will Submarines Soon Become As Obsolete As the Battleship?

Artillery is going to produce collateral damage. People with mobile phones will film all the gory details and upload it everywhere. The general public and the American public do not like collateral damage and gory details of that school that was hit instead of the ammo depot and training camp of the enemy. Public opinion wins wars today, with ammunition in a far distant second place.

Firebombing, city-leveling area bombardment will never be acceptable again to a public that currently imagines soldiers to always be well-equipped, omniscient and omnipotent superheroes that only kill the bad guys, with minimal damage and in the cleanest possible way.

The cost of munitions (and training of individual soldiers) has become irrelevant, because a single error can lose a war. The accuracy required from warfare by the general public is now at ridiculous levels, with associated costs per shot and can only be described as utterly insane.

So, unless all shells are GPS-guided with the same or better precision than current missiles, battleships aren't making a comeback. And when they are as accurate, they are as expensive and still only have a fraction of the range.

Comment: It's jetpack technology: always 10 years away. (Score 2) 248

by tambo (#49051337) Attached to: Smart Homes Often Dumb, Never Simple

Over the years, I've invested thousands of dollars in several home automation platforms. I've yet to have an experience that I'd call "good."

Candidate #1: X10. Future-tech, circa 1978.

  • Pros:
    • Drop-dead simple implementation - there's a physical dial on every receiver to specify a code, and a physical dial on every controller to specify which codes it controls.
    • Supported by a broad set of manufacturers back in the 1990's.
  • Cons:
    • Wildly unreliable protocol = don't count on your lights actually turning on. Flakes out at the drop of a hat.
    • Hardware had extensive quality issues. Devices spontaneously died without warning. Wonderful if you enjoy debugging your light switches; terrible for people with better things to do in life.
    • Even when working perfectly, the latency was unacceptable: waiting a full second for your lights to turn on becomes painful fast.
    • No centralized management. Communication was largely one-way - switches broadcast; receivers receive - so things like "reporting status" and "verifying connectivity" were impossible.
    • Protocol security? What's that?
    • Deprecated and dead.

Candidate #2: INSTEON: The Commodore Amiga of home automation.

  • Pros:
    • Designed with a lot more redundancy and reliability than X10. Something about mesh network communication and blahblahblah.
  • Cons:
    • Overpriced. Holy crap, overpriced. Starter kits that controlled a single lamp ran for like $500.
    • One vendor = extremely constrained range of products. Sure, some of the gear had backwards-compatibility with X10, and mixing network gear was a great way to drive yourself insane fast.
    • Terrible business model = stunted growth and slow, painful death.

Candidate #3: Z-Wave: The People's Home Automation Platform.

  • Pros:
    • Totally open protocol! Anyone can make a Z-Wave-supported device!
    • Potential for built-in reliability through mesh communication, etc.
    • Hierarchical mesh architecture can be centrally managed by a hub.
  • Cons:
    • "Anyone can make a Z-Wave-compatible device" =/= "anyone can make a *good* Z-Wave-compatible device."
    • Entry-level devices are cheap, but inadequate. Fully-capable devices are reliable, but expensive. There are also expensive devices that are crippled, but no cheap devices that aren't. Have fun with that.
    • The architecture is both overcomplicated and poorly documented. Want to figure out how scenes work? Plan on setting aside an hour to scrape together bits and pieces of information from different vendors, and glue them together with guesswork and trial-and-error.
    • Lots of potential... not as many products. In theory, Z-Wave is great for motorized blinds. In practice... there's like one company offering an overpriced half-baked product, and an Instructable DIY video.
    • Hub architecture is feasible... but good luck finding a decent implementation:
      • SmartThings wants to be hip and polished, but feels like it was designed by ADHD-afflicted high school students as a summer project.
      • MiCasaVerde / MiOS / Vera is ambitious... i.e., overambitious, i.e., no support. Great for those who enjoy hacking a commodity-based Linux box and digging through log files to figure out why the kitchen lights won't turn on. The Facebook group is kind of surreal: it's a company rep posting happy-happy-joy-joy patch notes, and dozens of people asking why their Vera won't respond and why customer service won't get back to them.
      • Home Depot Wink is a subscription-based service. Let that sink in: you'll have to pay $x/month for the privilege of automating your light switches.
      • A handful of weird, little-known contenders exist (Staples Connect, ThereGate, the "Jupiter Hub," etc.), with virtually no buzz (and the bit that's there is typically poor).

    Candidates #4-50,000: Belkin WEMO, ZigBee, MyQ, Ninja Blocks, IFTTT, etc.: The Home Automation Also-Rans.

    All of these products and platforms map a consistent trajectory - from gee-whiz Kickstarter prototype to "whatever happened to..." entries on the technology scrap heap. They don't ever develop sufficient momentum beyond diehard hacker/maker cliques that warrant serious consideration beyond the level of "free-time toy project" status.

Comment: Re:UI code is bulky (Score 1) 411

by tambo (#49037457) Attached to: Your Java Code Is Mostly Fluff, New Research Finds

> You will find that an MPEG2 decoder is substantially more complex than DeCSS.

First, "complexity" and "size" are completely different concepts, and we're only discussing codebase size here. Of course, extremely complex code can be very small (see also: MD5, RSA, etc.), and extremely bulky code can be very routine. The auto-generated code for instantiating all of the controls in a window can be thousands of lines long, but they're very bland and mundane instructions. ("Create a button at coordinates x1, x2, y1, y2; set its border, background color, font, and caption; hook it up to these event handlers...")

But just to reinforce the point: Here is a full MPEG 2 decoder. It's about 130kb, uncompressed.

VLC is the opposite of pretty stripped-down. It does everything, including ripping DVDs and transcoding video.

First, it's certainly stripped-down as compared with other media rendering packages: Windows Media Player, WinDVD / PowerDVD, and of course iTunes.

Second, those features - ripping DVDs and transcoding videos - are not only expected in modern media player packages; they also utilize the same set of core functionality. Just like ordinary playing, ripping and transcoding are basically uses of the output of the codec - the software is just delivering the codec output somewhere other than the screen, like to a storage device, or as the input to a different codec.

interlard - vt., to intersperse; diversify -- Webster's New World Dictionary Of The American Language

Working...