Forgot your password?
typodupeerror

Comment: Re:Drone? (Score 2) 151

by cshotton (#46971433) Attached to: U.S. Passenger Jet Nearly Collided With Drone In March

The problem is that for a long time, radio controlled aircraft have been somewhat of an esoteric hobby and the people involved formed much more of a community, with self-policing through organizations like the Academy of Model Aeronautics. You used to have to spend weeks or months, carefully crafting the aircraft, installing engines, controls, adjusting, tweaking, and then hopefully flying and not crashing in front of your friends on a Saturday afternoon. The AMA provided strict guidelines for operations near aircraft, structures, and people, etc.

Now any moron with a few hundred dollars can go get something off the shelf that performs perfectly well and is capable of ending up splashed across the Internet headlines because they AREN'T part of a self-policing community anymore. But it still begs the question of why the mainstream media is so glaringly stupid when it comes to anything to do with aviation. A "drone" has a very specific definition, though apparently to the uneducated journalists of the world, anything that leaves the surface of the earth now without a human onboard is considered a drone.

Comment: Perils of Outsourced QA (Score 1) 573

by cshotton (#46172563) Attached to: HTML5 App For Panasonic TVs Rejected - JQuery Is a "Hack"

Having submitted apps to almost every major TV manufacturer on behalf of some prominent global brands, I can tell you there is one common thread through all of this. Outsourced QA.

These electronics manufacturers are devoid of any meaningful software organization internally and almost all of them outsource their app evaluation processes to external organizations. These organizations are paid to find defects and in no way are incentivized to help you get your app into the marketplace. Not only is this incentive structure counterproductive, it also promotes an entirely random and inscrutable process where the same app can be submitted 3 or 4 times with absolutely no changes and receive wildly differing failure reports for completely unrelated reasons because it gets reviewed by 3 or 4 different analysts with different skill sets and random acceptance criteria.

In order to get some apps accepted by Samsung's "QA" process, we literally had to resort to threats in email that if they didn't pass the app, we were going to turn over their completely random QA reports to Samsung and get them fired. Good luck!

Comment: We can't organize that well (Score 3, Insightful) 266

by cshotton (#42869791) Attached to: 71 Percent of U.S. See Humans On Mars By 2033

What those 77% of people fail to realize is that we can no longer organize ourselves well enough to accomplish this sort of task. NASA, as an institution, long ago stopped being about technical successes and exploration. During my years working with NASA, I discovered that a NASA manager's career success is measured solely by the number of people they manage and the size of the budget they control. Not by how many successful missions they achieved, not by the technology breakthroughs they fostered, and not by any other rational measure beyond their org chart success.

So we have no government agency capable of focusing on such a complicated goal as landing humans on Mars. They immediately get distracted with project management issues and politics. If private industry were to try and undertake this effort, there would have to be some financial incentive for our largest private spacefaring corporations to try and cooperate, since none have the resources alone to achieve the goal within 20 years. And the only model they have for organizing themselves is NASA today. No one still working in the industry knows how NASA of the 1960's worked, and society has changed to the point that the technical people required for such an effort are no longer motivated to make the selfless sacrifices needed to achieve such a goal. All the good engineers left aerospace for the Dot.Com world in the '90s. Those remaining few are motivated by commercial and personal financial success, and that requires a much shorter planning and gratification cycle than 20 years.

Sorry, we won't be going to Mars. We're a bunch of greedy, self-absorbed, small-minded apes that have reached the pinnacle of our organizational skills at the bottom of our gravity well.

Comment: Secret ARM opcodes can execute JVM bytecode (Score 1) 63

by kriegsman (#41772345) Attached to: Red Hat Devs Working On ARM64 OpenJDK Port

Many ARM-based chips include the "Jazelle DBX" (Direct Bytecode eXecution) hardware CPU extension, which lets "95% of [JVM] bytecode" be executed directly by the CPU -- without need for recompilation to 'native' ARM/Thumb instructions. (See http://en.wikipedia.org/wiki/Jazelle )

However, unlike most of the ARM universe which is fairly open, Jazelle DBX's specs, implementation, and operating details are apparently a closely held secret, shared by ARM only with select JVM implementors. I'll be interesting to see if ARM decides to help out the OpenJDK people. Googling around, it looks like so far the answer has been "No."

Comment: Re:I work at Veracode; here's how we test. (Score 4, Informative) 103

by kriegsman (#38295894) Attached to: Study Shows Many Sites Still Failing Basic Security Measures
That is a GREAT question, and the full answer is complicated and partially proprietary. But basically, you've touched on the problem of indirect control flow, which exists in C (call through a function pointer), C++ (virtual function calls), and in Java, .NET, ObjC, etc. The general approach is that at each indirect call site, you "solve for" what the actual targets of the call could possibly be, and take it from there. The specific example you gave is actually trivially solved, since there's only one possible answer in the program; in large scale applications it is what we call "hard." And yes, in some cases we (necessarily) lose the trail; see "halting problem" as noted. But we do a remarkably good job on most real world application code. I've been working with this team on this static binary analysis business for eight or nine years, and we still haven't run out of interesting problems to work on, and this is definitely one of them.

Comment: Plenty of Prior Art (Score 1) 154

by cshotton (#37078086) Attached to: What If Tim Berners-Lee Had Patented the Web?

Assuming that he'd filed for his patents before the USPTO went insane, odds are low that they would have survived the prior art test. As others have mentioned, Gopher, FTP, and even several BBS systems would have been able to cover the prior art for the HTTP component. HTML was really just a bastardized version of SGML. And the entire concept of a hypertext page was predated by HyperCard and a bunch of work at Xerox PARC.

Why are we even having this discussion?

Comment: Re:Not a Jet Fighter (Score 3, Interesting) 119

by cshotton (#35145384) Attached to: Robot Jet Fighter Takes First Flight

As it turns out, the real problem on these platforms is power generation. With synthetic aperture radars, flight control systems, on-board mission management systems, laser designators, EO sensors, and LOS and BLOS/satellite comms gear on board, the problem of supplying electricity for all the systems becomes critical.

I worked on the original J-UCAS program which transitioned from DARPA to the Navy, and designing the autonomous flight and mission management systems was the easier part of the problem. Creating the comm infrastructure (software defined radios), the operational procedures, the peer-to-peer cooperation, and mundane stuff like dealing with air traffic control turn out to be much harder in practice.

Definitely one of the coolest projects I have ever worked on and I'm glad to see one of the J-UCAS derived UAVs finally getting into the air.

Comment: Re:Shallow (Score 4, Insightful) 470

by cshotton (#34815146) Attached to: Is Mark Zuckerberg the Next Steve Case?

It's just as shallow to compare a desktop software application's success to the much more transient, ephemeral, and difficult to quantify success of Facebook. Better to look at the Internet as a whole and ask a couple of simple questions.

First, name one network-wide, user-oriented application level service that was present when the commercial Internet opened for business in 1991 that is still in operation and use today.

Discounting UseNet and Email as infrastructure, the answer is likely "nothing." It's instructive to consider why. Early community plays on the Internet (The Well, The Globe, AOL, WebTV, and even MySpace) fell in succession, not because there weren't plenty of users and not because they weren't good services. They fell because, by definition, something newer and better comes along. It's the same reason we don't drive horse-drawn wagons to work. Supporting the infrastructure and feature set of an existing system means, by definition, that you will never be able to change and adopt new technologies as fast as someone else starting with a clean slate.

Second, what is so special about Facebook that it will avoid being obsolesced by the next cool fad? Answer again, "nothing".

Facebook's only advantage is the depth of its social graph. And as many posters have noted, the average Facebook user has a pretty static social graph and no need to add to it in any significant way now. Once you are fully connected, it becomes trivial to notify your graph that you are moving elsewhere, and then Metcalfe's Law kicks in. Once the infrastructure becomes distributed and you are no longer locked into a single service, people will be free to move their social graph and associated applications wherever they'd like.

Extrapolating the past lifecycles of similar, successful social sites to Facebook, it seems logical to conclude (as the author did) that Facebook's days are numbered. Maybe in the thousands, but numbered nonetheless.

Comment: What a crock! (Score 1) 717

by cshotton (#34079746) Attached to: VLC Developer Takes a Stand Against DRM Enforcement

The developer is at fault, not Apple. Apple is not party to the GPL terms just because a developer with a political agenda chooses to upload an application to the App Store. The developer is at fault for choosing a distribution medium for his software that is incompatible with GPL terms. It's not any more complicated than that.

Comment: Designers != Decision Makers (Score 1) 510

by cshotton (#32434712) Attached to: HTML5 vs. Flash — the Case For Flash

The fallacy in this article is that designers are the ultimate arbiters of how a web site gets built. It's just not the case. If I am paying the bills and I want to reach the largest possible audience, my first instruction to the designer working for *me* is going to be to avoid Flash, Java, and anything else that is plug-in dependent like the plague. Flash sites are inaccessible to the vision impaired, they are not generally searchable, they are prone to breakage when plug-ins are out of date or non-existent, and they are generally not maintainable by the organization they were created for once they are delivered.

Sorry, but sticking with standards based solutions and tools that manipulate open formats is the way to go if you, Mr. Designer-boy, are working for me. In a market-driven economy, designers are a commodity, not an industry force. They're gonna do what they are paid to do.

Comment: Re:Space without astronauts (Score 1) 145

by cshotton (#31954110) Attached to: USAF's Robotic X-37B Orbiter Launched For Test Flight
Actually the real reason is the pyrotechnics that are used to deploy the landing gear and the drogue chute as well. They both have to be armed and deployed manually by a series of buttons on the glare shield. It has been a long standing rule in manned space flight that anything that can explode on command like that is always operated manually unless it is impossible for some reason. The fact that neither system can be re-stowed after deployment is problematic.

Comment: HTTP-NG Revisited (ten years later!) (Score 4, Informative) 406

by kriegsman (#30079022) Attached to: HTTP Intermediary Layer From Google Could Dramatically Speed Up the Web
HTTP-NG ( http://www.w3.org/Protocols/HTTP-NG/ ) was researched, designed, and even, yes, implemented to solve the same problems that Google's "new" SPDY is attacking -- in 1999, ten years ago.

The good news is that SPDY seems to build on the SMUX ( http://www.w3.org/TR/WD-mux ) and MUX protocols that were designed as part of the HTTP-NG effort, so at least we're not reinventing the wheel. Now we have to decide what color to paint it.

Next up: immediate support in FireFox, WebKit, and Apache -- and deafening silence from IE and IIS.

Optimism is the content of small men in high places. -- F. Scott Fitzgerald, "The Crack Up"

Working...