A neighbor down the street is apparently being wed because we get their wedding RSVPs. The RSVP postcards are in a fancy script and the Post Office routinely misinterprets a "1" for a "4". This has definitely made this year's count spike.
The source code is zipped. For a 254 byte program. This just tickles me for some reason.
Military still seems "proprietary" to me. If they meant "commercial", I could see a difference. I also considered "embedded" or "firmware" style code that, while software, is more closely tied to a physical hardware implementation. All of those still seem either "proprietary" or "open source", though, and you're right (@stillnotelf) that these would raise rather than lower industry averages.
I wonder if they meant a "standard" as in a target or an accepted limit that is somewhat arbitrary rather than an "average" (which the article actually uses) of real-world code. This would make sense since the average they cite is exactly 1.
Wow, yeah, I posted an almost identical sentence myself. Eerie. (Although I didn't have a Wobegon reference... sorry). But yeah, it seems like an odd sentiment. Internal use software is still either "proprietary" or "open source"... isn't it? But good point. If someone calculated the bugs in my excel macros as if they could be used for general purpose computing I'd be in sad shape. (ObNote: I use excel macros as rarely as possible, and normally only at gunpoint).
and both [proprietary and open-source software] continue to surpass the industry standard for software quality
I agree with most posters that the logic in the post is hugely flawed (predicting something about the future by arguing that we can't know enough to predict it is inane). But more constructively: our constant access to information hasn't sated our desire for more information. Information collection is driving the recent knowledge boom as much if not more than ease of access. Besides, no matter how much time passes, if we haven't visited another world we won't have the information about that world at our availability. You have to collect information before you can use it... that's WHY further exploration will always be a goal (unless, you know, we obliterate ourselves somehow in the mean time, or find something similarly more important in any given short-term).
I said it before and I'll say it again. Taking something that exists and making it digital (or on the internet or on your phone or any "just-add-technology" change) doesn't make it new. Prior art should still apply...I refer you to my comment on Tuesday
I completely agree, but I think that was my point (and I reply largely because I took umbrage with the "Um, no" while really I think we agree)... design patents are very narrow in scope intentionally, and I'm arguing that software patents should be similarly narrow to specifically avoid the problems with the current patent system about which we're both talking.
I used the shopping cart example because of recent news that such a patent has been awarded, and upon reading the patent it does not appear to be a specific implementation at all; rather a very generic implementation that highly parallels an existing physical concept (a shopping cart). It's not that specific implementations shouldn't be patentable (although I may argue that copyright instead of patents should apply in most cases, when the differences are more aesthetic rather than functional, but thus my comparison to design patents), I'm just trying to find a reasonable testable limit to what should qualify.
As you say, patenting the entire store is not what patents are about. I think patenting shopping carts is simliar (unless they have a "novel" feature). Things that work well in the physical world are quickly gaining internet-based analogues.
You're also right about of inevitability, of course, but I think there's a difference between inevitable, and obvious or novel. Lending digital objects (case in point) is an interesting example. A narrow patent on a particular combination of encryption and centrally controlled tracking and limiting methods on how many times something could be shared probably could be patentable, and makes sense. But in this case, the patent has grown to include practically any conceivable implementation, which seems wrong. The limiting factor I suggested was whether there is a physical analog or not. There IS a physical analog for lending, so no matter how long it takes for the first person to build and market code for that should, IMHO, remain unpatentable. I'm not aware of a physical analog for lending something only a specific number of times, so I'm OK with that (although I'd be glad to be corrected).
Definitely an interesting conversation; I think we agree on the salient points.
The patent office needs to adopt a simple fact: doing something digitally that has been done physically before (like lending purchased objects just like a used book and music store, or having a digital "shopping cart" like, you know, a shopping cart) is "obvious". Someone will eventually get around to implementing it, so it is not novel and should not be patentable. At best maybe the site should get design patent coverage, or some very specific encryption algorithms should be protected in some way if in fact they are proprietary, but the idea of patenting an entire store concept should be ridiculous.
It's been a long time since I've read Kuhn (it was required in college along with a counter position by someone whose name escapes me), and I remember not fully agreeing with every nuance that he wrote. But I completely agree with parent that Biology has gone through paradigm shifts, and I'd say it's clear that other things have as well. We live in a world where cloning is a reality -- this is a paradigm shift from when it was science fiction. Microbiology changed the way everyone interacts with the world right down to the soap we buy.
Obviously economists are having some identity crises lately that I suspect will be looked back upon as somewhat of a paradigm shift. Chemistry has moved quite far from alchemy and nuclear chemistry was unthinkable or understandable not long ago. Manufacturing went from fully manual through the assembly line and is now in a phase of robotics and desktop printing. Spectrum disorders are fairly recent diagnoses (as a specific category at least), in the psychological sciences. Triceratops were recently reshuffled and we're far more likely to imagine dinosaurs with feathers now than when I was a kid. Don't get me started on the Brontosaurus. Warfare is changing; it's barely important to have a person with a weapon see someone opposing them directly in some cases. The concept of water on other planets is far more accepted and direct evidence is changing common thinking on the topic. We have a near-permanent presence in space. Gender views are changing... gay-rights are probably at a social tipping point, at least in some places, and any social-science is strongly affected.
All of these things are dramatic changes. Are they paradigm shifts? Some of them... one of the problems with paradigm shifts is that they're nearly impossible to see from the inside. And yes, they may appear more subtle than "the earth is round - no it's not" debates, but remember that those took quite a long time for large populations to accept themselves. It's difficult to predict how any new way of looking at something will affect the future, but some of these changes will eventually be looked back upon as paradigm shifts.
Ah, good, I'm glad someone already mentioned this. Big did not deserve to be italicized there not only because 260 million minutes of video isn't "that much" (!) in terms of internet streaming viewers, but the statistics aren't really based on number of minutes of video analyzed... the main statistics are more about viewership and certain events (video startup, video freezing), which could be surrounded by hours of uninteresting video time that didn't really contribute to some of the metrics.
Netflix has, what, 20+ million individual viewers per month? 10 hours a piece isn't hard to imagine. As the parent pointed out youtube is much larger than that.
It's still very interesting analytics. it's not always the size that matters with "big" data. But let's not get carried away with the italics now people... this way madness lies.
To my dismay, I worked on this project. The project started with controversy -- the Oracle bid that beat out SAP like seven years ago was surrounded by complaints. The article skips some details. CSC (Computer Sciences Corp, who is quoted) was the main driver of about $800-million of that spending. It is accurate to say that this change didn't affect them, but that's because hundreds of people had already been laid off or moved of the project between last September and last March.
There's enough blame to go all over the place. Years spent in requirements that weren't turned into code; time spent passing blame back and forth across development teams who were so large and segregated that they rarely communicated properly, both within the Air Force and within CSC and between the other teams. At it's peak I believe the project had roughly 800 people on it. I don't know what the maximum size a development project should have, but it's got to be smaller than that. That number includes everyone, trainers, managers, and some key initial users and testers, but still it's a very high number.
The Air Force tried several times to realign the project, but there were contractual disputes or, once that was over, difficulty deciding what to keep and what to scrap, which lead to a death spiral where everything went back on the drawing board and I think ultimately leadership just lost hope.
It wasn't a complete loss, though. A few small teams, including the one I was previously on, have survived. We built a robust data quality system and are working on some enterprise data dictionary and master data tools, which will help the systems that are left behind. With hundreds of systems supporting a half million users, $1billion probably isn't off the chart -- at least not had this been a successful project, but the worst part is that there's still much work that needs to be done, and now someone will have to start over... again.
I was a fan of the InfoMagic packages, which IIRC included Slackware's entire distro and a huge number of other files. Does anyone else remember Elfos fondly? I have a picture of the CD around here somewhere... I even had a T-shirt.
I was Slackware -> RedHat -> Ubuntu for my primary, but I tried CentOS, Fedora, and Knoppix in there for non-trivial amounts of time. Does Android count?
Also, I played with FreeBSD but that doesn't count for multiple reasons.
I've been working on the Heritage Health Prize that Kaggle is running for over a year now. It's a fantastic way to learn data science and tackle real world problems with real data and a co-op-etitive spirit. The forums and winning solutions are great for learning the art, and if you've never used R, it's a great opportunity to learn it and talk to people that have a ton of experience in the area.