I said it before and I'll say it again. Taking something that exists and making it digital (or on the internet or on your phone or any "just-add-technology" change) doesn't make it new. Prior art should still apply...I refer you to my comment on Tuesday
I completely agree, but I think that was my point (and I reply largely because I took umbrage with the "Um, no" while really I think we agree)... design patents are very narrow in scope intentionally, and I'm arguing that software patents should be similarly narrow to specifically avoid the problems with the current patent system about which we're both talking.
I used the shopping cart example because of recent news that such a patent has been awarded, and upon reading the patent it does not appear to be a specific implementation at all; rather a very generic implementation that highly parallels an existing physical concept (a shopping cart). It's not that specific implementations shouldn't be patentable (although I may argue that copyright instead of patents should apply in most cases, when the differences are more aesthetic rather than functional, but thus my comparison to design patents), I'm just trying to find a reasonable testable limit to what should qualify.
As you say, patenting the entire store is not what patents are about. I think patenting shopping carts is simliar (unless they have a "novel" feature). Things that work well in the physical world are quickly gaining internet-based analogues.
You're also right about of inevitability, of course, but I think there's a difference between inevitable, and obvious or novel. Lending digital objects (case in point) is an interesting example. A narrow patent on a particular combination of encryption and centrally controlled tracking and limiting methods on how many times something could be shared probably could be patentable, and makes sense. But in this case, the patent has grown to include practically any conceivable implementation, which seems wrong. The limiting factor I suggested was whether there is a physical analog or not. There IS a physical analog for lending, so no matter how long it takes for the first person to build and market code for that should, IMHO, remain unpatentable. I'm not aware of a physical analog for lending something only a specific number of times, so I'm OK with that (although I'd be glad to be corrected).
Definitely an interesting conversation; I think we agree on the salient points.
The patent office needs to adopt a simple fact: doing something digitally that has been done physically before (like lending purchased objects just like a used book and music store, or having a digital "shopping cart" like, you know, a shopping cart) is "obvious". Someone will eventually get around to implementing it, so it is not novel and should not be patentable. At best maybe the site should get design patent coverage, or some very specific encryption algorithms should be protected in some way if in fact they are proprietary, but the idea of patenting an entire store concept should be ridiculous.
It's been a long time since I've read Kuhn (it was required in college along with a counter position by someone whose name escapes me), and I remember not fully agreeing with every nuance that he wrote. But I completely agree with parent that Biology has gone through paradigm shifts, and I'd say it's clear that other things have as well. We live in a world where cloning is a reality -- this is a paradigm shift from when it was science fiction. Microbiology changed the way everyone interacts with the world right down to the soap we buy.
Obviously economists are having some identity crises lately that I suspect will be looked back upon as somewhat of a paradigm shift. Chemistry has moved quite far from alchemy and nuclear chemistry was unthinkable or understandable not long ago. Manufacturing went from fully manual through the assembly line and is now in a phase of robotics and desktop printing. Spectrum disorders are fairly recent diagnoses (as a specific category at least), in the psychological sciences. Triceratops were recently reshuffled and we're far more likely to imagine dinosaurs with feathers now than when I was a kid. Don't get me started on the Brontosaurus. Warfare is changing; it's barely important to have a person with a weapon see someone opposing them directly in some cases. The concept of water on other planets is far more accepted and direct evidence is changing common thinking on the topic. We have a near-permanent presence in space. Gender views are changing... gay-rights are probably at a social tipping point, at least in some places, and any social-science is strongly affected.
All of these things are dramatic changes. Are they paradigm shifts? Some of them... one of the problems with paradigm shifts is that they're nearly impossible to see from the inside. And yes, they may appear more subtle than "the earth is round - no it's not" debates, but remember that those took quite a long time for large populations to accept themselves. It's difficult to predict how any new way of looking at something will affect the future, but some of these changes will eventually be looked back upon as paradigm shifts.
Ah, good, I'm glad someone already mentioned this. Big did not deserve to be italicized there not only because 260 million minutes of video isn't "that much" (!) in terms of internet streaming viewers, but the statistics aren't really based on number of minutes of video analyzed... the main statistics are more about viewership and certain events (video startup, video freezing), which could be surrounded by hours of uninteresting video time that didn't really contribute to some of the metrics.
Netflix has, what, 20+ million individual viewers per month? 10 hours a piece isn't hard to imagine. As the parent pointed out youtube is much larger than that.
It's still very interesting analytics. it's not always the size that matters with "big" data. But let's not get carried away with the italics now people... this way madness lies.
To my dismay, I worked on this project. The project started with controversy -- the Oracle bid that beat out SAP like seven years ago was surrounded by complaints. The article skips some details. CSC (Computer Sciences Corp, who is quoted) was the main driver of about $800-million of that spending. It is accurate to say that this change didn't affect them, but that's because hundreds of people had already been laid off or moved of the project between last September and last March.
There's enough blame to go all over the place. Years spent in requirements that weren't turned into code; time spent passing blame back and forth across development teams who were so large and segregated that they rarely communicated properly, both within the Air Force and within CSC and between the other teams. At it's peak I believe the project had roughly 800 people on it. I don't know what the maximum size a development project should have, but it's got to be smaller than that. That number includes everyone, trainers, managers, and some key initial users and testers, but still it's a very high number.
The Air Force tried several times to realign the project, but there were contractual disputes or, once that was over, difficulty deciding what to keep and what to scrap, which lead to a death spiral where everything went back on the drawing board and I think ultimately leadership just lost hope.
It wasn't a complete loss, though. A few small teams, including the one I was previously on, have survived. We built a robust data quality system and are working on some enterprise data dictionary and master data tools, which will help the systems that are left behind. With hundreds of systems supporting a half million users, $1billion probably isn't off the chart -- at least not had this been a successful project, but the worst part is that there's still much work that needs to be done, and now someone will have to start over... again.
I was a fan of the InfoMagic packages, which IIRC included Slackware's entire distro and a huge number of other files. Does anyone else remember Elfos fondly? I have a picture of the CD around here somewhere... I even had a T-shirt.
I was Slackware -> RedHat -> Ubuntu for my primary, but I tried CentOS, Fedora, and Knoppix in there for non-trivial amounts of time. Does Android count?
Also, I played with FreeBSD but that doesn't count for multiple reasons.
I've been working on the Heritage Health Prize that Kaggle is running for over a year now. It's a fantastic way to learn data science and tackle real world problems with real data and a co-op-etitive spirit. The forums and winning solutions are great for learning the art, and if you've never used R, it's a great opportunity to learn it and talk to people that have a ton of experience in the area.
Someone with more recent experience should answer this since, as I said somewhere up above, I haven't used one of these in a while. That said, several of the USB Microscopes I've used work as fairly standards-compliant webcams, particularly (IIRC) with GSPCA and V4L support.
What you won't get on linux is support for computer controlled focus, enabling and disabling lights or any other features that the microscopes have, unless they have explicit linux tools bundled, which some may have, but I have not worked with any so I can't say specifically.
Hear hear. I had one of the old Intel QX microscopes and loved it. Any USB microscope will let a child see things on a nice big screen and many of them, like the QX3, are ruggedized for kids. It's pricey, though, compared to other desktop USB microscopes of which there are many, as other posters are mentioning (search Amazon), but the design should make it worthwhile. I haven't used the software in a while; it was a bit buggy last time I played with it, but hopefully it's improved.
If you really want eyepieces, there are some that do double duty -- Celestron makes entry-level microscopes that have replaceable eyepieces (so you can use it like a traditional microscope or mount a camera on it, but not both at the same time), or ones with LCD screens directly built in instead of traditional viewfinders.
As you near the $250 barrier you can get scopes that start to do double-duty, and your options increase. These are more lab-ready and may need more oversight for a 7-year-old, and frankly I find them harder to use for kids, but the ability to look through an eyepiece to set up and focus a shot and then discover the results on a computer make for good two-person work in parent-child fashion.
Out of points or I'd mod you up for mentioning Future Crew. I still have a 3.5" floating around with some great old demos from that era.
I'm not surprised anyone hasn't heard about the demoscene any more, but it's easy to figure describe: fit the most impressive graphics and sound you can on a very small memory footprint -- 64kb is the common limit now apparently, but I seem to recall some good "anything you can fit on a disk" rules and some impressive 4k demos.
Other than raw coolness, the point is to "do more with less" -- push creativity, efficiency, and algorithm design by artificially limiting resources. It's very impressive stuff, and having been out of touch for a long time, I'm AMAZED at the quality these vids are putting out. Has anyone been able to run the
I love science. From the article:
"a projection from 1981 for rising temperatures [...] has been found to agree well with the observations since then, underestimating the observed trend by about 30%."
Most predictions that are off by 30% aren't really considered "remarkably accurate".
I'm uncertain about this one (i.e. I'm not trying to troll). It's nice that the prediction does seem to chart well, and since the weather can't be accurately predicted 7 days in advance, any climate model that outperforms naive ones over 30 years is an achievement. Still, the point of the (recent) author is clear -- they're trying to emphasize that a warming trend is occurring and that people predicted it 30 years ago using "science" that is still sound -- at least some discussion of why a 30% error rate is acceptable should take place.
Well, you certainly won't find a shortage of opinions on Slashdot.
If you think the software is good enough, then a non-commercial version with limited registration information (e-mail, name), and some very privacy-thoughtful reporting (maybe to ensure that the registered serial numbers are only being used by one machine at a time), should only be a good thing. Getting your software into the hands of the people that might buy it will get them used to it, relying on it, and eventually make them customers. But (as others here have posted), don't abuse the "spying"... if you start to make money by pilfering the free registrations for ancillary information you're just going to annoy your users and they'll be more apt to pirate the software or use fake registration information. Giving them something in return, like forum access for very limited support, is helpful.
Other possible models include giving the software for free and asking payment for support -- nearly all profitable Open Source companies do this, and even if you leave the source closed the business model isn't terribly different. You could publish a "crippleware" version, which I find rather annoying, unless the limits are such that the home and non-commercial users needs are really satisfied, and the only people that need to pay $10k for the software are those to whom it's worth it. I give a nice shout out to Andrea Mosaic for doing this correctly (at a lower price point).
Lastly an option you may have missed may be to ignore it because it isn't a problem. A pirated version by a customer that wouldn't have paid anyway probably doesn't hurt you. A pirated version by a customer that would have paid may actually turn into a sale if they need assistance. When you upgrade, if the pirates liked it, they'll want the next version, so they may buy. It may be pirated by employees or students who years later may remember it and decide to buy it. You never can tell.
In those cases, you're getting your software out there and used; you could take an "all exposure is good exposure" attitude. The fact that you didn't list the name of your software in the original post here means that you may not think that way, or you may outright disagree.
Still, piracy is going to happen. At least you're asking the right questions. Don't let yourself get dragged into a fight with the anonymous masses on the internet, though -- you'll probably lose.
Robert Jordan's books redefined the level of crazy that I will accept from an author. They're fantastic writing, a wonderful, deep, involved storyline, but come ON, the length is way too self-indulgent and unnecessary. The story is nowhere near as complicated (or worthy) as, say, FOUR Lord of the Rings trilogies, but it's substantially longer. The sadness is that it is comparably well written -- length notwithstanding.
I'm currently using four of the books as monitor stands (I actually won't go so far as to use them as doorstops).
More importantly, though, this has changed the way I'll read connected books or watch TV shows. I fear the abandoned story line too much now, and I blame Robert Jordan. "Heroes", the TV show, was a similar letdown... I waited until "Lost" was finished, for fear of it falling into the same pit as "Heroes", and nearly did the same thing with "Battlestar Galactica".
Is there a name for this? Can we call it the "Robert Jordan" effect? -- the situation where you get too involved with an author or storyline and they just go on forever or (no disrespect) die?
And the expanding-storyline theme is amazing. Eight Harry Potter Movies? Really? Five Twilight movies? I love a good trilogy, and (other than the quality of the prequels) appreciate that the Star Wars trilogies are built so that you can watch the original without needing the rest to complete the story. Many authors have interwoven stories and worlds... How many books did Terry Pratchett write? Many of which made reference to one another, but at least they each had an individual story arc. The Ender's Game series is similar... Terry Brooks' series can be read in myriad configurations of trilogies and tetralogies.
ugh... the Jordan series is fantastic in many ways and I'm very glad to see it completed -- I hope the finale lives up to the series -- but please noone ever do this again, or at least give good warning so that we can avoid going down the path until it's complete.
If all the world's economists were laid end to end, we wouldn't reach a conclusion. -- William Baumol