Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Playing along with the ridiculousness... (Score 1) 629

I agree with most posters that the logic in the post is hugely flawed (predicting something about the future by arguing that we can't know enough to predict it is inane). But more constructively: our constant access to information hasn't sated our desire for more information. Information collection is driving the recent knowledge boom as much if not more than ease of access. Besides, no matter how much time passes, if we haven't visited another world we won't have the information about that world at our availability. You have to collect information before you can use it... that's WHY further exploration will always be a goal (unless, you know, we obliterate ourselves somehow in the mean time, or find something similarly more important in any given short-term).

Privacy

Submission + - Giving away personal info byte by byte for "security"? 1

ZahrGnosis writes: "Through work, and bad planning, I've been subscribed to a lot of trade magazines. I don't generally hate them, most come electronically now, and some are actually useful. What I despise is the annual calls to renew my "free" subscription and update my info. Ok, giving away my work e-mail, title, and line of business info isn't that bad, it's practically good marketing. But at the end of each call the phone people say "in order to verify that we talked to you, can you tell us" followed by something tiny and seemingly innocuous. The last digit of the month you were born in, or the first letter of the city you were born in. Today it was the country you were born in. I've started lying, of course, because noone needs to aggregate that sort of personal information byte-by-byte, but now I'm convinced that's what they're doing. Does anyone have any details on this? Can someone confirm or deny that these seemingly innocuous "security" questions are being aggregated. Yes, I tried some google searches, but my fu seems weak in this area... or noone is talking about it yet. Bueller?"

Comment Re:Obvious, Novel, and Prior Art aren't just digit (Score 1) 240

I completely agree, but I think that was my point (and I reply largely because I took umbrage with the "Um, no" while really I think we agree)... design patents are very narrow in scope intentionally, and I'm arguing that software patents should be similarly narrow to specifically avoid the problems with the current patent system about which we're both talking.

I used the shopping cart example because of recent news that such a patent has been awarded, and upon reading the patent it does not appear to be a specific implementation at all; rather a very generic implementation that highly parallels an existing physical concept (a shopping cart). It's not that specific implementations shouldn't be patentable (although I may argue that copyright instead of patents should apply in most cases, when the differences are more aesthetic rather than functional, but thus my comparison to design patents), I'm just trying to find a reasonable testable limit to what should qualify.

As you say, patenting the entire store is not what patents are about. I think patenting shopping carts is simliar (unless they have a "novel" feature). Things that work well in the physical world are quickly gaining internet-based analogues.

You're also right about of inevitability, of course, but I think there's a difference between inevitable, and obvious or novel. Lending digital objects (case in point) is an interesting example. A narrow patent on a particular combination of encryption and centrally controlled tracking and limiting methods on how many times something could be shared probably could be patentable, and makes sense. But in this case, the patent has grown to include practically any conceivable implementation, which seems wrong. The limiting factor I suggested was whether there is a physical analog or not. There IS a physical analog for lending, so no matter how long it takes for the first person to build and market code for that should, IMHO, remain unpatentable. I'm not aware of a physical analog for lending something only a specific number of times, so I'm OK with that (although I'd be glad to be corrected).

Definitely an interesting conversation; I think we agree on the salient points.

Comment Obvious, Novel, and Prior Art aren't just digital (Score 3, Insightful) 240

The patent office needs to adopt a simple fact: doing something digitally that has been done physically before (like lending purchased objects just like a used book and music store, or having a digital "shopping cart" like, you know, a shopping cart) is "obvious". Someone will eventually get around to implementing it, so it is not novel and should not be patentable. At best maybe the site should get design patent coverage, or some very specific encryption algorithms should be protected in some way if in fact they are proprietary, but the idea of patenting an entire store concept should be ridiculous.

Comment Re:Biology, Economics, Chemistry, Politics... (Score 1) 265

It's been a long time since I've read Kuhn (it was required in college along with a counter position by someone whose name escapes me), and I remember not fully agreeing with every nuance that he wrote. But I completely agree with parent that Biology has gone through paradigm shifts, and I'd say it's clear that other things have as well. We live in a world where cloning is a reality -- this is a paradigm shift from when it was science fiction. Microbiology changed the way everyone interacts with the world right down to the soap we buy.

Obviously economists are having some identity crises lately that I suspect will be looked back upon as somewhat of a paradigm shift. Chemistry has moved quite far from alchemy and nuclear chemistry was unthinkable or understandable not long ago. Manufacturing went from fully manual through the assembly line and is now in a phase of robotics and desktop printing. Spectrum disorders are fairly recent diagnoses (as a specific category at least), in the psychological sciences. Triceratops were recently reshuffled and we're far more likely to imagine dinosaurs with feathers now than when I was a kid. Don't get me started on the Brontosaurus. Warfare is changing; it's barely important to have a person with a weapon see someone opposing them directly in some cases. The concept of water on other planets is far more accepted and direct evidence is changing common thinking on the topic. We have a near-permanent presence in space. Gender views are changing... gay-rights are probably at a social tipping point, at least in some places, and any social-science is strongly affected.

All of these things are dramatic changes. Are they paradigm shifts? Some of them... one of the problems with paradigm shifts is that they're nearly impossible to see from the inside. And yes, they may appear more subtle than "the earth is round - no it's not" debates, but remember that those took quite a long time for large populations to accept themselves. It's difficult to predict how any new way of looking at something will affect the future, but some of these changes will eventually be looked back upon as paradigm shifts.

Comment Re:BIG data? (Score 3, Insightful) 155

Ah, good, I'm glad someone already mentioned this. Big did not deserve to be italicized there not only because 260 million minutes of video isn't "that much" (!) in terms of internet streaming viewers, but the statistics aren't really based on number of minutes of video analyzed... the main statistics are more about viewership and certain events (video startup, video freezing), which could be surrounded by hours of uninteresting video time that didn't really contribute to some of the metrics.

Netflix has, what, 20+ million individual viewers per month? 10 hours a piece isn't hard to imagine. As the parent pointed out youtube is much larger than that.

It's still very interesting analytics. it's not always the size that matters with "big" data. But let's not get carried away with the italics now people... this way madness lies.

Comment Re:Ouch. (Score 5, Informative) 362

To my dismay, I worked on this project. The project started with controversy -- the Oracle bid that beat out SAP like seven years ago was surrounded by complaints. The article skips some details. CSC (Computer Sciences Corp, who is quoted) was the main driver of about $800-million of that spending. It is accurate to say that this change didn't affect them, but that's because hundreds of people had already been laid off or moved of the project between last September and last March.

There's enough blame to go all over the place. Years spent in requirements that weren't turned into code; time spent passing blame back and forth across development teams who were so large and segregated that they rarely communicated properly, both within the Air Force and within CSC and between the other teams. At it's peak I believe the project had roughly 800 people on it. I don't know what the maximum size a development project should have, but it's got to be smaller than that. That number includes everyone, trainers, managers, and some key initial users and testers, but still it's a very high number.

The Air Force tried several times to realign the project, but there were contractual disputes or, once that was over, difficulty deciding what to keep and what to scrap, which lead to a death spiral where everything went back on the drawing board and I think ultimately leadership just lost hope.

It wasn't a complete loss, though. A few small teams, including the one I was previously on, have survived. We built a robust data quality system and are working on some enterprise data dictionary and master data tools, which will help the systems that are left behind. With hundreds of systems supporting a half million users, $1billion probably isn't off the chart -- at least not had this been a successful project, but the worst part is that there's still much work that needs to be done, and now someone will have to start over... again.

Comment Re:Slackware on floppies (Score 1) 867

I was a fan of the InfoMagic packages, which IIRC included Slackware's entire distro and a huge number of other files. Does anyone else remember Elfos fondly? I have a picture of the CD around here somewhere... I even had a T-shirt.

I was Slackware -> RedHat -> Ubuntu for my primary, but I tried CentOS, Fedora, and Knoppix in there for non-trivial amounts of time. Does Android count? ;-)

Also, I played with FreeBSD but that doesn't count for multiple reasons.

Comment I freakin' love Kaggle (Score 3, Interesting) 19

I've been working on the Heritage Health Prize that Kaggle is running for over a year now. It's a fantastic way to learn data science and tackle real world problems with real data and a co-op-etitive spirit. The forums and winning solutions are great for learning the art, and if you've never used R, it's a great opportunity to learn it and talk to people that have a ton of experience in the area.

Comment Re:Important question (Score 1) 118

Someone with more recent experience should answer this since, as I said somewhere up above, I haven't used one of these in a while. That said, several of the USB Microscopes I've used work as fairly standards-compliant webcams, particularly (IIRC) with GSPCA and V4L support.

What you won't get on linux is support for computer controlled focus, enabling and disabling lights or any other features that the microscopes have, unless they have explicit linux tools bundled, which some may have, but I have not worked with any so I can't say specifically.

Comment Re:Intel QX3 (Score 2) 118

Hear hear. I had one of the old Intel QX microscopes and loved it. Any USB microscope will let a child see things on a nice big screen and many of them, like the QX3, are ruggedized for kids. It's pricey, though, compared to other desktop USB microscopes of which there are many, as other posters are mentioning (search Amazon), but the design should make it worthwhile. I haven't used the software in a while; it was a bit buggy last time I played with it, but hopefully it's improved.

If you really want eyepieces, there are some that do double duty -- Celestron makes entry-level microscopes that have replaceable eyepieces (so you can use it like a traditional microscope or mount a camera on it, but not both at the same time), or ones with LCD screens directly built in instead of traditional viewfinders.

As you near the $250 barrier you can get scopes that start to do double-duty, and your options increase. These are more lab-ready and may need more oversight for a 7-year-old, and frankly I find them harder to use for kids, but the ability to look through an eyepiece to set up and focus a shot and then discover the results on a computer make for good two-person work in parent-child fashion.

Comment Re:The reason you haven't heard about it (Score 3, Interesting) 207

Out of points or I'd mod you up for mentioning Future Crew. I still have a 3.5" floating around with some great old demos from that era.

I'm not surprised anyone hasn't heard about the demoscene any more, but it's easy to figure describe: fit the most impressive graphics and sound you can on a very small memory footprint -- 64kb is the common limit now apparently, but I seem to recall some good "anything you can fit on a disk" rules and some impressive 4k demos.

Other than raw coolness, the point is to "do more with less" -- push creativity, efficiency, and algorithm design by artificially limiting resources. It's very impressive stuff, and having been out of touch for a long time, I'm AMAZED at the quality these vids are putting out. Has anyone been able to run the .exe's to verify? My rig keeps failing them -- I imagine it requires specific hardware and software versions.

Comment While 70% isn't a failing grade exactly... (Score 1) 371

I love science. From the article:

"a projection from 1981 for rising temperatures [...] has been found to agree well with the observations since then, underestimating the observed trend by about 30%."

Most predictions that are off by 30% aren't really considered "remarkably accurate".

I'm uncertain about this one (i.e. I'm not trying to troll). It's nice that the prediction does seem to chart well, and since the weather can't be accurately predicted 7 days in advance, any climate model that outperforms naive ones over 30 years is an achievement. Still, the point of the (recent) author is clear -- they're trying to emphasize that a warming trend is occurring and that people predicted it 30 years ago using "science" that is still sound -- at least some discussion of why a 30% error rate is acceptable should take place.

Comment You've come to the right place (Score 5, Insightful) 635

Well, you certainly won't find a shortage of opinions on Slashdot. :-)

If you think the software is good enough, then a non-commercial version with limited registration information (e-mail, name), and some very privacy-thoughtful reporting (maybe to ensure that the registered serial numbers are only being used by one machine at a time), should only be a good thing. Getting your software into the hands of the people that might buy it will get them used to it, relying on it, and eventually make them customers. But (as others here have posted), don't abuse the "spying"... if you start to make money by pilfering the free registrations for ancillary information you're just going to annoy your users and they'll be more apt to pirate the software or use fake registration information. Giving them something in return, like forum access for very limited support, is helpful.

Other possible models include giving the software for free and asking payment for support -- nearly all profitable Open Source companies do this, and even if you leave the source closed the business model isn't terribly different. You could publish a "crippleware" version, which I find rather annoying, unless the limits are such that the home and non-commercial users needs are really satisfied, and the only people that need to pay $10k for the software are those to whom it's worth it. I give a nice shout out to Andrea Mosaic for doing this correctly (at a lower price point).

Lastly an option you may have missed may be to ignore it because it isn't a problem. A pirated version by a customer that wouldn't have paid anyway probably doesn't hurt you. A pirated version by a customer that would have paid may actually turn into a sale if they need assistance. When you upgrade, if the pirates liked it, they'll want the next version, so they may buy. It may be pirated by employees or students who years later may remember it and decide to buy it. You never can tell.

In those cases, you're getting your software out there and used; you could take an "all exposure is good exposure" attitude. The fact that you didn't list the name of your software in the original post here means that you may not think that way, or you may outright disagree.

Still, piracy is going to happen. At least you're asking the right questions. Don't let yourself get dragged into a fight with the anonymous masses on the internet, though -- you'll probably lose.

Slashdot Top Deals

We are each entitled to our own opinion, but no one is entitled to his own facts. -- Patrick Moynihan

Working...