Forgot your password?
typodupeerror

Comment: Re:Automatically means no control (Score 1) 66

Not necessarily. Machine learning algorithms like K-means clustering are designed for exactly this sort of problem. In principle, it can figure out that there are two different shirley temple images: Shirley Temple the human; and shirley temple the bright-red drink. Of course, depending on *how* you wash the image data into something that can be analyzed, it could make unexpected categories like "Shirley Temple black & white" vs "Shirley Temple mostly red" vs "everything else".

Comment: Re:Yeesh. How cheap do people expect things? (Score 1) 180

by hythlodayr (#44313509) Attached to: Why Microsoft Shouldn't Worry About Cannibalizing Their Userbases
Seeing how the competition are free--ChromeOS, Android, and Ubuntu--and consumer-level users arent bound to Windows, Microsoft cant afford to charge for a consumer version of Windows. Not unless theyre giving up consumer space. Enterprise is a different story, where users are locked in for the foreseeable future.

Comment: Re:Hmm. (Score 1) 55

by hythlodayr (#42457995) Attached to: Cassandra NoSQL Database 1.2 Released
Not AC, but you seem quite knowledgeable in the NoSQL side of things. Even if it's just for HBase or Cassandra, if you happen to have a write-up I would love to read more (I'm sure others would too). Coming from a RDBMS background, being able to tack on columns and use it to glean info is especially interesting to me, since schemas are a sacred cow. Warehouse solutions like Sybase IQ are column-oriented solutions but I don't think this is the same thing as what you're talking about.

Comment: Re:NoSQL? Then what? (Score 2) 55

by hythlodayr (#42455923) Attached to: Cassandra NoSQL Database 1.2 Released
"NoSQL" is a highly-misleading name; the SQL language is really besides the point.

The important parts of NoSQL really boils down to:
1. Very high performance.
2. Ability to handle extremely large data (on the order of tens or hundreds of terabytes.).
3. Natural way of dealing with non-flat , non-BLOB data.
4. Better integration with OO languages.

#1 and #2 all come with trade-offs, which is perfectly fine. Not all problems need ACID compliance..

#3 & #4 really goes back to the 90s , though nothing ever stuck (e.g., object-relational databases).

Comment: Re:A couple things that kept me from upgrading... (Score 1) 791

by hythlodayr (#42448131) Attached to: Windows 8 Even Less Popular Than Vista
Windows already supports two ways of interacting with the computer: The original command-line shell (discounting powershell for discussion's sake) and the newer explorer shell. And if memory services, it wasn't until Windows 95/NT 4.0 that users could really almost everything from the GUI shell so I'm not surprised that we're seeing a similar evolution with Windows 8.

I also don't think it's a bad thing that Windows 8 desktop is pushing for three input interfaces rather than two; in fact, I suspect most of the critics who've tried Windows 8 desktop and disliked it have only tried with a keyboard+mouse setup. I definitely fall into this category.

And this is where I think Microsoft made a misstep: A mouse and keyboard only ever cost $40-$80 even back in the 1990s. But a small touchscreen monitor costs $250 to $400 whereas a non-touch cost $80 to $150 today for the same size. With dual-screens being very popular, double that price. I'm not aware of any mainstream Windows app requiring touch yet and most people don't make their living trying out new OS features, making it hard to splurge for a touch screen monitor.

Comment: Re:I'll believe it when I see... (Score 1) 867

by hythlodayr (#41373747) Attached to: Warp Drive Might Be Less Impossible Than Previously Thought
That only applies if when traveling faster than light within flat spacetime. A consequence of high speed travel is time dilation, where the elapsed time is noticeably different between, say, the earth and a traveler going at say 1/10th the speed of light.

While the math is beyond me, the Alcubierre drive apparently works around this mess (while introducing others) by contracting spacetime itself. The most notable mess is the requirement of negative mass exotic matter, which we have no way of proving or disproving can even exist. This is the same type of exotic matter which can also be used to open worm-holes.

That said, I would happily fund this sort of research out of my own pocket for the rest of my life. Maybe it's because I've watched/read too much sci-fi but I'd like to think humanity can accomplish far more than than the the Internet and combustion-based travel.

Comment: Re:Parents are already "designing" their kids (Score 1) 840

by hythlodayr (#41035951) Attached to: Genetically Engineering Babies a Moral Obligation, Says Ethicist
Of course, wait until a global famine hits and see how many 6-foot individuals last.

In the big scheme of things, is being a CEO so important? What do you get out of a it? Perhaps the chance to associate with (or spawn) individuals like the following Rich kids of Instagram?

Comment: A flaw today may be a requirement tomorrow (Score 1) 840

by hythlodayr (#41035675) Attached to: Genetically Engineering Babies a Moral Obligation, Says Ethicist
Psychopathy and other behavior "problems" (binge eating also comes to mind) may have been survival traits during our hunter-gatherer times; these behaviors only become problems in the context of a civilized life. If civilization only goes back to 10k years to our 200k year history--which apparently included a number of ice ages--and given how fragile our civilization is today, I'm not ready to agree we have a moral obligation to start modifying and engineering behavior. Quite the contrary, I personally believe we have an obligation to let things run their course even if it's cruel to individuals 99 out of 100 times. Note that on a practical level and as a parent, my feelings and beliefs are just the opposite!

+ - Patent problems plague pharmaceutical industry too->

Submitted by hythlodayr
hythlodayr (709935) writes "When we think of "patent trolls" we think of the tech industry. A similar problem plagues the pharmaceutical industry, as some generic manufacturers threaten to invalidate patents and distribute their own (generic) product.

Technically this is the reverse of patent trolling but the end outcome is the same: Legitimate companies forking over money to avoid litigation.

It also highlights the more general problem of creating and enforcing only valid patents."

Link to Original Source

Comment: Working as intended (Score 2) 347

By offering as reason that certain patent becomes crucial before it expires as a reason for being shared, Google is basically shooting down their own argument. Barring patent-trolling, this is exactly what the patent system was designed to do: Grant a limited monopoly--a short-term disadvantage to everyone else but a high-risk/potentially large-returns investment--to spur constant innovation, which is a long-term benefit to society. Sure, the owner of the patent can choose to share (for a fee of their choosing); but they can also use it as an exclusive seed to build a thriving business. Or do nothing at all. There are many things wrong with the patent system (too longer? too easy to write spurious patents? too hard/expensive to be a lone inventor), but this isn't one of them and I'm disappointed at Google for voicing a short-term view like this.

Comment: Re:Wanting to abandon C: It's not about control (Score 1) 793

by hythlodayr (#40526273) Attached to: What's To Love About C?
Sure there is: If a block of memory doesn't contain a null-terminator--as expected by the library functions--then the item in question isn't a c string. If you're saying C-strings don't exist because it's purely a standard-library concept, then we'll still have to disagree: The standard libraries make the language as much as the syntax and base-types, and this is even more true for established languages.

Comment: Wanting to abandon C: It's not about control (Score 1) 793

by hythlodayr (#40523815) Attached to: What's To Love About C?
Don't get me wrong, I think C will have a very long and robust life: It's the one and only bootstrap language of choice offered by just about every OS and CPU maker. And I think that it's too bad.

Many of the reasons for wanting to discard C have nothing to do with the low-levelness of C. Dependencies in C are a nightmare to trace, and probably bloats compile times by a tremendous amount. Then there are deadly designs such as C-strings and even worse inconsistencies within the string functions (think null terminator).

A better-designed system language is certainly possible (e.g., Google Go which is dying a slow death,) and it wouldn't be the worst thing to happen if it were to catch on.

"A car is just a big purse on wheels." -- Johanna Reynolds

Working...