Forgot your password?
typodupeerror

Comment: Re:Can we get a summary of that excerpt, please? (Score 1) 138

I weep for humanity...

"weird writing style" apparently means "written beyond a 6th grade reading level" (which is incidentally what USA Today is written for, by and large — a good reason to aspire to better news periodicals, even though editorial standards are slipping across the board)

Incidentally, the writing style is not that uncommon, and some of the techniques he uses can be found in other great novels of multiple genres (e.g., detective novels). At least one review describes Starfish as a thriller.

I assure you the excerpt makes sense. If you have trouble understanding this, perhaps you should go read more, and in greater variety. They used to teach reading comprehension in schools, but I'm starting to think programs like No Child Left Behind may have de-emphasized that in exchange for teaching kids how to pass more and more standardized tests that focus on bare essentials. I guess you don't need to have excellent English skills to be a good consumer.

Comment: Re:Type 1 vs type 2 diabetes (Score 1) 92

And I would be one of those skinny type 2 diabetics. Rail thin, never got above 200 pounds (just under 91 kg) at a height of 6'2" (just over 1.8 m), and am now comfortably around 160 pounds. I'm currently taking metformin and insulin; I've tried actos, but it just didn't help that much (especially considering bladder cancer is a potential side effect), and I discontinued it after starting the insulin.

I've had nurses look at me and innocently say, "Well, you don't look diabetic!" Which is code for, "You don't look fat!"

I struggle to keep my sugar levels in control, even with moderate exercise and diet. I spill ketones at the drop of a hat. (My urine smelling like paint thinner was one of the first clues there was a problem, along with neuropathy.)

I still get neuropathy and other annoying symptoms. So far, no vision changes.

This is a matter of genetics. Almost everyone on my father's side of the family has diabetes; most of the blame can be placed squarely on the family tree not being an acyclic graph.

Comment: Re:Type 1 vs type 2 diabetes (Score 1) 92

I get the impression Metformin is more of a block slowing up sugar uptake which reduces the amount of insulin needed to cope with the sugars and carbs.

That's only one of the things that metformin does. Metformin primarily suppresses gluconeogenesis in the liver, and secondarily increases insulin sensitivity. Far down the list is decreased absorption of glucose in the GI tract.

Comment: Where did that come from? (Score 2) 470

by LionMage (#40158053) Attached to: Windows 8: More EULA, Fewer Rights.

Jefferson was our second most intelligent president (estimated IQ of 160). We should listen to him.

That's pretty interesting, since Jefferson lived when IQ tests hadn't been invented. Furthermore, I have to wonder if this "estimate" (based on what, exactly?) takes things like the Flynn effect into account.

In point of fact, Nixon has a known IQ of 143 and therefore the highest IQ of all presidents who were actually tested. That is in no way an endorsement of Nixon as the smartest president.

Considering that many biographies of Jefferson place him as the most intelligent of any sitting US president, I have to ask who you think was the most intelligent president, someone who bests Jefferson with an IQ that you estimate at 160?

You also conveniently forget that Jefferson advocated periodic revolutions, whether bloody or bloodless, and by logical extension a new constitution would have to be ratified after each such revolution. (Think of the French, who routinely adopt new constitutions.) So while Jefferson may have been a strict Constitutionalist by our modern reckoning, he also likely did not expect our current constitution to last as long as it did.

On a different tack, it's pretty obvious that your response to smooth wombat was meant to be snarky. You haven't really addressed the fact that he raises a valid point: Laws must stay relevant to their times. If you're going to invoke Jefferson to advocate for a system of governance where nothing ever changes and technical progress is impeded for the sake of preserving an antiquated social and legal framework, well, I'm afraid I can't get on board with that simply because of Jefferson's cyclical view of governments and their founding documents.

Comment: Re:And? (Score 1) 185

by LionMage (#37473058) Attached to: OnStar Terms and Conditions Update Raises Privacy Concerns

Although siride did a good job of rebutting your nit picking, I just wanted to point out that "relies on electricity" is a poor definition of electric.

Here are some of the definitions I pulled from dictionary.com for electric as an adjective:

  • pertaining to, derived from, produced by, or involving electricity ("pertaining to" is pretty broad)
  • producing, transmitting, or operated by electric currents
  • electrifying; thrilling; exciting; stirring

All of those senses of the word have examples showing common and accepted usage. The "pertaining to" sense would seem to apply to an "electric bill."

Comment: Define "synthetic" (Score 1) 91

by LionMage (#36686426) Attached to: Spanish Surgeon Performs First Synthetic Organ Transplant

Someone already brought up the artificially grown bladder, which was covered earlier this year, so this surgery already seems dubious as a "first synthetic organ" transplant. The BBC article title says first synthetic windpipe, but the subtitle says first synthetic organ. I call shenanigans (and suspect a bit of nationalism at work).

However, what about the Jarvik artificial hearts? Those were developed and transplanted years ago. Don't those qualify as synthetic organs, since they are artificial yet perform a similar function to a real heart?

Comment: Re:Document, document, document (Score 1) 96

by LionMage (#36686298) Attached to: Ask Slashdot: Open Patent Licenses?

IANAL, but my understanding is that even if/when the U.S. switches to a "first to file" system, prior art will always remain relevant... it just complicates things because you have to establish that the prior art exists before the filing of the patent, not before the "inventor" claims to have invented the innovation. I guess one could argue this standard would be easier to meet since the act of filing a patent typically comes well after the process of inventing something, except maybe in the case of so-called "submarine" patents.

That said, I'm really unhappy about the U.S. seriously considering moving away from the "first to invent" system. Yes, our system is more litigious, and therefore one can argue it's more costly, but also seems less fair if someone legitimately did invent something first but couldn't afford to beat the other guy to the patent office. I'm hoping someone kills the switchover before it goes into effect.

I have personal memories of documenting every little thing I did at a start-up company in engineering notebooks and composition books where the sheets were all bound by sewing and adhesives -- makes it easy to see if a page has been added or removed. Spiral bound notebooks are not good for documenting stuff you've done, and loose sheets of paper aren't really good either. Digital records are easy to forge. Old school is the best way to document your work should there be a patent challenge.

Comment: Re:That's because SciFi sucks (Score 1) 292

by LionMage (#35910048) Attached to: Revolution of the Science Fiction Authors

I liked your observation better when Theodore Sturgeon made it: 90% of everything is crap. Of course, you seem to be claiming that the value is more like 99%, but we all know that 95% of all statistics are made up on the fly.

News flash: Most other genre fiction is crap, too. For that matter, most mainstream fiction doesn't pass the test of time and is quickly forgotten, if it ever was considered "literature" in the first place.

As for filters, I would suggest that you start paying attention to book reviews. Analog still does reviews of science fiction novels, for example. Amazon posts both user reviews and reviews by established periodicals. If you need something to inform your selection process, book reviews are a good place to start.

If you don't have reviews to go by, or enough reviews to go by, there's always the reputation of the author himself. (If you happen to not like an author who is otherwise well-regarded, that's fine, but authors tend to work hard to earn a reputation.) And if an author is older, sometimes they fall into the category of authors who improve with age, while others fall prey to the Hemingway syndrome (writing their best work first). Find out which category an author falls into and then consume either their back catalog or their latest works, depending.

Comment: Re:That's because SciFi sucks (Score 1) 292

by LionMage (#35909978) Attached to: Revolution of the Science Fiction Authors

Frank Herbert and Heinlein were both proverbial 800 pound gorillas. Both were best when their work was heavily edited. Both later in life got full of themselves and started pumping out works that no editor would dare edit for fear that they'd lose their rock star author.

The first Dune was good, after the first 20 pages or so. It took me about 3 or 4 attempts before I finally got "into" it enough. I wasn't impressed enough to tackle the rest of the series. Just because you happen to think the entire series is great doesn't mean everyone does or even should agree.

Heinlein always struck me as preachy, and his books were a platform for preaching to his audience. Most of Heinlein's followers (and I use that word deliberately) strike me as being very similar to cultists. It could have been Heinlein instead of Hubbard who founded a religion, after all...

As for Asimov... while he's a well-loved author of SF who published over 600 books (Wikipedia claims "over 500," but I have read various estimates from 600 to 800+), it's worth noting a few things. First, Isaac Asimov's best form was the short story, not the novel; the man couldn't do characterization, and it showed in his longer works. Indeed, ideas seem to be the central characters in many of his works. (I have to say, though, The Gods Themselves was a great novel and had OK character development. Not stellar, but not awful either.) Secondly, many of those published works were non-fiction. I don't see this as a detriment, since Asimov is a very entertaining writer with a gift for making complex ideas seem simple. Thirdly, I believe it is because Asimov was so prolific that we have so many examples of his work we can point to as "good science fiction." After all, most of us know he pumped out that many books, but few of us can cite the titles of more than a half dozen to dozen of them.

Comment: Re:Stupid (Score 1) 298

by LionMage (#35909730) Attached to: AT&T Admits Network Can't Handle iPhone, iPad Traffic

You do realize that US$70 is approximately equal to UK£35. (I know the exchange rate fluctuates, but...)

Also, which provider did you try in Orlando (assuming you mean Orlando, Florida, and not one of the other Orlando cities/towns in America)? Because most locales in the States currently have at least 2 competing ISPs. Generally, that's local cable (typically slow upstream and bursty downstream) and the local telco (providing some form of ADSL or, in my case, VDSL). Depending on which one of those you have, and what their policies are, the performance can vary quite a lot. Saying "when I went to America... the internet [was] incredibly slow" isn't really meaningful without further qualification.

Comment: Re:And this is news? (Score 1) 270

by LionMage (#33052798) Attached to: Java IO Faster Than NIO

There haven't been any algorithmic breakthroughs in many years for most of the computer science field? I find that hard to believe. Back in 1993, I was taking a graduate level course in algorithms, and the professor told us about at least one algorithm for multiplying ridiculously large matrices had been developed and published within the prior year (maybe it was 2 years at that time) by a Russian PhD. Granted, this particular algorithm didn't provide a speed benefit over other techniques until you hit matrix sizes of a million by a million, something on that order. But that's not the point.

You'd also be amazed at what effect seemingly insignificant choices in the implementation of an algorithm can make. The most extreme case I ever saw like that was something like a factor of 2 difference in speed. You might chalk that up to bad coding, but when that code is locked away inside libraries that ship with the language, rank-and-file developers might get stuck with a suboptimal implementation. So it's not just the algorithms themselves that yield new wins, it's careful analysis and improvement of older implementations.

Getting back to the topic of this article, I want to point out that I actually used NIO in a project in a corporate environment, and it seemed to give us wins in stability, thread utilization, and memory consumption, among other things. For the environment, it was probably the right choice. Had we been dealing with a newer Linux environment, or a less heavily loaded server, I suppose going with the "old" pre-NIO way of doing socket I/O would have been better.

Had we been on a Solaris system, I'm told the NIO way would have been the best choice hands down, but the company was moving away from Solaris. Still, this raises a valid point -- in the end, you need to tune your code for the environment in which it's running. So if the OS can't do threads well, the whole thread-per-socket approach will stink compared to select-based semantics. I haven't fully read the PDF yet, but it seems like most of this testing was done in Linux. Results for other OSes are not guaranteed to be the same.

Comment: Re:And this is news? (Score 2, Interesting) 270

by LionMage (#33052658) Attached to: Java IO Faster Than NIO

Java may not be "sexy" anymore (or "all the rage" as you put it), but it is not exactly a niche language. It still runs in surprisingly many places, like cell phone apps (yes, a lot of us still use regular cell phones, and Android is Java-ish but with some tweaks), and more importantly just about every corporate data center uses Java. That last "niche" is pretty huge, and the only thing that threatens Java in that space is dot-Net, the Java platform clone.

Java, like it or not, has become the COBOL of the 21st century. It's ubiquitous.

I agree that Perl makes code hard to maintain (especially in the sense that one developer won't necessarily readily understand another's Perl code, since everyone has his own favorite idiom), but you make a lot of claims that I don't see supported by facts. Perl CGI might be frowned on these days in some circles, but there are plenty of sites that use Perl as a basis -- including this one, Slashdot. So saying Perl is no longer used for CGI scripts is probably false, as there are plenty of folks who clearly think it's "good enough."

You're trying to make Perl and Java both sound like fads, but the truth is neither language is going away anytime soon, as each is too useful for too large a segment of the developer population.

Comment: Re:Does anyone really prefer 16x9 instead of 16x10 (Score 1) 646

by LionMage (#32970596) Attached to: Does Anyone Really Prefer Glossy Screens?

It's only funny if you didn't grow up decades ago watching 4:3 TV sets. (The first TV I remember my family owning was black and white, and it had vacuum tubes. One of the best Zenith TVs we ever owned.) For some of us, this new 16:9 screen format isn't "TV." For me, it's "high def TV," and the old format will always be plain old "TV" to me.

I should also point out that digital vs. analog doesn't enter into it, because 480i/480p content still dominates many networks -- and I'm saying this as a DirecTV subscriber. Those black side bars have become a common sight on my HDTV screen, since I absolutely can't stand seeing 4:3 content stretched horizontally unless some kind of "smart stretch" algorithm is applied (where the distortion increases toward the left and right edges of the display, but is zero at the center).

I should also point out that many laptop manufacturers deliberately choose 16:10 aspect ratio for their screens because their customers demand the extra real-estate, and because such designs allow widescreen HD content to be rendered while leaving room at the bottom or top for player HUD, controls, or for other use. Apple uses 16:10 for the MacBook Pro, for example.

Furthermore, Samsung widescreen monitors are often 16:10 (I own one for a desktop machine, purchased within the last year). I could go on. There's nothing preventing the display of 16:9 content on a 16:10 screen -- those extra scanlines are either going to be letterbox area, or they'll be used for controls or information. It's not the huge swath of black that letterboxing on a 4:3 screen causes.

I don't think your supposition on the market size for 16:10 monitors (or "non-16:9 widescreen," as you describe them) is supportable, considering how many 16:10 devices are on the market right now.

How much net work could a network work, if a network could net work?

Working...