Forgot your password?
typodupeerror

Comment: Re:Okular print support (Score 1) 116

by Colonel Sponsz (#34852556) Attached to: Interview With KDE On Windows Release Manager Patrick Spendrin

Addendum: I downloaded the latest version of Sumatra and fed it a sample document (a handbook of medical triage chosen as a representative use case: it's fairly large (290 pages), and contains text, photographs and vector diagrams). I went to the first diagram I could find (on page 2), started zooming in to see if performance had improved, and presto: crash! From download to first crash within a minute. If that's not poor stability, I don't know what is.

(And if you're wondering, zooming was still horribly slow)

Comment: Re:Okular print support (Score 1) 116

by Colonel Sponsz (#34852292) Attached to: Interview With KDE On Windows Release Manager Patrick Spendrin

I'm surprised to hear you say Sumatra has "poor rendering, performance and stability." Could you explain some of what you're referring to?

  1. Poor rendering.
    It simply fails at rendering too many PDFs out there to be useful to me as a main PDF reader. It's not that it doesn't support the bloat features of PDF (embedded video, 3D, etc), it's that it fails at any complex enough layout, and roughly half of all the other stuff I've thrown at it.
  2. Poor performance.
    Even simple documents have noticeable hickups when switching pages, and complex PDFs (basically, "anything more than a pure text, non-fancy-layout OO.o document") are simply a pain to read. And you can just plain forget about zooming - zoom in or out enough and it can take seconds to render the page you're on (and this is on an i7 machine with tons of RAM).
  3. Poor stability.
    It crashes. A lot. It can happen at any time on any PDF, but attempting to read a large enough document (take your pick of e.g. these) is a surefire method - within 5 minutes of active reading, Sumatra will die a horrible death.

I wonder what the last version you tried was. The underlying MuPDF engine has come a long way since Krzysztof Kowalczyk decided to drop the option of using Poppler as a backend and focus on MuPDF back in version .9 two years ago.

The one I still keep around (since every PDF reader has its quirks, it's usually a good idea to have several) seems to be v1.1, from 2010-05-20. To which all I said above applies.

Comment: Re:Okular print support (Score 2) 116

by Colonel Sponsz (#34846282) Attached to: Interview With KDE On Windows Release Manager Patrick Spendrin

Or you could just run Evince, which surprisingly works great under Windows. Both Evince and Okular use Poppler as the PDF backend, so the rendering should be the same, but Evince doesn't require the bloat of the entire KDE on Windows package.
I've used the official Adobe reader (yech!), Sumatra (poor rendering, performance and stability), Foxit (nag nag nag) and Evince. Evince is the best one by far.

Comment: Re:Does it make it too easy? (Score 5, Insightful) 253

by Colonel Sponsz (#34793622) Attached to: Honeywell To Sell Miami-Dade Police a Surveillance Drone

I'm not saying one is right and one is wrong, but I find the contrast confusing... Is it simply the case that surveillance is OK provided it's difficult? If that's the case, why do we allow helicopters at all? Or in the case of manned surveillance, why are the police allowed to use radios? Shouldn't they have to use call boxes? Either we're OK with the concept, or we're not.

The difference is that "difficult" surveillance can't be mounted on a massive scale - they actually have to be frugal in its use. They can't go around tracking everyone; they have to be pretty sure they have the right people to follow before committing the resources to it.
"Easy" surveillance OTOH, can be used to simply monitor everyone. Well, actually, that should read "will" instead of "can". It's basically Murphy's law as applied to surveillance: if the opportunity exists to misuse a law or technology, it will be misused. Surveilling everyone is way easier than bothering with all that pesky "probable cause" nonsense.

Comment: Re:Google may fail, but it has a lot of momentum (Score 1) 238

by Colonel Sponsz (#34754932) Attached to: Google's Next Challenge, Spam Results

I've noticed Google getting less and less effective all the time. I do a search, and 3/4 of the sites are 'fake' results that send me to ad pages with their own (totally useless) search results.

On important searches, I often spend 10-15 minutes tuning my query to help eliminate those sites so I can get to the real results.

For me it's the same, but for a different reason: it's Google breaking their own searches. Specifically, how they silently replace your actual keywords with what they think you mean.
Remember Altavista? How you had to do +foo -bar +baz +frotz to get any decent results? That's Google today for me. Keywords have gone from "require all" to "require some... maybe", so if I search for foo bar baz I'll get tons of results with just one or two of the keywords (and sometimes none!). So I have to prefix them all: +foo +bar +baz. Or I search for just foo, which Google might then decide means that I actually want to search for not just foo but also foob and fooc. But I wasn't interested in any of those! I was interested in what I searched for. So I have to search for +foo instead. And that might not be all: depending on Google's mood, I might still have to override even more false positives with +foo -foob -fooc (which means that I will start introducing false negatives as well). And sometimes.. even that isn't enough - occasionally, even that will be overridden by a "this is what we think you mean, and we know better than you" search, where it searches for some completely unrelated keyword.

Hey Google: the "Did you mean..." prompt was great. Silently replacing all my search terms and making me jump through hoops to use your search is not. It's fucking awful interaction design.
Oh, and while we're at it, you apparently have time to spend on on introducing completely unnecessary, CPU-draining shit like that awful DHTML fade (the reason I no longer whitelist google.com in Noscript) but not any to spend on fixing your completely broken non-English searches (they work for some altogether different alphabets, but a whole bunch of extended Latin alphabet searches are completely impossible)...

Comment: Re:Not news. (Score 5, Informative) 58

by Colonel Sponsz (#34749290) Attached to: Radiation Detection Goes Digital

Mod Parent Up.

Beta particles (electrons ejected from the nucleus, basically) have a mean free path of about a foot in air. Place anything else in between, like a thin sheet of aluminum or a little bit of plastic, and it sucks up the betas real quick.

The other big problem is that gammas are quantized, beta particles are not. When something radioactively decays, it gives off gamma rays of distinct, unique energies -- very useful for determining the radioactive isotope you're looking at. Not so for betas; they're emitted over a wide range of energies, and it can be very difficult (but not impossible) to tell what you're looking at by betas alone. I don't mean to downplay what this accomplishes, in a nice, small form factor. But this doesn't revolutionize the world of radiation detection. To date, no one has really been crying for a combined, digital, gamma and beta detector. Maybe if you build it, they will come, but I don't see a large market for this.

The (now grand)parent should be modded up, but one thing in your post is just plain wrong: the range of beta radiation is not "about a foot"; the range of beta radiation depends on its energy. Betas from a low energy nuclide like 35S, for example, do have a range of almost exactly a foot (32 cm) in air, but the high energy beta radiation emitted by 90Sr/90Yr OTOH has a range of slightly above 10 meters in air. And as for the quantization, beta emitters too have very distinct energy distributions (which you can look up in any good data sheet).
But you're right about portable beta spectrometry being pretty "meh". If it's high enough energy to worry about, it's easier to just look at the bremsstrahlung, really.

BTW, this was a really horrible article. There were no technical details whatsoever (well, just enough to realize that someone had been trying to explain scintillation to the very obviously non-techie journalist), and they seem to mix up radiation spectrometry with plain radiation detection...

Comment: But why have a catapult at all? (Score 1) 314

by Colonel Sponsz (#34649044) Attached to: Navy Uses Railgun To Launch Fighter Jet

What I'm curious about is why they're using catapults at all - the Russians and the Brits, for example, use a "ski jump" instead. And I read somewhere (unfortunately, I can't remember where - damn you, source blindness!) that that approach is actually better, in terms of aircraft launch rate, as you don't have a complex catapult system that has to be reset for every plane.

So... why are US carriers using catapults, when they seem to me to be just another point of failure? Can someone enlighten me?

Comment: Re:Of course it's under fire (Score 5, Insightful) 152

by Colonel Sponsz (#34485884) Attached to: NASA's 'Arsenic Microbe' Science Under Fire

If a scientist other than themselves didn't make the discovery, it's obvious the other guy's methods are flawed!

Scientists can be such whiny, arrogant assholes...whatever happened to science being done for science, rather than recognition?

You do realize that criticizing research is a crucial part of the scientific method, right? Letting claims go unchallenged is the domain of religion, not science.
People are ripping apart this paper because it makes grand claims based on a potentially flawed methodology. If the results can be replicated with those flaws fixed, then the NASA team's research recieves further validation. If not, hey, I guess they jumped the gun. Either way, you have to identify the potential flaws, which is what people are doing here.

Also, to once again quote Rosie Redfield:

There's a difference between controls done to genuinely test your hypothesis and those done when you just want to show that your hypothesis is true.

Comment: Re:Papers and Questions (Score 4, Informative) 152

by Colonel Sponsz (#34485752) Attached to: NASA's 'Arsenic Microbe' Science Under Fire

So my question is basically what does it matter what they grew or washed the bacteria with when, in one of the many investigations, they found that gel purified genomic DNA had elevated levels of arsenic in them? Unless I'm misunderstanding what 'gel purified genomic DNA' means, I would assume that there's still several pieces of data in these experiments that point toward an organism that uses arsenic in place of phosphorous -- even if only somehow partially. Would this sort of spectrometry reveal any arsenic at all in my gel purified genomic DNA?

From Rosie Redfield's critique:

Could 400 atoms of arsenate per genome be due to carryover of the arsenate in the phenol-chloroform supernatant rather than to covalent incorporation of As in DNA? The Methods describes a standard ethanol precipitation with no washing (and no column purification which would have included washing), so I think some arsenate could easily have been carried over with the DNA, especially if it is not very soluble in 70% ethanol. Would this arsenate have left the DNA during the gel purification? Maybe not - the methods don't say that the DNA was purified away from the agarose gel matrix before being analyzed. This step is certainly standard, but if it was omitted then any contaminating arsenic might have been carried over into the elemental analysis.

Comment: Re:Horrible article (Score 2, Informative) 165

by Colonel Sponsz (#34352810) Attached to: FedEx Misplaces Radioactive Rods

Apparently, it was 684 MBq of Germanium (which should mean it's 76Ge). Unfortunately, that isotope is not in any of my data sheets, so I can't tell you what that means in terms of dose rate...

Correction: it was 68Ge. As I stated, I couldn't find it in my data sheets, so I just looked at a list of germanium isotopes - which only listed naturally occurring ones. Silly me!

I do however have data for the next step in the decay chain, 68Ga (68Ge decays by electron capture, so let's just disregard that first decay). The first sheet I found put it at 0.103 mSv/h/MBq beta skin dose and 0.173 mSv/h/MBq gamma at 30 cm. At 684 MBq, that means a dose rate of about 70 and 120 mSv/h at 30 cm, respectively.
So no, these sources weren't particularly dangerous. Even at that close a distance (if you don't speak metric, 30 cm is about a foot), it would take half a day of exposure to become acutely ill (radiation sickness starts setting in at around 1 Sv). And as radiation sources don't tend to be that big, you can probably consider these rods point sources, which means that the inverse square law applies: at double the distance - only 60 cm - it would take four times as long.

Comment: Horrible article (Score 1) 165

by Colonel Sponsz (#34352482) Attached to: FedEx Misplaces Radioactive Rods

This is supposed to be "News for Nerds" - so why link to a Fox News article with almost no technical information whatsoever? For example: what nuclide was involved? How high was the activity?
After some searching on Google News, I found this article. Apparently, it was 684 MBq of Germanium (which should mean it's 76Ge). Unfortunately, that isotope is not in any of my data sheets, so I can't tell you what that means in terms of dose rate...

Comment: Re:at least the public tranist sucks in the US (Score 1) 890

by Colonel Sponsz (#34331608) Attached to: Next Step For US Body Scanners Could Be Trains, Metro Systems

Despite the fact that you're using "lol" as a word and thus deserves to be banned from all means of communication, I'm going to reply to this.

Thankfully the US doesn't have (m)any widely-used metro systems. How about implementing this on a bus as well ... lol Europe will never go for this, and this is another reason that I have no interest in returning to the states.

O RLY? And don't forget that the EU is for absolutely anything that violates privacy or decreases freedoms in any other way, so they will probably mandate it all across their territory.

Comment: Re:There's only one upgrade needed for Google (Score 4, Interesting) 252

by Colonel Sponsz (#34178478) Attached to: Google Give Searchers 'Instant Previews' of Result Pages

Yep, that does cure most of what ails it. But 1) most people don't know how to do this, and 2) it's a damned nuisance even tho I can do it with one tick of a checkbox.

And then you've got to turn it back on to get any useful behaviour from Google Maps, tho they've become so cumbersome of late

Ah, but you don't!
If you're using Noscript, whitelist maps.google.com (by default, Noscript whitelists the entire domain - but you can whitelist subdomains manually) and gstatic.com. There's no need to whitelist all of google.com.

Comment: Re:Alternatives? (Score 1) 403

by Colonel Sponsz (#33962812) Attached to: US Elections Dominated By Closed Source. Again.

Some ten year old hanging chads would like to have a word with you.

My local district still uses a paper ballot, but let's not pretend it doesn't have its own limitations, too.

And hanging chads are an artefact of - guess it! - voting machines. In this case mechanical ones. Pure paper ballots work just fine all over the world - voting machines are needless automation, and can and will fuck things up.

Faith may be defined briefly as an illogical belief in the occurence of the improbable. - H. L. Mencken

Working...