Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Submission + - Nmap team releases 5 gigapixel favicon map 1

iago-vL writes: From the creators of Nmap comes the largest survey of this its kind ever performed: the favicon.ico files of over a million Web sites were scanned, compiled, and sorted to create a 5 gigapixel image, blowing their 2010 survey out of the water! It's searchable, zoomable, and incredibly fun to play with! Can you find Slashdot without cheating? (Hint: it's near Facebook)

Comment Re:An O'Scope (Score 2) 215

The MHz number on the box is the bandwidth, not the sample rate. The sample rate is measured in samples per second (GSps). A 100MHz scope is probably adequate for analog signals up to 100MHz. However, if you're debugging a digital signal, you want a scope that has 3x the bandwidth of your signal's base frequency or more, because square waves are composed of the base frequency and an infinite number of harmonics. If you only have bandwidth for the base frequency, your square wave will be distorted into a sine wave and you won't be able to accurately see ringing, glitching, and other artifacts.

I have a 1GSps, 100MHz scope. I wouldn't use it for serious digital signal debugging above 30MHz (which is 33x lower than the sample rate), due to the bandwidth constraint. It's adequate for seeing if stuff up to 100-150MHz is "there" though (and for reading the bits out if you just want to use it as a poor man's logic analyzer), just don't expect to diagnose signal integrity and timing issues at those speeds.

Submission + - Google Books case dismissed on Fair Use Grounds

NewYorkCountryLawyer writes: In a case of major importance, the long simmering battle between the Authors Guild and Google has reached its climax, with the court granting Google's motion for summary judgment, dismissing the case, on fair use grounds. In his 30-page decision (PDF), Judge Denny Chin — who has been a District Court Judge throughout most of the life of the case but is now a Circuit Court Judge — reasoned that, although Google's own motive for its "Library Project" (which scans books from libraries without the copyright owners' permission and makes the material publicly available for search), is commercial profit, the project itself serves significant educational purposes, and actually enhances, rather than detracts from, the value of the works, since it helps promote sales of the works. Judge Chin also felt that it was impossible to use Google's scanned material, either for making full copies, or for reading the books, so that it did not compete with the books themselves.

Comment Re:Corrective lenses adaptation? (Score 1) 55

You can correct for chromatic aberration in software, to a varying degree. You can approximate it (so the aberration is ~1/3 of what it would normally be, by aligning the centers of the primary colors) for arbitrary inputs, e.g. a photograph captured with an imperfect lens (image editing software can do this). You can do it on the output side with perfect accuracy if you're displaying an image using three monochromatic light sources (e.g. a laser display), since the three wavelengths involved would then be distorted by three discrete amounts that are perfectly correctable. For RGB panels like LCDs and OLED displays the primaries aren't monochromatic, but they are more concentrated around the dominant wavelength than a natural light source with a uniform frequency distribution, so you get a result that's somewhere in between. This is what the Rift does to correct for chromatic aberration in software.

Uneven pixel density is only a problem if the pixel density at the sparsest point is too low. Today's displays already exceed visual acuity when viewed at a reasonable distance (e.g. a Nexus 10 or an iPad with a Retina display at a normal operation distance), though of course that is without covering a large fraction of the FOV. Give it a few more years and it'll only get better - once we have 8K phone-sized displays this will probably be a non-issue.

Comment Re:Some numbers for reference. (Score 1) 164

Obviously you were concerned enough to measure if there was any imminent danger

I wasn't concerned. I'm just a curious geek who happens to own a logging Geiger counter.

The issues is not radiation emitted, it's the radionuclides emitting them.

That is true. Ingesting radionuclides is definitely a much bigger problem than external exposure.

That's great but it's more likely that Japan now has very high concentration of radionuclides in very specific places in the ocean or land or sea, some of that area will be producing food. The likelihood of encountering in the food chain is now higher than the initial accident because the radionuclides have propagated further up the foodchain so if you ate food in Japan the likelihood of ingesting it has increased. The longer you stay there the more you will increase your chances of a permanent dose in your body, the more times you get one of those means the probability of some sort of cancer increases. A big problem for the locals, but not really a worry for you.

It's hard to get real data about these issues, as there is a ridiculous amount of fearmongering in the media. For example, there are plenty of articles talking about the spread of radiation in the Pacific Ocean from Fukushima to other countries, but a simple dilution argument shows that any claim of danger from that effect is nonsense - the ocean is ridiculously bigger than the quantity of radioactive water released, and even if you can measure the effect, it's going to be negligible in practice.

Locally produced food is another issue, and yes, the possibility for concerning contamination exists. Supposedly, food is tested in Japan, and the limits are stricter than in the US. Converting that into the probability that you will eat something that exceeds the limit (and by how much) is tricky. If you know of any serious studies attempting to calculate this, please do let me know.

FWIW, I do plan on moving to Japan in the not too distant future.

Comment Re:Some numbers for reference. (Score 3, Interesting) 164

Interesting. I didn't stop at Fukushima station, but I went past it on the Shinkansen with my Onyx in the outer pocket of my backpack (obviously it won't be picking up any alpha radiation there, but still useful data). Looking closely at the logs it is possible that one spike correlates with roughly the time I'd have been in that area, though I really would have to check the times closely. The Onyx was set to log every 10 minutes so it's also possible that it just missed the interesting times. The peak readings were about 0.2uSv/h, and that wasn't near Fukushima. Tokyo averaged somewhere around 0.11 uSv/h, while Hakodate (where I stayed a couple of days) was around 0.07uSv/h.

Interestingly, my return flight hit 3.0uSv/h, higher than the first flight (I just dumped the last chunk of the log which I hadn't done yet).

These readings seem to be using the default calibration of the Onyx. I haven't delved into the details yet (the firmware is still WIP as far as I can tell), but AIUI they are supposed to come calibrated, so either the default calibration is spot on, or the firmware isn't using the calibration data, or my firmware upgrade wiped the calibration data, or the calibration data was never there. Either way, I assume the default conversion factor is good enough for rough measurements of background radiation.

Slashdot Top Deals

Life is a healthy respect for mother nature laced with greed.

Working...