Catch up on stories from the past week (and beyond) at the Slashdot story archive

typodupeerror

## Comment: Re:more simplifications and fewer cats, please (Score 1)195

by Keybounce (#48642611) Attached to: Quantum Physics Just Got Less Complicated

The question becomes: before you open the box, is the cat alive or dead? Or is it somehow...both?

Your gut instinct is to say, "That's stupid. Of course it's either alive or dead. How the fuck could it be both?"

But the thing is, there are certain, non-cat-related experiments that we've done that REQUIRE the answer to be BOTH. Perhaps the simplest (and certainly the one we physicists learn about first) is the double-slit experiment. The basic idea is, you shoot a beam of something (light, gold atoms, DNA, doesn't really matter) at a slit, and it forms a pattern on a wall. It'll form this pattern even if you shoot your particles one at a time. Then, you close that slit and open another one, and fire your beam again. It forms a different pattern.

The problem is, the double-slit experiment doesn't tell you a thing about the cat.

Any single run of the cat experiment will have the cat either alive, or dead, before you open the box. 50% will be one, and 50% will be another.

Firing electrons at slits -- 1 or 2 -- does not change the fact that the electrons do have a location. We may not be able to measure it -- measuring requires an interaction, and the interaction will change what happens. There's a number that represents that ultimate limit -- plank's constant.

Toss electrons, one at a time, through a slit, see one pattern. Fine.
Toss electrons, one at a time, through another slit, see a different pattern. Fine.

In each case, you have electrons with a location. Different electrons have different locations. You don't get spots, you get slits. But slits made of one spot, then another spot, then another spot, etc. Each electron hits the screen at one point. Each electron has one location. Each cat is either alive or dead.

Toss electrons through a pair of slits, with detectors measuring the electrons: See a pair of slits (no interference) on the screen.
Remove the detectors: see the interference.

But in each case, you see spot, spot, spot. The final outcome location is different. The electron's path may not be linear, it may be doing quantum tunneling from point A to B as it "moves". In the process, it may move near the second slit, before coming back. Hence, the ability to "Detect the non local slit" and change the path.

But it has a single spot when you measure it.
The cat has a single state.

We may have no tools to describe it other than "We don't know, but a 40% chance of here, a 10% chance of here, a 10% chance of there, etc.". We may have no tools to describe it other than "We don't know, but a 50% chance of being alive".

It is not both.
It is "we don't know".
It is "we can not possibly know -- the universe does not let us know without changing the outcome".

But "Cannot know" is not the same as "Does not exist".

====

What is the "width" of an electron?

Since the location of an electron has uncertainty, there is a concept of "width" -- the area in which an electron might be found if you measured it.

If you have a single slit, then you are filtering out the "wide" electrons, that are too far off.
If you have a double slit?

If the two slits are close enough, that the "width" of the electrons includes both slits, what does the result look like? If the slits are far enough apart, that the width does not include both, what does the result look like?

My understanding is that if the two slits are far enough apart, you do NOT get any interference patterns.

The basic idea is, you shoot a beam of something (light, gold atoms, DNA, doesn't really matter) at a slit, and it forms a pattern on a wall.

But tossing bigger things at the slits means that the slits have to be closer to see the interference, and the slits have to be bigger to let the things through. Eventually, the "closeness" of the slits and the "wideness" of the slits means that you have one slit, not two.

Toss an electron at a single slit, it behaves one way.
Toss it at two slits, far enough apart that they are not within its "width", it behaves the same way.
Toss it at two slits within its width, and it "Sees" both, and behaves differently.

None of this is odd. How that behavior changes when the second slit is there is odd.
None of this has anything to do with the cat.

Each cat is alive or dead.
We can't tell ahead of time, only by opening and measuring.

Each electron goes to a location.
We can't tell ahead of time, only by putting a screen there and measure.

## + - Ask slashdot: What's the best wireless headset for around \$50-\$75

Submitted by Keybounce
Keybounce (226364) writes "What are good wireless headsets for around \$50-\$75?

This may seem strange, but the truth is, trying to find good descriptions of headphones is hard enough, let alone actual product reviews. The one that I found, that looked promising, turned out to only be stereo over a wire, and was mono over bluetooth.

Requirements:
1. Stereo audio playback.
2. Microphone (can be mono)
3. Wireless.

Desired: Sits on top of ear (so I can still hear the world around me)
Slightly negative: Envelopes ear.
Absolutely not: inserts inside ear."

by Keybounce (#48584085) Attached to: The Case For Flipping Your Monitor From Landscape to Portrait

Agreed.

The examples given in the article (yes I read it) show badly designed web sites that assume you have a certain fixed width display.

I use Stylish, so I can override bad CSS on the remote side. Bad, as in "Should be sued if they are a commercial business for violating user accessibility for people with poor eyesight". Bad, as in "If I make the text displayed by my browser big enough to read, then the website breaks and has text on top of text". Bad, as in "Since we can't tell what size width someone has, we assume everyone has 800 / 1150 / NNN pixels and count by pixels since everyone has the exact same eyesight, monitor quality, and everything else that our programmers have".

(I am not a lawyer. But there is a law in the united states of america about taking reasonable measures to ensure access to handicapped people, and bad eyesight that requires glasses and/or larger text is recognized as handicapped. Working with a system -- a computer -- that can trivially handle larger text and display it is reasonable measures. Taking that system, and forcing it to only work properly with small text should be actionable. Doing this as a business, and then claiming "We have rights, you cannot sue us" is just plain wrong.)

I have gotten used to having to patch bad CSS, and then update/maintain those patches. I have a large collection of forum / message board CSS overrides, and most forum sites for me now are "figure out which of my existing templates this is using, and set that template to include this site".

I have my browsing using the full width of my browser window, whether my browser window is the full width of my screen or not.

And, resolution is the other issue that this FA gets wrong. I would love -- *LOVE* -- to have a higher resolution monitor give me -- wait for it -- Ta DAH! -- *Better Resolution!*.

Instead, almost universally, "higher monitor resolution" == "smaller dots, and more of them".

Do you have any idea how hard it is to work at 72 points per inch on the screen? There are only two ways that I know of, and both break a lot of software:
1. Set my monitor resolution to 72 DPI. Never mind that my monitor can do 92, or 120 DPI, and that would give me a higher quality display of that 72 points per inch. Oh yea -- I forgot that some systems (hello microsoft windows brand graphical operating system) think that 92 DPI on screen is the normal and actually think that 120 DPI is bigger text.

2. Get a retina display, at 2 to 1 display. Most "aware" programs will see 144 DPI, and display something that is the right size; most "un-aware" programs will see 72 DPI and still work but with transparently sharper text. Screen recorders seem to be the only thing that get the display wrong.

Now, what about someone that -- surprise, surprise -- needs larger text? I actually want a 25% magnification to read stuff on-screen. I can read print at 12 points (*) just fine -- but that probably has to do with print being around 400-1200 DPI. I can't read 12 point at 72 DPI on-screen well at all.

(*): And, it does not help that "points" is not "points" at low point levels. I can change my font size, and below about 25 point fonts the change is not consistent. In some fonts, 12 and 13 are identical except for being "darker"; in others, it's 13 and 14. The same "point size" is significantly different character size in different fonts. Etc.

Even the question of "How do you handle better resolution" for text isn't easy. If I am displaying a 12 point font at 92 DPI screen resolution, do I take the outline produced by 12 points, and scale it to fit the higher resolution display, or do I take the outline from the higher point-sized characters, and display that unaltered? Since I get different glyphs (patterns of dots) in both cases, what's the best way?

====

TL; DR:
1. TFA basically says "Badly designed websites cannot use wide monitors, so don't fix the websites, wreck your display setup."
2. TFA basically says "Monitors only come in low resolution now". Fix operating systems so that better resolutions give *better resolutions*.
3. Require OS's to distinguish between resolution and display size / scaling.

4. Well-designed websites will work with user-specified font sizes and browser widths. Laws of some countries actually require this. None (as far as I know) enforce it.
5. Well-designed programs will have "display scale" or "zoom" features to make things look bigger on the inside of the window. Really well-designed programs can set this on a per-window basis.

## Comment: Re:First taste of Mac OS X (Score 1)305

by Keybounce (#48183303) Attached to: OS X 10.10 Yosemite Review

Full-screen was just implemented badly in OS X, to the point that I much prefer "maximize" to full-screen. In fact, I hate full screen.

1. I don't do single-tasking on my computer. Even if I want to have the full screen for a document I'm working on, I am using other apps, or other documents, or other terminal windows, etc., at the same time.

I can switch between maximized windows easily enough. Even if they are in different apps.
I cannot switch between two different full-screen windows easily, whether they are in the same app or not.

2. I use two monitors. Full-screen should mean "This window is the full size of this monitor", or "This window is the full size of my display, both monitors". There are times that I want A, and times that I want B. Let me select it.

*NOT*: "This window is the full size of this monitor, and the second monitor is unusable.

Now, I know what you are going to say: Starting in 10.9, it's possible to have the two monitors separate, giving me two different full-screen windows at the same time.

But that's no-good either. The point of two monitors is to show big things in two places. I can work at 72 DPI (sorry, these are *NOT* 25 year old eyes, they are 50 year old eyes), and still have enough screen space for a window. My main window is 1024x640, and I alternate the second monitor to either 960x640 or 1024x600, depending on whether I need to extend down, or to the right (or, in the case of iMovie, to the left -- keeping the movie on the main monitor at normal size, something that is not possible without this "split over monitor" behavior with iMovie).

3. A full-screen window is not a desktop. It's a window on a desktop, it's just the size of the desktop.

Apple does not understand the concept of "This desktop is for project X".
Apple wants to say "This desktop is for application Y".

Even if a window is full-screen in size, it's still part of project X, and that project involves other apps.

This can be made to work well-enough with maximized windows -- and hidden dock, plus maximized window, is almost as much screen space, and much easier to work with, than full-screen.

## Comment: What happens to the bitcoins? (Score 1)40

by Keybounce (#47800855) Attached to: Hal Finney, PGP and Bitcoin Pioneer, Dies At 58

We are now seeing the start of the death of bitcoin.

As people die, their coins -- protected by passwords not available to anyone else -- will be taken out of circulation.

So what happens to the bitcoins of the dead? What is the future of a currency that has to suffer hard decline in total units as generations go by?

What is the future of a currency where only corporations can live long enough to use it -- and they cannot prevent theft (if the corporation has a way to spend it, then at least one person must have the same way to spend it.)

## Comment: DOS: XCom. Amiga: Titans of Steel, Killing Game Sh (Score 1)382

by Keybounce (#47796677) Attached to: Ask Slashdot: What Are the Best Games To Have In Your Collection?

For any platform?

XCom, the original. Can be made better with XComUtil (a tool to work around some of the shortcomings of the game's UI. Yes, this is an old-school, low-res DOS game.

If you like magic: the gathering, then the original M:tG game (based on 5th edition cards) is great on old, slower computers, but runs too fast to be playable on modern computers. (Microsoft Windows, worked on 3.1 and 95, if I recall)

On the Amiga: Titans of Steel was a Mech Warrior game with actual time-based physics and time-based heat (instead of turn-based heat). No more "I have generated 15 units of heat, and sink 10, so I'm only heat 5, and take no penalty". Now it takes time to sink that heat -- and all the old FASA mech designs turn out to be horrible when you have real heat issues to worry about.

Finally, Killing Game Show (Amiga) -- you need the original disk, the crack did not work properly -- is a platformer with a "timer/clock" to keep you going. But it has one wonderful feature -- when you die, it replays the level you died on. So you can see where you made your mistake. And, *** You can take over during the replay***. So you can take over *before* you make your mistake and avoid having to do all the drudge work over and over again.

## Comment: Third party pass through (Score 1)91

by Keybounce (#47716575) Attached to: Research Unveils Improved Method To Let Computers Know You Are Human

And how will even the best, most fool-proof Capcha protect you from a spam bot system that passes that game, or other capcha, to some people farm in a foreign country? Or just to visitors to some other website that gets high enough traffic for the spammers to post sufficient volume of spam?

This, by itself, cannot solve the issue.

The issue is not "Prove that there is a human there".

The issue is "Prove that you, right there, right now, are a human, and not being passed to someone else, elsewhere".

## Comment: Re:Microsoft (Score 1)267

by Keybounce (#47627655) Attached to: Skype Blocks Customers Using OS-X 10.5.x and Earlier

Seriously?

The old code works. My PPC machine actually still can log into skype -- apparently they only "retired" the intel programs. There is no reason to force people to upgrade except perhaps to steal more information, spy on more things, and give you ads where the older versions did not.

Apple, good or bad, right or wrong (I call wrong) has chosen to dismantle features of the OS every version, approximately.

10.5 could be made to run EOF and web objects, but they were really only happy in 10.4. 10.6, as I understand it, cannot.

10.6 has support for PPC code, but sadly, does not include running your old version in a VM -- heck, you are not even legally allowed to load 10.6 into a VM unless you have the server version of it. 10.7 can be put into a VM, fine, but that doesn't help anyone with legacy apps.

Apple's whole "We have the technology for your business" has turned out to be three busts now as I recall: EOF is dead, WebObjects became EOF's master, took it over, and then took it to the java side, and then abandoned it all completely. The toll-free bridging has gone poof. Java as a first-class language has gone poof. Etc.

That's a lot more than 3, actually.

The point it: It is very easy to be in need of a specific older version of the OS because none of the newer ones support what you want/need to do.

And: "The new version is free"?

** THE NEW OS'S FROM APPLE STINK. **

Lets dumb it down. Everyone knows that a computer only works with one window at a time, so your finder is now single window (10.9). Everyone knows that grey on white is much easier to read than black on white, so lets make the finder displays of your files grey (10.9). Heck, 10.10 has this wonderful new feature: If you want to maximize a window, you must mean "full screen", after all, you never actually want to multitask between full screen windows and other things, right?

The full-screen app behavior is broken unless you are looking at a single-window app, like iMovie. And the new, free, upgrade to this new version of iMovie has lost features and become little more than a clip assembler with no smarts or power.

I'm on 10.7, only because I can't upgrade to 10.8 -- I cannot buy it. The horrors of "iOS is taking over the desktop", while at least partially true in 10.8, have workarounds; 10.9 and 10.10 are disasters.

## Comment: Re:Where is IPv*8*? (Score 1)250

I don't know about V7. But at the same time that V6 work was started, there was also work on a V8 system.

V8 was based on master regional gateways, if I remember correctly, called stargates. The central assumption in V8: Throw away the assumption that a single IP address always refers to the same machine everywhere. Within a single region, there is a mapping from name to IP to machine. But that mapping does not hold across stargates.

If I want to talk to a machine in my region, then getHostByName() returns an IP address that maps and routes just as normal.

If I want to talk to a machine in a different region, then getHostByName() returns a special 4 byte magic token that talks to the stargate that sits between me and whatever it takes to get to the destination.

It is another level of routers. Just as now, I can work within my regional area network -- perhaps I'm a comcast customer talking to another comcast customer -- or, I can go out over the "backbone" of routers that talk to routers with the special gateway router protocol (sorry, I forget the exact acronym -- BGP, I think?) to reach the final region for "last mile" delivery.

This extends it another layer. But at the same time, each region now has an independent IPv4 space.

Want to really enforce a "firewall of china"? V8 would actually permit it. If something like this was in place, then any attempt to talk to someone outside of china would have to send that hostname to a central authority router, which could then return either an accepted "cookie" (looks like an IP address, but treated special by the routers), or "no such host", or "here's the government re-education website".

What is the major compatibility problem? It is suddenly impossible to cache the outcome of "getHostByName()" across runs -- the cookie returned only has a lifespan as determined by the gateways.

===

There is a much better, much deeper question to ask.

** WHY THE BLEEP ** are we still using things like getHostByName()?

Why the bleep do we still expose struct sockaddr to programs? It's an OS internal.

Way, way, way back when, before there was an internet, when arpanet was just one of many networking protocols in use, networking changed far faster than BSD releases could come out. So a bunch of stuff that should have been OS internals were exposed, so network drivers could talk to application programs without an out-of-date kernel in the way.

Today, that should be gone.
Today, there should be a simple open() call that returns a network connection -- and for simple TCP streams, that's all you would need. Message-based (UDP/etc) would probably have a flag on open, just as we have for "read only", "create if not found", etc, there might be "best effort only". Heck, imagine a file system that could recover from "out of disk space" by eliminating old "best effort" files automatically. Sure, put up a warning on the console -- but programs can keep running.

All the issues we see from "How do we re-write all those programs from v4 to v6", all those "how do we migrate X from 4 to 6", etc. -- all come down to "Why do we even care?"

This is serious.

Why worry about the program ever knowing what address to talk to?
Why worry about the program ever knowing that port X is the destination?

Why would you ever want to say "This program/server can only run once on this machine because the port number must be reserved and known ahead of time to the users"?

Why not just say "Give me a channel to service X running on machine Y", and not worry about "I want a TCP channel over 4 bytes of address".

Why worry about "Hey, the TCP protocol fails spectacularly over networks that have a very high bit length wire" and "well, we'll fake the TCP protocol with a new one that looks sort-of like TCP, can handle high bit-length wires, but can be reset by a random packet from an attacker with a 1-in-4 chance of success". (High bit length wires == satellite links. TCP has a packet size, and a window size; put them together, and a bare TCP connection over satellite has to sit and wait for acks a large amount of the time.)

Get rid of a program's need to even care what the underlying network protocol / address family is.

Then see if there is any need to worry.

===

Last I checked, there was no way for a V4 only machine to talk to a V6 only machine -- while the original V6 address space map included the V4 address space as a special segment, that was removed in a later revision. If this is true (I don't know, it's been years since I've checked), then since there are lots of v4-only hardware devices out there, V6-only systems are DOA -- and we'll never be rid of the V4 legacy.

Is that still the case? Has that changed?

## Comment: New test: The spam bot (Score 1)432

by Keybounce (#47198225) Attached to: Turing Test Passed

The real questions are:
1. Can it convince 33% of the readers that it really is a rich Norwegian businessman wanting to send you money?
2. Can it solve Caphas 33% of the time on the first attempt?
3. Can it co wreck spelling properly at least 33% of the time?

## Comment: Helium 3 (Score 1)206

Someone several posts back mentioned that even getting gold from space was too expensive to be worth it. Well, there's one very good resource out there, in space, worth getting. Helium 3. Produced by the sun. Found in tiny quantities on the earth. Found in large quantities on the moon, but the cost of shipping back from the moon -- all return fuel must be carried to the moon -- makes it unprofitable.

But on mars? The return fuel from mars can be harvested on mars. So you don't need the fuel to ship fuel. That is the key difference, that makes mars worth using as a base.

Mars has Helium 3. How much does it have? I don't know.

What's involved in sending people to mars? Well, you need a habitat for them to live in, and you need return vehicles.

We've got plans for that already. Unmanned modules sent out to mars, that can set up mining / fuel production, a return vehicle, and a habitat. Send them off to mars; check out a location. Every two years, send something off to a different place on mars.

What happens eventually? You find a place with resources worth sending people to. And, you've got a fueled return ship. What? Something went wrong? Ok, send another set of survival/return resources to that same place.

Eventually, you have living space, and return trip, and fuel production, all ready to go. You can now send people and another return ship, just in case. And, some rovers -- you've got resupply points on mars, and now you can have people sent to do their own driving.

This is how you get people to mars -- every two years, a care package, until you've got something sufficient.

The why of mars? Two good reasons:

1. Helium 3.
2. You cannot mine an asteroid in the asteroid belt profitably. You have to move an asteroid someplace where you can mine it. There are four choices:
A: Earth orbit
B: Moon orbit
C: Mars orbit
D: Lagrange point.

We don't have the technology for D yet.
Attempting to move an asteroid into earth orbit ... lets just say that would be a political nightmare bigger than any technical challenge.

That leaves moon orbit -- with all the fueling problems -- or mars orbit, with much easier fueling/working conditions.

So the bottom line: Sending people to mars is not out of our technology. There are reasons to do so. It is the only currently known stepping stone to the next stage, and the first way we can get off this rock and prevent a single-point of failure that wipes out humans.

## Comment: Lotus Improv Models vs Spreadsheets (Score 1)422

by Keybounce (#47131211) Attached to: Why You Shouldn't Use Spreadsheets For Important Work

Many years ago, there was a program on the Pizza Box (aka NeXT machine) called Lotus Improv. It even came out on early windows systems.

It did something wonderful to spreadsheets.
It moved the formulas out of the cells, into a formula plane.
It got rid of the 2d system, where you put different sections of data on different parts of the plane, and try to keep things straight as your database grows, and instead used a collection of n-d spaces. Each one of the collections dealt with one subject; each one could track as many as 8 dimensions, and tracking 3-5 was typical.

It worked. It worked well.

Code was readable -- nothing was duplicated in every cell, or rather, almost duplicated with slight variations in each cell that you had to hope and pray was given the same and correct slight alteration each time (and god forbid you needed to change the template spread across everything). Instead, you defined clear statements once, and it automatically adjusted for each different cell.

Improv was wonderful.

Why did it die?

## Comment: Re:Use firefox ESR (Score 1)688

by Keybounce (#46882273) Attached to: Firefox 29: Redesign

** Mod Parent Up **

I was just about to post this same thing. I have been using 24 ESR (and 17 ESR, for PPC compatibility) for a while now. I was fortunate enough to NOT get this bleep today.

My mother, on her microsoft windows based system, was wondering "What has happened? Where has my gmail gone?"

Massive change, just for the sake of change, with no warning, with no user awareness, with no customization? I used to think that only Microsoft could pull such bleep on us.

You know you've been spending too much time on the computer when your friend misdates a check, and you suggest adding a "++" to fix it.

Working...