Forgot your password?
typodupeerror

Comment: On snobbery and serving temperature: (Score 1) 840

by Cordath (#36608840) Attached to: With regards to beer, I prefer it to be:

Snobs *exclude* rather than include (Thanks Ebert!). By refusing to drink anything other than domestics you have proven yourself as much of a snob as the parent poster.

Solution: Get together and swap beer!

But seriously, get out there and try some different brews. You'd be amazed at what's available these days. Beer is undergoing an veritable renaissance right now. Take advantage!

P.S. Serving beer ice cold hides a lot of its taste. For some beers this is a good thing and entirely recommended. There are plenty of beers out there that benefit from being served a little warmer. Not necessarily room temperature, but somewhat above freezing.

Comment: Investing in the Future won't get you votes today! (Score 5, Insightful) 760

by Cordath (#34596696) Attached to: 'YouCut' Targets National Science Foundation Budget

Private companies typically do not engage in long-term research that isn't likely to lead to directly commercializable results. I know this flies in the face of red-blooded 'merican "all socialism is evil" doctrine, but public sector research, funded by tax-payer money, is needed to build the foundations for tomorrow's industries. Quantum computing, like many other bleeding edge fields, is too immature, too high-risk, and with pay-offs that are far too distant for the private sector.

Research and education are both investments that can yield fantastic returns, but they are long-term investments that require steady commitment rather than periodic outbursts of zeal punctuating long periods of apathy. A minor cut now might help balance the books today, but the lost opportunities down the road will more than negate that. Top researchers don't hang around after you cut the funding they run their labs and pay their students and post-docs with. They won't wait a few years until times are good again. What they will do is go where the money they need to work is, and if they can't find that in the U.S., they'll likely find it in Canada, China, Australia, etc.. The U.S. is far from the only country doing quality research in QC these days.

Unfortunately, some U.S. politicians are of the opinion that they can make political hay by screwing over those "pinko" scientists. They're smart enough to know what they're sacrificing, but votes for them are a worthier cause! The only way to fight this kind of thinking is to call up your local representative/senator/etc. and let them know you're not buying it. The only way to make them stop this kind of thing is to make them think they'll lose votes today, because that's all they care about.

Comment: Got relays, beyatch? (Score 5, Informative) 520

by Cordath (#34310882) Attached to: Do You Really Need a Discrete Sound Card?

Sound quality matters, but sometimes small features that one might usually overlook even more.

For example, say that you have a nice speaker setup and a good amp, but an aging pre-amp that can no longer decode the latest audio formats. If you run things with a PC, the pre-amp is basically a very expensive DAC. If you can find a sound-card with good DAC's on it you can, in theory, just toss the old pre-amp and connect your computer directly to your amp.

Problem! When a computer boots up, a large voltage spike goes through its various components including the audio card. With many audio cards or audio chipsets this spike goes right out the line to your amp, which dutifully amplifies it into a very large CRAWHOOMP!!! Besides causing your cat or dog to projectile defecate on whatever it happens to be near at the time, this can also damage your speakers and/or amp!

How do other components like pre-amps get around this problem? Good audio components all have some way of electrically isolating their outputs from the rest of the device so that these power-up CRAWHOOMP's don't happen. This usually means electromechanical relays. This is why your expensive amp or receiver usually makes some clicking noises moments after being powered up. That's the relays clicking into place once voltage levels have normalised.

Good audio cards, like the Asus Xonar series, also have these now. On-board chip-sets usually do not since it would add a few dollars to the price of the board and most people don't plug their computers output directly into an expensive amp and speakers.

Long story short, what audio components you hook up to your computer and how you hook them up both have a large impact on the features you need in your computer's audio card. For a long time, computers had zero chance of replacing pre-amps because almost all audio cards lacked the small features that good audio gear almost universally possesses. That's changing, and about time too!

Comment: The markets need to be forcibly civilized. (Score 5, Interesting) 525

by Cordath (#32358840) Attached to: Sudden Demand For Logicians On Wall Street

Margaret Atwood once described civilization as the judicious trading of "freedoms to" for "freedom from". e.g. You trade the freedom to murder anyone you like for freedom from being murdered yourself. While a rather distressingly large percentage of Americans would scream "COMMIE PINKO!!!" at me for daring to suggest this, I feel that the stock markets could stand to be civilized a tad.

What is the purpose of the stock markets? Are they meant to be a video game played by A.I.'s for big cash prizes, or a way of facilitating investment and trade? It's time to find ways of restricting high frequency traders. While cumbersome regulations are one option, perhaps a per-trade tax or user-fee would be better. A tiny one, percentage wise, that will only have a significant impact on high frequency traders. Cuts to other taxes could be made to offset them for average frequency traders and perhaps even benefit low frequency traders.

There are, naturally, many other ways to approach this. All it takes is resolve and, in the U.S. at least, thick skin.

Comment: Info can't propagate faster than speed of light. (Score 4, Informative) 389

by Cordath (#32309062) Attached to: Quantum Teleportation Achieved Over 16 km In China

Don't feel bad, this is a pretty common mistake. People read about non-locality and how what happens to one half of an entangled pair affects the other half instantly no matter how far away it is. There does remain some philosophical debate over what entanglement and non-locality really are, but one thing has been supported very well by both theory and experiment: You can't transmit information or power faster than c. In the case of entangled pairs, actions on one half can have a non-local effect that propagates faster than c, but it's not possible to transmit information or power using that effect. In order to make sense of the results and actually observe the effects of non-locality, you typically need to send additional information classically.

So, this will not lead to lag-less communication over vast distances. What it will lead to is quantum crypto networks. Long distance entanglement swapping or quantum teleportation are one of the key ingredients to building a scalable network.

Comment: Don't standardize on one OS, or lock it down. (Score 1) 434

by Cordath (#32303152) Attached to: Most Useful OS For High-School Science Education?

It's not OS's your students need experience with. It's software and programming languages they need experience with. If they're going to go into experimental science, they would also benefit from building hardware and interfacing it with their computers. (i.e. Some basic electronics)

I do experimental quantum physics work in a university. We use everything. OSX, XP, Windows Server (no Windows 7 or Vista installs surprisingly!), and a few distros of Linux. Sometimes we are forced to use a specific OS (usually Windows) because some piece of equipment we're interfacing with only has drivers for one OS. When that isn't the case, it's personal preference. (I gravitate towards Linux distros with decent KDE environments.) Really, you shouldn't worry about what OS your students use. Ideally, give them a chance to try out a variety of OS's.

The applications are what's really important and the big ones tend to be mostly the same across platforms. If you're doing basic (or not-so-basic) simulations or analysis, you're probably going to use Matlab or Mathematica. Something requiring higher performance will probably be written in a low level language like Fortran. (Yes, Fortran. It's surprisingly good for Physics work. Try doing linear algebra in Java or C and you'll just waste a lot of time writing tools.) If you're running an experiment you might do a little driver work in C or C++, but odds are you'll tie things together with something like Labview. Origin also gets used a fair bit for plotting and curve fitting even though it has a pretty horrible interface. Excel, gnumeric, etc. just aren't as good at fitting. For writing papers it's Latex and nothing else. Many people use Latex add-ons like beemer to make presentations as well instead of powerpoint. I'm sure other people can suggest software to get, but it's going to get expensive fast unless you can get some free educational samples, which you should probably try asking for. You might be surprised by what you get!

Here's my ideal environment for your students:

-Not one OS, but many. They should be exposed to something they don't use at home. This will help them become adaptable.
-These OS's should not be locked down. Locking them down will stifle your students ability to learn. Heck, encourage them to try breaking and fixing things. You should probably, however, create disk images so you can easily restore the machines to a useable state if they are wratched. Your sysadmin will hate this idea and would probably prefer to lock things down tightly. Just remember that if sysadmins had their way nobody would ever use their systems.
-Get as much far-out scientific software as you can. Let your students play with it. Encourage them to try checking their Calculus assignments with Mathematica or Matlab, or perhaps write them up in Latex.
-Get some hardware to hook up to the computers. Find basic sensors like thermocouples or photo-diodes. Get some USB-interface chips, prototyping breadboards, and misc components and put your students to work interfacing those sensors with a computer. They might find it impossible, or they might surprise you. Being able to tackle tasks they're not prepared for with minimal guidance is one of the most useful skills you can teach them.
-Don't make boring lessons like, "Today we're going to learn how to print, "Hello World" in Java!". Give your students projects. Ambitious projects. The sort you don't know how to do. Give them lots of class time to work on it. Even if they're doing stuff you don't know anything about, talk to them about it. Ask them what they've done, what their current problems are, and what they plan to do. It's their problem to solve, but you're the coach who helps keep them on track.

If you do even a fraction of the above, your students will be well ahead of 99% of the students coming into University.

Space

Big Dipper "Star" Actually a Sextuplet System 88

Posted by kdawson
from the toil-and-trouble dept.
Theosis sends word that an astronomer at the University of Rochester and his colleagues have made the surprise discovery that Alcor, one of the brightest stars in the Big Dipper, is actually two stars; and it is apparently gravitationally bound to the four-star Mizar system, making the whole group a sextuplet. This would make the Mizar-Alcor sextuplet the second-nearest such system known. The discovery is especially surprising because Alcor is one of the most studied stars in the sky. The Mizar-Alcor system has been involved in many "firsts" in the history of astronomy: "Benedetto Castelli, Galileo's protege and collaborator, first observed with a telescope that Mizar was not a single star in 1617, and Galileo observed it a week after hearing about this from Castelli, and noted it in his notebooks... Those two stars, called Mizar A and Mizar B, together with Alcor, in 1857 became the first binary stars ever photographed through a telescope. In 1890, Mizar A was discovered to itself be a binary, being the first binary to be discovered using spectroscopy. In 1908, spectroscopy revealed that Mizar B was also a pair of stars, making the group the first-known quintuple star system."

Comment: Just try and take my Espresso Stout away!!! (Score 4, Insightful) 398

by Cordath (#30103526) Attached to: Caffeinated Alcoholic Drinks May Be Illegal
There's a pretty huge problem with banning alcoholic beverages containing caffeine. The worst offenders are not drinks that come in a can from Coors, but mixed drinks, like Vodka Red-Bull's. You can make laws telling people not to mix their Vodka and Red Bulls together, but good luck enforcing them! (Honestly, you'd think common sense and a sense of taste would be enough...)

The truly awful thing is that, if this kind of law was enacted, the drinks it would actually kill would be wonderful, rich microbrew espresso stouts and imperial coffee stouts. Outlaw Coors Light if you must, but DO NOT FUCK WITH GOOD BEER.

Finally, the most damning argument against this sort of law of all is that stupid frat boys and girls will still wind up doing stupid things no matter what they're drinking. So what's the point eh?

Comment: Re:Why this could be useful: (Score 4, Informative) 74

by Cordath (#29386621) Attached to: Australian Researchers Demo Random Access Quantum Optical Memory
Depends on the type of network. For plain ol' BB84 systems relying on sending single qubit states, absolutely. You wouldn't use that over a quantum repeater network though. You'd likely use one of several quantum key distribution schemes relying on shared entanglement. (e.g. Ekert 92)

Here's the principle on which quantum repeater networks will operate:

Alice (----- Entangled Photon Pair Source -----) Bell State Measurement (------ Entangled Photon Pair Source -----) Bob

What we want is for Alice and Bob to each wind up holding half of an entangled pair of photons. The two sources create two pairs of entangled photons and send the halves in opposite directions. Alice and Bob initially receive photons that have nothing to do with each other. However, when the other halves of Alice and Bob's pairs are annihilated together in the Bell State Measurement in the middle, the entanglement of the annihilated photons is swapped to Alice and Bob's photons such that they wind up being entangled together. The nice thing about this is that Alice and Bob can verify that they share entangled pairs and there's no way for anyone in the middle to fool them, provided Alice and Bob authenticate each other and there are no real-world deficiencies in their apparatus. In essence, Alice and Bob don't have to trust the man in the middle even though he's handling their photons.

To build a quantum repeater network, you just expand this out in a giant daisy chain with many many steps. Quantum memory is necessary for caching photons at each node in the chain so that you can wait for all nodes to be ready before proceeding with the bell state measurements. Caching is necessary because the probability of photons reaching each of the stations in the network simultaneously is no better than the probability of one photon going from end-to-end. i.e. Not bloody likely over long distances.


P.S. Funny aside: The first BB84 system built by Bennett and Brassard (the first quantum crypto system ever built), had some rather noisy pockel cell's controlling measurement bases such that you could tell what basis Alice was measuring in from the sound of the cell. Additionally, Alice and Bob were on the same lab bench, so an eavesdropper in between them would necessarily be inside the room. It was therefore famously joked that the first quantum crypto system was only secure if any potential eavesdropper was stone deaf! This is an example of a side-channel attack that can occur when reality doesn't quite live up to theory, and is the sort of thing people building any kind of crypto system, quantum or otherwise, have to worry about.

Comment: Why this could be useful: (Score 5, Interesting) 74

by Cordath (#29386163) Attached to: Australian Researchers Demo Random Access Quantum Optical Memory
While light can be bounced around, absorbed and re-emitted fairly well in a classical sense, it gets tricky when you start trying to store single photons that have been intentionally "dicked with" to encode quantum information. (i.e. Quantum bits, or qubits) What this paper is talking about is one way of implementing quantum memory for successfully storing and recalling photonic qubits. (i.e. light)

Now, the computer geeks out there probably heard "qubits" and immediately thought "OooOOOooo... Quantum Computers!". Not so fast. Photonic qubits are generally too quick to decohere (even when stored in memory such as this) and difficult to interact with to be good candidates for quantum computing. It's certainly not impossible, and perhaps even probable in the long-run, but atomic qubits are currently more promising and more widely being looked at for quantum computing. What a photonic quantum memory is immediately useful for is communications. i.e. Quantum cryptography. More specifically, building quantum repeater networks.

If you know a little about computer networks, you know that signals traveling over long distances have to be boosted by repeaters every so often or loss humps your data. Optical networks are exactly the same. After a few hundred kilometers of fiber you have a lot of loss. Unfortunately, unlike classical bits, which can simply be copied, qubits cannot be reliably copied. (Google the "no cloning" theorem if you care.) The work around is a little complex to explain (it's essentially a daisy chain of entanglement swapping), but requires quantum memory to work.

The short of it is, this sort of quantum memory will allow us to build longer distance quantum encryption networks than currently exist. (Quantum crypto is currently being used by some European banks.) At first, this might allow banks in North America to jump on the Quantum bandwagon. It's hideously expensive at the moment, naturally, and probably less economical than running volkswagen's full of hard-drives with one-time-pads on them back and forth, but in principle nothing about this tech is any more expensive than the repeaters the internet currently runs on. Economy of scale should eventually kick in, and these quantum crypto networks will be pretty handy if quantum computers manage to toast public key encryption. (Authentication, of course, is another issue entirely...)

Now, I haven't had a chance to read the Nature paper yet. I've read this groups past papers though, and they really are world leaders in experimental CRIB implementation. Last I checked, they still didn't have adequate efficiency to make their tech useable (must be greater than 50% recall to be practical). Still, CRIB is one of the more promising methods out there.

Comment: Apple is already familiar with the other side... (Score 3, Interesting) 276

by Cordath (#29328123) Attached to: A Different Perspective On Snow Leopard's Exchange Support
This might initially appear to be an odd move for Apple, because they already operate on the other side of the fence. e.g. iTunes, or Quicktime. The Apple version of quicktime that is released for windows is typically feature deprived (unless you pay for pro), buggy, and horrendously inefficient. (It's always great watching 1080p stutter along on a freakin' quad-core with a $400 video card.) It's reached the point where the deficiencies of Apple quicktime for Windows has spawned "Quicktime Alternative", just like Realvideo spawned "Real Alternative". "Quicktime Alternative", when it's fully caught up in the arms race with Apple, is a entirely superior to Apple Quicktime, offering smooth playback on modest hardware and all the features of pro for free. Naturally, Apple frequently "tweaks" things to break functionality on the open alternatives to their software. (This happened to Palm rather recently, w.r.t. iTunes.)

Now, I would assume that Apple has some agreement with MS to keep them in the loop on the updates to Exchange. The financial entanglement of Apple and MS and their workplace symbiosis is such that MS probably will not benefit as much as one would think from dicking Apple around the way Apple dicks open sourcers around. Also, MS knows they would have no chance in the court of public opinion if they tried to do so, while Apple can make a somewhat believable case against open sourcers reverse engineering Apple software and providing, for free, some of the pro features that are supposed to be paid for.

Comment: Western vs Eastern RPG's - W vs E MMORPGS (Score 5, Insightful) 256

by Cordath (#29062293) Attached to: On Transitioning To an Asian-Style MMO, Such As <em>Aion</em>
One thing that I think the article is absolutely wrong about is that Western RPG's or MMO's are in any way behind Eastern ones. From Baldur's Gate to Planescape: Torment to the KOTOR series, single player western RPG's have really pushed the boundaries and given us compelling and unique experiences. While the West churns out fewer RPG's than the East, they tend to be much more varied and innovative, especially in terms of characterization and plot. When a good Western RPG comes out I can look forward to a fresh experience, while most Eastern RPG's feel annoyingly familiar. Playing them, I always experience deluges of deja vu and have to carefully switch off parts of my brain. (e.g. The part that doesn't want to play a bitchy adolescent male prodigy saving the universe... again.) The things that appeal to Eastern audiences, like those fucking chocobo's, aren't what float my boat. Likewise, to say that the West is behind in the MMO department, with WoW absolutely stomping Eastern MMO's in their own bloody markets...

Aion looks like a solid eastern MMORPG, but nothing compelling enough to dethrone WoW. It's artwork also feels distinctly Eastern, which means it will flop in the West. Lots of people in the West love anime, love Kurosawa, love Chan-wook Park, but they're still a very small minority. The majority of people will not go for something that feels too Eastern, just as Eastern audiences flocked to Lineage but not to western MMOs. Cultural barriers definitely do exist between the East and the West and Aion doesn't look like a MMO that transcends them. It really is extraordinary that WoW has somehow managed to appeal to both the East and West, and I'm not sure even Blizzard knows how they managed it.

So, what's going to dethrone WoW? Slap me silly with a mackerel if I have a clue. Probably WoW2. It's not really a terribly interesting question. What is an interesting question is when we're going to see hugely popular MMO's on the scale of WoW in genres other than fantasy. There are a lot of people out there who love sci-fi and not fantasy, or who love historical settings and not sci-fi or fantasy. These are largely untapped markets. There is probably room for several big MMO's to do well at the same time, provided they target different genres. (another reason why Aion is probably doomed.)

Bioware's KOTOR MMO looks promising. It's sci-fi, which hasn't really been done well in a MMO sense except possibly for Eve Online, but the space-sim market is arguably a different genre from what KOTOR targets. Bioware has a long track record of excellent single player RPG's, but it remains to be seen if they have what it takes to put out a MMO, especially now that they have their own sort of "imperial entanglement" predicament now that they're under EA's umbrella. (You can bet there will be pressure to release early coming from EA, no matter how much Bioware claims they are the master of their own domain!) A lot of single player RPG fans are up in arms over KOTOR being turned into a MMO, since KOTOR's strength was it's compelling stories, which are remarkably hard to do in a MMO that is more about player dynamics. Bioware claims they've found the holy grail of MMO's though, a way to bring single player plots to massive online environments. That's a bold claim, if ever there was one. I wish them luck.

Comment: Are we talking Pr0n or Tax Receipts here? (Score 2, Interesting) 564

by Cordath (#28588149) Attached to: RAID Trust Issues &mdash; Windows Or a Cheap Controller?
If you're just want a convenient backup of your music collection, porn collection, musical pr0n collection, or your pr0n musical collection then RAID is not a horrible thing. However, if you're backing up important files, like the only existing scans of the now-burned dossiers William Mark Felt left you, then you should not stop at RAID. Statistically speaking, if something happens to one HD in your machine, like a massive power surge or being confiscated by tight-lipped men in black suits and black sunglasses, it has a pretty high probability of happening to the other HD. Offsite backups are, therefore, prudent. Leaving a HD in a box at the bank and giving the key to your lawyer is one of the safer things you can do, but not terribly convenient. There are a variety of online backup services available that are decent. I'll leave it to others to speculate on which ones are least likely to be fronts for the NSA. If you feel that your data might actually be interesting to more than one human being on Earth, don't forget to encrypt it. (Be honest with yourself. You are posting to /. after all.) I'm rather fond of emailing moderate risk files to my gmail account. (Stupid, I know, but very low effort and they're available anywhere you feel safe enough to check your email.)

As for Motherboard RAID chipsets... Keep in mind that your motherboard has a non-zero probability of frying, having it's caps go bad, being peed on by irate government agents, etc.. I once had a RAID 0 array that was hooked up to one of those things. After the Mobo died I had to do without letters K through P of my Japanese horror-comedy-porno-game-show collection until I was able to find a used computer with the same RAID chipset. (I don't know if it's changed, but at the time each different RAID chipset made RAID 0 arrays that were not compatible with anything else on this lump of rock.) If data portability rather than performance is a priority for you, my advice would be to avoid hardware RAID entirely.

Comment: Infra-red is a color, you nitwit. (Score 1, Interesting) 303

by Cordath (#28307157) Attached to: Why Natal Is a Big Deal
"Natal does not track players by colour (although we know from Milo commenting on my blue shirt that it can if it wants to); it tracks them by infra-red "

Gah! Willful, unthinking ignorance like this really yanks my chain. When you get things like this wrong it makes me want to ignore you completely, because you're probably an idiot. There are several mistakes like this in that article, and the continual invoking of "magic" is particularly bad. There is quite a fair bit known about how Natal works. If you somehow missed it at E3, look it up slacker! If you really want to persuade people, sound like you were paying attention at E3 for chrissake.

As for addressing the claim that hardcore gamers don't want to jump around in front of their couch rather than sitting on it and twiddling their thumbs...

Fail.

There's something truly awesome about sitting back, taking a piece of clunky plastic in your hands, and gaming the night away with some good ol' fashioned haptic feedback. Maybe Natal is precise enough to read your finger positions without needing a controller, but it still can't give you tactile feedback. Incidentally, spinning your hands in the air to control a car is actually a step further away from total immersion as compared to spinning a steering wheel in the air, because real cars have steering wheels!

Don't get me wrong. Natal is still an epic achievement, but hardcore gamers should probably realize it's more for their mothers than them. Take a look at the Wii. It's sort of a gimmick. A lot of people get one, play it for a couple months, and then pull it out only for parties. Why just parties? For one thing, it has a lot of games that are easy to learn and offer little advantage to the master, meaning almost anyone can win by the end of a session. It's also fundamentally enjoyable watching people spaz out in front of the thing before they realize it's all in the wrist. Given the Wii's relative lack of depth, why has it outsold the PS3 and Xbox360 combined several times over? Broad appeal. Your mom doesn't see the point of yet another game about saving Master Chief's undies from alien zombies, but air-golfing? Score.

Look at how MS operates and you'll figure out what's going on here pretty quick. Natal isn't the next generation of hardcore gaming. It's the Internet explorer of casual gaming come to dethrone Nintendo's Netscape. The Wii showed Sony and MS how monolithically massive the casual gaming market is, and now they want in. Natal is a slash aimed squarely at Nintendo's jugular, and they're going to have to innovate our pants off and then fellate us to stay in business.

So, will Natal ever do anything for hardcore gaming? I don't know. Natal, or something like it, will someday. It really is in the hands of the software developers though. I applaud MS for giving us a whole new bag of tricks, but I honestly don't expect a hardcore gaming Nirvana to come out of the mist like the author of that article does. I expect Wii-type gimmicky crap that will be a whole lot of fun at parties, and for your mom, but not that fantastic for late night fragging. Emotional AI and speech recognition is bloody impressive, but Turing test passing AI is still very bloody hard stuff. They can put this stuff into games, but at some point you'll probably feel like you're trapped in a world full of Dr.Sbaitso's. Scripted dialogue trees (Mass effect is a great example of doing those well) aren't going to go away for quite some time. In reality, the tools MS is giving us will take years or decades to refine on the software side of things. Existing input methods, like mice or gamepads, have been around for several decades and are heavily optimized. They're not going to be replaced in one generation.

The study of non-linear physics is like the study of non-elephant biology.

Working...