So if you're primarily a scientist there to do zero-g experiments on the ISS, are you still an astronaut? Why, because you're a professional - but not really in space flight? If we ever get to airplane-like conditions, is the steward(ess) an astronaut, is it like the crew? Or do you have to actually have a part in flying the spaceship, like is the cook on a big sailboat a sailor? Not that it really matters, but...
As for the price to flying to space I can't really comment since I wouldn't be buying tickets at all. Maybe one day when we have colonies somewhere to actually travel to, but not as things currently are.
Real zero-g (not Vomit Comet or theme park rides) would be pretty damn cool. Right now I'm looking at SpaceX and I really don't see a good reason why Dragon doesn't take more than 7 passengers, it seems they have plenty space and it's supposed to be able to return 2500kg of pressurized cargo, so from what I can tell they should be able to put more like 20 people in that cabin if they stack the seats nicely. It's $140m/flight so that'd bring it down to $7 million and that's for a genuine LEO flight. If they're just going for 101km with a supersized capsule I'm guessing the rocket is good for shooting up 140 people at a time at $1 million/seat.
That's why I drew comparison to the moon landing. Because it was pointless. There was no commercial reason to go. No military reason to go. Minimal scientific reason to go. There was no reason at all, beyond raising the national middle finger at communism. And yet, we went anyway. That's the kind of reckless stupidity it would take to make manned space exploration or settling possible: Screw the rationality, we go because it's cool, and because we can't let the other superpower steal the prestige. It's happened once, so there is always the possibility it will happen again.
Sure if you disregard the Cold War, the thousands of warheads pointed at each other and the Cuban missile crisis then there was no military benefit. NASA was the velvet glove around the iron fist but I think everyone except you saw what the real message was: "Our rocket technology is so advanced, don't you f*cking try anything." The moon is of course of no military significance, but the Apollo program was.
The alternative would have been a military program under the DoD, but pushing those kinds of amounts into the military budget would look aggressive and militant. Instead they got all the essential technology, plenty opportunity to show off and talk to the media, good old-fashioned heroes, honoring the great visions of a dead president and all under a formally civilian authority. The drive was the military need, the moon was just a convenient rallying flag.
If you get a cataract, spend the extra money on a CrystaLens. Unlike 45 year old natural lenses and implants available before 2003, they will actually focus. Of course they're under patent so they're about a thousand dollars each more expensive than other implants. I'm sure I'll have a cataract in the other eye not too long from now, the last eye doctor I saw said "a couple of years" and it's been longer than that.
I think I'll wait until 2023 when the patent runs out and everybody makes them, the ones like my mom has will be obsolete. I only use that eye to look at tiny things, anyway.
Insurance paid for all but the extra thousand, it was the best thousand dollars I ever spent. The device inside my eye is my favorite device of all.
As I understand it 30p is okay for photo work, but a pretty big compromise for general desktop use so I wouldn't do it. I have a 3840x2160@60p 28" monitor hooked up over DisplayPort 1.2 using SST (single stream transport). It works very well, I can also hook it up to my 1080p TV at the same time on my GTX 670. Just bought dual GTX 970s to replace it though.
There are three ways to support 4K content:
DisplayPort 1.2+ over SST
DisplayPort 1.2+ over MST
Avoid MST (multiple stream transport), it's not worth the issues. DisplayPort 1.2 has been around for a while, the screen is usually the blocker on whether you can use SST. My screen (Samsung UD590) can so I do and that works great. HDMI 2.0 is brand new, the GTX 970/980 are the first graphics cards to support them but I suppose they're the only means to hook up 4K to an UHDTV as I understand most of these don't have a DisplayPort. That's what it's designed to do anyway, but if you jump on HDMI 2.0 now you'll be the first to test it really. For me that's not even an option, I hook it through the sound system and that doesn't support HDMI 2.0 pass-through. I find it's not that essential at couch distance anyway, it's sitting up real close you notice it most.
The R9 280 certainly doesn't count as low power (250W), the R9 285 is considerably better in that department (190W) and got some newer features to boot, with a $249 MSRP it should just barely squeeze inside your budget. To stay in your budget limit the nVidia alternative is GTX 760, but I wouldn't buy a Kepler card today, too hot and too noisy. Unfortunately there's not a Maxwell to match your specs, there's a gap between the GTX 750 Ti (which wouldn't be a performance upgrade) and GTX 970 (which blows your budget at $329).
Personally I was very surprised by the GTX 970 launch price though, the GTX 980 @ $549 was as expected but the 970 delivers 13/16ths of the processing power with the same memory size and bandwidth for over $200 less. I bought two to use in a SLI setup, in the games that scale nicely it's a kickass value. I suspect that by December this will have had some market effect at the $250 price point too, so I'd say check again then. Asking for advice 2-3 months out in a market that changes so quickly doesn't really make much sense.
While you do have a point, I'd counter that there's no way to do anything with a computer unless there's an interface for it. For example if you go to Burger King 90%+ order as-is from the menu. But there's all sorts of simple instructions like "no onions" you can tell a clerk that you can't tell a computer. If you go to Whopper Lab you can see all the options of buns, patties, dressings and toppings available in none, light, normal and extra quantities and so on that would totally overwhelm the average customer. If the interface didn't exist, the option wouldn't exist but any given option will be the default something like 99.9% of the time.
I like being able to manage my computer, I don't like having to micromanage my computer unless there's a specific reason to. I consider having obvious buttons to find more advanced controls to be discoverable, not that you need to throw every option in my face to say hey, you could change this behavior if you wanted to. If it's possible to set a sensible default and I haven't seen a reason to go looking for it then I don't need to know. Non-discoverable features I consider things like touching corners that don't have any hint they have actions, buttons with no obvious function/that don't look like buttons, shortcuts you can't find except looking them up, type to search with no hints and so on.
That said, I generally prefer an expanding/alternate dialog over a multi-step dialog. If I know I need to go into the advanced settings every time because I'm the 1% using that function I'd rather have the ability to pin it to expand/use the advanced dialog by default, meaning it should be a superset of the basic dialog not just the extras. Since we're already in an advanced dialog having a checkbox "Use advanced display by default" at a standard location wouldn't hurt. Go into the advanced dialog once, check that box and next time you go straight to where you want to be. It is usually far more user-dependent than situation-dependent, so I think that'd work well for most everybody.
The whole "needles in the eyeball" are just a stepping stone to something truly amazing.
Indeed. I was severely nearsighted all my life, after the cataract surgery I no longer need corrective lenses at all, not even reading glasses and I'm 62. My vision in that eye went from 20/400 to 20/16. Truly a miracle.
BTW, my retina surgeon said that my retinal detachment was a result of being so nearsighted; a nearsighted eyeball isn't perfectly round like a normally sighted person's eyes.
One of the big differences between archiving and backup is that in archiving I want to keep this exact version intact, if it changes on me it's an error while a backup takes a copy of whatever is now - maybe I wanted to edit that file. Unlike backups I think it's not about versioning, it's about maintaining one logical instance of the archive across different physical copies. Here's what I'm thinking, you create a system with three folders:
The archive acts like a CD/DVD/BluRay and is read-only. So far, nothing but a really awkward way to create a WORM(-ish) drive, but the real point comes next in distribution and synchronization.
When you put a file in "to_archive" a job will pick it up and wrap it in AES (with AES-NI the cost of on-the-fly encryption/decryption is very slim) and create a torrent-like file for it and move it to archived. If you want to delete it from the archive, you drag the file to the "to_trash" folder or maybe you put some kind of lock/freeze/undo timer on that function. Files that are in "archived" are sync'ed to other computers - still encrypted - which means you can shop around for storage/bandwidth, maybe you got multiple locations yourself (home/cabin), maybe swap backup with friends or family or you can buy it on the open market and they'll all mingle and share data because it's based on basic torrents.
They can all do basic limits on size/bandwidth so you can have pricing plans and caps, you can have one-way "leeches" that download and archive it on tape that can physically deliver it to you. If you build it fairly smart you can also have local, offline backups and if you restore them it'll pick up that 95% is the same as last week and sync up the rest. Basically a "Redundant Array of Inexpensive Archive Locations." It will leak a little bit of metadata as to size and number of files, but not file or directory names and you can probably muddle that metadata up with padding and dummy files if you want.
Of course you can choose to have the AES key on several computers so you can access your media from any of them. And as a free bonus a device that has the AES key like say your cell phone can use this as an online library, it doesn't have to auto-sync everything. With many locations = many peers it won't matter if one is down and you aggregate up the bandwidth, just like in any other torrent swarm. Through the seed/peer numbers you can at any time watch the state of your backup in progress as you add files. If your computer goes to shit, tell it the archive key and it'll hook up and start syncing. Just like a torrent client you can set priorities on what to download first.
It's not for all your data, but I think a lot of common user data is that way. Those RAW photos or video or audio you took? Archive them, "single" everlasting master copy. It doesn't replace backup of say documents you're working on or source code you're developing but it complements it.
I want MIcrosoft to stop making awful Operating systems. We know they can do it, because XP was excellent, W7 almost as good. (...) I want Microsoft to change their "We know what's best for you dammit!" attitude, and ignore feedback. Both Vista and W8 had people begging them not to go there.
Maybe there's a hint there? Conservative, experimental, conservative, experimental... As long as people keep arguing if the old or new version of Windows is better, I don't think Microsoft worries. You are free to skip a version you know.
If you've read enough of Slashdot, you'll have noticed that every complaint about MSFT is attacked by "energetic fans" shouting that the complaint is invalid, that the person complaining is an idiot. How long is that supposed to work?
Do a s/MSFT/Linux/g and there's plenty OSS apologists too. Particularly because you got one team saying "Linux is so free and great, it's totally ready for the desktop and you should try it out" but when you have a problem the other team says "Yeah well you got it for free, so STFU and be grateful". I'm on Windows 7 now and I'm guessing sometime soon Microsoft needs to release another "classic" desktop for conservative enterprises so they can plan their migration before the 2020 EoL. Having Linux around as a plan B is nice but for gaming Windows rules supreme, regardless of whether Linux has a Steam client or not.
I think Norway is fairly unique in that we actually voted to have a king (12-13th of November 1905) after the end of the union with Sweden. Also our king's power has been reduced to a democratic emergency brake where he can only delay a law being passed until there's been an election. If it is passed again, it becomes law regardless. Formally he's the sovereign though, the one signing all the laws, head of all military branches, the one formally leading the king's council with the prime minister as his first advisor. And his person has a total blanket immunity in Norwegian law, though it was settled that he could be sued in a property dispute.
What I find quite appealing at times is that he's not a politician, not looking for a reelection or to further his own career nor is he trying to represent just the 51% who voted him in. In the US I have the impression that if a Republican is in office all the Democrats hate him and if a Democrat is in office all the Republicans hate him. He represents the nation of Norway and not whatever political party happens to hold the reins at the moment. There are other nations that have a form of ceremonial leader like for example Germany with the Bundespräsident as opposed to the Bundeskanzler, but it's a retirement home for politicians. You have to campaign to win it. It's not forever, so there's self-interest to it.
Our king is pretty relaxed about his right to rule, or rather I feel he thinks it's more of privilege. No blue blood, no divine right to rule and I think he like pretty much all western monarchies knows he sits at the parliament's mercy. Like the US, we do have a constitution and a process to amend it and like I said he couldn't block it. If he was losing the people's support I think he'd resign gracefully though long before it came to that. And apart from at the coronation I don't think I've ever seen him with a crown and all that, it's more a ceremonial rite when you take the job.
You can of course say he's not needed, that the US is a nation independent from the President in office and I suppose that's true, but it's a very abstract and silent existence. For example during WW2 the radio broadcasts from the king in excile in London was gathering the nation. When people use archaic expressions like "for king and country" we're not talking about saving one man's divine ass anymore, but that the king serves the country and we follow him as our leader. It's not a perfect system but honestly speaking I feel it works well. It's good for tourism. Sure they live in a castle with a solid upkeep, but I know we'd keep it for historical reasons anyway. We'd no more tear it down than old churches.
I'm fine with the chip; that protects me, the bank, and the retailer. I am NOT fine with the PIN. My signature can't be stolen; if someone steals my card, the signature on the sales slip proves it's not me. But if someone steals your PIN they have your every penny.
It happened to me with a debit card. I welcome the chip, but of they add a PIN I'll cancel all my cards and go back to cash and checks, even though they're nowhere as convenient.
I hadn't had any of the accounts I'd used, either, and wasn't sure which one it was. Still got the account back, give 'em a try.
I had cataract surgery on that eye two years before the retina came loose. I did know a couple of guys who had vitrectomies followed by cataract surgery, but the needles don't go through the lens, they go in through the whites (photos at wikipedia). I suspect that a vitrectomy involves steroids; steroid eyedrops for an eye infection caused my cataract.
It wasn't worth the effort, I just cut and pasted from the manuscript. It's bad enough posting journals, I take a bit more care with them.