The output of DVD player code does not determine whether or not you go to jail.
The output of DVD player code does not determine whether or not you go to jail.
Assuming your number is right, and they need to bleed off 7.5km/s, that would be 24,606 feet/second,
or a 12,303 second burn,
or a 205.5 minute burn,
or a 3.4 hour burn.
I'm really glad you bothered to post it, because
That may be true, but 3.4 hours is much more time than a standard deorbit, and they are decelerating that entire time.
So let's see...
Hubble's orbit is at 560 km (approx).
Height of the atmosphere is 100 km, officially recognized. It actually tapers off asymptotically to nothingness; there may be a stray nitrogen molecule or two even at the Hubble's height.
The standard deorbit maneuver (to take off that tiny 500 ft/s) is performed about an hour before landing. They hit the atmosphere about 30 minutes before landing, so a very tiny deorbit burn will drop them 460 km (on a shallow parabolic path) in a mere 30 minutes. The remaining 30 minutes is filled with hellfire and gliding to the runway.
30 minutes into your proposed 2 ft/s/s OMS burn, they will have only removed about 1100 ft/s. Of course, since that's over double their deorbit burn, they will be falling much faster (steeper parabola), and it will take less than the 30 minutes to hit the atmosphere, at which point they will still have a tangential velocity of 23,506 ft/s = 7.1 km/s.
The hideous numerical integration calculation required to find out exactly when they encounter atmospheric effects with a constant burn is left as an exercise for the reader.
So yes, they might have enough OMS fuel on board to slowly burn away 7.5 km/s, if they were way out in the middle of the solar system somewhere. At 560 km off the surface, though, they need a whole lot more thrust to remove all that in the short time before they hit atmo.
"We dinnae have the POW'R, Captain!"
Maybe they'll cocnsider it in the future.
They could save a lot of weight if they're just riding it down.
Cargo bay doors? Ejected.
Nav computers (only 3 of 5 required for normal missions)? Ejected.
Extra seats? Ejected.
Storage lockers? Ejected.
2 basket balls? Ejected.
Canada Robotic Arm? Ejected.
Post flight checklist? Nah don't need that any more. Ejected.
Landing gear? Ejected.
Parachutes? Nah, I'm taking those along. Ride it down to 20k feet, and jump out of the cargo bay.
Hmmm.. They DO have parachutes now, don't they? They're used in case of low altitude failure, where they can jump out the lower door.
Yeah, even with all that ejected, they still wouldn't be able to get to 7.5 km/s delta-v.
The fuel required to do that weighs about the same as 4 fully loaded shuttle orbiters.
If the facts in that article had appeared in a science fiction magazine in the 60s people would have said: wow, I wish I could live in that world where science even makes food taste better! But that was an era when the atom was your friend and we were going to conquer space.
Why don't you read it that way now? I saw none of the snarkiness you implied in the article. It doubt it was written with a smirk, but it could be interpreted that way if read with one.
Perhaps your cynicism has made it impossible for you to read that and say "wow, that is a fascinating science," and believe that Schlosser thinks that way, too.
I haven't read the rest of the book, so maybe it is cynical in context. But that excerpt contains none of the "OMG CHEMICALS!!!1!!one!!" that you seem to think it implies.
I don't know that there's enough fuel on the shuttle to bring it down to a geosynchronous orbit. They have oms thrusters, good for changing altitude on a mission and maintaining their orbit, but not dropping so much speed.
Low Earth Orbit velocity is approximately 7.8 km/s. The Hubble's orbit is slightly higher, with a slower velocity of 7.5 km/s.
The delta-v capability of a space shuttle after successfully completing a launch is approximately 600 mph (0.27 km/s), depending on the weight of the payload it's carrying. Dumping all their non-essential items out the airlock before the burn might gain them something, but not nearly enough. Remember that it takes two extra rockets and a full bolt-on fuel tank to achieve that 7.8 km/s in the first place (actually 9.3 km/s with atmospheric effects).
Even if a second rocket with a payload of nothing but fuel was launched to rendezvous with the shuttle, it would still not be enough to slow down the shuttle to zero tangential velocity. Nothing as big as the Stage 1 tank has ever been boosted into orbit in a single launch. It would take many launches and lots of complicated orbital rendezvous maneuvers to refuel the shuttle enough on-orbit to achieve a 7.5 km/s burn.
And even then, Main Engine Cutoff (MECO) during launch is at T+8 minutes; the shuttle engines can't burn a full tank of fuel much faster than that (they throttle down right at the end to keep the acceleration to 3g or less, but before that it's balls-to-the-wall). I'm not sure how long it would take the orbiter to reach the atmosphere during the deorbit maneuver--the shuttle would start to fall to the Earth immediately, in an arc that would end up perfectly vertical with respect to the ground--but 8 minutes seems like a long time. If the shuttle hit the atmosphere before the full 7.5 km/s delta-v was achieved, it would still have some tangential velocity, making for a bumpy ride, and the possibility of heating effects on unprotected surfaces.
In any case, there isn't nearly enough fuel up there to do this, and any secondary launch to bring them fuel may as well just bring a rescue capsule for the astronauts.
Thanks for the fun thought experiment, though.
Most users are scared by what they don't recognise, and an awful lot of them still insist on learning to do things by rote, memorising a set of steps rather than taking the (short) time needed to get an understanding of what they're doing. One of the things that always gets me is when someone asks (for example) how to print something - I don't blame them for not knowing how to perform a certain task, but nine times out of ten they could have worked it out for themselves simply by bringing up each menu in turn and reading them until they saw 'Print'. Not only that, they'd then remember for next time, no help needed. As it is though, they'll ask and more often than not someone will just tell them rather than gently directing them to think about it; subsequently they remember "Click the menu at the top left then hit 'Print' near the bottom", which is OK in itself. Then when they come to the next task, say making a new document, they ask someone else, and remember the rough location on the menu...
I don't understand this mindset.
Sit your average consumer down in a rental car where every single knob, button, and dial on the dashboard audio and environmental controls are completely different, and they will have no trouble taking a moment to work out how to turn on the AC and find something on XM radio. Even the displays in the dashboard are completely non-standard, and it often takes me a minute to work out which dial is the speedometer and which is the gas gauge. No one screams that every company must build their cars with exactly the same user interface. They just expect differences and deal with it.
But place this same consumer in front of a computer where a single icon has changed color and they flip out. They complain that it's too different, that they will never be able to find what they need.
Is this a function of the Microsoft monoculture, or is it based on the level of complexity of the system? Honest question.
There is a Cracked.com article in there, somewhere.
I wouldn't be surprised. This article is very old and exists all over the internet.
I have no idea where it originated, but I would love to find out.
I say, "Hey power company. I'm paying you guys to deliver me some kilowatt-hours. Nothing in my contract limits how I suck up those kWh: if I do it in a way you're not expecting, it's your job to install equipment to handle it."
We tried saying that to the internet service providers, too, and they responded by throttling bittorrent and changing their contracts.
The fact that DNS names have become so important is because early browsers had an address bar that shows the URL and allows users to enter DNS addresses. This UI has become fossilized as a method for end users to reach content. But this can quite easily be replaced to use something other than DNS, and hopefully, it will be done.
It has been done. There is a search bar right there next to the address bar in all modern browsers.
Search engines are the next layer of routing protocol. They are so high level that the hierarchy system seems quaint (web directories are *so* 1996), and has been replaced instead with a sophisticated popularity/usefulness algorithm, navigated by semantic expressions, not addresses. This is the evolution of the internet toward (dare I say) the cloud, where entry and exit locations are meaningless, and everything is interconnected not just by links but by semantic similarity. Two pages may be "next to" each other in Google's rankings, even though the servers are in Shanghai and Sheboygan.
The early internet used (and still does, to a certain extent) a "location" metaphor for finding information, based on the idea that the information was on a server that physically existed in space, and we think of domain names and IP addresses as places. But as the internet grows more interconnected with things like aggregators, with more high-level routing protocols like search engines tossed on top, the "where" of information starts to matter less and less, and the location metaphor breaks down. People trying to grasp these decidedly unintuitive concepts make up buzzwords like "cloud" and "virtual" and "mesh" to try and make sense of an information system that is several orders of magnitude removed from our evolutionary experience, as navigation on the internet moves from a physical location metaphor to a semantic navigation system.
The best way to predict the future is to create it. -- Richard Bandler
Given the topic of this article, your sig just got rather scary.
I am actually disappointed that Google settled at all. I was hoping this issue would end up going to trial, where we could have had a frank examination of the absurd copyright system. If the purpose of copyright is to provide incentive to creators, than orphaned works should logically fall into the public domain, since their creators no longer need incentive from them. This is actually no different than the property model we use for copyright: abandoned property (such as a shipwreck) no longer has ownership and can usually be claimed by someone else.
But instead, Google opted to settle. This was the smart business move, since they can now move forward with their database, and it may yet prove valuable as a public perception precedent, but it did nothing to solve the underlying problem of unfair copyright enforcement.
The people complaining are the university libraries, who previously had the monopoly on this material. These books will not only be available in Google's database, they will also be available in dead tree format at your friendly neighborhood library.
Google is not creating a monopoly, they're breaking one up.
There is something stopping another entity from creating their own anthology: The fact that Google has a license now, and nobody else does.
That may be true, but the hope is that this settlement can be pointed to in future cases, streamlining the process of getting a license. A new company XYZ, Inc. can negotiate a license from the publishers more easily by pointing to Google's license as a template, allowing XYZ, Inc. to go scan orphaned works. Or university libraries, who have paper copies of the books in question, can more easily negotiate licenses to scan them and make them available on the university website.
Google is big enough, with enough muscle, that they can push the legal system in this country pretty hard. Considering where they stand on copyright and making information available to the unwashed masses, I would say only good can come out of their efforts in the long run.
While I am typically a CLI mplayer guy (even if opening a movie from a graphical file manager to fullscreen), I have to say that SMPlayer is a fantastic front end GUI. I recommend SMPlayer to even my most clueless Windows user friends.
Sysadmins not doing backup is one thing, but how is surrendering all your data because it's convenient better?
You're not "surrendering your data" any more than you would be if you hired Acme Outside Contractors, Inc. to run your infrastructure.
Businesses using GMail would actually be using Google Apps, which operates contractually the same as any other IT contractor, with all the legal requirements for non-disclosure that entails and an enterprise-level SLA.
This is not a free service, because of the aforementioned legal/SLA requirements. But it is certainly cheaper than running your own Exchange server and gives your employees more features and better usability than Outlook. If your only reason for opposing it is some vague aversion to storing your data on iron you don't own, then you need to come join the rest of us in the 21st Century, where outsourcing, contracting, and offsite storage are the norm and contractual requirements for proprietary data storage are built into every vendor agreement. Google Apps is no different in this respect; it's just another contracted vendor, albeit a vendor with kick-ass software and 3 nines uptime.
Real programs don't eat cache.