For the last 26 years or so, I've been making electronic copies of my records. The media changes, the location does not. My current scheme is to burn financial records onto CD-ROM on two pieces of archival media. One goes into my local at-home fireproof safe. One goes into my safe-deposit box at my neighborhood bank.
Work backup is a little trickier. For a long time I was using tape backup, upgrading to larger capacity as the new drives came out. Then I started burning multiple DVD-ROM disk sets. I was able to start using a single pair of DVD+R(DL) when the cost came down. Again, one set goes into the at-home fireproof safe, the other to the safe deposit box.
I also use USB hard drives for in-office backup. I use Linux, so I formatted a 3-TB drive as ext4. I then use rsync to update the drive during projects at regular intervals.
The cloud? I have some people who insist I use Github and Dropbox. Github is fine for working projects, but I wouldn't depend on them keeping stuff forever -- regular backups of the working projects is the rule for me. Dropbox was going just fine until it broke completely when I upgraded my systems to CentOS 7.0 (and now 7.1). Almost useless. I'm hoping Dropbox will get a fix for this soon.
Life tip: Record your financial records on media separate from your other backups. You can then pitch the media after the statute of limitations expires (7 years for US).
Looks like the memory card on the the black box has been "lost". Is this true? How is it possible if the black box is designed to withstand 3500 g ?
Would the data on the memory card contain information on the door status (locked / unlocked / open / closed
Also, why isn't data streamed to ground stations nowadays? And why black boxes do not float ?
In short, together with the door design, it all looks like amateuristic design.
1. Door-locked status: Don't know, but you can't record everything -- there are already plenty of channels that are captured that are far more important
2. Streaming to ground: The NTSB has been working with other air safety bodies to make recommendations to do just that. One issue is available bandwidth: there just isn't enough of it available. So the amount of information that can be transmitted would be limited.
3. Floating black boxes: Like the downlink scenario, breakaway recorders that float are being looked into. More importantly, though, are better crash locator beacons, so the crash debris field can be found more quickly.
Perhaps they could video the cockpit (and the fuselage for that matter) and destroy the footage once the plane has safely landed.
In the case of the FDR and CVR, that already happens, sort of. The devices are only able to handle a finite amount of data, and new data overwrites the old. So eventually you effectively get what you are suggesting by normal operation.
And there is a good reason not to dump the recordings. During an investigation of a crash where wake turbulence was suspected to be the main culprit, the investigators had the FDR of the plane ahead of the accident plane pulled to see just exactly where it was in relation to the accident plane. As I recall, the data showed the leading plane was much closer to the accident plane than anyone had suspected, and the wake turbulence would have thrown the accident plane around violently. WIthout the additional data, investigators would not have been able to confirm a hypothesis as to a contributor to the crash.
Of course, the lone scientist would be backed by billions from polluters who object to clean water and air.
Do polluters object to clean water and air? I'm sorry, my father's experience on the Illinois Pollution Control Board says otherwise. The object of the board was specific: clean up Lake Michigan. The original estimates were that all efforts to clean up the lake would take 33 years. (Indiana had a similar project.)
During the first five years, the Board concentrated on identifying and quantifying the worst polluters on the Chicago lake shore. In many instances, the companies who were cited were able to put corrective action in place quickly. Part of the reason they didn't do it on their own dime is that their competition a couple of miles up the coastline didn't do it, which put the polluting company at a competitive disadvantage. So the company (1) put in control measures, and (2) snitched on their polluting competition.
In some instances, the management of the company was not aware just how bad they were, and cleaned up. That may sound stupid to you, but those companies just didn't realize the effect their outflow was having, until it was pointed out to them. In many cases, these were companies built in the 40s and 50s, when the amount of total pollution was orders of magnitude lower, and the ecosystem could handle it. This included smokestack pollution, as well as lake pollution.
The result? Significantly measurable improvement in less than five years, not the 33 years originally estimated. The eco-system started to recover once the worst of the ongoing industrial pollution was removed. A success story.
Where environmentalists and industry get cross-wise is the idea of the former that clean water and air should be obtains "at all costs" and "everything today". Industy wants that last phrases to be "at all reasonable costs" and "scheduled to match the capital spending timing."
The EPA of today, according to the reports I see in the media, is more of the first class of people instead of the second. EPA thinks that the environment is so fragile that everything possible -- and then some -- has to be done right now. Lake Michigan proves that our environment may be more robust than the EPA gives it credit for.
The most telling part is that the legislature will quote: "bar academic scientists on the panels from talking about matters related to research they’re doing." WTF? How is EPA supposed to make decisions? By ignoring the advice of scientist who work on the matter and taking advice from people who are completely clueless?
Who says that the EPA would be ignoring the advice from people who work on the matter? All the law does is bar the people judging the applicability of the data from judging their own contributions -- that's a conflict of intertest. The EPA holds hearing, where they can solicit the opinions of anyone they want. So your complaint is a red herring.
EPA's work has always been based on publically available rigorous science. the repubs are just raising an issue to squeeze in something else.
Then why is the raw data so hard to get? Why are people "adjusting" the raw data? The adjustments come after the raw data is published, as part of the method of analyzing the raw data. What about the work done by people who don't have "climate scientist" after their name? Is that data considered? As I recall, some of the journals rejected articles submitted by authors in other disciplines, such as areospace.
Ah, so by the rules in this law, Global Warming can never be proven. Just like it's never been proven that smoking causes cancer. No study on that is "reproducable" because anything that would prove a link by exposing humans to smoke is unethical (thus illegal). It's illegal to prove smoking causes cancer, and thus illegal to repoduce any proof to that effect, so the EPA couldn't regulate smoke, because nobody can replicate a study proving smoke (or lead, or whatever) causes problems in humans.
If one published the raw, unmanipulated data, others can evaluate the data using other methodologies to see if they come up with the same result. No "adjustments" or "fudge factors" in the first dataset, just the raw measurements. As for proving that smoking causes cancer, there are indeed reproducable studies done with lab animals. There are also reproducable studies using surveys of patient history and outcome, to determine what effect smoking has on overall health outcomes. With proper stripping, the raw data is easily collected and published without affecting patient privacy. And you are an idiot for using the term "Global Warming" because that bugaboo has already been debunked. Look at the snow levels in Boston, for example. Now the scientists erase "Global Warming" and replace it with "Climate Change". You really need to update your arguments.
You provide the raw data collected by whatever means, plus the methods used to take that raw data and translate it to a temperature measurement. Very straightforward. You provide the raw data without any adjustment or hedging, so that the raw data is accurate and as complete as you can make it. You then explain very carefully any assumptions you have made about your transformations, without any handwaving or "here a miracle occurs" or "I just know that this means that."
Subject privacy? The first step in any data collection would be to remove identifying information from the incoming data, so that the subject's privacy is maintained. By doing the stripping as the very first step, then publishing the stripped data as the raw data used for the rest of the research, you maintain transparency without compromising subject privacy.
The next step, when you want to coerce people to spend money, is to design a model that will predict what will happen, and measure that model against raw data of what actually happens. From that, you can validate your model, so that recommendations made can be measured against the model to determine the cost/benefit ratio.
Guarantee? What guarantee? Both DSL and cable internet service are provided on a "best effort" basis. If you want a SLA, you have to pay through the nose for it. Guaranteeing a SLA means the provider has to provision dedicated circuit capacity, instead of letting you complete for channel space on a first-come, first-serve basis.
With DSL, the uplink and downlink depends on the DSLAM-to-CO channel capacity, because DSL is implemented using ATM and virtual circuits in fiber rings. The differing up/down rates are a design decision, based on how many of the sub-carriers are assigned in each direction. Oversubscription is the carrier's choice.
True cable service is another story. The downlink is managed by the head-end, so the feed onto the cable can run at top rate. Yes, the more users who are on the subnet in your neighborhood, the slower things can go. The uplink, however, is a single channel shared by a number of sources, so the upstream channel acts like AloahNet back in the 60s: a fractional load can saturate the uplink because of contention. (ThickNet and ThinNet suffered from the same congestion problem...which is why most people use twisted-pair star networks, even in our homes.)
POTS is 64 kilobits/s in the ideal case, 56 kilobits/s when the path is digital, about 48 kilobits/s when there are analog diplexing amps and such (which continue to go away, thank goodness). But not let's get caught up in nits...
When you talk about video, you are assuming a single stream of high-quality 1080p video. How many American homes have only one television? (Especially when there is such a glut of analog-only TVs available for a song with the switch to over-the-air digital.) (Or as large-format laptops continue to hit the previously-leased used computer stores.) You can easily have two streams in the poorest of homes, one for the alleged "grown-ups" and one for the kids.
When you start talking about VoIP, you need roughly 100 kilobits/s to handle a single voice conversation and side-channel control, considerably more if you have side-channel "whiteboard" traffic. That's per phone conversation. It adds up when your household has a number of people, and more so in SOHO.
And the cable companies in particular want to keep 1990 pricing as much as they can, because Internet is a cash cow for them when they get CCIEs to maintain the network gear -- an absolute necessity when the cables sell 100/100 fiber to larger businesses.
It's about profit and rate of return. And, unlike the other parts of their business, the rate of return on Internet is (for now) unregulated.
3. Not everybody streams HD video. If you don't stream HD video then 25/3 is more than adequate. I watch TV shows from Hulu on my laptop over a 6 Mbps DSL connection.
I don't stream anything, because the short-term packet loss I suffer all the time would clobber streaming. I have "business" cable service, which is fine for mail servers, web browsing, and file transfers, but not VoIP or any real-time applications such as gaming. Skype is just...painful. Even VPN access can be dicey...and that's talking to a 100/100-fibre-connected-through-same-cable-company site.
Instead, I will find DVDs/BluRay at the pawn shops, used-"record" stores, and for things I just can't wait for other people to discard (or movies that people tend to hold onto forever), Amazon and Barnes & Noble.
"Normal cable companies don't need $100/month for Internet, consumer lobby says.
"The consumer lobby is opposed to a cable industry plan to keep sub-standard Internet server at or above $100/month. Cable companies do just fine with lower rates, the Internet Consumer Association wrote on SlashDot this morning. It wasn't that long ago that Internet access was available for one-fifth the rate, and the cost burden to the cable companies to provide service continues to drop as the Internet access piggy-backs on existing cable infrastructure, especially in the face of cable company promotion of so-called 'triple-play' products: television, telephone, and Internet.
"Notably, no party provides any justification for adopting increased tarriffs for providing service. All the companies provide bogus justifications for charges for service that go well beyond the 'current' and regular' amounts that were in place during the dial-up and DSL days."
(I wonder how the NCTA would respond to such an article, were one such as this parody were ever to appear in print)