Same as how upkeep is done in submarines. They also have very low oxygen, but enough for a person not to die. Of course, there are side effects... your thinking is slower, and wounds take a lot longer to heal, but it does work, and the low O2 in the air does keep fires from spreading.
I hope the RAM is installed is replaceable. If not, 8GB was something acceptable back in 2008... but a laptop should be at 16, if not 32 gigs of RAM. This is the biggest turnoff of the MacBook Airs. Disk space can be worked around using the third party SSD that goes in the SD card. Network connectivity can be augmented via a Thunderbolt or USB NIC. The CPU is good enough for most tasks, but RAM is the biggest bottleneck.
The paucity of RAM is my biggest complaint. For a lightweight laptop, the other stuff is acceptable. It would be nice if Dell and other PC vendors would hop on the Thunderbolt bandwagon which would allow for an external GPU (assuming enough PCI lanes are available to make it worthwhile.)
Of course it would be nice to have dock connector, but Dimensions are consumer level models, and they would likely never get used.
I would guess it would be cheaper in most cases for an attacker to black-bag the hardware (evil maid attack), or just use xkcd.com/538 and a wrench.
TEMPEST attacks are very low on my worry list. If I were running an organization that dealt with that sensitive a data, it would be well tucked away in a building designed from the ground up to keep cameras and detectors quite a ways from the juicy stuff. However, before I even bothered with that, I'd be working on physical security, network security, various encryption levels, and having pentesters in to actually verify that the stuff in place is actually doing the job versus looking cool.
Cooling costs come to mind as well. SSDs are one thing, as they can be powered off and not used. However, HDDs have to be either spinning (which creates a lot of heat, especially at 10k+ RPMs that enterprise disks spin at), or spun up/down, and spinning enterprise disks up and down isn't good for them, and might even cause array faults unless the array firmware is designed to deal with it.
There is also expense. If I have five hard disks worth of data, I need (5*4)/2, or ten HDDs by the OP's metrics. However, I've had batches of hard drives all fail at once. If I get multiple failures, even RAID 6 isn't going to help. If HDDs popped at random times, I might be OK, but not in this case.
Of course, I've ranted about this before... RAID is solid for protecting data against disk failure... but that is just one of -many- failure scenarios. I have seen disk controllers fail and write garbage to the entire array. One goober doing an rm or a dd command will toss the array. If you want serious backups, you need to not just focus on disk. Tape isn't perfect, but done right, after the initial cost of the drive, the cartridges are inexpensive, take zero watts (other than climate control), last decades, have innate encryption (LTO-4 and newer), and can have hardware write protect enabled, as well as WORM media. This is great for people with the "keep it forever" mindset. Just set a password , stream the data off to a pile of WORM tapes, and stuff those in a closet somewhere. If the tapes vanish, since they were encrypted, and assuming only a few people have the password, it can be written off has "just" a hardware loss.
: It is boneheadedly easy to set encryption on LTO media via SPIN/SPOUT, so might as well set something, even if it is a variant of "correct horse battery staple". Ideally, the password should change every year or so... but just setting -something- is better than nothing.
You hit the nail on the head. The first thing any leader will do is try to get the aliens onto their side, be it by duplicity, cunning or whatnot.
I wouldn't be surprised if there were something like the Prime Directive out there, just because the lesson of an advanced civilization bumping into a lesser one usually means the lesser one is immediately eradicated. Our history shows this to be evident over and over again. (Aztecs, anyone?)
I would suspect any advanced power that can visit the earth undetected likely has its own agents and such so they have an in-deep knowledge of the world and its goings on at least on the level of the CIA's World Factbook, if not more.
So, when would the aliens step in? I would say only after an XK-level event where 90-99% of the population is gone and the form of governments we have now are destroyed. Maybe they wouldn't even get near us, unless we got to the point where the human race would be at the point of too few people to effectively sustain the race with DNA mixing. Something far greater than the Black Plague which changed the West from a dukedom-fighting-duchy backwater where a peasant was almost certainly dead before age 20 to one of the top dogs (mainly because kings just didn't have the physical backs to break to keep them in power, coupled with the fact that farms could grow other than basic food items to keep people fed.) A CK-level event where power fundamentally shifts, would just get aliens betting on what nation/religion will win the year's pissing contests at best. (Think Command & Conquer Generals, but with ISIS+AQ replacing the GLA faction, and Russia part of China's team.)
If someone gets physical access to my machine while I'm away and the screen locker has not activated, regardless of OS I am on, I am screwed. Be it Windows where a utility can be run to hook into the keyboard, OS X and a
Realistically, X-Windows authentication and running rogue clients has been a non-issue since the late 1990s. By default, X is locked down quite tightly, taking an explicit "xhost +" to undo those measures. Even when SSH-ing into a remote machine, by default, the X-windows port is not authorized or forwarded unless both the client and server are explicitly changed to permit this. These days, relatively few applications are X-windows clients, other than legacy stuff. Most enterprise level items (be it an Isilon, VNX, VMWare vSphere, tape silo, and so on) either have a dedicated client, allow SSH in, or have a web page for their configuration. The last time I've used a X-Windows client from a remote machine was running the NetBackup administrative client application from a master server, because it was the most reliable way I could watch what was going on.
One cannot make light of security holes, but there are things to work on and ones that are too difficult for an attacker to ignore. It takes some explicit commands to force X-windows to allow clients other than from the local machine to connect (including disabling the kernel packet filter or actively allowing connections through it.) So, someone connecting remotely to an X server before xlock activates can be a hole... but it is something extremely hard to take advantage of.
It does have its appeal. For the average user who isn't that technical, and who doesn't know/care how to use PGP or gnuPG, this phone is a step up. At least a user who bought this will get better fixes with regards to security issues than with a lot of smartphones.
My biggest complaint is that it is a closed ecosystem. It would be nice if other devices that are not BlackPhones can run the apps so there can be a wider customer base. Otherwise, the device's acceptance will be hindered because everyone has to have that specific maker's phone. Plus, for every closed application, there is an open alternative.
Maybe the ideal would be to get PGP working independently and transparently with text messaging , mail, voice, video, and other items. That way, the metadata can be protected via one layer, but the actual contents are protected no matter what, even if the protocol is completely broken wide open.
: An ideal would be something where sender's device would check if the receiver had the ability to receive (likely having the app poll a server every so often), and if so, send it over the Internet (mainly so it can be acknowledged it was received). If not, send it via SMS/MMS. Unlike iMessage, it would fall back and not assume that a specific app was installed and running.
This is one reason why I have hedged on buying one. How are they better from CyanogenMod, and for tools, open-source items, be it apg, K-9, EncFS (so files can be secured on both SD cards and cloud providers), RedPhone, TextSecure, and other apps that have their source available if one wants to manually look it it.
I respect PRZ incredibly, but one of the reasons why I continue to use PGP even though he states that it is obsolete is that PGP (and GnuPG) are open source... and they are platform and transport mechanism independent. I can send an OpenPGP ASCII armored packet via E-mail, texting, XMPP, Facebook, or any other messaging protocol. I do respect PRZ by founding a security company in an era where most "security" is PR, but I prefer to pack my own parachute and use the tried and true.
The problem is that a company that has security as part of their mindset is hard to find. Most at best have it as an afterthought, something strapped on at the last moment.
Security takes R&D, just like everything else. Would I expect a v1.0 product to be secure, especially from focused attack by people who want to bypass it? No, and not even in a v1.0.10 product. Breaches will happen for the first few years.
However, I will state one thing about BlackPhone: They fixed the issue. Other vendors would just tell their customers to buy a new smartphone or go pound sand. Where the rubber meets the road is how security flaws are handled. Are they acknowledged and patched, or are they covered up, flagged as FNR (fixed in next release), and only threats of litigation able to actually get the vendor to make a patch. There will -always- be flaws. However, part of a company selling security is how they respond to issues, and here, BlackPhone has performed quite well. There was a problem, they fixed it, and that is what matters.
I do have one hope -- the USB bus seems to still have devices that interoperate at USB 1.1 speeds, even now, almost 15 years later. This is a good thing. If those devices are still usable on modern systems, then a floppy drive, or a CD drive are usable and would continue to be usable. USB 3 definitely is different, but there will be adapters so that people's mice and other items will continue to operate.
The parent is correct though. Critical data can't just be tossed on some media and forgotten. Ideally, every year or two, it should be copied onto something new. At least every five years, it should see a new medium.
What comes to my mind are software products like TrueCrypt. Who would have thought that TC, something one had as a utility for over a decade, would be sunsetted with multiple, incompatible forks out there? Now is a good time to move data stored in that format to another secure format .
Tape pose two problems -- not just finding a physical drive, but what software is being used? This is a bit easier with LTFS (put the tape in, it has a filesystem that is mountable), but in general, is data stored using tar, or some vendor specific utility. AFIAK, NetBackup uses cpio, IBM TSM uses its own specific format, and so on. However, if handed a tape, it becomes a matter of guessing to find out what is stashed on it, and some formats like DLT, one also has to factor in blocksizes. However, if one documents and keeps the backups programs around, this shouldn't be a major issue, although it seems to be often overlooked.
: If the data is static, and one isn't worried about an intruder knowing the data's size, gpg or PGP Zip come to mind. Drive images are harder -- since TC is gone, one sort of has to bet between VeraCrypt and CipherShed to see which one will continue versus which will be discontinued.
Maybe I have been lucky. I have CDs I made in the late 1990s when CD-R writers were 1-2x speed, and I can still read data from those. I once had to pull some files, so grabbed a DVD from about a decade ago, extracted the files and called it done. Since I use WinRAR for an archiver, I do know if there is bitrot, and if damage did happen, there is a chance that it can be handled by a recovery record.
I've also been lucky with tapes as well. I've restored DLT media over a decade old with zero errors.
Of course, when it comes to hard drives, I have a nice pile of dead ones over the years, including a batch of drives which failed at the same time. Similar with USB flash drives.
I am hoping the Sony and Panasonic ArchivalDisk product gains some steam and the price of drives gets dropped by a factor of 10-20. 300 GB AD, or 160 GB Ultra HD Blu-Ray (yes, that is the name, announced a few weeks ago) would be useful for a long term backup/archive format, especially since the technology is innately WORM driven.
Of course, here is something I wonder about which would help immensely with backups: Why isn't there a decent backup/archive/retrieval program out there that works well with multiple media types? Retrospect used to be good, but doesn't support USB Blu-Ray players (making it worthless for archiving.) In the enterprise, there is NetBackup, Tivoli TSM, ArcServ, Networker, heck, even Backup Exec. These not just do backups and restores, but can transfer stored backup sets between media types, validate backups, retrieve archived files, periodically move data from one pool to another (say from disk to tape), and handle one set of data (documents versus OS files) differently from another.
Why do I have to pay insane prices for an enterprise-tier of software if I want the ability to select some documents, click "archive", have them copied to an archive media pool, then go on? When I want to make sure the backups are secure, I create another pool on an external drive, copy the data there, and flag that pool as offsite. This way, every single file I have is backed up, archived/deleted files are retrievable with just one command (perhaps additional time to attach the media if it is offline.)
This isn't state of the art functionality here... ADSM (now TSM) had this stuff back in 1998. This should't be locked to an appliance either. The Unitrends appliance and the former WHS were nice devices, but it would be nice to have a server handle the backup coordination, then if need arose, separate media servers could be used as well... for a price well under five digits.
Backup software (and I'm not meaning the Acronis TrueImage and the other clones that can copy data to a drive or offsite and back... but stuff that can keep track of multiple media types, move files between them, deduplicate files, and be able to figure out where some spreadsheet from 2008 was, out of hundreds of DVDs burned) just has not kept up with the times for average users. I just don't see why Symantec, EMC, or IBM offers this for home users, as it not just makes data safekeeping easy... but because the server could be installed on a separate machine that accesses local desktops, malware on a client would not be able to destroy the data on other machines.
There is no -best- medium:
Paper is always readable, but can be easily destroyed by water or fire, and stores the least amount of info per size unit than anything else.
The cloud will be present barring SHTF, but there are the security issues , so it needs encrypted via the endpoint.
Tape is an archival grade medium, but the drive is expensive ($3000+), it requires a fast computer to prevent shoe-shining, and either requires a program for backups/restores, or one can use LTFS to have the tape appear as a hard drive. (This route, one can use LTFS or even just tar to stash a copy of the backup program and its keys for install, then install/use the program for the rest of the tapes.) Tapes can be physically set read-only so malware can't tamper with contents. One can also buy WORM tapes that further guarentee protection against data modification.
External hard drives are cheap and easy to use... but are not an archival grade medium, can fail, and can be zapped by malware.
Optical drives can function well as WORM media, and are inexpensive... but their present capacities are minuscule (25 GB is the best bang for the buck price point, although the next gen Archival Disc format may actually make optical media viable again for backups.) If Sony and Panasonic can make AD drives and autochangers  at a price point well under LTO 4-6, they may just have a major untapped market. Sony does have high capacity optical disk drives... but they run in the $6000-$7000 range, so hopefully this price will drop by large amount once mass produced.
SSD is decent and fast... but it is nowhere near permanent (those electronics will bail the gates eventually), and once the data is lost, it is gone for good.
My take: I use various redundant media. Critical files get burned onto Blu-Ray media using Nero's SecureDisk or DVDisaster (for error checking/correction), stashed in an encrypted container. I also periodically buy a large external HDD, copy everything from my machines onto it, let it deduplicate, then copy all the stuff from the normal backup drives onto the volume as well. With deduplication, this doesn't use up that much space.
: Never know who has access to the files, and the provider can go bankrupt at any time, allowing the next owner of the physical servers free access to the stored data without any legal ramifications whatsoever. In fact, one cloud provider even has it in their TOS saying that the next person owning their firm gets all data free and clear.
: You used to be able to spend a few C-notes on a 400 disk CD changer. An optical silo holding 400 disks isn't much different, so with the 300 gigs promised this year per disk, that gives 120 terabytes of WORM media in 3-4 rack units
Not all chargers are alike. There are reviews about how clean the 5 volt DC power is on various models, especially when one plugs and unplugs devices... and it varies from quite good to pure crap.
A lot of them only handle 500 milliamps, making them worthless for newer devices, some of them require 2.1 amps in order to even bother charging.
There are many brands that are decent. If I am going to use a wall socket, I want as many ports as possible, so I like the Lumsing five port model, assuming there is space. This way, I can leave a set of charging cables in place, so regardless if the device is a BT earpiece, a smartphone, a tablet, or an external battery, it can find a teat to suckle on.
I am not into the "boutique" chargers. There comes a point where you get what you pay for, then hit diminishing returns, similar to audio equipment, with the audiophile exponential price curve at the high end, with little back in return .
Of course, there is the one invention I'd like to see. A decent charger that has a very good, replacable battery pack (at least 20 ampere-hours). This won't be a small device, but it would definitely come in handy camping/RV-ing, especially with the 40 watt USB bulbs available. One can provide enough light for reading, a few can provide light for the entire camper. It would also be useful for power blackouts since when the battery gets discharged, it can be swapped with another.
: This is audiophile stuff. Studio/professional stuff can get expensive... but there is good reason for it, other than "it uses a rare substance that only discerning ears can tell."
I would love an 8 port charger, especially for two people. It could top off both of our iPads, both our phones, my BlueTooth headsets (one earpiece for the road if I'm not using my car, another for stereo listening), my Kindle (an e-Ink model which is easier on the eyes for long reading), and an external battery so when I'm camping/RV-ing, I have the ability to keep my phone topped off.
On a computer where I'm using multiple external hard drives, having USB ports becomes more critical, preferably USB ports on different cards, so I can run drives on separate buses.
The problem is that convenience got ahead of security. Until the hit on Sony, the biggest threat to companies was hardware failure. So, companies went with SAN installations that had RAID6, async replication via WAN, snapshots, multiple tiers, and deduplication. More backups needed? Add more drives, maybe a controller.
Tape (and also optical, although optical has not kept up with the times when it comes to storage) became something considered a dinosaur.
This model worked perfectly when the bad guys were logging in to copy off the plans for the next mouse trap, and then go about their business.
The Sony hack has changed things. It only takes one command issued as root to completely purge an entire SAN of all LUNs and directories. Replication? The remote SAN will happily replicate the deleted directories and zeroed LUNs. Snapshots? Easily deleted.
Even non storage items are affected. Firmware can be easily zeroed out, and bricking expensive machinery can be a victory for an extremist group looking for publicity.
As stated above, it is time for physical write protect switches to happen , and it is time to start factoring storage tiers with offline (perhaps WORM) media... media that can't be erased with just one command.
: The best is a physical switch or jumper, but even if it is a button or combination of buttons held down, this is better than what we have now. We should never have left the concept of "flip to writable and boot from clean media to initiate the flash update process" behind in the first place.