No, definitely the first one. Apt > yum by a long shot.
No, definitely the first one. Apt > yum by a long shot.
Notice how he uses the words "breach of contract" in the post. You can't have a breach if there was no contract.
Either Shuttlesworth is being VERY loose with legal terminology, which would generally be a bad idea for public statements from a former CEO and still public face of the company, or there was some agreement in place.
That's just Hyundai. Even my Ford with the widely hated MyFordTouch (aka Sync 2) system doesn't have that problem, nor did my previous Kia (which shares corporate overlords with Hyundai, but strangely they don't share infotainment systems even in their platform-sharing models like Optima/Sonata.
how do you solve the logistical problem of replacing 10 satellites all in completely different positions around the earth in one launch?
You don't. The way Iridium handles it is having some of the satellites in orbit allocated as spares and not in active service. They have 66 active birds plus six spares. The spares run in a different orbit which circles the earth faster than the active constellation but can still easily transfer to the correct orbit, minimizing fuel needs for activating one in exchange for a longer time spent waiting for the orbits to sync up properly for the transfer.
Basically you set things up like a large "cloud" host where there's enough spare capacity that individual device failures just aren't really a priority and you can replace the failed hardware in bulk every so often rather than having to do something one-off immediately.
When someone can read your passwords of your disk, the point of encryption is already moot.
No, encrypting the password database with a master password that's not saved means it can no longer be read directly, significantly raising the bar for capturing passwords.
A) FTP is typically plain text anyway so you could just wireshark it
Depending on user privileges this may not be possible, and would only gather one at a time.
B) you can replace the binaries and have them emailed any time they are entered
Depending on user privileges this may not be possible.
C) you can install a keylogger
This "user" could've just as easy encrypted his entire hard drive or user directory. Still wouldn't have helped though.
No shit that wouldn't have helped, as long as the drive's mounted the file is plaintext as far as the malware is concerned.
I would seriously reconsider taking a "secure" anything from anyone that can't bother to think their own security through.
Clearly you're not capable of thinking through security yourself.
Let's say I'm shithoused and inadvertently run some kind of malware that wants to steal my FTP passwords. I realize what I've done almost immediately after and shut down to restore from backups. If they're stored unencrypted, that malware could have already sent my full stored password list to wherever. If they're encrypted with a master password, the malware gets absolutely nothing. Even if I don't catch it immediately the malware still can't get it no matter what until I actually go to use those passwords.
If you can't see how huge of a difference that is I don't know what to say.
And how much solar power do they generate when covered by snow/ice? Your objection is short sighted...
My objection is about it taking more power to keep them clear than they could generate.
If they generate 48 watts per panel, but are drawing 150 watts to run the heating elements, they're losing 102 watts the whole time the heating elements are on.
Maybe they have figured out some way to require far less power per square foot to melt snow/ice on a flat surface than the roof heating systems I looked at for reference, but they'd have to be down below 10 watts per square foot to break even under ideal conditions. That is not much heat at all, and as others have pointed out in the sort of conditions where you'd need the heaters running the weather tends to not be anywhere close to ideal for solar so the chances you'd even get 48 watts are slim.
According to all the articles and press releases power generation is the primary purpose of these panels. They claim they'll have enough surplus to offset the energy usage of the entire town square. If they are consuming more power in an hour than they could generate in three, just to keep them able to generate power, that doesn't make a bit of sense.
Now if they were hyping this as an interactive LED sidewalk that's heated to stay clear on its own in winter, and it also happens to generate some solar power, that'd be an entirely different thing. That's not what they're doing though.
just how much snow and ice melting does it take to turn these into a net negative rather than positive generator of energy?
My thoughts exactly. This installation has 30 tiles over 150 square feet, so five square feet per tile, with each tile generating 48 watts total under ideal conditions. Let's be nice and round it to 10 watts per square foot.
Looking at a variety of heated driveway and heated roof systems it seems that most use somewhere between 30 and 60 watts per square foot to effectively combat snow and ice. That's 3-6 times the best-case power generation of these panels.
Your information is years out of date. I've been using an ssh application on an ereader and I've been getting around half a second refresh. There's also a debian distro for the pocket Kobo from maybe three years back that has an on screen clock that updates in seconds - so less than one second refresh there.
Router/switch activity lights blink at a rate I'm not entirely sure of but definitely exceeds 10Hz. 1-2Hz is not enough to be useful for the purpose IMO.
I have a Kindle Paperwhite, I know how quickly modern displays can refresh. I actually want to build a thermostat that uses an e-ink display because it makes perfect sense in that role. but for a network device's status indicator it's no good.
Even cheap electronic paper can be updated once per second with fairly low power requirements. For activity, the lights have basically been useless for decades: unless you're the only one on the network and are sending pings one per second, they're basically always on. It would be far more use to have a few more pixels and display a logarithmic scale bar of total throughput. For power on, something that alternated between - and | once per second would let you know that there was power flowing, without needing a static light.
I'm looking at the gigabit Cisco switch on the desk next to me and definitely have to disagree with you there. I can clearly see the difference in activity between for example the port my VoIP phone is on and the ports my server and router are on. I can see how heavy the broadcast traffic is based on how often all ports blink simultaneously. I don't know what their actual blink rate is but I can say for sure it's greater than 10Hz on a highly active port. Many times over the years I've used the lights to help locate the source of a network loop or broadcast storm. The fact that the lights can blink rapidly is the key to that working.
A LCD might be able to go fast enough, I'm not sure.
The utilization indicator definitely could work though, I won't deny that.
Make it e-paper, not LCD, then it will be readable under any light. If e-paper displays are cheap enough to put on store shelves as price tags, then they should be cheap enough to serve as a status display on a router.
E-paper would be a terrible display for this purpose. It can't change fast enough to work as an activity light, and since it maintains an image effectively forever until updated it's not trustworthy for lower rate status monitoring like power on. If the device crashed or even powered off entirely without resetting the display first it'd look normal at a glance.
Tuning adapters suck.
Tuning adapters suck for the same reason CableCard as a whole kinda sucked. Because the cable industry as a whole wanted them to suck. Ever notice how their own boxes never had the same problems, even during the time they were forced to use the same CableCard interfaces? Or how variable the support was between providers, with some providers happily shipping cards to consumers and offering self-service interfaces to activate them where others would insist on a truck roll and scheduled appointment (with standard cable company timing)?
Look at the same concept as implemented in Europe. Over-the-air, cable, and satellite television all use variants of the DVB standard. It even has an IPTV variant, though I'm not sure how widely it's deployed in that context. There's a standard interface for a service provider's encryption solution. Any consumer can use any compatible device with any television provider, and it works great.
For whatever reason (read: doesn't benefit the right companies) in this country we have a history of looking at problems Europe's already solved and saying "nope, we can make something much worse for consumers". See also GSM vs. CDMA and the fact that Verizon still insists to this day that they need to individually certify each device while the majority of cell carriers on the planet happily work all day with whatever phones happen to be compatible.
It's easy to be compatible if you want to be compatible. What these companies try to avoid saying outright is that they don't want to be compatible.
Are there a lot of cell towers in these areas where cell service for internet is a viable option?
I have 250/25 cable and theoretically 24/2 DSL (really 14/1.5) at my house. A friend of mine two miles away has no cable and theoretically 6/1 DSL that really delivers about 3/256k most days. The same T-Mobile tower covers both of our houses, off which my old Note 4 gets 65/30.
They've been procratinating on this "remove NPAPI" thing for years now. They always say they will and never actually do since it would rregress their market position by breaking most of the web.
Chrome (which has about 50% of desktop users) removed NPAPI entirely almost a year ago. None of the mobile browsers (which depending on country may be the majority of internet users) have ever had it. The vast majority of the web isn't going to miss it, because they don't have it right now and they clearly don't care.
What if it were a disk failure instead? Cryptolocker? Inadvertent keystroke, or even cat on the keyboard?
The partition getting deleted is obviously Microsoft's fault. The fact that it caused permanent loss of important data however is more the user's fault. If it's important it needs to be on at least two different disks, and the further separated those disks are physically the better.
Just because someone is the victim doesn't make their actions or lack thereof perfect. If you're not backing up your important data you're guaranteeing that many possible problems which would otherwise be an inconvenience immediately get bumped up to disaster.
"Our vision is to speed up time, eventually eliminating it." -- Alex Schure