Then women are the one advantaged. They get more pardon, reduced sentencing and easier parole. I don't even start on domestic issues and divorce where they are more than likely to be seen as the victim, get alimony and child care...
The domestic and divorce issue depends heavily on which state in which the divorce/domestic violence occurs. In California, the law does tend to favor women in divorce proceedings; in Georgia, on the other hand, men are favored. Marriage equality laws are too young to provide any evidence about whether these biases are sex related, or are a reflection the state's definition of fair division. With a national average of around 15% of dual-spouse families having woman as the primary earner, it may be difficult to get a statistically realistic answer, although without marriage or income equality, it's essentially a gender bias.
I'd like to hear from more people with smart watches who are happy with them, to better understand the appeal.
I'm really happy with the almost-smartwatch Pebble. I wear it in preference to my other watches (and I've always worn a watch). There are three things I like most about it:
- I can, and do, have the sound and vibrate on my phone turned off. I don't worry about silencing my phone going into meetings, the theater, or dinner (or unsilencing it after), and when I leave my phone somewhere in the house or on my desk at work, I still get alerts on my watch when the phone rings or I get a text (the Bluetooth range is surprisingly good). And I don't have to worry that my phone is sitting there, ringing on my desk, and annoying co-workers. Basically, it helps me to be less of a cellphone asshole.
- The battery life is acceptable. Charging once a week is not as good as having a year between battery changes, but it's acceptable.
- It's pretty simple to program (if you're a programmer), and it lets me have bizarro faces, like showing the time in dozenal, or having a YesWatch-style face, or whatever.
I have gotten more positive comments about this watch from strangers than any single other thing I've ever owned. I've been asked about it on public transport in NYC and in check-out lines in Philadelphia and London, and at twice I've had people literally stop mid-sentence to ask about it.
I hear this a lot from people who write unmaintainable code that's full of 'clever' tricks that usually have no measurable impact on performance and, when they do, actually end up making things slower.
A microcontroller has almost no relationship to the kind of system that you find in a modern desktop or even mobile phone.
And I hear this a lot from developers who write buggy, inefficient code that fails under load. Ain't anecdotal evidence great? It's almost as good as straw man arguments.
OP was talking about the fact that, when people don't understand the basic fundamentals of how computers work, they make poor design and coding decisions. Having a good understanding about what's going on under the hood, at all levels, is critical for a lead developer.
There is a difficulty of course: cripple the NSA, and you give free and secure communication to all sorts of undesirables.
And herein lies the problem: who gets to define who the "undesirables" are? How do we know they're undesirable? There's a large segment of the American population who think gays are undesirable. There's an even larger segment who think Muslims are undesirable. There are an amazing number of people on
j/k. Even conservatives deserve privacy.
Computer programmers weary of optimizing code
Auto engineers weary of increasing fuel economy
Home owners weary of insulating their houses
Electricity costs money. Reducing the cooling costs of data centers isn't a green issue; it's a cost issue. TFA mentions this specifically:
Steven Brill and his analysts have pounded the table on the importance for IT to pay the electric bill so they understand just how much power they consume.
so I find it odd that the take-away is "green fatigue."
I've had data corruptions with reiserfs.
I've lost data with ext4 (which happened to be the most frustrating, tedious, and complete failure of all).
Most recently, I had some HD failures on a fully RAID-1'd server running entirely on XFS, and had to re-install the OS from scratch and restore from backups. The new install was onto btrfs.
I've had partitions running on btrfs for a little over a year, and have not yet lost data on these, but it's just a matter of time; I will lose data. I used to blame it on cheap drives, but I've seen SMART failures on young Seagates so I'm now convinced there's no such thing as a high quality, high density drive. At the moment, I find btrfs easy to use (intuitive and simple), and full-featured, so it's what I'm currently using. But I suffer from no illusions; at some point, I will have FS corruptions and have to restore from backups, and I can only hope that any FS corruptions won't go undetected and be propagated to my backups for very long before that happens. Failures are inevitable no matter what I use, so now I value simplicity, convenience, speed
Right now, btrfs beats the alternatives for convenience and features. I put my trust in backups, not file systems, and value is in features and convenience, not some false perception of safety or reliability.
I guess I don't see the advantage to having a corrupt corporation not looking out for me over a corrupt government not looking out for me. I can't change the corporation, but I can at least try to change the government. Both options seem to have roughly the same success rate overall, so why not support the one that gives me a voice?
Exactly. Unused RAM is wasted RAM.
I keep seeing this assertion. If applications aggressively grab memory and resources that they might use, and if I'm a user who uses the computer for more than just a single-domain application (say, web browsing), then I'm going to encounter a lot of OS swapping as I jump around between applications. If the OS has a free buffer of RAM, then new applications I open are going to open more quickly as the OS doesn't first have to swap opt currently in-use memory.
I can't help but think that this philosophy that unused RAM is wasted RAM is what's led to application bloat. I see this most in my career in situations such as frantic last-minute GC tuning resulting from UAT load testing; or when a developer discovers that some new feature is going to push the app over some threshold, and they have to go back and spend extra time analyzing and tuning other parts of the code base that don't, strictly, have anything to do with the feature they were task with developing.
It's as if, when it became widely recognized that eager optimization had consequences, the industry threw the baby out with the bathwater and took it as permission to entirely ignore resource use considerations during the initial design and development phase. This philosophy has apparently permeated through to the general computing zeitgeist, as evidenced by your (commonly held, and understandable) conviction.
Don't you know American companies are crooked, evil, liars?
Tip: sarcasm works best when you don't skirt so close to the truth.
In my limited experience judges don't find it clever if you violate the spirit of the law without violating the letter.
So, what you're saying is that we don't have rule of law in the US, just rule of judge's opinion? It's one thing for a judge to interpret in the case of ambiguity, but you can't convict somebody of violating the spirit of the law. That's why suspects "get off on technicalities."
The law must be a strict definition, or it is subject to being applied differently to different people, based usually on one person's personal biases. Historically, this is Not A Good Thing(tm), and you see evidence of bias abuse in sentencing.
Why is this a bad thing?
Because unless the editor is truly idempotent, in the formatting, after you've reformatted, and then reformatted again, your version control system may think you changed lines that you didn't. This causes erroneous conflicts in merging, and renders history annotation useless.
Which allows me to rant a little: one of the best and worst things about Go is gofmt . It's nice to have such a tool; it's not so nice that it defaults to using the OS's line endings. If you're going to define whitespace rules-of-thumb, don't wimp out when it comes to line endings.
The iPhone also represented a huge effort
... radically different from other cell phones
Are you suggesting that there weren't full-screen, touch-sensitive slate phones prior to the iPhone? You can go back as far as 2000 (five years before Apple started designing the iPhone, and seven years before it was first sold) to the Ericsson R380; the Sony/Ericsson P800 was even closer -- if you removed the clip-on keyboard, it was the same form factor as the iPhone, with a touch screen and full PDA functions. So how, exactly, was the iPhone "radically different" from other cell phones?
Recently, I've come to the conclusion that products are irrelevant; popularity is all in branding and marketing. Us developers (of hardware and software) like to kid ourselves into thinking that we're the ones who do the "real work," but really, it's the sales and marketing people who are the backbone. Apple didn't "invent" the smartphone, any more than they invented the MP3 player (they were three years late on that), or the laptop, or the slate PC (again, late by several years), or any of the other stuff they've been successful with in the past ten years.. They've just been able to corner the "sexy" market, through good advertising and branding. I think that since Jobs returned to the company, they also payed more attention to quality and product polish, and were willing to sacrifice volume in the increased costs that often incurred. But I really think what makes a successful product is the cult of personality.
- Microsoft. There's almost always been a better competing product to whatever Microsoft is selling, but Microsoft managed to capture the Business sector by its early and intimate association with IBM. Even OS/2, an arguably better OS, couldn't wrestle that crown away, and that's because they didn't have Bill Gates, not because it was a technically inferior product.
- Linux. Minix predates Linux, and had the potential to be as successful as Linux, and can be argued to have a better architecture, but Tanenbaum had different priorities and isn't, I dare suggest, the personality that Linus is. Or, if you don't like microkernels, BSD. Same thing: they don't lack technology, they lack Linus.
- Java. There are a lot of at least equivalent languages out there, even if you restrict yourself to the OO space, but none of them had Sun behind it, pushing Java so aggressively. I'm not going to give McNeally or Gosling credit for that; I don't think there was a personality behind that one, just aggressive and persistent marketing.