Can the Internet of Things stop people from using inappropriate hashtags in long-form content? If so, then please sign me up.
The article is talking about enterprise-grade SSDs using data from 2012. As for the performance difference, it seems to be mainly due to the difference between SLC, MLC, and TLC. From page 4:
Even if economic forces are favorable to continuing price reductions of SSDs and NAND flash, a 2012 study by Microsoft Research (PDF) has found that a dilemma arises when trying to increase density and reduce cost of SSDs. The study looked at 45 flash chips from six different manufacturers and found that, as density increases, bit error rate (BER) and performance decrease. This is because the number of ranges of electrical charges necessary to store data on a single cell increase as densities increase.
The researchers found that, as feature size decreases (increasing density), bit error rates increase. While the SLC and MLC chips with cells that had feature sizes of between 80 and 60 nanometers (nm) usually had BERs of 1e-08, those with feature sizes of 40 nm had BERs at or below 1e-07, and the TLC chips with feature sizes of 20 nm had BERs of, at best, 1e-03.
In addition, researchers also found that increasing density also increases read and write latencies. NAND chips with feature sizes above 64nm had read latencies of 20us or less and write latencies of 0.5ms or less, while those with feature sizes of 32nm or less had read latencies between 20us and 60us and write latencies between 0.5ms and 2.5ms.
This leaves SSD and NAND manufacturers with a choice among density, cost, reliability, and performance. In any scenario, at least one of these four must be sacrificed to improve the others. This means that, even if SSDs can achieve cost parity with HDDs, it will be at the expense of reliability, density or performance. In fact, as discussed above, enterprise-grade SSDs already sacrifice write performance, cost and even density to address the threat of reduced reliability and data integrity and have built non-2.5” form factor configurations and added special coding or technologies to meet reliability and performance demands, resulting in more costly products.
More bits per cell requires more precise current sensing, which slows down reads and writes. I suspect parasitic capacitance due to the physical size of the array is also a factor. Performance is also affected by the controller, which may mask some of the bit-level performance differences.
Yeah, I have a lot of those too. I even ended up with a duplicate copy of Civ IV and all its expansions, plus games like Sniper Elite and Red Faction: Armageddon that I definitely never purchased. Also, the multiplayer for a lot of games is a separate Steam title.
But let's talk hypotheticals: if there's a worldwide catastrophe in which civilization is interrupted, somebody specializing in gymel wouldn't provide much use to fellow survivors.
Are you kidding me? Without electronics and industry, all performance arts are live and local. There's no high-quality music on demand from iTunes or YouTube, no recorded music playing at restaurants, parties, or festivals, no constant background music in television and movies. Maybe you can get crappy records made out of wax if you're lucky.
During the day, when most people are doing grunt work, the gymel expert might not be anything special. (Or they might -- people are not solely defined by their profession.) But at night, when everyone's sitting around a fire relaxing? I bet someone who can make strange and beautiful music would be very popular indeed.
That map is so much better and more informative than the tube map that I don't know why the latter exists at all. I know it's supposed to be a simplification, but if you condense that many cables into one route you end up with a map of countries that border the sea, not network routes. For example, there's nothing on the tube map to indicate that the UK is only one or two hops from Japan, or that the Seychelles are at the end of a line, even though it's clearly visible in both your map and the tube map's questionably accurate source material.
At the state level yes... but was overturned later... what's your point?
Five years after it was passed, yes. And the Supreme Court case was resolved on a technicality about Article III standing.
If you bothered to do any research, you'd know that same sex couples who were already married prior to Prop 8 being passed were grandfathered in... so there was no legal limbo, they were married before it passed and married after it passed.
No, they most certainly were not. The full text of Prop 8 was:
Section I. Title
This measure shall be known and may be cited as the "California Marriage Protection Act."
Section 2. Article I. Section 7.5 is added to the California Constitution, to read:
Sec. 7.5. Only marriage between a man and a woman is valid or recognized in California.
There is no grandfather clause in there. The California Supreme Court did the grandfathering the year after Prop 8 passed. And the same sort of people Brendan Eich donated money to showed up to defend Prop 8 there, too.
This was not a small thing for the people affected by it, nor were the resulting court trials insignificant. If you'd like to understand this better, I recommend reading the transcripts of Perry v. Schwarzenegger.
Seconded. 2-4 GB is enough to turn off your page file in Windows, and that's where performance improvements for normal desktop apps ends for me. I have 16 GB for gaming, but I'm not sure I've ever used more than 8.
His $1000 donation did not deny anyone anything, it did however assist an organization which could be seen to try to 'deny rights'... that group and it's side lost.
You know Prop 8 passed, right? Plunging thousands of gay and lesbian couples who had already married into years of legal limbo? Which was part of an ongoing movement to continue denying gay and lesbian couples legal recognition forever? And you're aware that that movement springs directly from millennia of unjustified prejudice and violent persecution that still lingers today, right? And that all of this deeply affects the lives of many Mozilla employees and Firefox developers? There's a larger context here, and none of that disappears just because a federal court throws out a law.
Eich had every right to be CEO of the foundation
Nobody has a right to a specific job. Especially a job that makes them the public face of an organization that relies on a large and diverse group of outside developers. Eich chose to be an oppressive bigot, and chooses not to apologize for it. That's his choice, but he doesn't get to dodge the consequences just because he's a good coder or manager.
The Mozilla Foundation board should have known better. This isn't a new criticism.
Why is marriage a "basic human right?" It's never been a basic human right.
And the concept of a gay marriage never existed in the 6,000 years of recorded history until about 15 years ago.
When did marriage become a basic human right?
There are many possible answers to that:
1. It was always a basic human right.
2. Around the same time freedom of association became a basic human right.
3. Around the same time the idea of basic human rights developed.
As a matter of American law, it goes back at least as far as 1967 with the unanimous Supreme Court decision in Loving v. Virginia. The UN's 1948 Universal Declaration of Human Rights also mentions marriage:
(1) Men and women of full age, without any limitation due to race, nationality or religion, have the right to marry and to found a family. They are entitled to equal rights as to marriage, during marriage and at its dissolution.
(2) Marriage shall be entered into only with the free and full consent of the intending spouses.
(3) The family is the natural and fundamental group unit of society and is entitled to protection by society and the State.
Why is the government involved pro or con with it to begin with?
By law and custom, marriage is a special relationship. It involves things like formalized joint property ownership, inheritance rights, power of attorney, and responsibility for and authority over children. Historically, marriage sometimes involved a legal union of two people into one person, with the woman's identity disappearing. (This is a bad thing and is no longer done in the U.S., but it was there.)
Why is it only limited to two people?
It may be possible to create a form of marriage that works for three people, but it's not necessarily straightforward. One example is automatic power of attorney when one partner is in the hospital and unable to make decisions. With a bilateral marriage, their spouse has full decision-making power. With a multilateral marriage, what do you do if two spouses disagree about treatment? I don't know if that's showstopping problem, but it doesn't exist in the context of gay marriage, which is functionally identical to straight marriage.
People here tend to forget that the UN is filled to the brim with corruption.
Nobody forgets that, it's just that the scientists involved don't actually work for the UN. I don't think they even get paid for their (volunteer) work on the IPCC report. There are some UN-paid staffers, but I only see about a dozen listed on the IPCC site. They're all part of the World Meteorological Organization. If you want to call the WMO a hotbed of corruption, you can try, but I'm pretty sure you don't have any reason to do so.
That their human rights body is chaired by countries with the worst human rights records -- and worse, that this is allowed to continue -- demonstrates why everything that comes out of the UN should be looked at with the greatest scepticism.
Well, a worldwide council with maybe five nations in it wouldn't be much use... Joking aside, you're about eight years out of date on that one. Regardless, I don't see how it follows that one bad organization in the UN implies the whole thing is worthless. The UN is a forum where the nations of the world get together to talk. It works about as well as the participants do. There are few (if any) nations that consistently value human rights over convenience, safety, and prejudice. There are a lot more with an interest in accurate weather and climate forecasting.
But computers in 2004 may have had a 20GB hard drive and 1GB of RAM. Today they have 2TB hard drives and 16GB of RAM.
But again, what about the OS needs to change to accommodate that? WinXP can handle 2 TB hard drives just fine. And 16 GB of RAM is neither common nor a necessity for most users. Best Buy still sells plenty of computers with 4 GB of RAM.
Now what we do (and did) need is a good 64-bit operating system, and XP-64 never fit the bill. But what are the alternatives? Vista was a mess. Win7 is good on newer hardware, but only the OEM versions are sold anymore. Today there's a choice between sticking with XP, buying Win8, or taking a chance on an eBay copy of Win7. I did the latter for my wife's computer, but it's hard to recommend for the general public.
I'm not disagreeing that we need to move on. But Microsoft has spent most of the last decade screwing up their upgrade path. Maybe if they stopped wildly redesigning the UI every time they put out a new OS, more people would have upgraded by now.
Computers in 2004 weren't all that different from today's computers, though. The AMD64 instruction set was out and consumer-level 64-bit CPUs were available. PCI Express and Serial ATA were standardized the previous year. DDR2 was in use, and you can still buy that today! The biggest changes in PC hardware since 2004 have been multi-core CPUs (which XP handles just fine) and solid state disks, which aren't exactly a compatibility killer. There have been a lot of huge changes in the mobile space, but that has nothing to do with XP. Virtualization is a big deal for servers now, but there are plenty of applications where it's irrelevant.
As a gamer, I upgraded to Win7 for hardware support and newer versions of DirectX. Aside from that, I didn't see a compelling reason to do so. It's not like I could suddenly do anything new with my computer. I can understand why people wouldn't want to shell out tons of money to upgrade. And then there are embedded applications. Where I work we have a ~$20k oscilloscope that runs XP. We're certainly not going to throw *that* out.
Maybe checking the status of an oven (or oven timer?) over the net is useful, but there's no reason to allow the network to turn it on. Separate device control from device status at the hardware level, and you at least keep people's houses from burning down.
It's teaching a shortcut for doing arithmetic -- one that's easy to do in your head, in fact. The idea is to do the subtraction in pieces by getting to round numbers. So in the example, 15 - 7, you start by getting to 10 (15 - 5 = 10), then you have 2 more left from 7 so you subtract that too (10 - 2 = 8).
The end result gives you 15 - 5 - 2, but to write it that way you have to already know how to break up the 7. Doing it one equation at a time lets you apply the method to larger numbers.