Did Netcraft confirm it?
Well, his CR32032 would be amazingly rare.
Out of morbid curiosity, I searched "CR32032" on Amazon. I think that's the first search I've done there, that only came up with one item.
That's strange. I've never had a seller demand anything from me. About half of the purchases I've made recently gave me positive feedback as soon as the received payment.
Here's my experience. Buy something from eBay or Amazon.
Well, we've bought a lot of batteries from various people I've been harvesting laptop batteries for the 18650 cells to put into phone recharging backs so we can play Ingress for effectively limitless hours, and for eCigarettes. That's given me a look inside them, and what condition the actual cells are. Leftovers, I sell to friends and friends-of-friends at cost.
The recharge packs I have take 4 18650's, so if I get 2500mAhcells, I have a 10000mAh pack. I went with carriers that have a physical on/off switch, rather than the soft switch like the Anker has, so they can sit a long time without discharging. I haven't needed to change batteries on them yet.
Generally, I buy from eBay. I'm looking for the higher cell counts, and aiming for about $1 to $1.50 per cell. So a 8 cell pack I want to spend $8 to $12 on.
When I crack them open (always more work than it sounds) they all have the standard overheat sensors, which was the concern before about exploding batteries. They have all been wired well. Out of say a couple dozen packs, I received one that had a dented cell in it. It didn't hurt the performance of the cell, but since it was dented, I refused to use it or give it to anyone. Some of them, I've damaged the wrapper, so I re-shrink wrap if I'm in urgent need of them, or I dispose of them.
Regardless if it says on the listing that it's an OEM or 3rd party pack, almost all of them have had no-name cells in them. I did get a few true Sony, Panasonic, or Sanyo cell, but they are rarer.
They've all tested out to be the listed capacity, and they all have worked at the expected life expectancy.
The only big exception was the battery for my old cell phone. It originally came with a 1400mAh battery. The only cheap seller listed 1600mAh for about $10/ea. I used them, and they were fine, but they only lasted as long as my original battery when it was new. When they finally started failing, I pealed the stickers off, and the original markings showed they were 1400mAh batteries. If I had been paying extra for the extra capacity, I may have been upset. Since I just needed batteries that worked, it didn't matter much.
I played Ingress a *lot* with my phone though that period. That draws a lot of power, so I kept a couple spare batteries in my pocket all the time so I could swap them as needed.
My new phone came with a much larger battery (part of my selection criteria), and I don't play as much. I let it charge in the car when I'm driving. If I'm walking, still carry the external pack, just in case I need it.
So.. Pick something cheap on eBay. Look for listing saying they're "new". Don't expect a higher capacity batter to be any better than the original battery. Since you're looking for cheap, you can generally afford to get a spare.
It's good for your house too. I've seen houses where the homeowner never ran their A/C and they were proud that they saved money. They also had problems with mold, paint peeling, drywall falling apart, and various wood things in their house warping.
At one place I lived, there were ceiling fans throughout the house, which was nice. There were also some on our back porch. The ones inside stayed in almost original condition. The ones outside had rust on the metal parts, and the blades warped.
But this was a discussion about datacenters, so I talked about the corrosion problems with IT equipment.
Well, both sides get charged. We're all either charged on capacity or 95th percentile throughput.
I've never known a residential provider to charge for used throuhgput, because people have a hard time understanding it. People would flip out if their bill was $20 one month, and $300 the next. Rather, residential providers do a bit of math. They look at their bill, the aggregate bandwidth used, and the total Mb/s available to customers. Of course, they tag on a nice profit. There are additional considerations, like what do they need to provide extra services like IPTV, how much does it cost to maintain existing circuits, add new circuits, keep employees paid, travel costs for technicians, etc, etc, etc...
So, you get a nice low flat rate, because consumers don't use 100% of their bandwidth 100% of the time. Basically, they oversubscribe. If they do it right, you never know. If they do it wrong, you have shitty service and everyone complains.
At the datacenter we have equipment, we pay for the rack, power, and on the 95th percentile utilization of that circuit. So if we idle everything for a month, we barely pay anything. If we dump all the load to that datacenter,
If you're running a business where you need to be in a datacenter, your business model better cover all your costs. Otherwise, you'll be out of business quick.
No one gets a free ride. You pay for your end-user line. I pay for the line my server is one. Everyone's paid, and everything works.
since there is no air conditioning, you don't have a condensation problem
No, it's not HVAC induced condensation. Meteorologists call it the dew point.
Right at this moment, the temp is 53.3F with a relative humidity of 78%. The dew point is 47F.
You're suppose to run a datacenter between 40% to 60% relative humidity. Without a system in place to dry the air, they're asking for corrosion on parts.
You can't say computers are corrosion proof either. When I worked in a computer store, we had computers come in all the time that were in houses with no HVAC, so they were exposed to outdoor humidity.
I left some old gear in a friends garage for a while. One of the units was a used Catalyst 5000, with cards I didn't really care about. When I put it in the garage, it was in functional condition.
I decided to bring it back up to play with. There was corrosion on the line card handles, and I'm sure corrosion inside. Nothing looked bright and clean. There was visible corrosion on the cat5 pins (for the cat5 ports). When I took it out, it barely worked with lots of errors. Reseating the cards didn't help at all. I don't know (or care) which parts went bad, I sent it off for electronic scrap recycling.
Someone's going to be really pissed off when they spent a fortune on servers that have to be trashed because they stop working properly.
There are other parts machines in the garage too. I only go to them for fans, power supplies, etc. I had already pulled out all the memory and CPUs. Sometimes they still work. Sometimes they don't.
Specs have some wild numbers on them. Some say they operate in 10% to 90% humidity. Sure, they *can* run in it for a while. They aren't expected to survive in that kind of temperature indefinitely. I've seen some specs that say they'll operate over 120F. Sure, for a very short time. I had one place argue with me because the spec showed wild numbers, but they were already experiencing hardware failures for operating servers in an uncooled server room (the HVAC broke, and they didn't want to fix it).
I should make an obligatory reference to Jurassic Park.
I was guessing by the fact that they had employees accessing the building, and parking lots, that it was a facility that had some sort of access.
Technically, it's just where you're buying the connection. Netflix are already at a shitload of peerings.
So now I'm even more confused to WTF they're bitching about.
Did you look at their floorplan? There are huge wedge shaped gaps.
Or lets do math. For the sake of argument, lets say that the diagram in their virtual tour was to scale. We're also going to say that each rack is a standard 19" rack, taking up 22" each. That can be wrong, but it's what I'm using for measurement.
The entire circular structure has an area of 24,052 sq/ft.
A square structure on the same property would be 30,625 sq/ft
The circular structure wastes 6,573 sq/ft.
Each pod, with a 3' buffer on each end, and a 3' buffer between rows would have a footprint of 768.4 sq/ft. Since I only included one aisle buffer on each (since they share common aisle), add one more aisle at 102 sq/ft.
The total datacenter rack space is really 3,944 sq/ft.
In the difference between the round and square structure, you could put all the racks and aisles. and still have 26,681 sq/ft.
Or about the size of two Olympic size swimming pools.
Or 0.017 LoC.
Or 53,362 bread boxes one layer deep.
Or you could tile the floor of the wasted space with approximately 106,724 AOL CDs, which coincidentally is about half of the total number of AOL CDs received in Centennial, Colorado in one bulk mailing. Unfortunately, it will be very ugly, because you're trying to tile a square floor with round objects which has lots of wasted space.
I could dazzle you with more numbers, but you've already started cursing me, and I really don't care.
(really? Cogent? really really? Well done, Netflix. Not pinching any pennies, at all)
It seems most people either don't know about who's service is how good, or they ignore it.
But hey, they could have gone with Internap. Did they ever lay any of their own fiber, or are they still pushing traffic over the cheapest possible transit?
I used to run a big adult site. We wanted servers closer to the customers for speed. We made enough that we didn't really care about the connection costs. We'd put up server farms around the world where it suited our customers best.
We owned every piece of equipment in our cabinet or cage (depending on the location). The provider equipment ended at the fiber they dropped to us, and the power outlets.
Their own CDN site talks about putting Netflix gear out for free. So they are basically saying they want the free ride. No one gets rack space, power, and connections for free. The right thing to do would be to lease the space like everyone else does.
But hey, they're loving to cry about being treated unfairly. They are the loudest ones about it. Honestly, other than speed complaints that are usually a fault, not a conspiracy, I don't know of anyone else talking about the same thing.
It is possible that the world is ganging up on Netflix. It happened to Cogent, more than once. That was mostly they refused to pay on their contractual obligations.
As described, after looking at their materials, I don't see an advantage to the radial design over a grid design. There is nothing to that which would improve airflow, and it leaves huge underutilized areas.
On the other hand, a traditional grid design optimizes the space, and it would still allow for the same airflow.
It's not a matter of being round, or having dead space, it's simple things we teach children. Square boxes don't fit through round holes. Round objects don't stack optimally.
One of the Equinix datacenters in Los Angeles (previously Pihana Pacific) has all of it's cooling on one side of the room, and returns on the other side. Each row is basically a wind tunnel. There is no appreciable temperature difference between the two sides. Both the front and back of the cabinets have the same airflow, and maintain roughly the same temperature.
As far as the total power load, they could keep the load the same, and have almost half of the building for just storage.
Of course, a square building that the industry uses as a standard for this kind of work, would not make the news. No one would be talking about it.
I guess if they have money to burn and real estate to waste, it doesn't matter what shape they make it or how much space is underutilized.
Did you notice that he talked about the doors to the warm side? Controlled and logged access. And just a couple seconds later he says the top of the pods are all open to the common upper area. I'd hope they'll have something in the way, but I doubt it would be anything that bolt cutters (or just tin snips) and a few minutes would have a problem with.
Particle filtration does not mean it dehumidifies.