Yeah, it's not even the vomit of one bee, they all have a massive mutual "20 bees, one cell" eating each others vomit and sicking it up again party.
Andouillette - sausages made from the colonic intestinal linings - are a fine example. I ordered them unknowingly in a French bistro on holiday. I figured out what they were made from about halfway through the plate. While the remainder of the plate required me to engage in a kind of self-administered Jedi mind trick, they were still mighty tasty.
Actually much more boring - pork meat, suet, and oats. It's basically a cheap porridge sausage, which is about what you'd expect from the Scots. It's popular in Ireland too. I rather like it.
A friend used to speculate that if black pudding was the red blood cells, then white pudding was the white ones (ie - pus). But it would be prohibitively expensive to separate them out, and it really would be a luxury foodstuff as the white cells are only a tiny fraction of the total cells in the bloody.
Auto == self. You eat bits of yourself. The linings of your mouth, accidentally biting your cheeks and tongue, etc. Many people also nibble their nails, their skin, etc.
Cannibalism is daft though. You're far more likely to catch a disease that thrives in humans that way.
Puerto Rico, 2 people killed, 25 injured, annually, from New Years Eve celebration gunfire.
Maybe not collaboration between Google and Microsoft, but maybe collaboration between Microsoft and the content cartels.
"Hey, people are downloading stuff from YouTube and saving it. We wondered if y'all at Microsoft could fix that."
"If we give Google a reason to require obnoxious DRM on all YouTube content, it will serve both your needs, and also ours, because Google will have to spend a lot more on CPU time encrypting all that stuff."
Shoes, tyres, etc already have them. Walmart love them for stock control - which means one in each carton at least, and then one each on high ticket items. They've experimented with pickup loops on the shelf, tracking stock in real-time. They've run commercials showing a man billed for all the items in his pockets via RFID - not showing their hand much...
It would amusing to pick it apart and see how much prior art and how many ridiculous claims it contained.
Futurama requires that you feed and clothe a shitload of honest hardworking Koreans. This just requires a few asshats.
Yeah ; it's even worse. Regulated means that people with power / lobbying cash can use them, and the general public cannot.
Technology like this is fundamentally democratising - the sad side to that is that it democratises snooping, drone attacks, etc. The glad side is that intelligence gathering is no longer the sole province of those able to afford a vast intelligence apparatus.
Those with power love to support their own privacy because they are more likely to have something to hide. Citizens with drones scare them, because they create remote sensing platforms that have a low entry cost and scale with the number of participants ; whereas current remote sensing platforms require very high buy-in (because you need to buy a CCTV network, plane, helicopter, or satellite launch) and are thus the sole province of large organizations, and large organizations are more likely to be sociopathic in nature.
* Find environmental violations (drone with pollution sensors)
* Detect abnormal nocturnal activity (drone with IR camera and some software that learns where IR hotspots usually are - and aren't)
* Work out footfall density in urban areas (useful to know where to site stalls / shops)
Think up your own. Corporations love intelligence, and hate the idea of other people having better intelligence. Especially about them.
Imagine, if you will, a cloud of drones. You can't control the drones, but there are a lot of them, and they contain a bunch of algorithms that cause them to congregate in areas you tend to find interesting anyway. All the drones upload their data to you, and you have a giant server farm dedicated to extracting useful intelligence from the data. That's Glass. It's ironic but unsurprising that Schmidt will promote this squadron of drones, and try hard to stop people owning and operating their own.
One of my eyes has a lazier focus than the other. Being a nerd and reading books and screens all day, I noticed from the age of about 17 that my distance vision starts to get a bit fuzzy unless I get outside and look at distant objects, and that this is more pronounced in one eye than the other.
3D films help with the difference between the eyes, because you have to focus both eyes correctly for the effect to work ; it's not like the real world where a slightly fuzzy object seems to be acceptable to your brain, in a 3D film, the fuzziness is really noticeable and your eyes work harder (in my experience).
I also wear reading glasses (+1D) when my distance vision gets fuzzy, not because I need them to read, but because they move the focal point at my monitor distance to what is effectively infinity ; thus solving the problem of having to look at distant objects without cutting into my hacking time.
who exactly does that? using 8 bytes when you only need 4 is just stupid.
CPUs move memory around in register-sized chunks ("words"). Therefore a CPU operating in 64-bit mode moves memory around in 64-bit sized words.
You can gain some ground by packing smaller variables together, but there will be some slack for things that don't fit into the chunk size. And it's more efficient to access memory aligned to word boundaries.
You may as well say "why use 8 bits when you only need one" - most databases store boolean values as a whole byte, because it's a total pain in the arse to write a single bit then offset the rest of the row by one bit to save 7 bits of space. If you have multiple boolean fields (up to 8 per byte), they get packed together, because it's much cheaper to shift a single byte to the left than it is to shift the rest of the row.
So the answer is, everyone does that, because their compiler takes care of it for them.
It might also be because of L2 cache sizes, or bus speeds.
P4 Northwood had an L2 cache of 512Kb
Athlon 64 had an L2 cache of 1MB
Most of the text-processing jobs I run (XML, XSLT, HTML Tidy, Regex) get a really big boost out of having a larger cache. The jump in performance from a Core 2 Duo to a 2 core i5 is very noticeable, for parts that run at similar clock speeds.
He's de-duping files with SHA512, from the listing.
That will get a major boost on 64-bit machines just because of the increased word width. I imagine the hashing step is what is consuming most of the CPU time, and making the code CPU bound instead of I/O bound.
I agree that 64-bit machines are somewhat niche, but I work in that niche.
If you do anything serious with Java, on Windows, because of the memory layout and the insistence of the HotSpot VM on being allocated contiguous stretches of address space, you're limited to about 1.2GB of heap space. When you have a domain that has object counts in the 3 - 5 million region, that fills up rapidly. This is for a big graph of objects and the queries for them involve lots of graph traversal. The code in question can do set queries in about 0.5s that an RDBMS takes over 5 mins to do, so there's a real value to caching all the objects on the heap.
Yes, I could use another language that doesn't have a stupid VM and have ample overhead in 4GB, although this data set will grow (even if it's not "social network" level of growth). But with working code in Java, it's much cheaper and easier to throw a 64-bit OS and another stick of RAM at it.
A shame that my employer is still tragically stuck in the 90s and thinks 32 bits should be enough for anyone..