Probably why I couldn't reach NeoGAF for most of yesterday, unless I went through tor. Which I did, because I'm a man and I have my needs.
Newman writes: "With next-generation technology like non-volatile memories and PCIe SSDs, there are going to be more resources in addition to the CPU that need to be scheduled to make sure everything fits in memory and does not overflow. I think the time has come for Linux – and likely other operating systems – to develop a more robust framework that can address the needs of future hardware and meet the requirements for scheduling resources. This framework is not going to be easy to develop, but it is needed by everything from databases and MapReduce to simple web queries."
Seeing a lot of pro-russian "psyops" on one local forum attached to a news outlet focused on economics, so much so that it's pretty obvious that it's organized. Massive amounts of downvotes on anything negative to the pro-russian side, and weak conspiracy theories written in broken english moderated up.
Not sure why they're wasting their time, but there you go. I guess the proud Cheka men have nothing better to do than troll forums.
The Titan-Z was and is a PR product. It was conceived simply to create buzz around nVidia. They had the misfortune that AMD put out a better card before they could get the darn thing to market though. First they delayed it, then as pressure mounted they finally sneaked it out without much of the ado they were hoping for. I doubt there exists or will ever exist more than a couple of hundred Titan-Zs IN THE WORLD.
Anyone who tells you that this card "is for X" where X is something else than PR is wrong and/or lying. It doesn't make sense anywhere.
Here's a new sneaky approach, less destructive but so far effective: U.S. Marshals Seize Cops’ Spying Records to Keep Them From the ACLU
Very disappointing. So it's almost exactly the same price as the products that are already out but using 2x nm NAND. Oh well, I guess it's my fault for thinking Crucial would actually make a move here.
There seems to be a step missing from A (that's not how memory works) to B (therefore uncomputable). The premise that memory isn't lossy sounds like rubbish, even IF it's perhaps not so simply a question of 'read errors'
I recently watched this talk, Modeling Data Streams Using Sparse Distributed Representations, which seems to be able to represent memory in a layered and lossy way perfectly fine in a computer.