Yea, right, I'm sure a letter from the credit card companies is really believable. And even they had to insert this text before they started complaining about the evil guberment:
"The CARD Act has provided consumers with significant benefits, among them the elimination (with
few exceptions) of increases on interest rates on existing balances, whether the regular purchase
interest rate or an introductory or promotional rate. These restrictions help consumers avoid surprises
due to increases in their interest rates. In addition, since implementation of the CARD Act, customers
are paying significantly less in late payment fees and overthelimit fees. Customers also appear to be
paying a higher portion of their outstanding balances, perhaps due to the minimum payment disclosure
of the CARD Act, which explains how long it will take customers to repay a credit card balance if they
only pay the minimum payment.
"While the CARD Act has provided clear and significant benefits to consumers, there have also been
I was just trying to say that if you want to run Maya you don't need Windows emulation.
You can get Maya for Linux
Mostly because high res was easier on CRTs especially if you didn't mind horrible blinking, and it took LCDs a long time to catch up.
4K is still very demanding for 3D gaming, but since it's exactly 4X of 1080p scaling isn't a big problem. And artwork looks really beautiful in 4K, which seems a good fit for a game like Starcraft.
Didn't you notice that we're not gaming at 320x200 anymore?
Yes, indeed the normal resolution for games does go up over time. 4K will be entirely mainstream eventually.
Everyone here should know that the best possible and worst possible cases are usually extremely artificial and almost never happen.
So I am curious about what has the actual impact of this has been? Because if companies managed to charge 5X what they did before, while delivering the same amount of power, the profits would have soared in an amazing manner. And that probably hasn't happened, because then this would have been noticed far sooner.
So I am curious about if a measure of the resulting average error can be made by looking at energy company economical info.
I looked into it out of curiosity about a year ago and concluded that I could make somewhere around $5 - $15 a month, while spending more on power. It long stopped being worth mining with common hardware.
Of course using someone else's equipment you don't have that downside, but those consequences far outweigh whatever pocket cash he made from it, unless it was installed on an entire cluster.
SQLite isn't remotely competitive with Oracle. It's nowhere near in the same league as even PostgreSQL or MySQL.
SQLite is a toy database with a huge amount of limitations that's found a niche in "I need a RDBMS for something simple, and rarely used". Thus the use for desktops to store things like configuration and music databases. In such cases it works well.
If you're even thinking at all of multicore performance, SQLite is not the database for you. It's got absolutely dreadful concurrency and will die under anything resembling a serious load.
"Jobwashing" - Similar to "greenwashing" but updated for the present era.
Sorry, no. They broke strings entirely in Python 3.0 and that is why people cannot port to them.
Here is how to do strings correctly: use UTF-8 and DO NOT BARF ON ENCODING ERRORS!
It is absolutely 100% a requirement that a program be able to read a random byte stream into a "string", then write it out again, and get the same byte stream.
In Python 2.0 this only barfed if you tried to convert that string to "Unicode" (it would have been nice if it did not barf, but at least you could store, read, and write strings).
In Python 3.0 it will BARF ON READ. This makes it impossible to write reliable software.
Yes you can use "bytes" in Python 3.0. But that really sucks if in fact you expect your bytes to be readable text, with only RARE (but not magically non-existent) errors.
It's satire, stupid
What this country needs is a good five cent nickel.