I wish i had known that i knew nothing
I wish i had known that i knew nothing
What does this mean for StarCitizen? AFAIK their complete work is based on the CRYTEK engine...
Is the duty for password complexity correctly placed on the users shoulder? I think not...
The users has two jobs:
1. Select a password he can remember
2. Choosing a password someone else does not associate with him
Raising password complexity requirements makes those two jobs harder. In my observation, with rising password complexity, the users tend to re-use passwords more often (which is more detrimental to security than a less complex password).
For password complexity to matter, the service provider must have failed (lost the data) and succeeded (choosen a half-way decent algorithm) at the same time.
Therefor i consider the burden of password complexity wrongly plaxced at the users end.
I hope the Nest Thermostat is better than the Nest Protect Smoke Detector. Those gave me a case of serious "early adopter burn".
The Nest Protect detectors have the tendency to generate false alarms in clean air (no smoke, no dust, no steam) and are very hard to disable (get a ladder, dismount, get a screw driver, open device, remove battery). The idea of disabling a false alarm by WIFI has not occurred to them yet
It's says "Civilization" in the title, so i will buy it anyway...
If it is any consolation, the level of competence of political decisionmakers in Germany is about at the same level. The ballpen is the last technological inovation they use.
The correct headline would be:
German court refuses to force Valve Steam to allow resale of games
Yep, but we come back to my argument: The biggest risk for the for Google on the search market is regulation (see EU proceedings).
Actually, i think Google knows that it is getting too big: the breakneck speed of acquisitions is the result of the intent, to get as big as they can before a more confining regulation sets in.
The system will not receive any updates any more while sharing a code base with newer systems. Any patch coming for Vista/7/8 starting April will be analyzed for a matching bug in XP which will be turned into exploits quickly.
Any Windows XP system will be a real liability when connected to the Internet.
Windows 7 is way better than XP and even 8 (with a bit of tweaking) can be used properly.
There is no excuse for running XP as there is no excuse for housing people in ramshackle houses prone to collaps any minute.
With Windows XP still at 28.98% you can only weep and cry. This means that nearly one third of all PC users are running disastrously old systems.
Kernighan and Ritchie were well aware of Turing completeness. Dennis Ritchie started with Theoretical Computer Science before he wrote his first software (see http://www.gotw.ca/publications/c_family_interview.htm). You can be sure that designing C without Turing Completeness would have been for them like designing a car without tires.
Languages without Turing Completeness only make sense only in special applications because they are so limited (e.g. the C PreProcessor is not Turing Complete unless you use it recursively).
One of the marvels of the Turing machine is that it is so simple (you can describe what a Turing machine does on 2-3 pages) but it is as powefull in expression as modern languages with specification of thousands of pages are.
A lot of coders have no idea about the theories behind it. That is why a lot of code sucks. It's not the lack of Turing machines but on the theories that are connected to it (e.g automata theory, complexity theory).
What you are saying is like: I am tiler, i never check the foundation when i am building the roof, so it can't be important
You can make a living as a coder without all that knowledge. More than half of the coders do. But if you look at the people who shape the world of software (like Dennis Ritchie, Linus Torvalds, James Gosling, etc), you will notice they all are well versed in the area of computer science theories.
P.S. Concerning AI and Turing Test: computer games have no AI. The producers of computer games call their software opponents AI, but they are a collection heuristical algorithms cobbled together.
When you are playing agains an opponent, you can usually tell easily wether this is a computer or not. In fact, you are conducting a Turing Test then and the other side fails usually miserably.
I cannot blame you for not seeing it. I studied the theories for five years and thought them dull and boring. Then, after decades of having to work on real world problems, it hit me. Nowadays i can e.g. look at code or database designs and easily recognise coders who have understood the theories and those who didn't.
Not understanding the theoretical background will put and upper limit to anyones capabilities as a coder. This is like being restricted in World of Warcraft to Level 20: some skills in the skill tree will remain out of reach no matter the grinding.
The best way to illustrate the genius of Turing is: he saw computers coming before the first one ever being built. He developed a trivial "assembler language" (Turing machine) that is so powerfull that no computer and no programming language built today can compute something his machine could not.
When he was finished with that, he thought not about calculations (as the opposite German genius Konrad Zuse did) but of processing symbols. He thought of computer code being processed by computer code and thereby inventing compilers and interpreters without having a name for it yet.
Then he interpolated the capabilities of those (not yet existing) machines and recognised that they would appear to have some kind of artificial intelligence and started thinking about how to tell computers and humans apart (60 years before the first SPAM was sent).
Looking back, having all the tools already on your fingertips, all this may sound trivial. But to achieve only 1% of his visionary power, i would have to grow by several orders of magnitude.