Computing... Verification complete.
You seem like a sensible person.
Computing... Verification complete.
You seem like a sensible person.
You've got a good point, but in fact the entire setup of the test is nonsensical. Here's how it should work:
1) Have a judge have two conversations, not necessarily at the same time, one with a human and one with a computer. (Obviously the judge does not know who is who.) 2) Give these conversations some time - more than five minutes, for sure. 3) At the end, have the judge declare who they think is the computer and who they think is the human. 4) Do this repeatedly, and use statistical methods to determine, at certain confidence levels, whether the judges were doing better than random guessing.
When someone's devised a program that fooled, say, n=200 judges whose judgement was tantamount to random guessing at a confidence level of p=0.01, start the presses.
Thankyou. Got it.
Congratulations AC - that is a concise, scientific definition of the test.
All that these researchers have achieved is to redefine the Turing test downwards, and to redefine it so far downwards that it is a completely different test, sharing only the name. However, I see from Wikipedia that there is some history behind this redefinition. It's as if 30% success, from a blind audience, over 5 minutes may be be a first step towards passing the test - according to some. Yah... like jumping 5 feet in the air is a successful first step towards jumping into orbit, or proving Fermat for n=1..10 is a first step towards proving it for all numbers.
The OP is a technological achievement, but nothing more. It is not even relevant to the science of the Turing Test
its not just the start menu its the EVERYTHING that's changed all at once and requiring users to make a fundamental change in the way they use tools is going to meet resistance.
Changes like this should be introduced gradually, not because we're a bunch of whining sniveling children but because require a large adjustment all at once leads to user frustration and poor efficiency during the transition time.
That was my experience with Windows 8. I was very frustrated with losing the start menu, because, over time, I had customised mine to be able to do almost everything through it. All of a sudden I couldn't find anything, and couldn't understand what principles (if any) were behind how it was laid out. I had to google for how to shutdown the computer, how to close a Metro screen, how to run a DOS command, etc...etc...
I had expected it would take me about a week to master Win8, and be then I'd like it. After a month, I was still cursing it.
I considered gettting a Start menu replacement, but before I did I started to "get" the Metro interface, and realized that I actually like having two layers for mywork, ie. that I can leave some things on the top (Metro) and leave the desktop uncluttered. After a while I had also put all my regularly used programs into the taskbar.
So, it was a vicious learning curve which I would have abandoned if I weren't locked in, but now I'm happy with the Win8 UI, and would opt-out of the Start menu, if it were available.
If that's the experiece of a tech-savvy, Windows veteran, I can only guess at how less technical people have handled the transition. I expect that many of them still don't know how to close a Metro window (you "grab" the top with the mouse, and pull it down, btw)
I'm not really seeing it catch on either, but OCaml's sweet spot was writing fast code that dealt with very complex data structures. It enforced static typing, but used type inference to figure out what the types of variables were. It has powerful operators for assembling and splitting up data structures that let you write very concise code that was checked at compile time for correctness.
I use F# daily for the Model (and ViewModel) in WinRT and ASP.NET MVC. Your list of advantages (from OCaml) are exactly the ones which I enjoy.
In a nutshell: "... let you write very concise code that was checked at compile time".
What F# adds, mostly, is the
I got into F# seven years ago, when it was just a research project and looked more like OCaml than a
By 2010 it had become fully integrated into
By 2014 it had evolved into a complete language with its own killer-features and it had spawned a large community, with blogs, tutorials, books and sample code. There are several significant third party add-ons, and numerous high profile adopters.
In five years time, rather than F# disappearing, it is more likely that it will be the preferred language of many developers and shops, and the early adopters will be thankful for our extra years of experience.
As for me, I'm thankful not just to have it on my CV, but because it helps me build better apps for WinRT, the web (with ASP.Net MVC) and Android. The root advantage of it being a functional language in the
I expect that in five years time, or, hopefully, just two, I won't have to mix F# (for the model) with C# (for the UI) for WinRT and ASP.NET MVC.
I'll link back to this in five years with "I told you so". I'll still be Javaman59 then.
>> Not much chance of that. F# just hit #12 on the Tiobe Index, up from #69 this time last year:
Yep. The sooner you get into it, the better off you'll be in five years time.
Until last year when I went to my GP for a "general checkup". He looked surprised and suspicious, and asked "Why?". I told him I had been drinking too much. Since then I've had about four visits in one year as the effects of 20 years of alcohol, smoking, stress, obesity and asthma have all started to take a toll.
Agreed, it may be difficult to change directions once employed, but it is possible (I once worked with someone who resigned a programming job because she'd just got a job with the police - something she'd always wanted to do. Someone else gave up a project management job to do a Radiography degree. And others I've known have resigned to travel the world for a year).
Agreed, it is possible to change career direction - particularly if you are strongly motivated for the new career, and have savings and/or other backup.
I think the trick is not to get "locked in". In this case I would possibly advise to take the job that is definitely there, but tread lightly until you get the job you want. Don't buy a house, don't get married, and don't have children, until it is certain that the games programming job isn't going to materialise.
I really liked this observation! Indeed, when you are young and independent is the best time to take a risk with your career. While it is never easy to change direction, it is certainly easier now than it is once you are financially locked in. After that, it'll be twenty years - if you are lucky!
In the end, an interim job is better than no job at all (or flipping burgers).
That seems to be the unanimous advice here.
He's a friend. Do you have friends? Do you care about your friends? Do your friends care about you? If you saw a friend making what you think might be a mistake, wouldn't you perhaps talk to them. If your friends saw you making what to them might be a mistake, wouldn't you want them to talk to you? Personally, I can understand where the Original Poster is coming from. He's a friend to his friend. It's what friends do.
When someone is taking their first job out of college I think that they should be given lots of advice from those with more real world experience. The first job often sets up the whole of your career. If you start in banking, you will probably stay in banking. Ditto for defence. Once you have two years experience in any industry it will be very hard to change to another industry. It's not impossible of course, but it may involve a period of unemployment and a pay cut. But, if you have dependants by then (which often happens in ones twenties) then you are locked in.
For the advice to be useful it should be based on fact, and the adviser should be careful of overemphasising their own emotions. It should also be open ended, eg. saying "The games industry often has some notorious sweatshops, but that is not universal", rather then "Don't work in the games industry!".
Perhaps the one bit of advice which must be emphasised to new graduates is that the first job is a very significant choice which they may not be able to easily change - so choose wisely.
Now, I am talking from personal experience here. When I was about to graduate someone gave me exactly this advice - the job I take now will probably be the one I have for the next twenty years. I rushed into it, and took an exciting looking job in the defence business. I quickly hated it, but I was already locked in, and had dependants. The defence business wasn't nearly as exciting as the it seemed, but it took me twenty years to get out of it, and a massive pay cut.
I live in Western Australia and it's winter here.
I live in South Australia, and it's winter here, too.
Later "this summer" doesn't start until December.
I would say it does, because using seasons as a unit of time is a distinctly Northern hemisphere convention. In my observation, American's and Canadians are the main users of it (more than the British).
I often get confused talking to an American when they talk about doing something "in the summer", and it's not so much that they have a different summer, but that I'm not used to measuring time like this. (We only use it for things that are specifically related to the weather, such as sports).
In Australia we wouldn't say "later this winter", we'd just say "around August/September".
Either like Daredevil, or paired with the ability to emit ultrasonic pings...
Sonar, so that I can fly around in caves in the dark.
When I graduated from uni 25 years ago Dijkstra was my hero and, under his influence, I tried to solve problems on paper, and prove my programs, etc. It took me years to understand that real software engineering is about sitting in front of a computer, typing out code, testing and debugging it. I wasn't much use to myself or anyone else until I discarded the Dijkstra influence.
I still remember him as a great writer and humorist, and his ideas are useful, just so long as the young programmer doesn't try to put them into practice.
It used to be 60% to 80% when I had more of a "life", and spent time doing housework, reading, going for walks, etc.
Now my only breaks are the gym a few times per week, and when I do the bare minimum of cooking and house chores, and a few regular social commitments, particularly church. Apart from that, I just put myself in front of the screen, and stay there.
No porn, btw, just internet news and forums, and work. In the last few years it's become more of the former, and less of the latter.
This is a timely reminder to change my ways.
Is that the best you can do? There's nothing wrong with the first quotation or the third. The second prediction was made in in 1998, for God's sake. Good thing you posted as an AC: how many incorrect predictions have you made?
Still, the second prediction is worth quoting in full. The premise "Most people have nothing to say to each other..." is so near, and yet so far, from web 2.0 that it's just delicious. To be able to say something so wrong (in hindsight) is an achievement beyond most of us. It must rank with the all time great failed predictions.
The growth of the Internet will slow drastically, as the flaw in "Metcalfe's law"--which states that the number of potential connections in a network is proportional to the square of the number of participants--becomes apparent: most people have nothing to say to each other! By 2005 or so, it will become clear that the Internet's impact on the economy has been no greater than the fax machine's.
ASHes to ASHes, DOS to DOS.