Right up until you want a way to define API changes and continue to maintain back branches.
A rockstar programmer doesn't bang out a lot of code.
They pick the right algorithm which scales well (doesn't need to be rewritten), considers and handles most error cases cleanly (few bug reports), and often leaves easily maintainable code (another person can take over, doesn't require a support team).
But those criteria will all be comprised of humans. Which they will NOT be able to predict.
Interesting hypothesis. Wouldn't it make sense to test it and see if it's right or not?
And there are much better ways to teach programming. For a very long time there has been a movement to bring programming to the masses, as if, somehow, everyone would be able to write beautiful, intricate code to solve their most complex problems.
You probably already know how to program, and I think that's why you seem to miss the point entirely.
That niche is mostly served nowadays with apps for phones and tablets, which the end user can discover and use on their own from the app store to solve most common needs; but the developer is not totally removed from the equation yet, as even for really simple needs, the user is restricted to the subset of interactions that developers have created in advance, and the user can't build their own on top of them.
Writing programs requires clear, linear thought. It requires thinking in terms of structures and systems.
That's true of most current programming environments, but it's not an inherent property of what we call "programming" if understood in a general sense (that of creating new automated behaviours), specially when we restrict it to the basic tasks I'm talking about. Programming by example, case-based reasoning, procedural inference, constraint-based layouts... or even dexlarative markup languages are tools that allow creating some kinds of automations without using a procedural language nor learning an exact syntax.
The field of End-user Development studies those tools in a scientific context, and has some achievements in their history. Many of these tools are limited in scope (they apply to special situations) and are not general-purpose tools; but some of these, like the spreadsheet and Hypercard, are Turing machines at their core and can ultimately be used for any programming task.
The essence of EUD tools is not defined in terms of linear thought but in terms of semantics and inference of meaning - i.e. being able to make sense of the system as presented and use it to solve your current problems. All humans are good at sense-making, provided the tools are tailored to cover their needs and knowledge background. I've learned some research on semiotics applied to Human-computer interaction, and it shows how to study and build such tools. That is not the approach that is taught in common comp-sci curricula, though.
There are plenty of graphical programming languages that reduce the need for precise syntax, but they only REDUCE it, not eliminate it, and they still require procedural thinking which, ultimately, presents an insurmountable difficulty for many people.
True again, but again not an insurmountable problem. Information workers have managed to use Excel as a shared database with abstract datatypes and Outlook as a workflow management and collaborative creation tool, and as structured personal storage, all without learning how to create a single function definition. That's a Good Thing, though it would be better if they could use similar tools created specifically for those purposes.
Sure, everyone should be given basic skills in writing, and perhaps in drawing or painting as a child, and so perhaps everyone should be given basic skills in programming, but beyond that, why?
Who says there needs to be anything beyond that? The main problem for end users is that their essential needs are still not covered with current programing platforms, as basic programming skills are not enough to build even the most basic automation. The knowledge required to build anything practical like a simple app or Web page is still too much; the learning curve is just too steep. That's where a tool like Hypercard, with its hands-on approach and simple conceptual model, would help.
Wikipedia is dead, for anything other than keeping track of trivia about popular media anyway. All the policies about removing content in the name of improving quality, without adding proper quality processes on top, killed it around 2007 - not coincidentally, that's where the decline of editors started.
The huge knowledge base that is Wikipedia is merely waiting for someone to successfully fork it; it may very well be Google graph, as they're the best positioned.
The first company that manages to define a process to separate spam from good content, and keeps the knowledge clean and growing from all valid contributions through a semi-automated technique, that avoids all the drama over rules and edit warring over content, will be the one to keep all the users. And then it will be instantly bought up by Google, who have been eager for a way to replace Wikipedia for a long time.
Yes, thank you. Sorry, I thought it was obvious.
The first Altos were built as research prototypes. By the fall of 1976 PARC’s research was far enough along that a Xerox product group started to design products based on their prototypes. Ultimately ~1500 were built and deployed throughout the Xerox Corporation, as well as at universities and other sites. The Alto was never sold as a product but its legacy served as inspiration for the future.
With the permission of the Palo Alto Research Center, the Computer History Museum is pleased to make available, for non-commercial use only, snapshots of Alto source code, executables, documentation, font files, and other files from 1975 to 1987. The files are organized by the original server on which they resided at PARC that correspond to files that were restored from archive tapes. An interesting look at retro-future."
Link to Original Source
The gamers behing GamerGate make a really good point: that they like their games violent and showing b00bs, female bare skin and women in scant armor, and these games should not cease to exist merely because some people are offended by them.
The journalists against GamerGate make an equally really point, though: that such games do not belong in mainstream titles intended for all audiences; they should be distributed through special channels as the soft porn they are.
Average electricity consumption per capita in the USA is 1683W. For the EU, it's 688W, which makes 2kW ample for a small household. If my electricity consumption went to 1.5MWh/month, I'd start to seriously worry - my electricity bill would be about three or four times what it currently is. According to Wikipedia, electricity in the USA costs 8-17 cents per kWh. That works out at $120 to $255 for 1.5MWh. Do people seriously spend that much money on power each month?
Yes. My monthly usage ranges from 800KWh to 1800KWh (peaks being due to HVAC)
(for some reason the first time I loaded this page there were no comments, so some of this is duplicate)
Excellent! Very glad to hear it. There are a
* CTFTime : http://ctftime.org/ : Website that tracks team scores, upcoming events, and writeups for previous events.
* CapTF : http://captf.com/ : My CTF dump-site that includes a calendar, links to "practice" sites (aka Wargames), and many years worth of CTF events archived
* Field Guide : http://trailofbits.github.io/c... : Specifically covering the skills / approaches, the field guide is a good read for anyone getting into this world.
* Guide for Running a CTF : https://github.com/pwning/docs... : Written by PPP (CMU's ever-dominant CTF team) along with feedback from the broader CTF community, this guide is more relevant when making a CTF, but can aid in understanding how the good CTFs are designed.
* PicoCTF : https://picoctf.com/ : PicoCTF is designed for high school students, but had an awesome difficulty curve, getting up to some relatively advanced challenges by the end of it. It's also extremely well designed, runs for a longer period of time and is a
* CSAW : https://ctf.isis.poly.edu/ : One of the best events targeted specifically at College students, unfortunately the qualifier round just finished, and the participants already selected for the final round, but you can always check out the archives of previous challenges to get a feel for the difficulty. Note that the qualifier event is typically intended to be much easier than the in-person finals to better encourage new students to get into the sport.
* IRC : irc.freenode.net#pwning : There's a lively and active community in #pwning on freenode that would be happy to help you with questions/advice related to CTFs.
* YouTube : There's a couple of different presentations/talks on CTFs over the years. If your'e interested in learning more about attack-defense CTFs and in-particular DEF CON CTF, I gave an old talk that's mostly still relevant (https://www.youtube.com/watch?v=okPWY0FeUoU), though I'd recommend you not focus on A/D at first, but just get into the regular challenge based or jeopardy boards as they're sometimes called.
The best way to prepare for CTF is by... playing CTFs. There's no real magic formula, just go out there and start working on challenges. Old CTFs are great as learning exercises since you can usually cheat and read a writeup, but avoid the temptation as much as possible. If stuck, go off and try another problem first, and only if you're
I find that the Speedtest.Net results are a realistic estimate of my actual best case upload/download speed, but there are certainly some websites which are much slower to load, for various reasons. If you suspect your ISP is throttling some websites intentionally, you can always browse through a VPN service.
As mentioned previously, local WiFi problems are often the root cause of slow page loads. Go wired. You can also use the network debugging tools built into Firefox (Network Monitor) and MSIE to try to determine what parts of a page are particularly slow.
While I enjoyed those older cartoons as a child, now, as an adult I can totally see why they are no longer screening. They were rife with racism, violence, sexism and other crap that I wouldn't wan pumped directly into my child's brain.
On the other hand, you watched them and grew to know it as crap. Your children, not being exposed, will not learn to recognize it, and as adults they may be more likely to fall prey to it.
There's something to be said about playing with risky or shameful behaviors in safe environments - it's the natural way for learning to face the darkest aspects of life.
User Error is the Primary Weak Point In Software.
Corollary: designing software that fails to work well under user error is the primary engineering mistake.
Uh... because each particular species can't choose to be within the survivors or the extinct? That's mostly a random outcome, based on their adaptability to the new environment - which you don't know a priori what will be, or what survival skills will require.
We were as a species on the verge of extinction once, it could very well happen again.
Actually, they didn't. The benchmarking was done as the JSONB feature in Pg is brand new and they wanted to see how it stood up to the competition (being much much slower is a sign that something could be improved).
Being faster on a single node was a surprise.