Mr. Obama is president of the United States.
Is this really necessary? I mean, I'm sure there are a lot of people who don't know who he is, but how many of those people read to the end of Wall Street Journal articles about cybersecurity?
The Fed, headed by Chairman Ben S. Bernanke, argued that revealing borrower details would create a stigma -- investors and counterparties would shun firms that used the central bank as lender of last resort -- and that needy institutions would be reluctant to borrow in the next crisis.
No shit, Sherlock! I sure wouldn't invest in something that racked up that much debt, and I'd be wise to avoid doing so. But I guess if you're not going to keep secrets in a way that entices people to make terrible investments, it makes sense to plan for the next crisis instead of fixing this one.
Intel's move to SW upgrades of CPU microcode is creating the tech and business infrastructure for regular FPGA upgrades to these new hybrids.
Not to mention undetectable, CPU-resident malware. Yeesh. If FPGAs get cheaper to manufacture and program in huge quantities than regular non-reprogrammable hardware, then sure, use them. But don't make the whole freaking CPU reprogrammable via software running on that CPU. That's going to be nasty. Think AESDEC becomes AESDEC and copy the key material off to somewhere shady.
I don't think it's a particularly scary movie. Most of the bad guys are just normal people in latex cat suits, and they don't meet their ends in particularly horrific ways (except for the one guy that gets thrown through the propeller). To me, Red Skull seems too weird looking to be scary. If he looked more human, he might be kind of scary, but he looks slightly more like a real person than Shrek. The costume is pretty good, which is to say that he doesn't look fake; he just doesn't look human, so it's not very freaky. Just my two cents. If anything might scare your son, the scene where they perform the experiment on Steve might. It's obviously pretty uncomfortable, but you can't see anything, and Steve takes it like a man.
Having seen Transformers 3 when it came out, I was quite relieved that this movie lacked the sexual exploitation and inane humor that are so common in big action movies and that Michael Bay has turned into a science. If you want your kid to enjoy and get something out of any of the recent superhero movies, this is probably your best bet. Cap's an imitable, stand-up guy, but his story doesn't feel preachy or overly simplistic, and it's not diluted with raunchy filler for its own sake. Plus, it's big, loud, and exciting.
If I hadn't come to hold such a negative opinion of Facebook, I'd give Google+ a shot. I pretty much stopped using Facebook for communication last summer. They kept making more of my private stuff public, which made me less and less comfortable having any info on Facebook. It also seems like Facebook has gradually gotten better and better at delivering annoying things to me. Meanwhile, I noticed that I would check it multiple times a day, but rarely did people communicate with me directly. I just looked at other people's inane posts and conversations. It was actually kind of a lonely and depressing experience. So I took almost all of my information off of Facebook, disabled my wall, and made my profile list the other ways in which I prefer to be contacted. I didn't leave entirely, because there are some people who, annoyingly, do not communicate outside of Facebook.
Along comes Google+. I have slightly higher confidence in Google's willingness to respect my privacy than Facebook's primarily because Facebook is so astoundingly bad at it, and I have much higher confidence in Google's ability to deliver a clean interface that lets me see what I want and doesn't just annoy me. From what I've seen initially, this confidence appears to be somewhat well-placed. The ads for Google+ show a sparse, efficient interface and, among other things, the ability to keep people from tagging you in photos. The stated goal seems pretty much the same as Facebook's, and if I were a heavy Facebook user, I would switch to Google+ in the hope of being less annoyed and somewhat more private. But I gain almost nothing from Facebook right now, and although I think Google will try harder to keep my data private, Google is also probably a lot better at thinking of interesting things to do with it that I wouldn't like.
I'll pass on Google+ for the moment. I'm on AIM, Google Talk, and Skype, and I check my email accounts frequently. My friends know where to reach me.
It's been pointed out before, and is worth pointing out again, but US government default is prohibited by the 14th amendment of the constitution [wikipedia.org]. Whether they follow it or not is left to be seen.
IANAL, but if the US government defaults, aren't they just saying that they are unable to meet their obligations? That doesn't sound the same as questioning the validity of the public debt. To do that, I should think they would have to default and then lie and say they didn't.
I'm a junior in college majoring in Electrical and Computer Engineering. I haven't "accomplished anything" that Evans would take seriously at this point. The main reason is that I'm extremely busy for most of the year. I work for probably between 55 and 65 hours a week on average. Could I contribute to open source projects or develop Android apps on the side? Sure, if I wanted to regularly stay up for four days at a time and accept a hit to my QPA. (I know people that do this.) Last summer, I wrote a good amount of code for internal use at the company I worked for, but I can't really go sticking that in portfolios. I hope that Evans will forgive me for taking an actual break on my winter break, as opposed to seeking out "real-world projects with real-world users" to contribute to in a way that I can demonstrate pre-interview. (I want to do systems software, so things that I would seek out to work on probably won't have that many direct "users.")
I can buy that a technical interview with no demonstration of coding ability might let through some inept people. I got my last internship with just one technical interview. I hope I turned out OK. The company I'm working for this summer had a more thorough process. The recruiter comes to campus and interviews people whose resumes they liked from the career fair a few weeks prior. I can't remember much about that interview, but I don't think it was very technical. It only lasted about half an hour. Later that day, everyone that interviewed got an email directing them to go to a website and take a timed test with various programming questions. Most or all of it was multiple-choice, and there might have been some short-answer questions. People who they liked on the basis of the interview and test came out for on-site interviews. There, I was given a programming problem and five hours by myself to solve it optimally. There was a guy somewhere else in the building who I think was looking at my code periodically who would come over at various times and ask me if I could do anything to improve performance for a particular input. After I was done coding (actually the next morning), the actual interview occurred. The interviewer had read my code (and maybe talked to the guy who watched me work), and he asked me to explain it and describe my thought process as I designed it. I'm pretty sure that was the most important interview. In the other one, I asked the interviewer how he liked the surrounding city, and he talked about that for 20 minutes.
I haven't started working there yet, so it's possible that I could still show up and be the new guy that can't code. I think I can code. In the last year, I've helped write most of a small OS kernel for ARM, and I've helped implement a basic MIPS processor in Verilog (not real programming, I know). Those were both partner projects, but the commit logs will show that I pulled my weight. Nevertheless, if Evans recruited for this company, he would probably complain that I was working on contrived problems and that nobody actually used my results. I don't think that means that it wasn't freakin' hard or that I didn't do a good job. The interview process seemed pretty solid, though. It also seemed pretty time-consuming. I've never hired or managed anybody, so I don't know how you decide how much time to spend on a candidate.
This course is about how to use theoretical ideas to formulate and solve problems in computer science. It integrates mathematical material with general problem solving techniques and computer science applications. Examples are drawn from Algorithms, Complexity Theory, Game Theory, Probability Theory, Graph Theory, Automata Theory, Algebra, Cryptography, and Combinatorics. Assignments involve both mathematical proofs and programming.
To the best of my knowledge, this course is currently taught and will continue to be taught using functional functional programming. I took the Fundamental Data Structures and Algorithms course referenced in the article. It has some of the problem-solving ideas from Great Theoretical Ideas, but it's more about just knowing what algorithms are out there. Up to this point, the course has been taught in Java with heavy focus on object-oriented ideas, but that will now change to functional programming with ML. I appreciate the benefits of the knowledge it provides, but its mathy nature was sufficient to convince me never to take Great Theoretical Ideas. I'm more of a systems guy.
As an Electrical and Computer Engineering major at CMU, I agree with you. ECE students currently get almost all of their programming experience from courses offered by the CS department. Those of us that go the software route generally tend in the systems direction or at least toward things that are useful in a non-theoretical sense. The upper level systems courses in the CS and ECE departments depend on the course Introduction to Computer Systems, which will stay the same during the change from OOP (Java) to FP (ML). Currently, this course depends on a freshman course, Effective Programming in C and UNIX. This course is to be replaced during the transition. Apparently, it will start by gradually weaning students off of the nice, clean functional model of execution and only allow them to use a "safe" subset of the C language until the very end.
I fear that there will be a nasty chain reaction of sorts: students will start with one of the intro courses and get the functional view of the computer without learning much at all about how it actually works or why that might be important. When they get to the imperative course, they will spend significant time learning what were previously considered to be the basics of introductory programming. Because of this, they will spend less time actually becoming proficient with C and Unix. When they get to Intro to Computer Systems, their lack of proficiency will make what is already a very challenging course an absolute nightmare. I predict that, as a result of the planned changes, fewer students than currently do will stick with computer systems.
Disk crisis, please clean up!