I think the US system is different, but in the UK you don't pay tax on the first part of your income and a student stipend is non-taxable. This means that you can live very comfortably as a PhD student: the stipend covers your cost of living and then you can earn roughly as much as someone working a full-time minimum-wage job on top of that before you start paying taxes. When I did mine, I was coming close to the tax-free allowance from consulting work, so at the end of it I'd saved enough for a deposit on a house (not a massive achievement: I was living somewhere with very low housing costs).
This also leads to some unfortunate unintended consequences: the main funding body has made it very hard to fund PhD studentships from grants, so the work around is to hire PhD students as research assistants and enrol them as self-funded PhD students. This means that the university charges overhead and the student pays tax, so it ends up costing 2-3 times as much as a funded studentship for about the same level of take-home pay for the student. Worse, they're then above the tax-free income threshold, so they pay tax on top of any other earnings, so PhD students funded on a grant get a much worse deal than ones funded from a scholarship or other award.
yes my brother in Florida lives in nice subdivision with house less than ten years old, but connects with juno dial-up for $14 a month. Compare that with my mother-in-law in Cambodia where in 1998 they were laying down fiber around her neighborhood, they went from dialup to megabits/sec overnight.
Residential fiber in 1998? In Cambodia? I call bullshit. Here in Norway the first delivery of Internet over cable TV was 1998, ADSL in 2000, I see our first FTTH company was started in 2001 but that was in a very small area where they were rolling out lots of fiber for the oil industry anyway and was rare as unicorns and super expensive even for a first world country. It was only in the recovery after the dot-bust in the mid 2000s it saw any real traction.
People who actually believe it are in the minority and are simpletons or mentally ill.
Sadly I know a guy who really full-on believes this... and that Roswell was real, 9/11 was faked and a bunch of other conspiracies. Wouldn't be surprised if he believes in chemtrails and owns a tin foil hat either. But he's pretty good with words and is neither retarded nor obviously crazy. It's like he's just decided the world is a sham, we're all puppets dancing according to some agenda and that warps his perception of everything else. It's like he's just waiting for Morpheus to show up and offer him the red pill. Even if you manage to push back and disprove one little bit of his ramblings it's like okay maybe that was wrong but the other 99.9% is still on.
It reminds me of some of those otherwise seemingly functional people who've been in ordinary jobs but end up fixed on some crazy idea that a Nigerian prince is offering them money or that they have an online girlfriend they never met who totally loves them and totally lose it. Many of these don't fit the "simpletons or mentally ill" who could never hold down a job profile. I saw someone else here mention Ben Carson, brain surgeon but thinks ancient aliens built the pyramids. No matter how smart you are, you see what you want to see. You believe what you want to believe. Then you use your brains to wrap reality around your beliefs, not the other way around.
That's how you end up with scientists with a religion full of facts science has refuted. It turns out people don't have to have one coherent set of thoughts. We actually live quite well doing a day job and believing in the first woman was made from a rib bone at the same time. It's just that for some the last kind of "facts" take over and consume them, to the point where they can't accept reality as reality anymore. Mainstream media (MSM) is fiction, my alt-news is reality. Mainstream medicine is fiction, my homeopathy is reality. And Internet is the greatest boon to alternate realitys ever, here they all meet to agree on how right they are. Most are pretty harmless though.
Because "fake news" has a very clear meaning that should be apparent to anyone who knows what the word "fake" means. Where do you use the word "fake"? You use it in places where something that is known to be false by the originator has the appearance of truth.
You do realize that would not include most conspiracy theories, right? Sure, some are created and perpetuated because it fits somebody's agenda but most of the unpopular ones are simply people seeing patterns that aren't there, explanations as cover-ups, ridicule as naivity or complicity resulting in some bizarre theory that doesn't make any sense or serve any recognizable purpose. Cardinal Richelieu said "If one would give me six lines written by the hand of the most honest man, I would find something in them to have him hanged", well so would a conspiracy theorist find the Illuminati.
When has an intern done any actual productive work? Every place I worked, the interns were always assigned the most menial, busywork tasks that we could come up with.
a) The companies you've worked for pay interns peanuts so you get monkeys
b) The companies you've worked for have already decided they're no good and not to be hired
c) The companies you've worked for are too stupid to give them a chance to prove themselves
d) The companies you've worked for are overworked and needed a steam valve
e) The companies you've worked for are testing their ability to suck it up and endure
f) All of the above
Even if they're serious students you can end up with a lot of strange things from people with little or no real world experience. But I'd give anyone a fair chance to prove themselves, then fail them to coffee fetcher and busywork.
Employees are fickle, they switch jobs or get ill or get hit by the bus or go on strike or whatever taking all their knowledge and experience with them and disrupting day-to-day operations. Sure you might say some companies felt their software was their assets before but they mostly still need expert operators. Automated systems accumulate far more of that value, like say Google's car project. Any one person on the team is probably easily expendable, it's the collective result that holds value. Banks used to have personal customer relationships, today 90%+ just want a webpage and some help if they have a problem. The people are not critical to the process short term as long as the system is up and running.
This obviously shifts the balance of power towards the employee instead of the employer. After all, conflicts are typically about who blinks first and if they can keep the lights on without you that makes it that much harder. I suppose it was always like that for mega-corporations but usually their customers would suffer which would bad for business and bring them back to the negotiating table but if you stare into the crystal ball... just imagine Amazon if you got automated trucks, warehouses, pickers and wrappers, purchasing system, inventory management system, warranty and support systems and so on. It could almost run itself, I suppose there'll always be people at the fringes but they would be just that.
I hear it does great things for 4k, so it seems that it would be really great for HD, and even older 720 or 480 content too.
The main reason it does great on 4k/UHD is that the fixed 16x16 macroblocks in H.264 are too small, HEVC brings flexible coding tree units (CTUs) that vary from 64x64 to 16x16 which obviously has the most effect for the highest resolutions. If you restrict it to 16x16 CTUs you get a ~37% penalty on 2160p, ~19% on 1080p and ~9% penalty on 480p. So not as big a deal for older content as you might think.
It describes the very low level of a program and a computer.
No it doesn't. It describes the very low level of a program running on a computer from 30-50 years ago. The lessons that it teaches about algorithmic complexity are still valid, but the low-level stuff is not. Once you get to limits of the implementation, rather than of the algorithm, artefacts of caches in pipelines are far more important to performance. Not only will you not find, for example, Hopscotch Hash Tables in TAOCP, you also won't find an explanation of the underlying reasons for their performance.
Sure, yeah, you could take a few weekend courses and bang out some stuff and possibly even find a job paying decent money. But if you want to move up in the world you need to turn your hack and slash techniques into a refined art. The kind of crap commodity programmers write is the stuff that skilled developers get paid a lot of money cleaning up or just re-implementing. (...) If you want to work in the big leagues on important things, you need to be open to learning some things and respect the craft.
With all possible respect to all the CS experts of the world, that's not what they teach. Finding a good organization of your application that makes structures easy to break down, processes easy to follow and changes easy to implement doesn't involve deep, abstract mathematical formulations with optimal answers. It's about creating functional units (objects, layers, modules, services) with clear responsibilities that abstract away internal details, create well defined and narrow interactions, break up and explain complex logic, that everything behaves like and contains what you'd expect from common language definitions and naming conventions and with sufficient high level documentation that anyone of moderate intelligence can understand what bits need to go where.
Or to put it another way, if you sent the source code through an obfuscator the CS experts would probably be just as happy with the output as the input, after all the algorithms and functionality are all unchanged. It would make it an incomprehensible mess of spaghetti code and "there be dragons" that nobody understand how or why works, but those are practical concerns. The same is error and exception handling, CS is all about correct algorithms that never get called with invalid input or run into any of those practical problems that cause poorly written software to crash, often without leaving behind any useful reason why and if there's any possibility to just fail this and move on.
I think you're onto something about the craft and the art. If you want to make swords for an army it's a craft, if you're making a nobleman's fine blade it's an art. Most of the time what we want is robust craftsmanship, process as many passable swords as possible and discard any failures. Not very glamorous and not very artistic, we're not awarding points for style or elegance but whether the code you've built is a reliable work horse that gets the job done. Or maybe the difference between an institutional chef and a fine dining chef. One is serving a hundred people a good meal, the other can spend forever making a plate of fine art. Both are very different from being a poor chef, but being good at one doesn't really make you good at the other. And CS is the Michelin guide department.
Lend money to a bad debtor and he will hate you.