Personally I think XCOM should have been picked over WoW. It's super iconic, consistently ranks as one of the best videogames ever made, and I would argue has had more influence on video game development than WoW.
Don't be absurd. There's no evidence that it's inherited, and he's arguing that it is a result of events in the person's life. It's entirely nurture.
I don't know if it's entirely nurture, but I do think that's a big part of it. My wife and I are Millennials, and when we grew up all these technologies like the Internet and video games were coming in and I don't think anyone thought too much about it. I know I've played a lot of video games in my time and wasted absolutely copious hours online, though I don't look at porn both for the sake of my wife and because I'm a Christian and it isn't right. But Internet and video games seem to be enough to cause problems even without porn... I'd actually wondered if they weren't rewiring my brain several years ago as I began to notice how hard a time I had putting down my smart phone and just being able to sit quietly without technology in my hand.
My wife and I have three children, and we've realized phones, iPads, etc have become so ubiquitous they're interfering with our ability to spend time with our kids and raise them well. It's just way too easy to grab one of those and not talk at the dinner table, or to completely numb out in the evenings. We are lucky that we are old enough to remember a time before the technology when we were different, because we can see that changes have happened and we've realized we don't like the changes. We're actually turning things off as a result. We've deactivated our facebook accounts, the iPad stays pretty much just at work where I use it only to read work related research materials, and my wife is going back to a flip phone with no real Internet browser (and I may follow her shortly). Oh, and my giant flight simulator hookup is up for sale too. Our oldest child will be six this summer, and I'm just hoping we haven't lost too much time, and that the rewiring can work the other way.
While a lot of people would say I'm pretty successful, as I've gotten a lot of promotions at work and make a good income and have an intact family, still I don't think I'm really reaching my potential as a Dad, husband, Christian, or even engineer because of all these distractions. So here's to turning it back off and partying like it's 1995 again, with an isolated, inconvenient to access desktop being the primary way to get online and discouraging access.
Sorry to ruin it for you, but despite the fact that Gen X is generally confined between 1960 and 1980, your upbringing - technology wise - was exactly the one of a Gen Xer, and that struggle is exactly what makes the tech savvy X and Boomers have that extra insight in IT.
For what concerns IT, you can consider yourself a product of Generation X.
I don't think it's the same at all, unless you are confining yourself to a very high level statement where "the same" means "technology was immature enough that the average user had to have computer troubleshooting skills to get things to work". In that respect, yes, things were the same, and are quite different than the situation the new generation has with highly refined and simple technology.
But on a more specific level, due to the rapid progress of technology, Gen X and early Millennials were very different. When most Gen Xers I know were growing up in the 80s, they were all about the better and better baud modems, getting books of phone numbers and dialing into BBS's, soldering things together, dealing with computers that didn't have hard drives, and playing lots of textual MUD dungeons or basic Atari 2600 games, and browsing USENET and working on pure DOS or Unix systems.
By the time we got our first computer, which was Windows 3.1, the GUI was firmly entrenched, hard drives were in the machines, and no soldering was taking place, and many of the BBS's were gone or dying (I didn't even hear the word BBS until 20 years later when an older colleague mentioned it to me). My family's first computer even had a cassette loading CD-ROM. Our second computer, and the one I was old enough to most clearly remember, had a Pentium, a 56k modem (nothing slower was being sold by that point), a hard drive of over 1 GB, a normal CD-ROM drive, nice sound and video cards, and Windows 95 with Microsoft Office.
Of course even the Windows 95 machine was not particularly refined or bug free, so I learned many of the same troubleshooting methodologies as a Gen-Xer. But I used almost none of the same technology. None of my computers used floppy disks that were actually floppy, I never got on Usenet (my first Internet exposure was using Netscape Navigator to browse to Yahoo, probably around 1996), never used the Internet on anything less than a 56k modem, and we had broadband and Windows XP in highschool, where my friends and I worked with firewire connected camcorders and video editing software. And I have to admit, while I can hold my own with my older Gen-X friends at work when it comes to software troubleshooting, I have NONE of the hardware or electrical skills they had. I've never used a soldering iron in my life. So I really don't view my technology background as being the same as theirs, although luckily things were still buggy enough that it honed the same troubleshooting methedologies they had, at least on the software side.
Self driving cars/trucks/farming equipment, robotic factories, etc., will reduce the need for a massive workforce, especially once we get closer to the singularity.
Lol, we aren't anywhere near "the singularity", despite the media hype. Back in the day "AI Research" at MIT consisted of getting computers to beat people at chess. This was accomplished by making better and better rules to allow the computer to make better moves in various situations. It wasn't real intelligence, but it made the computer "seem" intelligent.
Today, the big AI hurdles we are crossing are translating sounds to text, synthesizing natural speech, and being able to better contextualize spoken text so that computers can do better searches on data and discover more relevant results. All of those fields have made big strides, but they were done by writing better and better rules and using better and better statistics, not by any true "learning" or "intelligence". They are similar to the chess work done by MIT... they make computers seem intelligent and make them more useful, but they still don't think or have consciousness.
What we are heading towards is the USS Enterprise computer on Star Trek The Next Generation. It will be capable of understanding human speech and generally delivering what they want, but not capable of independent thought. True AI, true independent thought, is extremely hard and we haven't even scratched the surface. We have no idea how our brain works or how we think, let alone how to build applications that can think on their own. True AI (the singularity) will not just magically happen. Things don't happen by magic, least of all computer programming. It will not ever happen until we have enough understanding and design prowess to design and construct a thinking computer. Until then, we will have Siri/Enterprise computer like devices... they can translate speech to text, then compose Google searches, run by ever improving algorithms that deliver ever improving answers, but the leap to thought is not going to happen.
Also born in '83 here. I don't really understand why we're considered millennials.
My assumption would be because we were still growing up during the turn of the Millenium. I had turned 16 just two months prior to Jan 1, 2000 and all those big celebrations. I think that's why the cutoff is usually listed as '81 or '82 or something like that... millennials are pretty much anyone who was born but not yet 18 by the turn of the millenium.
THE defining event for the millennial generation (IMO) is the recession and just God awful job market when they got out of Uni (one might argue 9/11 was, but I'd disagree). I graduated into a booming economy, and my career (embedded and electronics engineering) has pretty much roared thanks to that. By the time the recession hit, I was already considered essential (or useful) enough to keep my job throughout the recession. No raises in that time, but I've caught up. The poor souls who graduated just a few years later will never catch up.
I would say 9/11 and the second great depression (honestly, looking at the length of it I think it deserves the title... plus the IMF has labeled it the worst slowdown since WWII, so yeah, only the great depression stands equal with it) were both equally defining.
And I think people in my age range at least got hit just like people born a few years later. I graduated in December 2006, and the housing bubble had already peaked in early 2006, and started to accelerate into collapse at the end of 2006 and into 2007. The U.S. National Bureau of Economic Research records that the recession began in December 2007 and extended 19 months.
So people my age got hit just like the people behind us. The job market wasn't great when we graduated, and though not as bad as a year later, I had a number of friends that got furloughed or laid off in the resulting chaos. I did ok because I was in a recession proof job (making 99 cent Totinos super cheap pizzas, which naturally stayed in demand), but my bonuses and merit increases and 401 k matches were all cut as I recall. But I probably did the best of anyone my age, because as I said, many who graduated with me got furloughed or laid off, and quite a few who were a year or two older and at the very oldest age of the Millennials had bought houses at the peak. A lot of them are still paying for that mistake because they are still underwater and can't sell, or if they have sold, have done it in the last year or so and have lost every dime they ever spent on the house in the last decade, having nothing at all in savings or equity to show for their first decade of work.
So while I admittedly lucked out timing and avoided a worse fate than so many others, I wouldn't say early Millennials in general had it way better than those born three years later. Most everyone got hammered by either housing losses, layoffs, furloughs or a bad job market, and the only ones who seem to have it good are the late Millennials who missed it all and are graduating now. (But with the easy money policies at the Fed and wild government spending, bubbles already appear to be forming again, especially in stocks, and I would not be surprised if the process doesn't repeat itself in just a year or two. Stocks are seriously insane right now, and if you look at graphs of stock price compared to forward price to earnings, they have left their historical moorings and are now rising with no relationship to underlying performance).
So you grew up with 95/DOS AND instant messaging?
Umm, I hope you are being sarcastic, because those technologies did come out right around the same time and certainly aren't mutually exclusive. Windows 95 when I was in 5th grade, and AOL Instant Messenger when I was in 7th grade (in the Des Moines area, everyone used AIM even if you weren't an AOL subscriber... it was the thing to do in middle school and high school in the pre-texting years).
And in case you forgot, Windows 95 was still more of a wrapper over DOS than anything else in those days. A lot of games being sold were still for DOS, and you accessed the DOS command line to install and launch them. So yeah, DOS and Windows 95 went hand in hand, definitely not mutually exclusive. The first computer we had ran Windows 3.1, also a very DOS heavy experience depending on the application.
Anyone remember playing Star Trek: A Final Unity, Sim City, Sim City 2000, X-COM, Across the Rhine, or games like the PC MegaMan X port? I remember MegaMan X requiring quite a bit of work to get it going on my machine, but man, those were all great games and so worth it.
Link to Original Source
If this trend continues, we're going to be awash in smart financial or medical people. Y'know, stuff that's harder to outsource so easily.
I understand why medical is hard to outsource, but I would think finance would be incredibly easy. I'm pretty sure Excel and calculators are plentiful in other countries.
The implication of that post seemed to be that "I wanna make games" = "not serious", and therefore less likely to learn a "serious" language like C++. I just thought it was an odd thing to say when C++ happens to be the language of choice in the videogame industry.
No, he's right. The "I wanna make games" crowd is usually not very serious. The "I make games" crowd is where the serious skill is at. But only a tiny, tiny subset of the "I wanna make games" crowd is actually serious enough to make it to the "I make games" crowd.
Side story: I was a senior year computer science / computer engineering double degree student at my University. My senior year, I happened to move onto the floor in the dorm for the computer science learning community (something I had never lived in myself). Learning communities were a place where freshman sharing a major could live together and learn/study together. Anyway, all these CS freshman, about 30 of them, all were in CS because they "liked video games and wanted to make games". They would run around dressed like medieval people with spears playing Dungeons and Dragons, playing video games, etc. None of them understood that programming is challenging, requires a lot of theory and math, etc. I kid you not, I don't recall even one of them making it to their sophomore year as computer science majors. They all switched, and it was pathetic.
So that's what he's referring to when he talks about the "I wanna make games" crowd. They are a dime a dozen and not serious at all. The "I make games" crowd, on the other hand, is extremely skilled and respectable technically.
Stop hiring Indians and Chinese.
Ridiculous. Actually, part of the problem is that due to wealth transfers (Welfare and tax credits), government handouts to unions (especially federal union jobs), etc, have made it so that engineering take home pay gets held down through taxes, and some other jobs get paid more than they should. It's not that I want lower wages for some people, but, when the disparity in earnings gets artificially reduced, a lot of people may not be willing to take the much harder STEM career path for only marginally higher earnings. In countries like India where engineers make ten times the average wage, EVERYONE lines up to be in STEM.
Here, you can have government or factory jobs making 45,000 a year, and starting engineering jobs being 55,000, and while there is probably more upward potential with engineering, it takes way more work and leaves a lot less time for goofing off in college. If the government makework job paid a more realistic 25000-30000 and the engineering job started at 75000-80000, you'd see everyone with any ability flooding into the STEM courses, and you'd be more likely to reach a supply/demand equilabrium when it comes to STEM talent.
Note: STEM jobs also take a very considerable amount of constant lifelong learning to keep up with technology changes. Constant studying, test taking and certifications are often the norm, whereas other fields you learn how to do a job and then you never crack a book again for 20 years. Tech is a tough treadmill to be on, and if you want people to go that way you have to make it worth their while by not monkeying with wages and wealth redistribution.
Pay us well (and give us raises as we gain experience so we don't have to job-hop to be paid market rates).
Treat us well (no more 70 hour weeks, no more rollout-on-weekends-with-no-comp-time, no more demand to fix bugs on our own time, no more keeping us in meetings all week then wondering why work didn't get done on time, etc).
Give us job security (no more you-are-useless-if-you-are-over-40).
Do that, or even some of that, and the workforce will swell with tech workers.
Wow, these are all so true. I was at a company I really liked... really liked the people and my boss. I was the lead engineer on a team of 15, but was the second lowest paid guy. Everyone coming in got to negotiate, but I couldn't. Went to my bosses, they agreed I deserved the same wage, fought for it, but HR shot them down. I guess HR didn't think I'd leave or something. But I did. I have a young family of five to support, and I can't afford to be underpaid. At the end, the difference between my pay and the industry average was $30,000. I left and immediately ended up at the average. Now they have to replace me with someone who doesn't have eight years of experience with the company (and new people are always a risk), and they will have to pay the market rate I was asking for. And I actually wanted to stay and would have if they had just paid me what they WILL now have to pay the external hire. Why are idiot HR departments so short sited?
And yeah, the meeting thing is so true. Seriously, STOP the meetings. If I have five hours of meetings and three hours of emails being sent to me each day (many of which turn out to be FYIs that I didn't need to be copied on that waste my time), how can I get anything done? I truly believe the fix is agile for infrastructure: pick what you are doing for your two week sprint, and work solely on those items for two weeks. Instead of that though, most places give you an annual list of 15-40 projects that you work on simultaneously (impossible), and you have the overhead of having to go to status meetings and send constant emails about them every day/week, even though you really aren't working on most of them in a given week. Such a waste... it's like a computer that has too many processes and spends all its CPU time doing context switching rather than actually processing meaningful work. I really think the ideal number of projects at a time would be about 3. If people were allowed to work on a small number at a time, knock them out, and then move to the next thing, I think they'd actually get more total projects done in a year than the "work on them all at once" method that seems way to common.
Sidenote: IMO, the "do them all at once" method is nothing more than a crutch for bad managers. They don't want to tell anyone their project is less important and needs to wait until mid-year to start, so they pretend they are going to start it right away. They don't care if having 20 active projects at a time bogs everything down in project overload and everything takes longer, just so long as they can make themselves look good because they are "servicing" it.
Makes me glad I'm one of the leading edge Millennials, one of the ones that grew up with Windows 95/DOS and all the associated bugginess and user-unfriendliness of the applications of that era. We actually had to learn how our computers worked and how to really get in and fix things. These later edge Millennials that got iPhones in middle school and high school have utterly no idea how any of this stuff works.
For reasons I don't understand, the media continues to refer to the trailing edge Millennials as technology whiz kids who have grown up with technology and are "technologically savvy", but to my way of thinking they really know nothing about technology at all. It takes absolutely no skill to use some Apple store approved iPhone app with a super simple, refined UI. It did take skill to try to install and run old DOS games and get all those crazy, primitive drivers to install, work, and not have conflicts with each other. Those issues led to a curiosity about computers, which led to me learning programming, which led to a computer engineer degree and ultimately a good career in IT, but had I grown up with an iPhone I wonder if it would ever have happened.
Oh, and let's not forget leading edge Millennials are phenomenal typers too, because we grew up with Instant Messaging clients, not texting with our thumbs. Not a bad skill to have in IT.
-Born in late 1983.
Actually no, they didn't announce a damn thing you completely inaccurate headline and summary. They very well might but that is just one article speculating at what they might announce based on what other companies are doing.
I knew the summary was full of baloney when they called the Virtual Boy an ill-fated "peripheral". A mouse or an add-on device is a "peripheral". Virtual Boy was not a hardware add on to something, it was a full blown console in and of itself (I still own one).