And if they had not continued their heavy development after the game was released, it would have been a flop.
And if they had not continued their heavy development after the game was released, it would have been a flop.
At what did I say that hitting that development time guarantees a good game? I didn't. Going over that does have an effect. You can choose to ignore it, and in that case we are in disagreement.
Let's start at D3. 11 years is a horrendous joke. We can ignore graphics. Let's just talk COMPILER. You think they are still using the exact same compiler over those 11 years? Same development environment? Same freaking server versions and desktop versions to run it? Or even using 11 year old graphics cards as their model? Hell no. Those types of upgrades and migrations cost additional time and money.
WoW was announced in 2001, and released in 2004. That's 3 years. Not the 4 to 5 you claim.
StarCraft 2 started in 2003 and not released until 2010. Is it a good game? Sure. Is it as well received if it isn't the sequel to the biggest RTS ever? That's debatable. That's it's now split into 3 separate releases doesn't make it better. It makes it worse. Similar to the Hobbit short story getting split into 3 long ass movies for the sole hope of making more money. Blizzard needs to recoup 7 years of development.
So while you will blindly say I'm wrong, it's more a case of you don't wish to acknowledge the truth I'm presenting.
Nothing you say actually suggests a link between development time and the quality of the resulting product. If I were to go on listing games with 18-36 months of development time that came out bad, I could go on for days, any long-time gamer with Google's help certainly could. That doesn't mean that's a bad timeline either. The fact is, Blizzard rebooting the project will have no real effect we'll ever see on the outcome.
Again, hitting the timeline is no guarantee of greatness. Missing it is definitely linked to seeing games rushed out. Hmmm, the latest aliens game comes to mind.... Anyways, Blizzard rebooting the project means there is a very real effect: that what they had been working on is crap and they're starting over. They do not have the luxury of another 3 to 10 years hoping WoW holds on while they restart from scratch their new MMO development. You think I'm wrong there??
I see no big problem with Titan being delayed. The longer a game takes to develop is generally a good thing. And the last thing Blizzard wants is a chunk of its WOW players to come to a sub par game, then leave for something else that is new.
I whole heartedly disagree with this statement. There is a sweet spot of time spent for game development. My guess on that is 18-36 months. Once game development hits 3 years, the graphics engine on which it is built is old enough to be noticeable compared to the newer content. Now, not everybody cares about that, but why does it matter so much? Because the original timeline was already within that time frame. That means the game is getting grossly overdue. Grossly overdue games are in that state because the devs cannot get it to a releasable state.
Most recent example in my head. TOR. You may have heard of that incredibly expensive, overdue boondoggle that EA put out. I bought it. Was excited to play it. Until I played it. There are many problems with that game. I won't even blame the devs for them, because IMO it's fundamental flaws in the game's design.
Duke Nukem is another. Or the recent Blizzard offering, Diablo 3. Look, once a computer program (any program really) goes too far over schedule there is something wrong with it. Titan being delayed and large scale developer changes means that game is fatally flawed and they're probably looking to push it to any functional state possible so they can sell a crappy ass game to as many unsuspecting fools as possible.
He's not getting back into coding. He dabbled a bit as a hobby 20 years ago. This mindset that just anybody can pick up and become a programmer is a problem in this industry. Sure, anybody can hack away and make "something" happen, but it takes a whole lot more than that to be a competent software engineer.
If he's looking for a career change, a programmer is not it. A Network Administrator is not it either. Maybe get into hardware repair somehow, ie copier repair.
People going around saying "anybody can do it" do a disservice to those who should not spend the time and resources to attempt to jump this. There's a lot of competition for entry level programmers, a lot of bad competition, but those young kids will always be given the opportunity first. They're cheaper. They'll most likely not be taking sick time (this guy even references his aging body).
It was written by PRIVATE employees, paid for by PRIVATE monies. Obama's campaign did not take PUBLIC money for his re-election.
The code is either owned by the Obama campaign, or the DNC, or perhaps a specific individual. It all depends on who payed and who commissioned the work. Regardless, no government civilian workers had their government paychecks granted to them because they worked on coding the Obama campaign's get out to work widget.
So the developers of MariaDB took all their experience and knowledge that they obtained at Oracle while working on MySQL and created a direct competitor product to their prior company's product? I don't care why they did it, it's underhanded and doesn't speak well to the character of those people.
That "litigousness" you reference should have given them pause before doing this. Oracle just may find enough crossover coding and algorithms to go after those guys. Wouldn't matter if they're innocent or not, just the court costs involved could wipe them out.
(everyone who's working on it has a PhD, while I'm an undergrad
There's your problem. PhD's don't necessarily know how to write good code. They spent all their time learning and not applying.
They can certainly take very advanced concepts and make a program that does it. It just won't be very pretty or maintainable.
As a general rule, the closer someone is chronologically to having finished college, the more correct and prettier their syntax is and the older programmers simply refuse proper retraining or learning in general really.
This statement is all sorts of wrong. Newer kids write "more correct" code? Or have prettier syntax? If that were true, then every programmer would be out of work by age 35. The truth is that it takes about 5 years of good professional experience to become a decent programmer. Schools just cannot possibly prepare a kid for dealing with a program with a code base of over a million lines of code. Sure they might write some pretty webpage, or some basic database shit, but you're seriously deluded if you think a 24 year old who isn't some type of coding genius is a better programmer than a 10 year professional.
Totally agree. I'd just add if I were in charge, I'd explain why his behavior got him fired and not another chance. Guess that's one reason why I'm not in management!
He's an arrogant douche. Get rid of him. Any new kid who has the balls to call existing code terrible to your face is going to be nothing but far more trouble down the line. He already knows everything, and nothing you can say is going to get through to him. It's just not.
Get rid of him if possible, stick him in a little box otherwise. Don't let him work on the good stuff and hopefully he'll leave on his own because he's "awesome". Your bigger fear should be that he won't ever leave...
We have a few of those types in my office, and sadly we don't get rid of them because they have value just being present. It's depressing watching them bounce from project to project because they can't do anything right, and yet think they are god's gift to programming and that we're all just idiots.
Don't take the time to explain jack shit to him. It's been tried here, and they didn't change. People may say I'm too harsh and that everyone deserves a chance. My response is that they don't. People that act like dicks to others like that don't deserve another chance. Their next chance is at another job.
I like this post. HP big corporations are run by bean counters now that only look out for the stock price. Employees are an overhead cost that get in the way of profits, so whenever they can scale back those costs they can.
HP cut 5% this year. How many will be cut next year, or the year after? Those remaining employees know that their workload just goes up and up as more coworkers get cut, and it's guaranteed that corporate management will cut more when they need more profits. The key personnel jumping just means they understand that and don't want to deal with it anymore.
Certain classes are 4 credits each. For instance I remember my Chemistry being 4 credits, while Biology was 3 (not including lab credit). Directly representing the number of hours lecture per week. My various English, history, etc were generally 3 credits. That's gonna put your math off a bit.
I'll even go a bit further and recommend against summer courses. Not because they're harder, but because they pack an entire semester of study into a few weeks. It's a high work load, and coupled with full time employment that is going to burn a guy out. One summer maybe, but make no mistake that course load, and homework/projects coupled with the work hours is going to make the guy really, really want some time off. I did this for 3 years while working full time 2nd/3rd shift. The summer course was most demanding because of the time requirements.
I just wouldn't want to see the guy jump in and start and not fully understand that he's looking at a very, very long process and a ton of work to get his degree. Just throwing out some slightly flawed math and say, yup you can do it in 6 does not fully inform this person that he will in effect be giving up his personal life for the duration of those years.
This is what I was thinking. However I think you're being a little naive about what it will take to get a BS doing part time coursework.
For the submitter:
Getting your Bachelor's with no previous coursework is going to be a Herculean task while working. There's a ton of classwork involved in areas you just might not be interested in. For a Master's, maybe a Ph.D. (I don't have one so I don't know for sure), part time coursework can be done while working. That Bachelor's though, there's just too many credits required to get that degree in a short period. And by short period I am talking 5-6 years.
You're looking at probably 10 years or more to pick up a Bachelor's doing part time class work. Depending on the school policies, you risk some of those credits expiring before you would graduate. Further, you need to realize that 10g is not going to go that far at traditional 4 year schools. 2 classes per semester is realistic.
Look, this isn't impossible. If you want to get that BS before you turn 40, you are going to have to work afternoons and be a full time or close to full time student in the mornings. You're a programmer now, so hopefully your company will work with you on that. If they won't, your dreams of a BS are going to be just that for a long time. On the bright side, for a lot of programmer jobs the job experience is far more powerful than the education. Granted you are locked out of certain companies and industries, but by no means does that mean your career could be any less successful than others.
For home users, yes it's a no go. That price point can get a good android tablet and an ok desktop combined.
I think where Microsoft is headed with the Pro is enterprise. At least I believe that's their plan. Enterprise can deal with the price if the device seamlessly integrates with their networks and supports everything the system administrators require of it. Then there's no need to migrate existing software or make web portals to them. From my experience, the web portals of the software systems I use are all woefully lacking compared to the desktop versions.
You're right though about it being late to the game. If people already have their ipads are love them, they are not going to be happy with a me-too tablet. I'm not saying the Pro is a slam dunk, or will even be moderately successful. Just that it has nice positioning for providing enterprise with a good tablet that could match their existing security requirements. There's a lot of ifs, but it's something that Apple and Google just haven't provided. At least not yet.
I have a theory that it's impossible to prove anything, but I can't prove it.