But it wasn't just Valve. TF was a Quake 1 mod. DOTA was a warcraft 3 mod. I also remember playing Urban Terror on Quake 3. Also tons of Neverwinter Nights mods. But yeah, Halflife mods were really influential. We had Counterstrike, Natural Selection, Day of Defeat.
Early 90s shareware was very different from late 90s and early 2000s mods. A bunch of eventually AAA titles and studios spawned out of mods for existing games. Things like Counter Strike, Team Fortress, the original DOTA. I'd put that in a completely different Era from the console scene and the shareware scene.
Temper this with the fact that I'm one of the few people who actually like Gnome 3, enough that I switched from Ubuntu to Fedora just to not have to replace Unity. But, fine, people are angry that they didn't respect their user base, when what their user base wanted was yet another rehash of the win 95 desktop layout. The Gnome developers actually tried to do something new in desktop UIs, they actually tried to innovate. And as with any innovation, some of the things they did worked, and some didn't. Gnome 3.0 had a lot of problems, but the potential was there and some of us saw it. As of Gnome 3.8 there is a ton more polish. And a lot of that polish came from user feedback. No they didn't listen to feedback that said "Bring back Gnome 2! No change evar!" They just continued to refine what they had. And they laid down a ton of backend libraries that allowed things like Cinnamon to exist. If they had adopted Cinnamon as one of a few official skins for Gnome 3, would people support them then? Because in terms of development there wouldn't be any change. Some devs continue to work on the new UI, some devs on the rehashed old UI, many on the shared core. Just like today.
I'm going to go contribute to a project that has done amazing things for open source.
They already have some heuristic that puts up the "Reboot Required" message when certain packages are installed, and it seems fairly conservative. I would hope that they continue to use this, just do the delayed install instead of hoping you still have a functioning enough system to run halt. But yeah, that wording does sound overly broad. But "when additional metadata becomes available" also sounds like they're waiting on another feature and they expect it before release. Just speculation, but I really hope they wouldn't reboot for anything without a
I didn't want to imply that either did what they did to intentionally avoid anything. Both filesystems behave like they do for legacy purposes, nothing more. Those legacies just have different consequences. And no its not all bad, there are seamless upgrades you can do on linux that just aren't possible on windows without jumping through a lot of hoops. But its a feature that can be abused. Same way windows behavior produces more consistent behavior, until some file is locked, and you need to delete it, but the process locking it won't die no matter what you do, and you're stuck rebooting.
In short all software is shit.
If that's the case, why when you install any random hotfix does it scatter files marked for upgrade througout the system?
(Disclaimer, I'm a linux/unix programmer. My experience with windows is limited to porting code originally written for linux and writing systems automation scripts for SQL Server, where I dealt with this exact problem)
Its semi-true. Since XP they've gone through and tried to untangle the mess of dll dependencies, and there's a lot that doesn't require a reboot. But if you look at the dependency graph of windows core libraries its still a big circular mess.
The file lock on open is still definitely true though, its core to the way windows filesystems work and changing it would break compatibility with decades of software.
Ok, you guys (and TFA) seriously missunderstand this feature, and yes it is a feature. This won't affect any update that doesn't already require a reboot. The difference is that currently if you update a critical system library, everything that depends on that library has the potential to act in an unstable manner until the next reboot occurs. This change says that if you're updating one of those libraries, the update doesn't actually happen at package install time, it gets scheduled to occur on the next reboot. That's it. No more extra reboots, just more stability and updates scheduled for boot time.
The fact that windows has this feature isn't a problem, its the fact that it requires it on nearly every dll update. The reason for this is that windows locks files when they're in use, so its actually impossible to update the file until the services that use it (which are often core system services) are stopped, ie at boot time. Linux has avoided this by making its filesystem be refcounted. If a file is in use and you delete it, it stays there until the thing using it exits. So library updates just delete the old library and install a new one, while programs using the old library continue to until they're restarted. This works until you have something dynamically loading stuff, or when you have ipc between programs using the different versions of the library, or a million other modern techniques that unix designers didn't think of.
Anyway, this really is not the travesty everyone here thinks it is.
Parent was me, forgot to log in
I keep trying IDEs and have yet to find one that I work faster in than emacs and command line tools.
Its been 6 years since I worked there and I haven't kept up with them, but at the time Amazon employed some core developers of Perl and some of the major libraries (I believe they paid people to work on Mason). My knowledge is very out of date however, and they may not even be as big a Perl shop as they used to be.
My girlfriend is deaf, and traditional office environments have typically been hard for her. At some point she decided to start working from home so most of her interactions would be over email. She had experience working for non-profits and picked up some SQL and knowledge of some of the databases backed financial systems that those non-profits use (notably Raiser's Edge). She found a decent amount of work on E-lance doing financial reports for non-profits using Crystal Reports and SQL Server Reporting System, and when it came to hard SQL she was able to ease her way into it, taking on more and more complex reports over time. Now she's a full time contractor, working from home, telecommuting to non-profits all over the world, making good money.
Learn some SQL, one of the report generating tools, and how some type of business stores its financial data, typically a non-tech heavy one that's not likely to do this stuff in house.
I actually like Gnome 3 as well. I thought I was the only one. Unity drives me insane however. It seems like the two were going for some of the same concepts but Unity missed the mark.
Thank you sir for that insight. The takeaway of this whole discussion is that the key to job security is a four digit slashdot ID.
It wasn't actually phone support, but it was low level end user desktop support. Basically instead of "have you rebooted it?" over the phone, I would go to their office and reboot it, and anything more complicated I was supposed to send it downstairs for level 2 support (I actually got yelled at for reinstalling a driver once).
And this was by no means a hierarchy. Tech support was just the easiest to break into without experience. The move to sysadmin was natural, work on servers instead of desktops. QA was a calculated because I wanted to write production software. It was actually a step responsibility wise, but got me closer to dev.
All of those have the potential to rise up and make careers out of. I know from trying to hire them, senior QA people are worth their weight in gold. I just knew from the beginning that I wanted to write code.
One other thing I should point out is that I've found startups to be generally more accepting of a lack of a degree. My current large company was the result of my previous startup getting acquired.