Flex was already open source.. They are just pushing the responsibility of maintaining it to the community. Now if they were open sourcing the Flash Player, I would commend them for that as it could ease the pain a little of those stuck relying on this legacy technology.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
I almost lost several weeks worth of a project I was working on due to the drive crashing
It blows my mind when people running OS X don't use Time Machine.
My hard drive died a few weeks ago, and it was so easy to restore from Time Machine. I was right back where I left off when the drive died. In my case I bought a 2TB 7200 RPM Hitachi Deskstar. I had heard that those tend to fail, but the price was right, and I have enough confidence in Time Machine and the off-site backups I make every few weeks (rotate external drives which have a complete backup of my entire system) that I can take that risk.
Using that logic, we should all be buying souped-up computer monitors that have computers built into them, as opposed to buying the monitor as an accessory to your computer.
You mean like an iMac (and countless other computers throughout the history of desktop computing)? I personally don't like the concept and wish it had died in the 80s, but the iMac is popular enough and PC vendors are trying to bring it back, too.
I had a similar experience with my TV buying. I bought a 58" Panasonic plasma TV in 2009 which works great, but it will never get Netflix support (they claim this is because the TV lacks some hardware for DRM), whereas the 2010 model did get a firmware update to give it Netflix support.
Have you run Windows 2000 recently? That shit is blazing fast compared to XP and later Windows. Windows 2000 was also faster than Windows 98 in my experience. I consider it the best Windows ever (not saying much, I know). If they had tacked on the remote desktop feature XP has (maybe you can have this with terminal services?) it would have been perfect and we would have never needed another Windows version (it runs great in a VM).
Today, the average user builds their own machine or knows about it and gets a geek friend to do it; "upgrading" still pretty much means getting a new PC, but now a custom-built machine is more common
Huh? I think the opposite is true, and I think the general trend is towards laptops and mobile devices where there is no assembly and very little upgrading. I think more people are using computers these days, leading to more people assembling their own computers and doing their own upgrades, but I think percentage-wise this is less so.
If you dislike Vista and 7, use something a different operating system. Don't pretend Microsoft should support 10 year old software.
I already run a different operating system, but have to use Windows in a VM for some software. My Windows 7 VM is much slower than my XP VM. Forget XP, I wish Windows 2000 was still supported for what I use Windows for (.Net development). Windows 2000 in a VM is blazing fast and doesn't need as much RAM.
CLI lovers may be welcome, but do they actually use it? Everybody I know who said that OS X was great because of the CLI has since switched to Linux.
Did these people switch from Windows to OS X? OS X happens to have a usable UNIX userland, which is good, but what makes it great is the combination of being a UNIX and the excellent GUI, commercial software availability and support, and no hassle with wifi / power management / multi-monitor support.
I'm a long time UNIX user (started using various proprietary UNIXes a few years before Linux existed) and a software developer. Of course I use the command line (BTW homebrew being my package manager of choice). I also use Microsoft Office, iTunes, commercial music apps (like Logic, GarageBand), commercial video editing software (Final Cut). I switched from Linux as my host OS (running Windows in VMWare), to OS X as my host OS (and needing a Windows VM a lot less; I don't use Windows at home at all anymore). OS X hits a sweet spot where it meets all of my needs, one of which happens to be a proper UNIX userland, even though Linux is better for that. It's the sum of the parts which is what makes OS X and Mac hardware nice.
I think Apple used to have a very expensive MacBook Pro that gave you a choice between glossy and matte but I don't think they have that choice anymore. No more Apple hardware for me.
While there was a short period of time where they didn't offer matte screens in any laptops (when the first 15" unibody MBP came out), you can get matte in either the high-res 15" MBP or the 17".
YouTube. Now was that so hard?
Streaming video is hardly a new use. During the dial-up days, I was doing video conferencing, and it worked perfectly well. Faster connections just mean higher quality, not new uses. YouTube could have existed in the days of dial-up, it just would have had lower quality video streams. No one would have complained, either, because we were used to those lower-quality streams.
If you don't what Cmd-H is for, GTFO.
Seriously, I used to hunt for pixels too, but after about 1280x1024 I stopped caring.
I don't like my desktop at much higher resolution than that, it becomes uncomfortable. I know gamers and drafters really want giant screens at massive resolutions, but besides them who else really wants it? 2560x2048 resolution doesn't exactly help me see my web pages or documents any better - in fact it can make them downright hard to see, so why do I need it?
May want to add programmers to those who want massive resolutions. I was actually offended when the company I work for bought us 1280x1024 LCDs (fortunately they made up for it later). I had been happily using 1600x1200 on my CRT before that (and occasionally bumping it up to 2048x1536). I can make use of all the pixels available to me. The thing is I want things to by immediately visible without having to switch apps or windows. The more area I have to display code, logs, documents, browser window, terminal, etc, the better. See multiple source files at once is essentially to my productivity. Vertical resolution is good for me, too, as it means less moving up and down a source file or document (and also I can tile things vertically).
I currently use a 1920x1200 monitor (actually two of them when I plug my laptop into a monitor) which I think is a good resolution that meets in the middle of what people want and how much they are willing to pay. What I hate is the proliferation of 1920x1080 computer monitors (which *IS* because of HDTV). Did they really have to remove 230k pixels? 1080P content displays fine on a 1920x1200 monitor with 1:1 pixel scaling.
What I really want (that is currently available) is one 2560x1600 30" display (but budget doesn't allow at the moment, and the options available are lacking features I want/need). The more text I can see on screen (with a comfortable font size) the better. The less swapping between applications just to read things the better. Though sometimes I just want a single app (such as Emacs, Eclipse, VS) to take up the whole screen, and multiple monitors just doesn't work well in most of these apps.
Those people coding on netbooks is what I call "crazy people." Many coders don't even know they liked higher resolutions until they got it. Maybe that's part of it, people just don't know what they are missing.
For gaming, I don't really care. Even the ~720P or often less resolution of most XBox 360 / PS3 games looks great to me on my 1080P plasma (I'm one of those who think LCD is inferior to plasma for gaming and video; if I ever get back into PC gaming, I'm getting a plasma display to hook up to my computer and will be happy with 1080P and the frame rates I get with it).
Something to the effect of "And the inferior multitouch support won't bother you nearly as much if you are always using your phone one-handed..."
I can pinch to zoom one handed, but sometimes I wish I had a simple + / - for zoom like the browser on Android does.
So I'd say the simple fact is SSD simply isn't needed on the desktop. Mobile is another story, with the non volatile nature of SSDs making them a good choice, but since most of my customers are simply doing the basics on their laptops (word processing, surfing) they really don't need anything bigger than the basic bottom of the line SSDs
I kinda see it the opposite, but I am not a typical user: on the desktop, you can easily have multiple drives, so having one SSD for the OS and apps, and having one or more HDDs (either internal or external) for general storage would do the trick. On most laptops, you can only have one drive, so better make it a big one.
Who (as an end user) needs any kind of storage medium for porn. Porn is in the "cloud" these days.
Maybe you can get creative with your router configuration. Block update servers during the peak time, or set up your own bandwidth cap during that time. It could be tricky, and you may need something more than the Tomato firmware, maybe a full Linux or OpenBSD box (running on a lower power x86 machine) might be necessary, but seems like it should be doable.