And I don't know if the shift towards iOS-ness is as minor as we think it is. Right now, it's not so bad. It's far less shocking than the jump from Windows 7 to Windows 8, but what's really chafing me is the hardware. I kind of like some user serviceability. I like being able to buy some RAM and jam it in myself, or doing the same with a battery. And when it comes to my desktops, I especially like being able to upgrade things like my hard disks. It seems pretty absurd that you have to make the decision about how much RAM you want in your machine up front, with no upgrade path.
What is the feeling/experience of other 'traitors' who run OS X for the desktop and Linux for everything else?
Wrong way to look at this. When I was in high school I almost exclusively used Linux (Mandrake, eventually switched to Debian, back before everyone had proper package managers). Eventually I got an iBook running OS X 10.3 because I wanted a Unix laptop that delivered great battery life, and at the time the iBook's promised 6 hours was far better than the two hours you could find in most competing laptops; plus it had a dedicated GPU, so I could at least dream about gaming on it in my off-time. But it had a big bonus: it ran Photoshop and Illustrator so well (again, this was back in the day where if you threw a 200+ MB PSD file at the Windows and OS X copies of Photoshop, you'd see the Windows copy freak out and crash while the OS X copy would eventually open it). At the time, I thought I was going to get into graphic arts, and this seemed like the best combination of features I could get, though it was admittedly expensive.
When I went to college, I bought a MacBook Pro. I had changed directions in education, and now I was sprinting full throttle into video production, and being able to run Final Cut Pro was a deal-breaker (especially since my school only taught Premiere and Avid Media Composer). The five-day turnaround on repairs was a huge plus, as was the reliability factor. Major bonus points were awarded that I wasn't being conned into getting Windows Vista, but had the option to make that mistake if I wanted. On top of that, Time Machine saved my bacon more than a couple times (mostly user error). This ended up being a very good decision, since knowing both Final Cut Pro and Avid Media Composer is what landed me my first job.
All the while, I maintained a Linux server. Why? Because just like the Macs I bought were great tools for image and video editing when I bought them, Linux was great at being a reliable file server, firewall, and DVR (thanks to MythTV). I still run one today, though because of the direction Apple is taking the entire OS X platform in (hardware and software), I'm considering changing that. I use Final Cut in the office enough that my skills are kept in tip-top shape, and personally I prefer Avid when I have my druthers, which runs on Windows.
For me, it's not about the politics of this or that, it's about what tool will do the job I need to do, and do it well. Beyond that, then decisions about what kinds of companies I want to support, their practices, environmental record, prices, etc. gets factored in.
I'm out of state for schooling nine months out of the year, and I can only imagine what kind of hell my family is going to go through if we even need to go through something like this again. I'm so glad they've got a pair of Mac laptops I can fix by remote, and is vertically integrated. Not to evangelize Apple too much, but it's perfect for a family that wouldn't know the difference between PCI, AGP and ISA slots, and couldn't figure out how to install a driver to save their lives.
Now, not to say that Windows is worthless in this respect, but when the company's support assets have been sold two or three times, and you can't even find a spec sheet for your computer any more, that's pretty bad. It's more about information being lost and having to go down rabbit holes. The same thing would happen to Mac OS if Apple did the same things Gateway has done.
There was a big hoopla when a user discovered their Apple ID was stored in the M4A file, and all of a sudden there was this big paranoia trip over security and privacy issues. To me, this isn't much of a bigger issue than a hardware manufacturer linking a serial number to your credit card/user account, or writing your name on something with a Sharpie. Having that opinion, I didn't bother striping the information from my iTP file, but as far as I'm aware, my MP3 from Amazon doesn't have that. Even in that situation, with a 256 KbPS VBR MP3 copy of the file, complete with album art and metadata, I haven't copied the file anywhere. Even moreso, I don't feel like sharing it. Same goes for my copy of OS X. I bought Leopard and Tiger, and in both cases, I felt that I shouldn't pirate it, and I didn't feel any pressure to share it (despite friends asking to borrow my disc for an upgrade, I didn't quite feel right about it). But that doesn't mean that people don't share iTP files, or Amazon MP3s, or copies of OS X.
I love the idea, but in practice, it's too easily beaten. The only way you're getting around it is to embed the data directly into the stream. With an audio file, you could easily just plant the stuff into a frequency range outside human hearing, but a simple filter would remove it. With video, you'd have to burn it into the video or audio stream (audio stream's problems already discussed). If you put it in the video, it'd likely be visible, which would ruin the viewing experience, or in some transparent layer, or possibly hidden in visual details that must be filtered to remove it (maybe using a 32-bit colorspace?). But simple transcoding would ruin that.
With those holes in your solutions, the ever controlling MPAA will never back it. The RIAA is starting to lose the fight (look at iTunes Plus, Amazon's DRM-free music store, Radiohead, and everyone else going DRM-free or independent), but it's a bit harder to break the MPAA mold. The business world seems to think that just because you add pictures to sound, it's a whole new ball game. In my eyes, it's just another way to experience a creative vision.
Such a liability could break AT&T and any other telcos bearing it. This analysis also explains recent DoJ filings taking AT&T's position against Network Neutrality. This "private/public partnership" might have done irreparable damage to everyone plugged into the switchboard."In short, it is increasingly evident that the major US TelCos enabled the surveillance of every single domestic communication, or cannot prove that they did not. So in light of the possibility that The Program monitored the communication of every American with a phone or a web connection, this means that nearly all Americans may have standing to participate in a lawsuit should any plaintiff achieve success in showing standing and damages from the program.
But is the triple-core Phenom really that amazing? Some have argued that it is nothing more than typical AMD bluster — all wind and no substance. However, this triple-core processor may have more than meets the eye. Here's a quote from the editorial
""According to AMD, these new processors represent a "multi-core triple threat" to Intel's current hemogeny in the multi-core desktop segment. They claim that current quad-core desktop processors (the Intel quad-core Core 2 processors, in other words) only represent less than 2% of the market. Hence, AMD believes their triple-core Phenom processors will fill the market's need for more powerful processors without paying for more expensive quad-core processors."
Link to Original Source