I stopped buying AMD laptops firstly because the new processor naming scheme does not give me any clear picture whether one processor has better abilities than another. Intel's i# scheme does a better job. Secondly, AMD graphics chips suck on Linux a high percentage you need to do some command line work to get thengs right. (folks bash Nvidia too, from my experiences, it's just install and go, and have great performance.)
Previously I sought out AMD laptops with nVidia graphics chipsets.
I run Ubuntu at home and I don't know what my ISP is using - probably CentOS...
My reason is because there is no compelling reason right now for me to switch. Once it is in my next Ubuntu upgrade or my ISP switches to it then I'll do so as well.
I figure "frequent" would count for all the Windows machines I've purchased, wiped the HDD and installed Linux... Though historically I went from OS9/X to linux.
Earth has more than a bunch of rocks minerals and elements. there are surely unique organisms here not only that there is your culture and inventions. There's many ways to do things or to express ourselves, I don't think any advanced civilization has already thought of all those things. Most likely they are just as screwed up as we are and pick the first idea that works... not always the best.... so they would be in the market for different stuff, styles and ways of thinking that can be easily exported.
TFR was incomplete:
TFS = The F#$@ing Summary (I think)
PC = Personal Computer
p2p = Peer to Peer usually in refernece to file sharing
Not really a fan of this technology - but my thought is this would be a good place to work on fine-tuning the system to increase the effectiveness. You have several RL image sources/raw footage and know what the result should be... time to work on debugging.
Clippy... AND Comic Sans, AND that fish theme/screensaver from windows 98SE.
On the programs I built and then later rebuilt (was fortunate enough to work in a place where I was able to build and maintain systems over a long period), I probably put a lot more time and research in improving systems than the users had.
If it was left up to the users they would prefer the same thing with maybe a few new entries and features, because that's all they are accustomed to, not that is bad, but its not basis for good innovation.
As the developer I knew what limitations I had when I first developed it (and maintained it) as well as a list of 'if I was able to rewrite it' pet peeves. Also I knew intimately how the system works so I also knew what processes can be expanded with new technologies and have learned better ways to solve problems I had in the past.
So user input is important - but it should be tied in with expert developer knowledge. Marketing is also a good minor factor, the first early databases I built were called Foxbase. because that what name the users saw on the DB engine startup. The later one was named WANDA, it gave it more an identity, a personality, which I think helped adoption - probably better than just calling it the "Web Database" which would probably had become its name without branding.
And now a comment from the entertainment industry...
I think we nerds need to get more facetime access to the rest of the world. All these "stranger danger" kids are now stranger danger adults.