It looks like they have taken an interesting step following that philosophy with enabling functional expansion through interchangable backs.
Sailfish also has a pretty slick interface. I will hold off on judgement until I get a chance to use it for a while.
If a user-centric design philosophy (including openness/freedom) doesn't really matter to you, and you don't care about the user interface, yes it's just another phone. But then again, any modern cell phone is essentiall Turing-complete and you can build/connect accessories and power supplies around them. So at a high level of abstraction, no modern phone is distinguishable, nor should we expect to see one any time soon.
To me, it looks as though Lenovo has been trying to make the Thinkpad line more profitable by slowily adopting common components/methologies from other lines. They could have assymilated the Thinkpad designs to make the Lenovo branded products better, but that would have increased the production cost, even if only slightly.
Its evident even in the firmware, where each generation moves away from handling more functionality in hardware (think things like the independent volume control/amplifier) to using software to make it look like it still functions like a Thinkpad.
Also, completely idiotic to follow Dell on the tablet design. The latchless lid on the latitude XT was stupid, and for whatever reason, Lenovo decided to follow suit. The x220 and x230 tablets are clearly yet another step down (I've had my nose in most of the tablet models since the x41). Don't get me wrong, the speed and low power capabilities (I frequently run at 6W when I want the battery life) are superior, but I think that's more a statement of the general trends in laptops (thank Intel for the cpu/chipset). The build quality and the bells and whistles under the hood, the things that differentiated the line, are trending towards the mean.
The software is windows only, which is major (but not killer) downside for me. I've only played with one for a couple hours, but it was enough for me to want to try them out for a few things around the house, when I find the time.
I highly doubt that's the case. I suspect the defect/variation distributions of few if any generations of intel chips have actually matched the distribution of market demand. Going back at least to the 386, they have artificially crippled higher end models (beyond what was necessary from defects), to provide different price/feature/performance points for consumers. The SX line was just DX chips with the internal floating point unit disabled.
We might feel a little less cheated if Intel actually designed and fabricated different products so that we actually get what we pay for, and don't have to feel cheated because that new cpu is artificially crippled. Realistically, the production cost of the extra silicon is far less than the cost of designing different chips. And, yes some chips will naturally fall into the lower end models because of defects and variations. So whether or not we feel cheated, they are actually delivering a better value to the customer by artificially differentiating the models. People are accustomed to market segmentation, airfares being one major example, I suppose it just feels a little different knowing that you get a black box that is a first class seat with extra blocks added to squish you because you only wanted to pay for ecconomy class.
As to why the segmented market is reasonable, the fact of the matter is people do have different values and needs, and people want a price tag that matches those needs. Intel could charge a uniform price for all cpus, but then they would have to decide between alienating a huge number of customers by setting to high a price, or drastically reducing their profits. Say what you will about corporate greed, and even Intel's stagnation. They do reinvest huge ammounts of their profits into building new fabs, and other aspects of producing subsequent generations. Market segmentation enables them put more of the cost burden on the customers that have more money to play with and really care about the getting the performance now. Really, it most benefits the consumers that will feel cheated (I haven't heard people complaining about the higher prices of the higher end chips). If chip makers were acting with more bad faith (just look at the telcom and cable industry), then this would be more upsetting (and less about consumer value).
As for the lying issue. I don't think this has been much of a secret for the last 25 years. Its probably just a matter of more people are becoming aware of it now, people who are less familiar with the literature and issues. Also, I think Intel has been fairly stupid about marketing. The post-sale "upgrades" drew people's attention to issues that most people just don't want to know about, and probably didn't really appeal to all that many consumers. Its also a little sickening to see them put effort into developing a "secure" system that lets them sell hardware upgrades. Perhaps something like that would just work better for consumer relations if they provided a trade-up program, even if it does mean it would cost more for consumers to get the same upgrade (assuming the same profit for intel).
Different domains have different ways of communicating and different standards of acceptable behavior. We wouldn't expect the same tone from a kindergarden teacher, a librarian, and a sailor on a navy ship in the middle of the atlantic. So sure, if that's your environment and it pisses you off, go ahead rag on Linus for a while, he can take it. If not, try imagining the librarian wandering around your office telling you to keep your voice down, or some other absurd situation which would drive you crazy but might not offend you so directly.
Yes, the kernel mailing lists technically constitute a public forum, but to an extent that says more about the public listening in on his domain than him shooting messages out to the world. He put the kernel out for all of us to use and enjoy. He didn't force himself upon the world (like some other OS developers).
Linus has a reputation for being harsh, but how often does he go off on a rant? How often does he rag on a clueless passerbyer who shoots off a silly patch to the kernel. Mauro, is not a fresh kernel developer wannabe, he's been working on the V4L stuff for years. Pressumably, he understands geek communication is not as slick and polished as corporate or political discorse, where saying things politely seems to be more important than actually conveying any useful information.
Some comments mentioned Dale Carnige and other stuff about sweet talking to convince people to give you what you want. I don't think Linus really has anything to prove. People respect him for his work. I'd love to see people protest on the streets, reject the Linux kernel cause they think Linus is an ass. Try switching to windows, mac, hey maybe even Hurd. Hmmmm, all project lead by even bigger assholes, never mind....
Finally, what's really in our best interest? Would you prefer Linus bottles up his frustration long enough to compose himself so he can be more polite? Maybe, he should go sit on the beach every time he feels like being rude until he calms down. Personally, I think he's more useful to me, bruising a few feelings here and there and getting back to work.
On a separate note, I don't think Mauro was right about his response. Pulse and kde blowing up like that because the kernel returned a different failure then expected is a clear failure, particularly for things that try to position themselves are core facilities, even if in userspace. That said, this does come as a response to a patch in a RC. If someone catches a change that will cause immediate problems, whether they are wrong or not, as long as the change wasn't some critical fix to an even worse problem, this is the time to revert first, and then commence discussion. Teasing an outsider for buggy userspace code, or for an audio server interacting with video systems (which is actually not too uncommon, don't know why he went off on that) was more innapropriate than Linus' blowup.
A fair number of people have worked on the infrastructure and building the community and development methodologies. If anything, his reliance on other people to take care of their domains is probably part of why he went off like that.
For workstations, I mostly use Asus boards. They tend to have more bells and whistles and also have worked really well for me. I have used Intel boards, and can confirm they are also great but generally fall behind Asus and Supermicro on one end or the other (particularly considering their prices).
Except for laptops and smaller, I generally only buy boards that supprt ECC, that probably weeds out most of the crappy stuff on the market.
As for EFI, I haven't used it on a server or workstation yet, but I have used it on a laptop with an older bios that doesn't have the secure booting crap, or at least it's not enabled. From what I've seen its an ugly mess. Some cool ideas, but really lacking solid userfriendly tools. If you do a fresh install, it might not be too bad. I've only play with converting legacy bios installations to UEFI with and without GPT. For the most part, you can't configure/install the key components for EFI booting on a running system that was booted in legacy mode. The machines I've played with only support net booting with legacy bios (I typically net boot for installs and repair).
Converting a linux install without rewriting the entire disk is actually not too dificult, if you do it just right. However, don't try it unless you are comfortable with loosing the data on the disk. Windows seems to be a lot more finicky about EFI. For one, it will only boot with GPT partition tables (my bios and the linux kernel don't seem to mind using either GPT or legacy tables). Can't say I care enough about windows to have put in the effort to get the conversion to work.
Anyway, EFI is still surmountable, but life is easier if you avoid it and get a friendly board that still supports legacy booting.
Researchers spent several months collecting samples of groundwater and stream water to determine which source removed more mineral material. They also put to use surface water estimates from the U.S. Geological Survey to calculate the quantity of mass that vanished from the island each year. Researchers point out that Oahu is actually rising in elevation at a slow but steady rate due to plate tectonics.
“The Big Island is so large that it actually depresses the ocean crust–kind of like a dimple on a golf ball,” BYU geologist Steve Nelson told The Science Recorder via email. “Oahu is close enough to the Big Island such that as the plate drifts to the northwest, Oahu is moving up and out of the side of the dimple. Kauai is far enough away (and older) that it has moved out of the dimple. The estimate of the duration of rising is based on the time it will take for Oahu to drift to where Kauai is now.”
Mr. Nelson and colleagues believe that Oahu will continue to grow for as long as 1.5 million years. Beyond that, the force of groundwater will eventually win and Oahu will begin its transformation to a flat, low-lying island like Midway. Researchers are confident that it will be a very long time before Oahu begins its descent."
Link to Original Source
For the first time ever, three pharmaceutical companies are poised to test whether new drugs can work against a wide range of cancers independently of where they originated — breast, prostate, liver, lung. The drugs go after an aberration involving a cancer gene fundamental to tumor growth. Many scientists see this as the beginning of a new genetic age in cancer research.
Great uncertainties remain, but such drugs could mean new treatments for rare, neglected cancers, as well as common ones. Merck, Roche and Sanofi are racing to develop their own versions of a drug they hope will restore a mechanism that normally makes badly damaged cells self-destruct and could potentially be used against half of all cancers.
No pharmaceutical company has ever conducted a major clinical trial of a drug in patients who have many different kinds of cancer, researchers and federal regulators say. “This is a taste of the future in cancer drug development,” said Dr. Otis Webb Brawley, the chief medical and scientific officer of the American Cancer Society. “I expect the organ from which the cancer came from will be less important in the future and the molecular target more important,” he added."
Link to Original Source
Link to Original Source