Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
It's possible to thin wafers right down from the backside after initial manufacture. This is already done for the ICs used in smartcards - apparently the processed wafers are surprisingly flexible.
You just stick a man in it, and if they make it, it's man rated
Were you involved in the space shuttle program?
The original N95 firmware was basically unusable though.
It took about a year for nokia to release a version that supported demand paging and actually made the phone usable.
For Samsung the cost of producing an update is non-zero.
It's not as though they can just download the lastest version of Android, compile and sign it then make it available.
For a start there are the hardware drivers - if the underlying kernel has changed then these may need to be updated and or QAed again. If they're using a custom skin then that will probably need to be updated as well.
If they're making the update officially available then they also need to be able to support it - from initial QA to documenting the new software for technical support personnel.
Of course Samsung should provide some updates, however you can't pretend that it costs them nothing to do so. Also, the phone you bought 12 months ago still does everything it did when you bought it - bug fixes are one thing, but why should they continue to add new features for free when they've stopped selling that model?
Surely what's really needed is a direct monetary incentive for Samsung (or another manufacturer) to provide updated firmware.
How would people feel about paying $10 to get the latest version of Android with some new features? It worked for Apple with the iPod touch, although they also released a version with just bugfixes, available for free.
I think the key innovation (from a consumer point of view) was the laser diode. Whilst some early laser disc players used gas lasers, it was the laser diode that enabled the CD player and all the other consumer electronics applications you describe.
Yes, however Google makes its money by selling advertising alongside search results. As part of the scraping process, Scroogle removed these adverts. Therefore Scroogle cost Google bandwidth without giving anything back.
Personally I use Adblock, but sites are free to block people using that if they like. Which is just what Google has done in this case.
......by using a different search engine.
Oh wait - you're weren't generating any revenue for them and were actually costing them bandwidth.
That will really show them!
This article from the BBC talks about doing something similar to make rain rather than just clouds.
This all rather presumes that Apple simply wants to sell as many iPhones (or iPhoneOS devices) as possible.
Apple want to be No. 1 in the top 50 or 25% of the market. That's where the profit margins are.
You can now buy phones running Android for £100. The hardware sucks. The margins must be pretty thin.
That isn't a game Apple wants to be in.
The key to Apples success is selling aspirational products. Sure their hardware is more expensive, but it also *feels* more expensive.
I'm not sure why people think it's in Apple's interested to discourage the adoption of Theora (apart from the fact that h.264 is meant to be a better codec technically anyway).
Yes, it would be great to have a freely implementable video codec standard in all web browsers - but with the current patent system (in the US anyway) that just isn't going to happen. Add to that the fact that h.264 decoding hardware is available in more and more SoCs and graphics cards (so, relatively speaking, Theora performance would be even worse - requiring higher bitrates and more power for equal quality) and this battle is already lost.
Active matrix OLED displays are actually really hard to manufacture compared to TFT LCDs.
A major issue comes from the fact that the TFT backplane has to supply an appreciable current to each pixel, rather than just a voltage as in LCDs. This means you can't get away with using amorphous silicon, you have to make the backplane out of Polycrystalline silicon which makes the whole production process a lot more complicated and also limits the size of panel that you can make.
Also, you generally you want to run the OLED elements in constant current mode, so you end up needing a current source circuit in each pixel. This increases the number of transistors you need per pixel from 1 or 2 in TFT to between 2 and 6 with OLED. And if any of them has a fault then you've got a dead pixel.
Indeed. One thing people are doing now is building optical feedback into every pixel, so as the OLED material ages more current is pushed through it to keep the brightness the same.
What about regression testing?
It'd be quite possible to run a check and throw a warning if a change effects greater than a certain percentage of domains. Or you could check against a sample of domains that really aren't going to change (I'm thinking mcdonalds.se, ibm.se etc etc).