This was the last key iPhone component that didn't have two sources, and the Journal estimates that Intel's revenues could now increase by up to $700 million before the end of 2016.
I don't want or need a "smart TV". I have a Sony from a few years ago that has Netflix and iPlayer and a few other things that I've never wanted to explore. Same as I don't want to use Netflix or YouTube or even Catchup services via my cable box.
I do have a Chromecast and Pi hooked up to my TV. These give my TV smart capabilities, just not built into the TV. Whilst I have them hooked into Home Assistant and the Pi is linked to a Synology which automatically downloads new episodes, the average person isn't really interested in the effort required and is happy enough to be able to use Netflix or YouTube on a Smart tv - after all, why spend £30 on a Chromecast when the facility is already there?
It would be nice if all software was open source, but i don't think the business model always works. I think we have to recognise that software which is open source (free libre) quickly becomes available via other sources (legally), even if the original source charges for it.
For big organisations like Red Hat, despite the software being repackaged and made available for free as in beer, they make a lot of money on the support contracts for enterprise companies. So it works well for them.
For smaller companies or even one man shops, the business model of support for a fee just doesn't work. I bought Reflector and a particularly useful browser tab recorder in the past six months for business purposes. Would they have been developed if they had been released under an open source license? Probably not as the developer would not seen value from it.
I worked for an ISP when the blaster worm was a thing. We suspended internet access for customers whose computer(s) were infected and unpatched, by identifying the traffic that was coming from their pipe.
Now, you were saying something about your lack of understanding about network and technology?
The point is that we don't know what is on the 'real' iPhone - that's why they have recruited this team of experts. The people doing the work don't know what's on the iPhone either, so they will not know whether or not the iPhone they are working on at any given time is the real one or a decoy.
You can pick up a modern iPhone for under £20 per month in the UK. That's the price of two packets of cigarettes, or a meal for two at one of these burger restaurants that are all the rage at the moment.
I think it's definitely a case that your judgement is clouded by the phone you use (or indeed vice versa!) I am an iPhone user and feel negatively towards Android and neutral towards Windows Phone. We each have our opinions on this and the world would be a boring place without diversity.
But I think this programme will attract the users who don't care about the platform they use. For a lot of people, all they know is that their new phone menu is a bit different or the SMS icon is a different colour. Apple is betting that a) this scheme will bring new users to the platform and b) that the customers will become loyal iPhone customers.
It will be a sponsorship deal to raise their profile. The people who will be travelling by helicopter to a film festival may well be potential influencers in the purchase of consumer or military products.
I look forward to the day when I can have a single micro USB cable (or whatever the future version might look like) on my desk and in my bag.
I have a device that still uses mini USB, several using micro USB, and proprietary Pebble, Fitbit and iPhone chargers. Manageable when I'm at my desk if a little messy; more difficult when I'm travelling and either take a whole bunch of cables or just the most important (usually one lightning and one micro USB).
No-one on a plane has ever questioned the small square bluetooth device clipped onto my collar, or the headphone cable coming from it.
I really hope they don't become mainstream.
I suspect you might well be surprised at the hardware used by people who use Slashdot. With a few exceptions, I think the majority of people here seem quite intelligent and logical but have somewhat of an aversion to change and unnecessary innovations.
Speaking purely for myself, I built my computer in 2011 with a i5 2500k and maxed out the RAM. Then I spent £60 on a low end graphics card because - why spend more when I have a console for gaming and a media player (WDTV at the time) for watching downloaded shows and movies?
I might be in the minority, but I don't think it will be a tiny minority.
I found hardware qwerty phone keyboards the most difficult to use and introduced the most errors. Probably because the keys are so small. I find the iOS software keyboard really simple to use, especially with autocorrect. I can usually walk and type with my thumbs (looking ahead rather than at the phone) and most of the time it is correct.
But nothing will beat the original 0-9 and T9 for ease of use and accuracy. I don't really remember which I liked the best - I used 0-9 (multi tap) for years and then one day I got a new phone and made a conscious effort to learn T9 and after that I was hooked.
The market for mobile devices which might run ARM is far bigger than the market for laptops and desktops.
I'm in the UK and have an iPhone purchased here and a UK-based account. The app appeared on my phone along with the rest of iOS 9.
However, as time goes by, people use more and more data on their mobile devices. All of which requires additional investment or incentivising customers to lower their data usage.
A few years ago, average data usage on one of the UK's mobile networks was c. 2GB per month. This year, average data usage on the same network is c. 4GB per month. I have an unlimited (genuinely unlimited) plan with the same network and my average monthly usage has gone from 2GB to 15GB in the same period of time.
So, in spite of prices going up, we are probably actually seeing a decrease in the cost per unit used.
Feel disillusioned? I've got some great new illusions, right here!