A way in which Donald Trump can be useful to all of us! Let's hook him up and see if his hot air can make things go.
A way in which Donald Trump can be useful to all of us! Let's hook him up and see if his hot air can make things go.
Not true, if you call "siri" a useless bitch, she will make some sort of comment. Of course, only if she understands you, which is 99% of the reason I call her a useless bitch.
So much telemarketing is just scam these days
Most of them seem to be trying to get me to donate to their political campaign or charity, which after further research, doesn't exist.
You could replace the fingerprint sensor with something that could provide arbitrary fingerprints, possibly based on a collection you have made of them. Then use your collection to buy stuff. Requires no memory in the sensor at all. This is much faster than creating molds of fingerprints and applying them to the sensor. I can see Apple not wanting to tolerate replacing things tied in to your CC #.
Replacing a battery seems less defensible to me, if that aspect is true. It's hard to argue this is tied in to any trust chain.
Well I still submit my antarctic born, hogwarts attending, international, two-spirit gender facebook avatar has sabotaged someone's productivity and finances. At the very least a past employer on whose dime I created it, and whatever big brotherish agencies are slurping facebook and have the unholy job of fumigating the nonsense. Most of my uploaded photos are of cats I do not own, internet memes I didn't create, and lately every derp face Trump picture ever taken. The latter itself could fill a sizable storage array.
Still, i can do what passes for interaction with my family in a synthetic and non-time consuming way, which is all facebook has ever been good for anyway.
Looks like a classical case of mafia.
It's not really that simple except for reasonably large, well studied components. But if you are doing the design of say, a motherboard or a the main board of your cell phone, you are essentially constructing a new thing, based on components that themselves may or may not be well understood even under their own environments. Processors are a crapshoot, many of them (including our favorites) don't have an MTTF at all, or any reliability data period. In fact quite a lot of smaller ICs are like that too. In a mature organization we do study the lifetime curves of the components (in some fashion or another), and there are standards of acceptability based on the market, but that is definitely not a good assumption to make about most consumer electronics (for example). A lot of those are made in some shady fly by night environments.
The whole topic in context of consumer electronics is kind of dumb. Nobody designs things to fail in a given window. It's hard to do even if you have reliable statistical models. You design not to fail in a given window, and inevitably outside of that window something eventually goes wrong somewhere. In reality you are often against some sticky design choices (quality, reliability, cost, pick one). My favorite is selecting decoupling capacitors for big digital ICs like CPUs. Failure to have adequate decoupling will result in random and unpredictable failures, yuck. Proper decoupling is frequently physically impossible, some people who make chip packages don't think this through real well and don't simulate. Yay. But the designer does the best he can, trying to find the smallest parts to get in to all the nooks and crannies, with the least inductance he can introduce. In choosing that small package he has chosen quality over reliability and cost: the smaller package will have a lower voltage rating and thus the MTTF will be lower (often very much lower in practice), and you often add cost in choosing those components because they require SMT lines that support small parts, the smaller footprints have larger manufacturing fallout (tombstoning, bridging, etc.) and sometimes they just cost more because only one guy sells them, etc. No one will ship if the derating curves are too bad, but at some point we say "a life of 3 years is good enough", and that's that. In reality decoupling in many environments is black magic, no one has the technical data to know how much is enough, and we massively overdesign it, and even as components fail nobody ever notices!
Then there's mfg variability. Your design may be absolutely correct on paper, it may even have met your DFM criteria for your factory. But there is a non-zero probability of failure in fab and assembly of every part of the design. Things happen, I mentioned surface mount part tomb-stoning (literally turning at 90 degrees to the PCB, like a tombstone) but that's just one of so many things. Not all of these produce a hard failure immediately, many of them make it through whatever physical and functional test you apply to a device after it is manufactured. But they fail early because the circuit as designed by the engineer, as hopefully studied for standard component failure, is now outside of its design spec, and is going to fail early. Or possibly someone mishandled a component and induced a latent ESD event to a device causing its lifetime to be reduced. So all that work above, designed to make sure your design works "just long enough" gets ruined horribly when it gets physically assembled.
In reality, yes we are making lifetime choices based on the market, but not in any devious technical way. Given the low costs the market demands on consumer goods, and the fast design cycles a number of less than optimal choices are being made that impact the final product. There is no way to predict what is going to fail first, all we can do is look at failures that come in and identify where the weaknesses must have been (even that is usually only done for the first 90 days, or maybe 1 year). However since products change so significantly from one gen to the next, it is difficult even to use the historic data.
If we get to a point where technology in a given area has mostly stagnated, we might see an improvement in reliability such that we could engineer it in better ways. But as things are right now, as a designer I would never intentionally design something to fail earlier... it would be inviting disaster.
We did, he chopped it up and glued it to his head.
. IF they allow this, expect cable rates to go up $10/month
Because we need more of an excuse to cut the cord.
I would never say that SOME people might benefit from the extra 5-10% of performance (let's even get it to 20%); but you're only talking a couple of hundred-thousand machines WORLDWIDE. For a company the size of Apple, the numbers just don't add-up, for the extra R&D (both hardware and software), extra testing/support and extra supply and distribution logistics.
Explaining to me the business concerns does not change my opinion on the product I need, that is not my problem. I also think you grossly underestimate the performance delta and the number of users who would benefit from a high quality, well tested and artistically sound performance machine. What is the only real reason to run Windows at home? Games. You can do virtually everything else on OS X, Linux or just via some web app, usually better. Windows has one place in the market: the office, not so much on the merits of its technology but because of the infrastructure built around it (both software and IT). But at home? Games, otherwise it's a barnacle. But a lot of people have one.
I would argue that power & thermals are the main issue: an iMac with these parts would be thick and noisy. Enter the Mac Pro which has solved these things nicely. The Mac Pro is targeting a vanishingly small segment of the market, but Apple has stood behind it (and it's a a very nice machine for what it is, and as a system designer very impressive technologically considering what the competition is doing). The irony is that form factor seems less than ideal for the business where you want to stack things, and much more ideal for the home office or desktop where you may have only one computer. If only it had a single proc option with a more ideal video card. By not using Xeon the price would be lower (enticing more people to buy a turn-key solution). Next note this thing about gamers, they want the latest and greatest, from a business case this is a feature. Since the Mac Pro isn't a standard FF (and that's ok, the standard FF is not great from most quantitative perspectives, never mind the aesthetics), you are now in a position to sell the latest and greatest CPU and video card to them, on a yearly basis, without necessarily having to do a yearly upgrade cycle of the whole system. I believe Apple could solve this neatly, way better than what we currently do with aftermarket cards, and manually mounting sensitive electronics. It's just a matter of wanting to serve the market, and understanding how many people have more money than time (>30 w/kids and gamer, I imagine) and how much extra you can charge for a turn-key system.
Also, the trend for video articles... I guess it's the 21st century, some trends can't be stopped, but remind those involved that most of us read slashdot, and the articles. Reading is silent (if done properly), and suitable for the workplace where a bunch of us tend to do our slashdotting. Also, let's be honest, most nerds aren't that photogenic, if I wanted to look at grumpy-looking ugly men I can find a mirror or alight from my swivel chair and walk through cube land.
For desktop use, Linux is 2nd best. The linux desktop has become significantly less good in the past few years since Canonical mostly abandoned it, I remain optimistic against all odds that it will improve, but OS X is much cleaner and more responsive. It manages to combine the best parts of unix with the best parts of a modern UI. There are things I'd change, but compared to linux where even on a fast machine X responds slowly and with high latency, I use both and prefer OS X.
For server use, for multi-user, or for a workload that is largely headless, there's no question that linux is best.
If only Cmdr Taco were here, we could have him add "Ghostface Killah" to the lameness filter for news posts.
I love my macs, but Apple does not sell anything that represents a performance machine, and never has. In fact that is why some of us learned to hate them in the 90s: we can put together a much faster machine, for less than their not so fast machines for users we, frankly, disrespect. Now I'm older and have a life, and I am sensitive to the argument that I want to use the machine not constantly tinker with it, and although I have designed computers literally from copper traces, I respect the investment Apple makes in building a very high quality machine that can last and requires very little TLC. They are the best machines out there for casual use.
But I still wish they made one with a high end processor and a high end GPU (hint: AMD does not make any, but then either does Intel). I don't want to hear about "not needing the performance", that is a horrible answer on many different levels, and in point of fact, is wrong for some of us.
So, Apple having forsaken us, we're forced to use the next best os (Linux) and cope with what drivers the gods of Proprietary Hell (right above Special Hell) deign to give us, and frequently bitch and moan about their idiocy. I don't especially care about Intel graphics myself, I always replace it, but I can understand the attractiveness some might see given the wider array of form factors some of these low end machines can come in, where Intel graphics is a key feature.
If only the stock market were concerned with the profitability of a company, rather than the belief that one should grow to be so large that it dominates the world.
The person who's taking you to lunch has no intention of paying.