Interesting idea, but the hardware spec for that device is so lacking in basic facilities that it will probably be a non-starter for a lot of people.
My cellphone works while the power is out too.
Sure, as long as the batteries last and you have useful reception in your current location (and the base station isn't affected by the outage). These are relevant concerns with a cell phone, while they matter little with a traditional land line.
You act as if smartphones somehow don't do their jobs, or that they're all massively unstable which is total bullshit.
That's a matter of opinion. Do they crash every five minutes? Of course not. Do they crash often enough to be annoying and potentially dangerous? Yes, every major mobile OS platform has had this problem at various points in recent years. Given this is a device you might need to call an ambulance one day, none of the major platforms has a great record on stability.
As for doing their jobs, there have been a few antennagate-style stories over the years, where some fundamental design flaw has undermined the basic functionality of the device as a phone. It seems popular to make thinner smartphones with larger screens that then bend or break in your pocket lately.
Modern smartphones seem to be about on par with PVRs and so-called Smart TVs. They do their job up to a point, and they do offer some advantages over the devices we used before. On the other hand, they are also trying to do too many different things to do any of them really well, they often try to be a bit too clever about how they do them too, and at some point these things affect the reliability of the system and/or raise security and privacy concerns.
I often have a feature phone in my pocket and a tablet in my case/bag, and I have yet to find anything I want to do while I'm out and about where a typical modern smartphone would be better at it than one or other of the devices I actually use. YMMV, but I'd be genuinely interested to hear of any common tasks that a modern smartphone really is better at than other widely used but more specialised devices, because I can't think of any myself.
You laugh, but old school rotary phones could still call for emergency help if the power went out, they didn't hang, they didn't get viruses, they didn't get firmware "upgrades" that stopped them from working properly or at all, they didn't run out of their own batteries in the middle of a long call...
For once, I'm 100% in agreement with Khyber. Smartphones in a world with modern laptops, tablets, headsets and feature phones just look like a mediocre compromise to me. About the only thing they seem to be better at than any of the numerous other devices available is letting someone check Facebook every 10 seconds without actually having to take anything out of a pocket. At least until someone updates something remotely for them and breaks that functionality, anyway...
It sounds like you're a little older than me but we both see this much the same way.
I have as much interest in useful or interesting new technologies today as I had when I was 21. I'm also significantly quicker at getting up to speed with them and more aware of things like pros and cons and the importance of choosing the right tool for the job than I used to be at that age.
However, if you asked me right now, I'm quite sure that I couldn't crank out a new TodoMVC example in this week's front-end JS framework as fast as a 21-year-old who just learned it can. Since not a lot of people solve real problems or make real money writing toy to-do apps, I don't find this situation too threatening.
The thing is, I've long since stopped being impressed by this week's front-end JS framework, this week's UI trends and visual design language, and this week's new programming language that looks and feels like C or JS with a thin coat of paint over it. I could get up to speed with them to the point where I too could write to-do apps in half an hour, but to me that's like deciding to learn some new GUI toolkit just to write Tetris or learning some new database API just to write a PIM or whatever we're calling them these days. As you say, these kinds of tools are so ephemeral now that they tend to be very trendy and generate a lot of hype, but they are often popular more because of some big sponsoring organisation than any particular innovation or technical merit.
To me, about the only thing more dull is evangelists for a specific browser (why?!) telling us all about these great new features it has for writing large-scale applications... when the biggest web apps out there still tend to be orders of magnitude smaller than stuff many of us "old programmers" were working on in the last millennium, at which time some of those features actually were quite innovative.
Next week, all these elite young programmers, who are leaving people like you and me and our meaningless track records of building actual working and revenue-generating projects in their wake, will probably notice that MV* is not the only possible UI architecture, that building an application that has to run for years around a framework that has a shelf life measured in months might not be such a great idea, and that JS is actually a very bad and very slow language that just becomes not quite so bad with the ES6 changes and only moderately slow with modern JIT compiling engines.
Just don't tell them that the entire web apps industry probably represents closer to 5% of the programming world than 95% and some of these state-of-the-art ideas are actually 50 years old. Such talk is the stuff of nightmares, and they aren't old enough to hear that kind of horror story yet.
Older people seem to be more resistant to going along with the flow of technology...
You might consider that there are at least two plausible explanations for this.
1. Older people can't or can't be bothered to keep up.
2. Older people can keep up just fine, but actively choose not to use certain new technologies or to avoid them for certain types of projects because in their judgement those new technologies aren't the best option for what they need to achieve on those projects.
There are plenty of both types of older developer around in the software development industry. Obviously one type tends to get more useful work done. Unfortunately but inevitably, inexperienced developers frequently mistake one for the other. Knowledge and wisdom are not the same thing.
I'd say Google's median age of 29 sounds about right. Obviously exceptions exist, but given that wages tend to be rather logarithmic relative to experience they're not that huge of a driver for hiring younger.
That's partly because by somewhere in their 30s, a lot of the good programmers aren't working for someone else on salary any more. They're working freelance and picking their gigs, or they've founded their own business(es), or they've specialised and now do contract work with a combination of programming and industry-specific knowledge and skills.
In each case, they are probably earning at rates much higher than almost any salaried employee at almost any employer. Notice that in all of these scenarios the rates you can charge are based on real value generated, which doesn't have a glass ceiling the way wages usually do.
Good programmers who are still working for someone else as a full-time software developer at 40 probably have their own reasons for choosing that career path. Those reasons will often mean they aren't particularly looking to move either, and if they are, they're not going to do it by sending out numerous CVs to different employers the way a new grad does.
Young people are more energetic, more eager to learn, and more likely to know the things you'd need tonhelp Google, like modern programming languages.
Who do you think is creating those modern programming languages, kid?
Most of the new grads we hire at my company turn out really well. Most of the old people we hire either can't actually write any code, or they can only write code (but only in their preferred language) and can't be bothered to learn or follow prescribed design patterns or coding standards.
Have you considered applying Occam's razor here? Maybe your hiring process sucks. Maybe the compensation and conditions you're offering simply aren't good enough to attract older developers who are any good. Are these theories more or less likely than entire generations of developers who presumably once had that enthusiasm and aptitude you seem to see in new grads mysteriously becoming incompetent and unmotivated a decade or three later?
You continue to make the same assumption but apparently still without any hard data to support it.
You asked how sites are able to choose not to use Google. I gave you several significant alternative sources of traffic, any one of which might generate more traffic for some sites than search engines.
Whether or not you choose to believe that some sites do in fact generate most of their traffic in those other ways and would continue to do so if Google disappeared tomorrow is obviously up to you. However, whatever assumptions you choose to make won't change the real situation for those sites or make them any more reliant on Google's preferences for their effectiveness.
What sites gets most of their traffic from a different search engine?
You implicitly assume that sites get most of their traffic from any search engine. Plenty of sites don't. Sites get traffic from paid advertising (on ad-supported sites, social networks, physical media, and so on). Sites get traffic because people already know what they need (public services with widely known addresses, for example). Intranet sites obviously don't rely on public search engines. And of course there's old-fashioned word of mouth advertising, and its new high-tech counterparts like hyperlinks on related sites and social media.
Of the commercial projects I currently work on -- and there are several, because I do freelance/consultancy work -- I don't think any gets the majority of its visitors from search engines, and in some cases if Google disappeared tomorrow you'd hardly notice on the bottom line.
A search engine is about Content not Presentation.
Your search engine might be. Apparently the most successful search engine in the world thinks its users want content with good/appropriate presentation more than content that isn't as well/appropriately presented. And they're probably right.
I'd be the first to agree that Google shouldn't get to dictate how the Web works and that sometimes Google or at least some its employees appear to be extremely arrogant in assuming they are every webmaster's #1 priority. The reality is that if you're running a site that doesn't depend primarily on Google for traffic, you can and should implement whatever works best for you and your visitors, regardless of what Google wants or says.
However, if you're relying on Google's service for most/all of your visitors to find your site at all, you have to play by their rules if you want the best treatment from them. This is the basic principle of SEO, and it's as old as search engines themselves.
A copy of what, exactly? No-one has a physical copy at that stage, so there's nothing to borrow.
If they have an Apple device that they can use to watch it.
I know, but charging me X for the box set just after it finished wouldn't cost them anything compared to charging me X nearly a year later. In fact, it would benefit them a little in terms of cash flow and probably very slightly due to inflation. And obviously it would benefit them compared to me being fed up with the spoilers and consequently not bothering to buy the next season on disc at all. I enjoy the show, but I enjoy plenty of other shows too, and I could just as easily spend similar money on 20+ episodes of one of them instead of 10 episodes of GoT next time I'm on Amazon.
So what you're saying is that there *are* legal ways for you to get the show earlier and avoid being spoiled?
Reportedly, but as far as I know I don't have any way to use any of them without spending many times the cost of the box set just on one kind of equipment or another and then another significant multiple of the box set cost on the subscription/streaming/whatever for the show itself. So as long as I don't mind a 1000-2000% mark-up, sure, I can probably avoid being spoiled (unless you count the other inferior aspects I mentioned as spoiling the show in another sense, of course).