Comment Re:Very Interesting (Score 0) 36
Interesting. But not entirely surprising.
That the main result of giving kids laptops was that it prepared them to be better consumers of Facebook? Lots of people called that early on.
Interesting. But not entirely surprising.
That the main result of giving kids laptops was that it prepared them to be better consumers of Facebook? Lots of people called that early on.
Did you get one of the high end iPads?
I've got the 2018 base model, and it hasn't received any software updates in 2 years now. We got to the point of getting a new one about a year ago because it was too sluggish, and the limited RAM was causing problems. We still keep it around for simple tasks, but not for anything remotely demanding.
Sounds like you're not from the US.
Most people get their phones from their service provider. And the major providers offer a free phone upgrade every 2 or 3 years if you trade in your old one. It's structured as paying for the phone in monthly installments, and they give you a credit each month to cover the installment. They give you the option of paying upfront for the phone, but you don't get the credits that way, so it's almost always a terrible idea to do.
They are very, very clear about what the full price of the phone is. The catch in all of this is if you cancel your service before the term is up, you lose the remaining credits and have to pay the remaining balance.
The result of this is upgrading more frequently than the 2 or 3 year cycle is very expensive. You can keep your phone longer than the cycle if you really want to, but it rarely makes sense to because there usually isn't any benefit to turning down the upgrade offer.
There's only 2 reasons to have a new piece of hardware rather than make this an app on your phone:
1)It adds new sensors that your phone doesn't have (yet) that will enable new functionality. This won't be the case, as there's no usecase for it
2)It adds a new IO methods that aren't possible on the phone. AR goggles might do this. An AI assistant doesn't, it's all audio and voice.
This is basically just going to be replacable with a bluetooth microphone paired to an app on your phone. Which means nobody is going to buy it- even if they can actually find a usecase people want AI for (doubtful).
I suspect you have a different definition of "cheap phone" than the parent poster.
Your logic seems reasonable if you're looking at a $400-$500 Android phone instead of an $800 iPhone or Galaxy.
But you can get $30 Android phones. I would not expect those phones to last for very long, both because the tech quickly becomes obsolete, and due to physical durability.
Also worth noting that most people getting a flagship phone are getting it as part of their service plan, usually on a 3 year upgrade cycle. As a result, upgrading every 3 years is by far the most common cycle, as upgrading at any other frequency doesn't really make sense.
Those exist, but divide the view count by number of comments. It will show for the most part thousands of views per comment. That means most people aren't using the social part. I've yet to ever write a youtube comment, but I use it daily. So if you asked me if I use YouTube you'd get a yes, but it's not social media for me. If you limit it to those who read/write comments it would be fair, but I'm not sure they did that.
I'd say the same for YouTube. It's used to watch videos. The number of people who comment on them is minimal compared to the userbase. I'd be very curious to the exact definition of "social media" they use is. I don't think it's what most people consider to be social media.
Not anything. Especially when dealing with nuclear. There are some parts that once degraded cannot be safely replaced. For example, the containment unit. And others where making a new one makes more economic sense than replacing even when technically possible. What state this plant is in I have no idea, and am not qualified to have an opinion on. I just hope experts are making the decision based on economics and power requirements and not politics.
No, it isn't. You're absolutely deluding yourself. And even if it was, nobody uses it to actually write anything. Learning to write (vs read) would be, is, and has been for 50 years an utter waste of time.
I am absolutely serious. I have never, in 45 years of my life, seen anyone write in cursive past 3rd grade.
The main question is if the plant is still safe. It hasn't been used in years. Is it still in good maintenance? Was the design meant to be idled for years? What are the risks of restarting that particular design of reactor after all those years? Is the land there safe for workers of the plant after reactor 2's accident all those years ago? And what plans are in place to prevent what happened at reactor 2 from happening at reactor 1?
I actually don't know the answer to any of those questions. But I hope experts are actively asking those.
Really isn't. I haven't seen cursive anywhere but on documents in a museum at any point in my life. That includes signatures, which are more likely to be a squiggle than anything resembling actual cursive. There is zero point to mandatory instruction on it anymore (if there ever was- the idea that it was a faster way of writing is backed by 0 proof. And even if it was, the ease of reading script more than cancels out those speed gains).
People buying essentials on credit has been around for a very long time.
Longer than most think.
You load sixteen tons, what do you get?
Another day older and deeper in debt
Saint Peter, don't you call me, 'cause I can't go
I owe my soul to the company store
-Sixteen Tons, Tennessee Ernie Ford
Are they a university for research and learning or a tax-haven for investments^W gambling.
"Harvard is a hedge fund with classrooms" - Scott Galloway
A few things to note...
Over the past couple of decades, more and more roles within the British healthcare system have become able to prescribe - pharmacists (as noted in the summary), nurse prescribers, physicians associates (who technically should be under the supervision of a GP, but the way the NHS has that set up its very much a "PA prescribes, GP actually has little say")...
The role of doctors in the British healthcare system is being diminished and replaced by lower paid, lower trained positions, and GPs are particularly hard hit by it - which is why GPs are retiring or moving overseas at record rates, far beyond the ability for the current GP training schemes to replace them.
The UK is actively doctor hostile these days, and British doctors do not want to be part of it any more.
It's not just in Britain. All across the West, there's a shortage of native-born doctors. The expense and hassle of getting an MD is bad enough. Then you also have the modern stresses of being an MD (which in America, includes a highly litigious culture where doctors have to get maddeningly expensive malpractice insurance). The workload is huge, and the money is only good for the hyper-specialists now. The home-grown family doctor is an endangered species in the US, and we're addressing it in two ways: handing doctor duties to those lower on the chain, and importing doctors from the third world. Every single new doctor at my not-large Southern US hospital in the past three years has come from 3 places: India, Pakistan, or East Africa. This of course, robs those areas of badly needed doctors. And it doesn't really matter if your system is private or nationalized. Look at the ranks of doctors that staff your local services. You'll see similarities everywhere in the West: there's fewer of them, and they tend to come from overseas.
Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (6) Them bats is smart; they use radar.