You forgot "Hey you kids, get off of my lawn!"
Because vaccines aren't always effective for everyone, and some people (e.g. immunocompromised individuals) can't get vaccines. But these people rely on herd immunity, which is depending on enough people being vaccinated that any disease that shows up isn't transmitted through the whole community. It's not the kids who don't get vaccinated that the legislators are worried about, it's everyone else.
And the rate of vaccinations in parts of California are so low that health officials are seeing the effects of lack of herd immunity.
You pan with your finger, and zoom with the crown. Seems to me like an easier way to find the right app than paging through many screensful of icons.
Let me give you a hint: take statistics.
Actually, the classic Mac OS just considered extensions as part of the file name, and used Type and Creator codes to associate a file with an application. When OS X came out, the Type and Creator codes were phased out, and now the extensions are used the same way as on Windows.
Yes, it's | sed s/\.[^\.]*$//
Just like the song says, "you have to be carefully taught." If an AI isn't carefully taught to believe in God, it certainly won't come to the conclusion that God exists by logic alone.
Most people forget that religion is virtually brainwashed into most people at an age when they're too young to question authority. An AI isn't going to have that experience, unless the AI's creator intentionally puts it there.
Christianity is a buffet. Each denomination chooses different fixin's from the salad bar.
Why not adopt the new standard, jump your major revision number to 10, and then leave it there forever; just like Apple and Microsoft?
This is exactly what I did. I measured the distance from my eyes to each of the three monitors on my desk. My optometrist made me a single-vision pair of glasses that was appropriate for that range of distances, but not for anything else. I got a second pair of progressives for everything else.
The author writes " It's 1920 x 1920 pixel square resolution is said to offer 78 per cent more pixels than a traditional Full HD monitor." Is it so hard to calculate the number of pixels in a traditional HD monitor and divide the number of pixels in this monitor by that number to actually *confirm* the fact?
I swear journalists get lazier every day.
Apple loves their walled garden, but doesn't make decisions like this capriciously. Until we know *why* Apple's doing this, it's hard to judge the situation. They may have a reason that seems insignificant to the end user, but you don't get to be the biggest company on the planet by making decisions like this for no reason.
SoC is system on a chip.
Basically if you display 80 or even 120 lines of code it does not matter if the monitor is 1080 pixels or 2160 pixels in height. Sure the higher resolution will display a well designed highly detailed font better than a lower resolution font but that is all, however programmers normally use a mono-spaced font like "Courier" so a fine detailed font is pointless.
Spoken by one who hasn't done much programming on a HiDPI monitor. I can tell you from first-hand experience that the higher resolution display significantly reduces eye fatigue. I have two 24" 1920x1080 external displays connected to my 15" rMBP. I always put my main window on the small 15" screen because the text is much easier on the eyes at 220 dpi. In matter of fact, text is the only thing that looks dramatically better on the retina display than a standard display. Images and icons may be more detailed, but it's not nearly as noticeable as the improvements in the display of text.
I'd rather give up my external monitors than my retina display.
Because that was the argument that Woolworth's used to keep black people from eating at their lunch counter. Woolworth's claimed that blacks could eat somewhere else, but virtually everyone else said the same thing. If you're a public accommodation, you accommodate *all* of the public, or none. They're called public accommodation laws for a reason.