So sure -- if you're just browsing
Could you give us some examples? Outside extreme cases, the highest bandwidth apps only require 3-4Mpbs (and this has nothing to with any Internet standards, we run high def Apps on our 1Gb LAN and we still have nothing requiring more than 5Mbps. So no, even if you had 1Gbps you couldn't use it if you tried.
Sure I could. I shuttle around AMI images, and do checkouts against large Subversion repos with 11+GB of data in them. I can easily saturate a 1Gb connection.
But that's neither here nor there. If I knew what the next-generation hit application would be, I wouldn't be here chatting with you about it -- I'd be out there writing it. The thing is nobody really knows what sorts of applications we can come up with that benefit from ubiquitous, high bandwidth availability. Perhaps we start working more with applications that can offload their processing needs on-the-fly in a nearly invisible manner. If the network speed were crazy high enough, you could run as if you had completely dynamic RAM online for loads that suddenly require it (that would require an approximately 100Gbps connection, FWIW).
But without those speeds, such applications can't be built. And as they can't be built, we can never know what amazing ideas people could come up with to make use of it. It's like a farmer with a cart and a mule saying "I can move both hay and milk from home to market -- what use would anybody have of an 18 lane paved freeway?". And yet, we have 18 line, paved freeways, and we make use of them all the time.
Back in 1985, 2400bps was fast enough for anyone -- users typically didn't need the kind of speed 4800bps (or -- gasp -- 9600bps) gave you.
But you know what? As more bandwidth became available, developers were able to write different kinds of applications to take advantage of it.
So sure -- if you're just browsing
Fortunately, I sit here in Canada with a 120Mbps home cable connection, and don't have to give much of a crap about idiot Senators in the US.
I've seen in the news items of women who were tested for Zika after microcephaly, but that's just confirmation basis (sic).
It's incomplete data, but it isn't confirmation bias.
If the researchers were taking tests of women who gave birth to microcephalic babies, and Zika was not the cause, you'd expect that the women being tested would have some closer-to-even distribution between Zika infected and non-Zika infected, given a suitable sample size.
Now if the testing were done the other way around (checked for microcephaly only in women known to have had the Zika virus), then you'd potentially have confirmation bias if the results appeared to show a correlation. The problem you'd run into here is that without checking against the birth results of mothers who didn't have Zika, you wouldn't know if there were some other cause for the microcephaly.
This of it this way. If you went to a village and rounded up every mother who had a microcephalic baby, and you found that 99+% of them had Zika, there is no confirmation bias. You'd still want to determine how many other mothers infected with Zika had non-microcephalic babies, and you'd further need to determine when during pregnancy the Zika infection began (as it's possible that microcephaly only occurs if caught at or before a certain point of gestation), but the result would point to possible avenues for research.
If, however, you rounded up all of the women who had Zika during their pregnancy, and found that 80% of them had microcephalic babies and stopped there, then you'd have a case of confirmation bias. It could turn out that 80% of non-Zika infected mothers also had microcephalic babies. That is confirmation bias. What you called "confirmation bias" is good research methods. It's certainly not the end of the research, but correlations are not confirmation biases in and of themselves.
(Aren't all babies) [b]orn with abnormally small heads?
If all babies were born with small heads, we wouldn't classify it as abnormal, would we?
But why aren't we seeing the same thing in Africa or Asia? It's not like the Zika virus in Brazil has had thousands of years to mutate into a version that causes microcephaly, but not the original strain in Africa and Southeast Asia. It's the same virus.
First off, it doesn't take a thousand years for a virus to mutate. Influenza mutates on a yearly basis, for example. And as a general rule, any organism that finds itself in a different environment faces different selective pressures, which may influence the mutation rate, or at the very least, the likelihood of a mutation being more fit for the environment than the pre-mutated strain.
If the cause were due to chemicals, you should see an equal number of non-Zika infected mothers giving birth to children with microcephaly. That doesn't seem to be happening from what I've read. There is no data pointing towards chemicals being involved in any manner. Obviously more diagnosis and testing is needed -- as yet we don't know whether or not Zika has mutated in South America, how the virus is passing the placental barrier, or the exact action which is causing the microcephaly once infected. Wild guesses won't get us closer to a solution to these outstanding questions.
I have a rather mundane, obvious reply to this article. But I can't be bothered to type it all out right now.
Please come back tomorrow, when I write an instant "+5 Insightful" comment in its place.
The Internet is probably better off without NAT
Short response: Fuck you.
Long response: I should be the one who decides whether my local network appears to the outside as a single IP address, or multiple. Also, fuck you.
Short response: Okay.
Long response: Don't go around bitching to the rest of us when developers decide it's no longer cost effective for them to run STUN servers or include thousands of extra lines of code into their products to work around your broken-ass NAT implementation after everyone else has moved on. In the post-NAT world, all of those work-arounds you rely upon daily are going to go bye-bye.
Valid question. I used to install Quicktime... 4? On my Pentium 2 MMX 200mhz computer back in the mid 1990's so I could watch movie trailers on Apple's website in middle school. That's the last time I installed Quicktime that I can remember. I'm honestly curious what purpose it serves today? Is it a web browser plugin or what? I haven't even thought of Quicktime in YEARS.... let alone had a reason to use it...
My understanding is that versions of iTunes prior to 10.5 required Quicktime. Quicktime has always been more than a video player -- it's an entire multimedia framework, with APIs for doing a whole host of multimedia playback, editing, and conversion capabilities. It was the main multimedia framework for Mac OS X up until 10.7 (Lion).
iTunes would have used it for both media playback, as well as for transcoding video from various formats/sizes for various Apple devices (iPhone, AppleTV, etc.). Newer versions no longer require Quicktime so far as I'm aware -- however, this article is about people who aren't keeping their software up-to-date, so it wouldn't be surprising to learn that they're still running older OS's and older versions of iTunes.
Perhaps I am mistaken, but doesn't the Canadian Charter include a stipulation in it that essentially says, "The government can ignore any of these rights if they so choose"?
That's the short version.
The longer version is somewhat more nuanced. No Government or government agency can just willy-nilly ignore someones Charter Rights, and then declare "Notwithstanding Clause!". The cause pertains to legislation in specific. So which you could pass legislation that violates certain Charter rights (but not all of them, which I'll get to in a minute), you can't just randomly violate peoples rights on a whim, and use Section 33 as a defence.
Section 33 also only pertains to very specific rights, and not all of them. Namely, those in Section 2 (Fundamental Freedoms), Sections 7 - 14 (Legal Rights), and Section 15 (Equality Rights). Not covered are Sections 3 - 5 (Democratize Rights), Section 6 (Mobility Rights), Sections 16 - 22 (Official Languages), or Section 23 (Minority Language Education Rights).
Further to all of that, Section 33 specifically states that any such legislation automatically ceases to function after five years. A Parliament or Legislature may re-enact any such legislation that is about to expire; any such re-enacted legislation has the same five year expiry date. Thus any legislation that goes agains the Charter has the opportunity of being repealed by democratic means (as Parliament and Legislature are limited to five-year terms under Section 4, which is one of the sections that cannot be overridden by Notwithstanding legislation, if the people have a problem with Notwithstanding legislation, they can vote in a new government int he next general election to repeal or let it expire).
Not specifically stated in the act (but upheld by case law) is that a government can only apply Section 33 to legislation it has the authority to enact. Thus, for example, in the year 2000 the Alberta Legislature tried to use Section 33 to make same-sex marriage illegal in their province; the Supreme Court held that the legislation was null-and-void as only Parliament has the authority to enact legislation pertaining to marriage. This of course, could also work the other way -- the Federal Government wouldn't be able to use Section 33 to invalidate something under Provincial jurisdiction.
It is also important to note that at the Federal level, no Federal Government has ever used Section 33; indeed, previous Parliaments have sworn that they will never use it. That doesn't have any real protection in law, of course -- but to do so would be political suicide.
None of the above should be read as my endorsing section 33. I don't. I understand the political expediency that caused it to be (at the time it seemed like it was add Section 33, or not have a Charter of Rights and Freedoms at all...), and I have a pretty good handle on its legal standing and the politics surrounding it, but like a lot of Canadians I would like nothing more than to see it repealed. We are fortunate that it is rarely used by the Provinces, and has never been used by the Federal Parliament, but an even better long-term protection would be to scrap it altogether. Unfortunately, doing so requires agreement from the ten Provinces, and as we know from experience, you can't open up Constitutional negotiations to change anything like this without everyone coming out of the woodwork demanding that all of their changes be discussed (and accepted) as well.
Smaller ad networks might not be able to, but Google probably could. Imagine if Google ran their ads, with encoded scripts and links, through their other domains at random. Your only choice then would be to completely block Google, which would break many sites that use their hosted scripts and content, or put up with some ads. Google hasn't done this yet because an anti-adblock arms race would be both costly and a public relations disaster. It's not yet worth it for Google to push the nuclear button on ad blockers, but that day may yet come if things keep going the way that they are.
Except that, unless you only use a HOSTS file (which is the easiest thing to work around in terms of ad blocking), modern ad blockers don't solely block based on host. You can also block based on HTML tags, IDs, and classes. Sure, you can always randomize these on page loads, but you risk breaking a lot of things, and it would add extra server computation to serve out a page (to the point where for someone like Google, it might become more expensive to serve ad pages than you make in revenue).
Something that's made better by ads?
We don't get the American Superbowl ads here in Canada, so I really wouldn't know.
Besides which, if the Superbowl isn't good enough without the ads, why watch in the first place?
When it comes to the Internet, the biggest problem they're going to encounter is that there is nothing in this world that advertising improves
I've sat and tried to think of anything that advertising actually improves (in my mind at least). About the closest I can seem to get is movie trailers before a movie. And that's it. And I don't see how that would apply to websites.
There is no advertising anywhere that improves the web experience, thus users will always have an incentive to block it. It uses end-user and ISP bandwidth, so it actually costs the consumer (and everything in-between) for its delivery.
Anything that costs me money which detracts from the overall experience, even by a tiny bit, is going to get blocked when there is an easy technological means to do so. There is absolutely no way Google or anyone else can change that -- being less annoying is still infinitely worse than not being present in the first place.
LED's, arrows on the floor. smartphone, arudino, flip book, GPS...all fail
My grandfather suffered from Alzheimers in his final years. Even though he had been a military court reporter throughout WWII, and had worked in an office of a chemical plant for his entire career after the war, in his later years anytime the phone rang, he would pick it up and toss it in the garbage. Anytime my grandmother needed to use the phone, she had to go fish it out of the trash.
Years later, when I was working on my MSc., my research supervisor came to us with an idea for a device just like the OP wants to create. He had no experience with people with dementia. I related my grandfathers story. He was never able to secure funding, and the project eventually fell through. Unfortunately, I think he felt that with sufficient funding and enough research, his idea for a handheld device to help the elderly with dementia would have eventually worked, and our relationship was never quite the same again. But I had experience behind me -- my grandfather was an educated man who had used telephones his entire career, but couldn't operate one in his final years (and I am talking about a landline, and not a cell phone), and out of frustration and confusion would just dump it in the nearest waste bin. If you had given him some fancy and expensive electronic device that made unexpected sounds and flashed meaningless text and colours and such at him, it would have very quickly wound up in the exact same place.
The parent is correct. A time may come when we are of that age and our memories start failing that our long experiences with touchscreen smartphones will make such a concept tenable, but we're at least 20 - 30 years out from being at that stage yet. And I suspect that it will be some form of more invisible technology that will rule the day in the end, such as technology built into the environment. Until that time, you need a flesh and blood person to provide aid and guidance. Parent poster is right. There really is no substitute here.
10 to the minus 6th power Movie = 1 Microfilm