There's just one thing I need to say: An audience of one, is still an audience.
I do not use weasel-worded statements often, but when I use them, I use them for their full meaning.
There's just one thing I need to say: An audience of one, is still an audience.
I do not use weasel-worded statements often, but when I use them, I use them for their full meaning.
Your comparisons are ridiculous for anyone who has ever played a violin.
There are so many things that are wrong with this study. There are so many things that differentiate violins BESIDES how they sound to an audience.
But the question is, given that any musician's ultimate target is to eventually have an audience, shouldn't how an instrument sounds to them be the quintessential point of evaluating the quality of an instrument?
Remember: Price and rarity are another set of entities altogether. A solid gold violin couldn't be played, but would be worth a ludicrous amount of money. The very first violin ever created in the world would be a rare find (as it probably does not exist anymore), but would probably be in a condition in which you simply could not play it at all.
You are right that there are many qualities a musical instrument can have, but you are wrong in assuming that they have any relevance on the most important quality of an instrument: If it can create music people want to hear, in the quality they want.
The article highlights that 150 people is too low a number to preserve all the genetic diversity over multiple generations. This is in line with other estimates, that say that below 250 and 500 individuals, genetic diversity collapses rapidly.
But I always wondered: All these statements assume normal sexual selection, where some gene lines die out in the long run.
But what if one would remove the element of chance? What if you know the genetic pool of your colonists and could ensure over dozens of generations, that no genetic diversity is lost. Additionally, what if you could preserve the original genetic pool via cloning or DNA storage & synthesis?
Since the initial stock of colonists are presumably genetically healthy, it follows that their offspring should be healthy, too, if you eliminate loss of gene lines. And even if some issues appear, you still have the originals "on backup".
Of course, like others pointed out, such a strict procreation scheme might lead to adverse psychological effects in the population.
But all the US actors and Pop stars use American social media now. So good luck getting rid of the huge swaths of followers using US services.
Europe is the second most profitable market of such media worldwide; often accounting for between 25-40% of the gross. Do you really think that those people, for whom money always comes first, would ignore that market just because it means opening up a second account you need to flood with sock-puppeted postings?
The additional cost wouldn't even show up in their budget (apart from witholding money from those poor souls who went for a share in the profit- instead of gross-margin).
Worse yet, this whole thing is not about routing moronic teenager BS emails through US services, its about keeping the NSA out of everyone else's data....and that won't happen.
Do not let the perfect be the enemy of the good. To use a car analogy, it's the difference between leaving your car unlocked in the streets of a Mexico City slum, and keeping it in your own garage in Beverly Hills.
Sure, it might still be stolen by someone, but one needs a concentrated, deliberate, directed effort of the right kind of person with investment of resources, whereas the other only needs a random person of questionable repute that happens to pass by.
At the same time, these guys complain that they can't run their offices with Linux: "It's too complicated for our staff. Give us back our Windows XP, our MS Office, our Internet Explorer."
May I remind you of projects like LiMux, which involved bringing the entire Infrastructure of the city of Munich over from Microsoft products to open source products based on and around Linux?
Projects that instead of failing, succeeded quite well. Where the users -- after an initial grumbling -- not only accepted it, but gave it quite better usability marks than the MS products. Users that are governmental offices, who are not exactly known for quickly embracing new ideas. In a federal state that's Germany's equivalent of Texas in terms of conservativeness.
So given that this project quite nicely showed that going away from the US Software companies, over to truly international Open Source software is very much feasible, even when you're just using the money you'd have spent on licensing costs anyway year-over-year, what's exactly the holdup?
Also, before you raise the flag of "lowered productivity", the entire switch-over happened progressively, without impacting users beyond them having to learn a few new clicks and buttons.
Now, avoiding US-based internet services is also not that hard.
This list goes on and one; at least for Europe. Therefore, ignoring US services is only a matter of overcoming complacency, not one of sheer impossibility.
A lot of the wildlife around Chernobyl had dramatically recovered despite high levels of radiation.
Actually, all that Chernobyl's wildlife proves is this:
It is beneficial to wildlife populations to not exist in proximity to humans.
Given that fact, the recovery and increas in Chernobyl's wildlife becomes suddenly very, very uninteresting. Add to that the fact that the average life expectancy of somewhere around 90% of species living in the wild is below 20 years, and you get while doing longterm exposure studies on them is also kinda moot.
The Kinsey studies were flawed and debunked a while ago. Get with the times.
Just like Newton's ideas about gravity and the mechanistic universe were shown as flawed and debunked by the advent of relativity and quantum theory.
Being incomplete, yes, even being flawed, is not to be unexpected for scientific theories and studies. Indeed, almost all such endeavors in the history of mankind turned out to be flawed and incomplete. That does not diminish their importance though, as attempts to reduce the blurriness of our understanding of the world.
This is why I led my post with the deliberate statement of "[...] if the Kinsey studies have shown one thing [...]"; implying directly that I know that they were somewhat flawed and in many ways also a product of their times.
Still, their importance (along with similar studies done in Europe around the same time) helped western society grasp that a binary model of sexuality is even more deeply flawed and incomplete.
That is not to say the binary model does not approximately correspond to nature -- after all most species need heterosexual sex to procreate. It merely needed pointing out that it was missing a lot of the nuances of reality. Nuances that, when ignored, can lead to to wrong conclusions and predictions. And since these are applied to humans (instead of falling apples, to stay with Newton), the results of such errors can be quite ugly.
The 10% includes those who have bisexual urges, but identify as heterosexual.
In that case, the number would be likely much higher. After all, if the Kinsey studies have shown one thing, it is that pure homosexuality is as rare as pure heterosexuality. Most people fall into the range where they "merely" strongly favour one gender over another, but not to the exclusion of the other.
Furthermore, sexual attraction is not the same thing as actually wanting sexual intercourse. It ranges from simple and almost universal things like the benign interest in the aesthetics of human bodies -- no matter the gender --, over gendered group bonding (best example: sports clubs) up until bonding with a specific individuals (best example here: soldiers in war).
And then remember that your mind is also capable of empathy on all levels. For example, if you see someone cut themselves, you most likely feel a mirror of their pain. That's why horror movies are so effective.
The same is true for sexuality. After all, if that were not the case, porn would not be as effective and desired (by any culture, any gender, really).
For example, if you see a movie in which two people kiss, do you totally ignore one partner? No, you perceive and are affected by them both. You might like some combinations of genders better than others, but you can not deny that the kiss will affect you either way and that something in your brain will mirror the feelings (physical as well as emotional) conveyed by the kiss.
Additionally, sexuality is the result of a developmental process and like any such feature (height, skin color, etc.) it has as much a genetic "pre-set" component as well as a environmental component that can divert the development. If you flood a male embryo with androgen-blockers, the embryo will turn physically female, along with an increased chance to be attracted to men. Same if you flood a female embryo with the right cocktail of male hormones.
And like your final body height is influenced by the supply of nutrients during development, sexual orientation is influenced by a myriad of environmental factors. And like height, the result is a sliding scale. In many ways, your genes only supply the starting point for that first cell, but not where you will end up.
As such, if you don't limit "bisexual urges" to people who actively strive to have physical sex with either gender, you will see that your 10% is an estimate on the lowest conservative threshold.
Private law (privileges) do not mean that they only apply to yourself. It means that the law is your own law.
For example, a privilege of kings in times past was to hunt in the royal woods (i.e. all woods owned by the king; under strong feudalism that meant all woods). Only the king and his men were allowed to hunt. If you were caught "poaching", that was usually punished quite draconically.
As such, the law was private in so far as that it applied to, or benefited a select few; not private in so far as that only the king was beholden to it.
Another form of private law is when you enact a general law that ostensibly applies to everyone, but only benefits specific individuals. One such law was the three-class voting system of 17th-19th century Europe.
For example, in the German Empire (1871-1918), every free man over 25 was allowed to vote. But the vote was not equal (aside from excluding ~70% of the population to begin with).
You see, the parliament was split into three categories of seats: 1/3rd of the seats were elected by the general populace. 1/3rd by the clerus (church) and the last 1/3rd by the landed aristocracy.
Which meant that 90% of the voting populace was represented by the first third; the 8% working for the church got the next third and the last third was voted in by a paltry 2% of the voting populace (~0.5% of the general population). Three guesses how the parliament usually voted...
This law applied to everyone, and by becoming filthy rich enough to buy yourself into either the church or into landed aristocracy, you could increase the effect of your own vote. But still, this law was in effect a private law, as it applied to different people differently -- in other words: it granted a privilege.
Some private laws are unavoidable or sensible -- like withholding the right to vote from significantly mentally disabled or ill persons; or from children whose vote would be no more than either white noise or the vote of their parents.
But most private laws are just this: plainly unjust.
As a Texan I am absolutely disguisted by this. So having a conservative state legislature is bad for a lot of reasons. However, supposedly one of the benefits is keeping the government out of things it has no business in. So what the living fuck happened.
To be a cynic:
The voters got exactly what they wanted: Private enterprises buying their own law with no government in sight to stop them. That's what privilege means in its pure form: Private Law.
After all, remember that a democracy needs at least three pillars to survive: A strong executive (government), a strong legislative (parliament) and a strong judicative (courts).
Weaken one of them, and you open up the chance for people to abuse the disproportional strength of the other two (or even one).
Strong executive/legislative with a weak judicative leads to a police state, where the due-process of law is abandonded.
Strong legislative/judicative with a weak executive leads to corporatism with a nice load of loophole abuse and unfair privileges -- which is what you see above.
Strong executive/judicative with a weak legislative leads to a static, reactionary state, where a small elite forms a wall against any change.
Do note that countries that lose yet another pillar are usually civil-war-torn dysfunctional messes or dictatorships of the worst calibre.
So, why do you want a weak executive again? Or, if you interpret "small government" to include both legislative and executive, why are you so crazy to want that?
Just as a minor node to myself; it is "James Clerk Maxwell" not "James Maxwell-Clark". Figures I would misremember his name after just having read an article about the Lewis & Clark expedition.
Even if we restrict the definition of "science" to your definition; that is that science is purely "evidence-based, hypothesis-driven testing", computer science would still fit the bill.
Remember, that CS is as diverse a field as modern physics is. You have theoretical CS, where you tackle questions like: "What is a good, logical definition for computability?" or "How can you logically prove that a program terminates/runs in X time/consumes X resources, no matter the input". This is fully equivalent to the questions of theoretical physics, where you tackle the Grand Unified Theory -- joining gravity, the weak and strong force as well es electromagnetism.
These theoretical question can be brought up without need of evidence -- if all you're interested in is disproving something. According to your definition, this means that the theoretical aspects of both physics and CS are not "science". Okay, let's run with that.
The nice aspect of theoretical questions that can't be disproven by pure thought is, that they lead us on to try to discover concrete evidence that a given theory is true or false in real application! And this is where your rather narrow definition of science comes in, and the point where we find that both practical physics and practical CS fulfill the criteria.
For example in physics, we can test the theory of relativity by building telescopes that look at stars and black holes, to see whether the hypothesis' predictions hold true to raise the hypothesis to the state of a theory. As can be seen with the term people use for "X of relativity", this has happened for relativity.
But if you look with even more than a superficial glance at CS, you will see that the same process is at work in moving from theoretical CS to practical CS. One open question of theoretical CS is whether P = NP or not . So far, we are incapable of disproving either possibility with pure thought. Thus, we turn to practical CS where people try to find evidence of either in the real world. After all, if you can create a program on a real computer that solves an NP-hard problem while never leaving the limits of P, you have conclusively shown that P = NP. So far, we've only found approximative or heuristic solutions that do that, so after 50 years of turning up with "no evidence" we are allowing ourselves to say that the hypothesis of "P != NP" should be treated (even if only cautiously) as a theory -- and we're indeed doing that, as you can see if you look at most modern encryption methods.
But you might say: That is not enough! After all, you could reduce any written computer program on a physical hardware to a sequence of logical steps in a system modeled with pure-thought. And indeed you can, as the Turing-Model of computation promises exactly that -- and so far physical evidence agrees with us. But isn't the same true for physics? After all, physicists search for such a description, too! It's what Maxwell-Clark, Einstein and lots of other physicist were and are after when they ultimately search(ed) for the Grand Unified Theory. How can you blame CS for already having found its Unified Theory?
But the last example finally puts the nail in you view: What about Quantum Computers? They are the point where physics and CS meet; both on the theoretical part (Quantum Theory / Quantum Computation) as well as the practical part (building the thing and proving that the shit actually works as advertised).
So, if we accept your definition of science; then it follows directly that if CS is not a science, Physics can't be either.
I think you misunderstood the topic. Please re-read my posting and that of the parent again.
The topic was not about the SoC (where previous iPhones indeed already used the Apple A3-A5 bridges/systems). The topic was only about the CPU .
And the CPU part of the A6 is based to nearly 100% on the ARMv7 specs. If you compile stuff with an ARMv7 compiler, it will run completely unmodified on an A6 -- because its CPU is nothing but an ARMv7 with additional bits bolted onto it.
If you're generous, it's at best like the early AMD CPUs. A core licenced from Intel, created in a manufacturing process mostly designed by AMD (since Intel doesn't license that) with a few additional bits bolted on. Newer AMD Cores are diverging from that, thanks mostly due to microcode translation making x86 a very "virtual" design. And if you remember, ARMv7 -- being mostly still a RISC design -- can't and indeed doesn't need bother with that.
> IOW all based on ARM Holding chip designs - a company co-founded by Apple. Boo-fucking-hoo.
The question was about whether or not it was "designed by Apple", specifically "in California".
ARM is a British company, headquartered in the UK. They do have an office in the Silicon Valley, but they also have ones in Japan, Germany, Sweden, France and a lot of other countries. So if that's your logic you must say that Apple devices are pretty much "Designed all over the world".
I freely support you on the point that ARM was indeed founded as a Joint-Venture between Apple and 2 other companies (Acorn and VLSI), but nowadays Apple only holds ~13% in shares. If holding shares nowadays counts as "being designed by", ohhh, boy, copyright and patent law just got a million times more complicated than they already are.
> the CPU is very much "designed by Apple in California", though manufactured by samsung and/or TSCM.
If you mean by "designed by Apple" in so far as that Apple demands all its suppliers to print the Apple logo (and only that) on the chip, then yes. Other than that, though, you're sorely mistaken, as a 2 minute search would've easily told you:
1st gen and 3G: Samsung 32-bit RISC ARM 1176JZ(F)-S v1.0
3GS: 600 MHz ARM Cortex-A8
4: 800 MHz ARM Cortex-A8
4S: 800 MHz dual-core ARM Cortex-A9
5: 1.3 GHz dual core Apple A6
So only the iPhone 5 has a "design" by Apple. And that is stretching the meaning of design quite a bit (thus the scare quotes). 99% of the design of the A6 is based on the ARMv7 specification; after all, it has to, since it needs to be compatible with the previous ARM CPUs used in previous generations.
To use a car analogy: Calling the A6 "designed by Apple in California" is like saying that taking all the blue-prints from Volvo and just adding a BMW label plus steering wheel turns your Volvo into a BMW.
"The greatest warriors are the ones who fight for peace." -- Holly Near