Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Slashdot Deals: Prep for the CompTIA A+ certification exam. Save 95% on the CompTIA IT Certification Bundle ×

Comment Small sample size in the early years? (Score 1) 132

There is a possibility that the early adopters of GitHub just randomly happened to be using particular programming languages. One needs to see the number of users/projects along side this ranking plot.

This relates to the evolutionary process of random drift, and in particular to one manifestation of it known as the founder effect.

Comment And again with more jargon (Score 2) 90

"adapted a number of pre-existing features to a new use" = Exaptation also referred to as a Co-option, this is a shift in the function a biological feature serves in the organism. The trait may have been non-adaptive (i.e. without function) before the functional shift, for example it may have been a Spandrel.

"adoption of a more babylike skull shape into adulthood" = Neoteny, or the distortion of the developmental timeline as to extend the duration of what was previously a juvenile stage into adulthood. Developmental pathways - which are regulated in part by specific biomolecular pathways - provide evolution with a set of channels through which it can naturally and easily evolve; easy to reach and viable variations morphology a few mutations away! Famously, this is how humans developed their marvellous cabbage heads.

"huge evolutionary changes can result from a series of small evolutionary steps" not equal, but at least highly related to the concept of Punctuated Equilibrium.

This is a type of explanation draws on the very important concept of Historical Contingency, i.e. the idea that the particulars of a (natural) history processes are largely determined by the coincidence of circumstances which are effectively random, and therefore on a larger scale seem not to be completely deterministic or teleological processes (at least not completely, although I cannot deny there may be some features of the process which are). Whether completely true or false or anything in between, I like this approach to explanation in historical processes of complex systems. It seems to imply use of a type of simplifying assumption which might call a principle of Epistemic Parsimony in complex system; you assume that most types of events are the result of processes to complex to comprehend and therefore - for you as observer - are effectively random. Of course a collection of random events can yield a perfectly tractable and even almost deterministic cohort, just as conversely a collection of deterministic events can yield a delightfully random swarm.

Most of the above concepts were - if not explicitly (co)developed and conceived - championed and expounded by Stephen Jay Gould, and represent a school of thought that critiqued the so called Panglossian Adaptationism which Dawkins, Dennett and (formerly) Williams explound/ed.

Comment But crossroads ahead with the Swarm of Things; (Score 1) 344

would one of the dominant mobile orientated FOSS-OSs splinter and diversify into various specialized applications? Would it do so without even without financial backing from Google? What are the other major players? What's been happening lately in that regard?

Please tell me - I have no clue.

Comment The normal and the revolutionary aspects (Score 5, Insightful) 232

You have a good point, but only most of the time.

Thomas Kuhn, in the context of science, spoke of 'normal' and 'extraordinary' science. Normal science was as you described; you stay in the paradigm and follow the conventional methods for resolving issues. However, these methods did not appear out of nowwhere; somebody was being a clever cowboy and decided its time to do things different. This is where revolutionary science came in. Of course most of these innovations - like any innovation - fails miserably. But if it wasn't for all the failures we would never have the successes that change paradigms and got us to the methods we use today.

And I think this dichotomy does apply to almost every human endeavor.

That said, for normal day to day operations, being a cowboy ALL the time is foolhardy and dangerous. But for people to NEVER have a little bit of an experimental and innovative mindset is also risky in its own way. Sometimes this is balance (known as a bimodal or barbell strategy) is maintained within a single individual by balance of exploiting the traditional and exploring the novel; sometimes its divided between individuals, with a good balance keeping more people stable than unstable.

Comment Baserates (Score 2) 133

Parent makes a good point but misses another. We need some baserates indeed to make a reasonable assertion about the damage or benefits to virtualizing teams. But really it doesn't quite matter how you measure failure, as long as you use equal measures on both normal and virtual teams. Of course to get a complete picture different types of failure should be considered, but for each particular measure the comparison is still valuable.

Comment The empirical side of mathematics (Score 2) 320

There is something to be said for the empiricism of mathematics.

It may be far less prevalant, but I do believe it is there; consider that one almost never knows the consequences of assumptions before hand with any certainty (although good mathematicians have intuitions of course). Mathematics is an exploration of structures which are not completely understood. Ever. In this sense the study of highly complex human made structures is still science because we don't necessarily get even close to understanding our creations. This is why we have something called legal science (in Europe at least); it might not be a hard science but I think its fair to call it a science.

Indeed, in mathematics we even have Gödel's second incompleteness theorem which shows us we cannot use mathematics to prove its own consistency, and thus must settle for an empirical exploration of the consistency of our axiomatic systems.

Comment Re:So.... (Score 5, Interesting) 265

Nearly all surviving balances in nature are stable equilibria. They're not fragile at all. If you perturb them, it just re-stabilizes at a new equilibrium point. e.g. If you tilt the bowl in the wiki picture, the ball doesn't fall off the top of the bowl like in the first picture or roll away like in the third picture.. It just settles in at a different spot on the bottom of the bowl in the second picture, now-tilted slightly.

Bullshit.

That's a myth dreamt up by people more concerned with mathematics and engineering to pay attention to how organic systems actually function.

Let us put aside for the moment that this reasoning applies to highly simplified models of ecosystems, and not ecosystems themselves. This adds a whole epistemic layer to the problem: we don't really know shit about what would actually happen given a perturbation; we barely know this for many models and for actual ecosystems you can forget about it.

But then - even model ecosystems are seldom if ever in equilibrium, and the classical stability-based equilibrium analysis may have been cutting edge in 1974 when Robert May published his seminal book, but plenty of problems with this approach have been found since then. There are a plethora of other concepts that have been developed in order to tackle its short comings, for example resilience (how quickly the system returns to equilibrium). All these concepts should always be taken with a pinch of salt; its not obvious they are relevant or even desirable goals in ecosystem management.

To speak of one particularly relevant metric to this particular issue, there are huge parameter ranges in many models in which oscillatory behavior is present. In his 2012 book, Kevin McCann argues we should focus more on whether the eigenvalues are complex (i.e. prone to decaying or sustained oscillations) than on whether their real parts are negative (the classical stability criterion). If dynamics are oscillatory and I perturb a population down, it will overshoot its original value (possibly perturbing other populations) and will also return back down (making the population spend more time in low numbers and increasing extinction risk).

Another critical concept is that of fragility proper; as opposed to the dynamical concepts, fragility is a measure of functional response to the perturbation as opposed to the dynamics of the perturbation. Just because there is a stable equilibrium for some variable doesn't mean perturbing will have no cost in terms of other critical variables. For this see Nassim Taleb's 2012 book Antifragile.

Importantly I would point out the complete disconnect between your statements and empirical observations of ecosystems. We have many studies suggesting that empirically measured ecosystems may be extremely fragile to particular types of perturbations; for example see Solé & Montoya 2001 which identify keystone species by food web degree (number of tropic neighbors) and demonstrate fragility of total biodiversity to extinction of such keystone species. Another example is Montoya et al 2009 where a different identification of the weak spot based on inverse Jacobian / indirect interaction analysis is found. There is also work by Jane Memmott and her colleagues in identifying fragility not only particular species extinctions but also particular habitat loss. One doesn't need sophisticated analysis, however, to see ecosystems collapsing at a rapid rate not only at present but in many historical situations; indeed ecological fragility is quite possible one of the drivers of mass extinctions (present and past).

Finally, I would add that I would be the first to point out the short comings of all of these methods. The burden of proof, however, is on those engaging on system-scale perturbation, and not the other way around. Of course, arguing based on crude models and half a century old theory does not constitute proof (not even close). Risk is not measured by estimating probabilities of unlikely events; this is impossible due to sampling error. Its measured by looking at exposure. You don't compute the probability you will have a motorcycle accident; you know they can happen and put a helmet on (or get a car) to mitigate exposure. In this regard I cite the non-naive precautionary principle

developed by a wholly different school of thought from finance.

Comment Re: Hypersonic weapons lead to nuclear war ? (Score 2) 290

My guess is that such weapons would change the ballance by undoing mutually assured destruction. The missiles and their interceptors are in a red queen race and if you can move faster than your opponent you may be able to strike them while intercepting their attacks.

Comment Network structure might also play a role (Score 1) 516

While certain it is likely that - as other commenters note - above ground lines are more prone to failure, I would like to point out another possible factor: the structure of the network itself. Certain network structures which have more redundency in links between nodes are more robust, so that a failure of one line would result in less damage. Conversely, certain places might be located in particularly vulnurable sections of the network (for example, areas serviced by a single line as opposed to several).

Comment Re:Bad phrasing (Score 1) 138

First of all, the phrasing can very well refer to cladel trends (this is how I would interpret it in a technical text), in which case it kinda makes sense (while being admittadly somewhat ambiguous) to speak of Dinosaurs shrinking. Second of all, I resent the implicit conflation of evolution with natural selection espoused by your last sentence. Yes, this is evolution. No, this does not automatically mean every phenomenon is explained by selection (despite what adaptationists try to sell you).

The unfacts, did we have them, are too imprecisely few to warrant our certitude.

Working...