It can almost be done by rebasing and replacing a central master. But this deteriotes the history and interaction with _every single cloned repository_ and is generally noticed quite quickly. The validity of all the potentially independent, separate cloned repositories is one of the very useful, decentralized powers of git.
> Without death, there's no evolution possible
Unless a species can modify its own biology, or the evolution of _technology_ or of _societies_ can be included. And in practice, it is: evolution is not just DNA biology, it involves entire ecosystems and behavior that are effective, but contained nowhere within the biology of a species.
I was not referring to the insertion of false data: I was referring to its insistence on doing a local cache, appartnely not part of the system DNS, _after_ switching DNS servers and potentially needing new DNS answers due to being in a different DNS "view". This is common enough practice with various proxy and load balancer configurations, to have a different DNS record on the internal network than on the external network.
Inserting false DNS records is a whole _different_ security risk, one that is an ongoing problem that web browsers can do little about. In theory, it should be noticeable via SSL certificate failures. In practice, there are so many stolen "CA" or "Certificate Authority" records in the wild that can be used to sign arbitrary SSL certificates that we canot rely on a fake website not having a signed, apparently legitimate SSL certificate even for a corporate site like a bank. So poisoned DNS records, which is the problem you are referring to, are a much larger risk than one might expect. And the browsers can do _nothing_ about this. It's a failure of the SSL architecture.
Oh, my. No, I'm afraid it's "greatest common denominator". it's the standard mathematical "term of art" for the largest number that can that two, or more, numbers can be divided by, with no remainder. "Greatest common factor" may be more clear to you, but failure to use the correct label should be a troubling sign with anyone you expect to do mathematical work.
The tendency of Firefox to preserve its own DNS cach means I cannot use it when hopping from VPN to VPN with split DNS running. unless I configure and install my _own_ local DNS server to auto-reconfigure every time I activate a VPN. I'm afraid it's become unusable for me for real work and testing when switching from internal to external website access as I debug network and configuration issues: it's the only browser that fails this way.
As an older programmer, I'm fond of some very good quality, older tools such as "webmin". Not all the modules added to it are excellent, but its very clean and very flexible for many core system utilities such as BIND based DNS. It's also much more robust than any configuration tool that relies on a separate, manually configured back end database.
I'm afraid that this usually means two entirely different interfaces, with overlapping features and writing to the same configurations. That is more than twice the development cost, since they involve distinct styles and expertise to develop or manage and the _negotiation_ between the two styles is an added cost. And it makes debugging more than twice as expensive, since tests have to involve both sets of interfaces and switching between them.
This is prohibitively expensive: the result is usually that the "plain" interface lacks critical features that are only available in the more sophisticated tool.
That's a fascinating guess. It's not a feature I've personally used. Although yes, the Cisco configurations and the Cisco _clients_ do tend to have a horrible morass of undocumented options.
> since that puts an upper bound on human ingenuity
I'm saying to master the components you need before planning the project. Simply saying "human ingenuity will solve that" is like saying "we'll make the software secure when we're ready to publish". It is, itself, guaranteeing project failure.
> The stars will die out long before entropy becomes relevant here.
Do you understand what "entropy" is? Even using the purely thermodynamic definition, there is a very real energy cost for preserving complete coherence of the DNA sequence. Certainly, as a dynamic chemical system, the "entropy" present in the DNA molecule itself and its complex environment prohibit the likelihood of perfect sequence coherence over lengthy times. The result of such failures is degradation. One of the most unfortunate results of such losses is, frankly, cancer.
Frankly, for significant extension of human life, a general cancer cure is of critical importance. Or the accumulation of small risk factors is going to accumulate to a near certainty of dangerous cancers, of many different sorts. We already see this among our older citizens!
>> especially since scar tissue accumulates and regrowth of neural tissue has never been mastered.
> Neither which is a permanent problem.
Given the lack of progress for both issues, there's little concrete reason to assume they are _not_ permanent issues.
> I assure you that companies like Google, Facebook, Twitter, Microsoft and their relation ALL do the EXACT SAME THING.
Not from my direct experience with several of those companies. They install wldcard certificates, signed by one of the commercial root authorities, not their own root certificates.
Do you have any direct experience or instances to show that _any_ major software vendor or software service does this?
> I work at a school.
Clearly, not in IT or network security. A root CA is not for "filtering". A proxy or firewall is for filtering, and a root CA doesn't help with that other than to automatically authorize the certificates presented by the proxy. A root CA is for signing other certificates so that they are accepted without the manual intervention of the student or visitor using the "Bring-Your-Own-Device".
It may have conceivably been installed under a sealed warrant for "national security" reasons. Much like the Patriot Act in the USA demands silent cooperation with warrant free investigations of unconstitutional scope, I'm sure that UK governmental agencies have also demanded and received cooperation with dangerously excessive search orders.
A "private boarding school" implies that the school might well have international students, or students with parents are of economic and political power. Is it feasible to contact _those_ students and their families, to explain what the school has been doing without their knowledge? A similar scandal involving the use of webcams on student laptops to photograph them at home was reported on Slashdot, http://en.wikipedia.org/wiki/R....
Doing Main-In-The-Middle attacks with the root CA and SSL certificates signed by that root CA is only one of the risks. Once certificates signed by that CA are accepted, they're permanently usable for fake websites, for main-in-the-middle attacks with proxies using those faked SSL certificates for designated websites, and for replacing ordinary SSL signed software or update packages with fake, rootkitted packages. The list of subtler security issues is longer: those are only a few of the leading problems.
I'd be profoundly concerned that the school is not competent to protect their CA, or other certificates that have already been signed with it. Since they've already demonstrated ignorance among some personnel of their own security practices, and unwillingness to communicate truthfully with students, I'd assume that they've never properly secured the host or network on which they've stored their CA. Unless they have _erased_ the private CA and all copies of it, it can be misused at anytime in the future, especially on the school's own network.
Moreover, if possible before the CA is erased, _all_ of those certificates already signed with the CA need to be revoked, and replaced with a correctly signed one. That's quite expensive, at roughly $200 USD/certificate/year. You can buy get the certificates more cheaply, but that estimate includes the technical time to go replace the old certificates.