Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:exception handling (Score 1) 51

Biological simulation engineers at Umbrella Corporation cannot guarantee the accuracy of any simulated systems created using this product, and cannot be held liable for any resulting products that may result in injury or harm to any species, including but not limited to uncontrolled anomalous tissue growths, genetically linked deformities, or the mass extinction of human kind via a zombie apochalypse.

By using this software you agree to the above enclosed terms and conditions, and to be bound to said agreement.

Thank you for using LifeLab(tm).

Lifelab and Umbrella corporation logo are the sole intellectual property of Umbrella Corporation, all rights reserved.

Comment Re:exception handling (Score 2) 51

Only if there is a process for the cell to do so. Like a computer, a cell isn't magical. This is why amyloid plaque buildup in neural tissues is a fatal degenerative disease. There is no mechanism for the cells to flush the defective products they are synthesizing from the broken synthesis chain.

The real world KEEPS the defective biproduct, and simulates its impact on the rest of the system. A computer based simulation of that process that aims to be accurate, must also do so.

Comment Re:exception handling (Score 4, Insightful) 51

The issue is that the "zombies", in this case, defective H proteins, stay in the cell and are NOT really dealt with. They become a new, undefined input in the system that must be accounted for when simulating other cellular processes being performed in parallel inside the cell.

This can lead to a very extensive chain ot unexpected executions and transformations. Dealing with that programmatically is going to make any computer currently in operation attempting it cry to the ghost of Alan Turing and beg for mercy.

If the goal is accurate simulation, then a (try),(catch),(finally) isn't going to work properly.

Comment exception handling (Score 5, Interesting) 51

Biological systems have many broken legacy "routines" that don't get called, or get called, and execute incorrectly. How do these engineers intend to deal with exception handling in this capacity?

For instance, a well known mutation known as bombay phenotype involved a precursor protein called "H protein", which then gets modified by additional cellular processes to become either A or B blood antigen. The mutation makes a defective H protein, and thus prevents the proper activation of the A or B antigen "routine".

If they try to build a programing language for cellular processes involving DNA and protein synthesis, then how will they handle exception cases, such as that one? It can be likened to the halting problem, because the question asked is "given these inputs and this program, will the program ever halt?"

How do they intend to resolve this problem?

Comment Re:Uh... (Score 1) 740

Look, it just felt that if it was going to be forced into being famous by those damned paparazzi scientists at the LHC, when it had spent the entire previous history of the universe toiling in obscurity providing substance to all the masses, that it at least deserved to be compensated for the hassle.

And you people act like it did a bad thing! Shame on you!

Comment Re:Fascism is not Libertarianism (Score 1) 356

You are injecting an artificial difference that does not logically exist, between a "for profit school", and a "private school."

Private schools are for profit schools that are selective in which students they will accept.

For profit schools are for profit schools that are selective in which students they will accept.

Sounds to me like you are barking up the wrong tree. The issue isn't that the schools are driven with a profit motive, the issue is that you take exception to the school's ability to refuse admission.

This is a perfectly justified concern, because when no public schools (that have to accept anyone and everyone from a given district) exist anymore, then logically, there will be a resulting demographic of children who are systematically excluded from the "for profit only" school landscapes.

On the other hand, this is also an unsatisfied demand in the market. That means creating a school that specializes in these "undesirable" pupils would have an assured revinue stream.

At that point, the complaint changes; all the kids are going to school, but at least one demographic has few if any options, and the one school that specializes in the undesirable kids is essentially a monopoly, and can charge an absurd price, and get away with it.

True libertarians don't want to acknowledge this last situation, because it clearly paints a portrait of where government regulation is necessary. This is because government regulation of just about anything is considered offensive to diehard libertarians.

I don't mind the death of public school systems, and the rise of privatized ones in their place, as long as there is regulation forbidding outright castigation of groups of pupils based on any set of criteria. Eg, the schools have to admit any and all students, and the cost burden between a special needs student and a normal tuition paid student has a government assistance program that the school can make use of, paid for by tax money, but with riggorous oversight to punish and discourage abuse.

But there I go being a moderate centrist again.

Comment Re:say... WHAT? (Score 2) 335

A single process can contain multiple threads. Without some level of protection there, this kind of thing could be more vulnerable to code injection attacks, allowing a perp to own the whole VM. If they do this without upsetting the process in any visible way, they can now just soak up all the data that the VM is having shoveled through it.

Without a kernel space inside the VM looking for untoward behavior from the threads in userspace, and enforcing restrictions on who owns what resources, this is a recipe for trouble. The compromised thread can walk all over the vm's memory, and report whatever it wants to the hypervisor. In this case, the goal isn't to escallate, the goal is to compromise the vm and lay dormant. An actual, real VM with a seperate kernel space keeps important parts of memory secure. Like the data reporting and monitoring threads.

They just removed a whole layer of security. It may well be mostly redundant, but given the stakes involved, redundant security features can actually pay off.

The honor system doesn't work when the threads stop being honorable.

Comment say... WHAT? (Score 0, Troll) 335

Did they really just say that they removed the insolation between kernel and user spaces?

(Re-reads. Yup. That's what they said!)

Oh dear gawds. Do they not realize that this makes their processes naked little unprotected things in a dimly lit room, that are going to be savagely raped and abused by the first rogue process that comes along?

Do they have no conception of why the two spaces are kept apart!?

No thank you, I will refuse to conduct business with any agency that uses this platform, thanks. We have a big enough problem with identity theft and wire fraud as is. I don't want to encourage such a horrifically stupid idea by giving some dumbass led company my business.

Comment Re:Econophysicists. WTF? (Score 4, Funny) 387

Nono.

It's the non-neutonian version of fluid capital.

It really isn't that difficult; just imagine cornstarch and water.
Ok, now imagine that as money, aka, liquid assets.

This non-neutonian liquid asset appears firm and to have substance as long as it is traded quickly, or placed under high trade pressures, but for anyone attempting to hold onto it, it melts into a sticky mess, and they are left with little to show for it.

There is a considerable degree of interest and research into such non-neutonian liquidities, as everyone seems hell bent on finding ways to make ever more of the stuff. This means that the rate of exchange and the overall economic force behind the trading have to continue to rise to accommodate the inclusion.

We non-neutonian econophysicists deal almost exclusively with these kinds of liquidities, and often work very closely with non-euclidian geometric market analysists to see new angles to the market that others failed to see or exploit before.

It's really quite technical deep down, so don't feel bad if you can't quite comprehend it all.

Comment Re:End of the Universe (Score 2) 164

I just had a radical thought.

It's probably wrong, as it is the product of ignorance. As such nothing that follows should be seen as factual. It is supposition. And again, probably very wrong.

Still, What if the physical volume of spacetime is far larger than it currently appears, the force driving spacetime expansion is the energy that creates vacuum fluctuations entering the true ground state (as more spacetime that has fewer fluctuations), and the currently observed universe's rate of expansion is an illusion?

Imagine:

Shortly after the bang, we have a very excited vacuum, and a volume for the universe that is very constrained. Mass-like fluctuations in the vacuum will occur frequently, even though actual massed particles don't exist yet. Over the volume constrained universe, this creates knots in the energy density of the early universe, by making spacetime "lumpy".

If we presume that the rate of spacetime expansion of this early universe is "just slightly" greater than this gravity like influence from the combined action of the fluctuations that have mass like terms, then spacetime will explode away from the soup, faster than the soup expands. If we say gravitational effects propogate over spacetime at exactly c, then this expansion would be a tiny fractional bit in excess.

the aggregation of this soup toward its barycenter would be arrested by this expansion. The acceleration of the soup toward its baycenter would make the apparent rate of expansion seem very tiny. (Say we are accelerating at 1 plank unit every plank second, toward the net barycenter. We are in spacetime that is inflating at a net rate of 1.000000000.....1 plank units every plank second. The members of our cloud will appear to be moving *away* from the barycenter at the .0000000....1 plank units per plank second, despite actually accellerating toward it.) The actual expansion of spacetime will be considerably greater than the apparent one.

Gravitational attraction falls off on the inverse cube of distance. As the cloud expands (or rather, is pulled apart by expanding spacetime), the rate of accelleration by gravity toward the barycenter diminishes, making the apparent expansion rate increase.

Now, the really odd thought.

If we presume that the driving force behind the expansion is the decay of vacuum energy to its lowest possible state (spacetime with no fluctuations), then rate of expansion will not remain constant, and will slowly degrade over time as it runs out of energy.

This suggests a number of things. First, that the energy density of spacetime (as a whole) is falling off at a greater than geometrical rate in proportion to its volume. Second, that the rate of expansion is actually slowing, as the energy behind the expansion is depleted. And thirdly, all massive objects in the universe are currently travelling with a very large extant of momentum toward the original cloud's barycenter (with some local difflection of vector from uneaven distributions, and interactions with nearby massed objects with local barycenters) that is already a significant fraction of c.

This would seem to explain a good deal.

1) where did all the missing energy go? It's basically empty and flat spacetime surrounding the visible universe like a bubble. It is far bigger than the visible universe.

2) the odd shifts in rates of expansion of the universe over time, when run backwards with regard to star lifecycles, and isotopic concentrations of clouds and star clusters. The universe appears to expand very predictably, slows down for a long time, then suddenly picks up again with great force. The reason suggested: expansion is at first very pernicious, but attractive forces nearly cancel it out, making for slow apparent expansion. The rate of decay in that expansion is not sufficient for attractive forces to overcome. The distances between major massed objects continues to increas, and the rate of accelleration between them diminishes, but slower than the falloff in actual rate of expansion. The two falloff curves (expansion and falloff of attraction) almost join. Expansion continues, but appears very very slowly. Distances between objects continues to increase, reducing attraction so that the curves again seperate, rate of attraction now decays faster than rate of reduction in expansion. Universe starts to fly apart at exponential rate. Voila, we have our bell shaped curve in rate of apparent expansion.

But we also have another sticky mess.

Under this thought experiment, all mass in the universe has been accellerating (counter to expansion), and is now a significant fraction of c in true velocity. This means it is experiencing time dilational effects, in comparison to the actual rate of spacetime expansion, universe-wide. That is to say, the empty spacetime outside our visible universe experiences time significantly "faster" than ours.

But this also explains a curious observation of our current observable universe. The speed of light appears to be changing. (Rather, the speed of light is the same, but the amount of time being experienced is changing.)

If we, as massed particles in motion, are slowing down our rate of hidden accelleration due to the relentless expansion, then our degree of time dilation will also diminish, resulting in the change in measurements. We are slowing down (or rather, not accellerating at the same rate), so we experience more time, and the speed of light appears to slow accordingly.

This means that the actual age of the universe is "waaaaaaaaaaaaaaay" older than what our astrophysicists have determined by measuring star lifecycles, and is likewise, vastly larger in true volume than measurements in our matter dense region would suggest. (By many orders of magnitude.)

This should produce useful math that makes useful predictions that could then be experimentally evaluated.

But again, it is probably wrong in more ways than can easily be counted.

Comment Re:fair use (Score 1) 215

So, don't misrepresent yourself then. Just misrepresent what you own.

"I am wierd_w and I am the legal owner of all the posts on this slashdot comments page."

I don't mis-state who I am, just what I actually own. (You would have to be an idiot to believe I owned all the posts on this page!)

Thus, "not perjury". ;)

Comment Re:That's so sad. (Score 1) 625

The model is based on observed tissue degeneration, as per the abstract, as well as wide statistical samplings of observable age related mental decline in the human population.

To quote the abstract directly:

"This development-to-degeneration model is testable through imaging and post mortem methods and highlights the vital role of myelin in impulse transmission and synchronous brain function. The model offers a framework that explains the anatomical distribution and progressive course of AD pathology, some of the failures of promising therapeutic interventions, and suggests further testable hypotheses as well as novel approaches for intervention efforts."

In not so many words, it's predictions are congruent with the results of several recent studies in the invervention of age related mental decline, as well with the post mortem and fMRI imaging data collected to date.

While I also doubt that the researcher would go so far as to say it is 100% "God's own truth", it is a theory that appears to correctly predict the observed behavior. You are making the mistake that just because it says the word "Theory", it is equally good to crackpottery.

Comment Re:That's so sad. (Score 2) 625

You are welcome to such a belief, but I hold a different one.

The second law of thermodynamics is just that-- a law. Current observations predict that the universe will die from entropy stemming from unrestrained expansion, and that eventually all protons in the universe will decay.

This means that immortality is fundementally inachievable. The best you can do is fight to stave it off. Much like the carnot equasion showing the maximum possible efficiency for a heat engine, the laws of thermodynamics state the maximum theoretically possible degree of resistance against entropy in the universe you hace put up. If you are truely serious about the effort, you will consume 100% of the non-entropic portion of the energy of the universe to satisfy the attempt. Eternity is a VERY long time.,

Long before then, you will have set about on a genocidal campaign to secure energy sources to sustain your existence at the deficit of other intelligent life. The universe is a big place. We statistically are not alone. Even if we are, another immortal being's existence will radically reduce your own ability to resist the entropic decay of the universe. Eventually, you will fight each other to have the resources and energy the other represents.

The ultimate conclusion of attempting to attain immortality is complete sociopathy.

If you instead say that you only want to extend your life some degree, you still ultimately must accept the inevitability of death. If you are going to do that, why not accept the lifespan you are allready afforded?

Voila-- We have reached my position.

Comment Re:That's so sad. (Score 1) 625

Actually, I avoid getting care for most of my health issues, to avoid this very form of hypocrasy. The soft tissue tumors are benign, and simply cause cosmetic issues, other than being in irritating places that can restrict movement. There is no real reason to remove them.

The blood sugar regulation issue is still in the pre-diabetic stages, and is treatable with diet and exercise.

The heart condition is likewise manageable. I have health issues, but am not miserable. They restrict me, but they dont define me.

I dont seek medical interventions to extend my lifespan, and should one of my lipomas suddenly turn into crazy wild deep tissue cancer, and I go undiagnosed and die from metastatic illness, I wont be upset about it.

I have no interest in ending my life, but I dont actively seek to artificially prolong it either. I will die when I die, and I like not knowing when that will be. It causes me to live each day to its fullest, and not take life for granted. Being promised perfect, unfaultering health and purpetual youth would spoil that. I would turn it down.

Slashdot Top Deals

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...