Forgot your password?
typodupeerror

Comment: Re:Paving to the road to hell (Score 1) 135

by Immerman (#47677435) Attached to: The Man Responsible For Pop-Up Ads On Building a Better Web

> I think the best you can do is try to teach people to understand when they are being manipulated and hopefully it will some day cease to be profitable enough for folks to continue doing (one can always hope).

I can do you one better - we could ban ALL attempts at psychological manipulation in advertising, restricting ads to only strictly factual statements about the product. If that's a bit to vague for we could start with a set of concrete guidelines: No sexuality or sensuality of any form will be portrayed or implied in an ad. No social situations will be displayed or heard. No implications may be made that a product will increase your social status or other desirable qualities unless it is specifically being marketed to do so (and is thus vulnerable to false advertising charges). And I'm sure we could think of a few more, and would have to add still more as marketers found new buttons to push.

That would still allow advertisers to inform their audience of the availability of their product and whatever wonderful features it has. They just can't attempt to inspire any emotions other than "this is a wonderful product on it's own merits". You can show the car and its luxurious interior, you can list it's impressive specs, and demonstrate the surly growl of the engine. You just can't attempt to manipulate your audience into wanting it more than they're pre-inclined to do.

Comment: Re:What if it were Microsoft code (Score 1) 191

by Immerman (#47677177) Attached to: Larry Rosen: A Case Study In Understanding (and Enforcing) the GPL

Damn it, what's with you and being technically right? ;-) Tort then.

I believe with a torrent it would be pretty straightforward to figure out how many people received at least a part of the file directly from you - folks are usually caught by automated monitoring systems after all. I could see how it might depend on the implementation details though. It may be more challenging to determine how many people it then spread to, but that's no different than sneakernet or any other infringement avenue, and I believe that typically a person is only legally liable for the distribution they themselves perpetrated.

For open source though, how do you calculate monetary damages when the license under which the commercial pirate received the software imposes no monetary cost? Now granted, in this case a parallel proprietary license is also available which likely changes things, but if it were Linux or something instead, where getting an alternate license is likely impossible at any price?

Comment: Re:Cutting features and old syntax? (Score 1) 427

by Immerman (#47675785) Attached to: Interviews: Ask Bjarne Stroustrup About Programming and C++

Ooh, casts, good one.

And an excellent point with templates and overloading.

As for operator overloading - I partially agree. Vector and matrix math likewise become far less awkward thanks to operator overloading, I really miss it when working in languages without it. But the functionality is unchanged, I just need to type more without it. And I have noticed that as the number of overloaded complex numerical types increases the odds of an unexpected interaction arising increases rapidly, especially thanks to implicit casts being invoked. You're just asking for trouble when "a=b+c" can invoke an implicit chain of a half-dozen functions to become a syntactically valid statement.

Comment: Re:Cutting features and old syntax? (Score 1) 427

by Immerman (#47675755) Attached to: Interviews: Ask Bjarne Stroustrup About Programming and C++

They're absolutely not. Perhaps I could have phrased that more clearly: Orthogonal functionality is the best kind of functionality, maximum flexibility and with minimum learning curve. Why would you want to deprecate inheritance in favor of something else rather than letting them amplify each other?

Comment: Re:LHC distinctions (Score 1) 100

by Immerman (#47675727) Attached to: Password Gropers Hit Peak Stupid, Take the Spamtrap Bait

No, I'm arguing for conservative risk-taking in the face of a species-terminating potential risk. You need to propose a mechanism under which you're CERTAIN that flux doesn't matter. One quantum black hole or strange particle may well evaporate faster than it can feed, but create a swarm of dozens or thousands of them simultaneously and some of them may manage to combine into something dangerous.

Certainly, we know that their are old planets and neutron stars. That's not the question. The question is "are their any we've never seen because they were swallowed up before we looked?" There's a world of difference between "very rare" events and "impossible events"

Why wouldn't someone be comfortable working at the LHC? If it does somehow manage to destroy the world it's not going to matter where you're standing, and at ground zero you might at least have a chance to know what happened.

Comment: Re:What if it were Microsoft code (Score 1) 191

by Immerman (#47675653) Attached to: Larry Rosen: A Case Study In Understanding (and Enforcing) the GPL

You are technically right* - they are not violating the GPL, they are committing copyright infringement (aka commercial piracy) for distributing Ximpleware's software without a license. A much more serious crime. Without an explicit contract the only license they might have claimed to be operating under is the GPL, and they are violating the conditions that license is contingent on.

* contingent on precedent-establishing confirmation in higher courts that GPL compliance is a condition of the license, and not one of it's terms.

At this point in a lawsuit, when the pirates are caught red-handed in a case of long-term infringement the lawyers typically hunker down and tell their client that their options are:

A) Come into compliance with the GPL - meaning release their entire derivative product under the GPL - in which case the lawsuit usually goes away. I'm not sure if the GPL is actually written that way, or just that plaintiffs to date haven't tried for damages.

B) Buy off the plaintiffs so they drop the case. This probably involves some sort of proprietary parallel license, and possibly back-payments as well.

C) Be found guilty of copyright infringement, pay statutory damages, and then be forced to do either A or B.

D) Continue to pay their lawyers to spin a fancy song and dance for as long as they like in the hopes that they can flummox the judge, but eventually, almost certainly, do A, B, or C.

It seems like the SCO saga did a pretty incredible job of showing a company twisting on that hook while some of the best lawyers in the country dragged the GPL through the coals. And the GPL came out unscathed. I can't think of any case since where a company has managed to make a serious case against coming into compliance. I couldn't swear that it's causal, but I strongly suspect that the pirate's lawyers tend to do a little research, come across Groklaw's insanely well-documented and well-organized collection of case history and other relevant information, and then, once they've had a chance to fully absorb what they're looking at, despair.

Comment: Re:Is the complexity of C++ a practical joke? (Score 5, Insightful) 427

by Immerman (#47671925) Attached to: Interviews: Ask Bjarne Stroustrup About Programming and C++

And? Life is a learning experience, so break out the reference manuals and learn something new. Unless you've been thrown in way over your head it's unlikely you'll encounter more than one or two new methodologies in any given codebase, and it'll probably be pretty glaringly obvious when you run into a language feature you don't understand.

Comment: Re:Cutting features and old syntax? (Score 1) 427

by Immerman (#47671891) Attached to: Interviews: Ask Bjarne Stroustrup About Programming and C++

Why would you assume such a thing? C++ is rife with old, rarely-used "cruft". Meanwhile templates and lambdas are great, but they provide advantages largely orthogonal to inheritance. I'll grant you that purely on merit overloading is potentially a legitimate "cruft"candidate as it adds essentially no functionality while potentially increasing confusion, especially operator overloading, but it's so pervasive within the standard libraries that extracting it would likely drive away most of the developer base.

Comment: Re:A little behind the times (Score 1) 315

by Immerman (#47671517) Attached to: Why the "NASA Tested Space Drive" Is Bad Science

>Lets be very clear, their data shows no such force.
Oh? Then why didn't their control load generate a thrust? As I recall they tested three devices this time: a Cannae drive, a null-Cannae drive (still supposedly a "valid" EmDrive design), and a resistive control load. Both drives showed roughly the same thrust, which changed direction along with the drive, while the equivalent resistive load showed nothing. It's possible that both drives were operating as ion drives (since they weren't in hard vacuum) or magnetically interacting with the lab equipment, but until such an error source is identified you cannot simply discard the results

I agree the theory doesn't pass the hinky-meter test, but the tests to date all seem to suggest that *something* is happening - or do you know of a number of independent, well-run tests that have shown a complete absence of thrust? This would hardly be the first time a working technology was built upon a nonsensical theory. (I give you exhibit A - most of Western medicine until the last couple centuries.)

Comment: Re:LHC distinctions (Score 1) 100

by Immerman (#47671389) Attached to: Password Gropers Hit Peak Stupid, Take the Spamtrap Bait

And you, it seems, miss my point as well: I'm perfectly aware of how often *single events* of LHC energies or higher hit the Earth, and am not terribly concerned with them - in a few billion years if a single-event catastrophe were at all likely it probably would have occurred.

But consider multi-event interactions that might permit a dangerous particles to clump together into something that could expand fast enough to become catastrophic. Have you actually looked at the LHC flux? The LHC's design luminosity (interactions per second per cross-sectional area) is 10^34 cm^-2s^-1. Or to use the units in the graph, 10^30m^-2s^-1. Compare that to the TeV-range flux in your graph, at ~10^-6. The LHC is creating a TeV flux 10^36 times higher than that due to cosmic rays.

Besides which those flux levels are for the upper atmosphere - only an extremely small percentage of cosmic rays actually make it to the surface since, while the atmosphere is vanishingly thin at (essentially) lightspeed, it's also incredibly dense. The odds of making it through a few hundred miles of atmosphere without interacting with a single atom is extremely low.

Also worth considering is that thanks to conservation of momentum any black hole produced by a cosmic ray will itself be traveling at an appreciable fraction of lightspeed, and unless it accumulates mass VERY quickly to slow down it will pass through the Earth in a matter of milliseconds and be free to evaporate in interplanetary space. An LHC black hole on the other hand would be created from a head-on collision of equal-energy particles - any resultant black hole would be nearly at rest, and firmly gravitationally bound to the Earth.

>Now imaging the same rain falling on every neutron star. No black holes form that way either, and that's as dense a target as you can ask for.
Citation? ;-). We've never even definitively observed a black hole, and now you want to make absolute claims about how they do and don't form? How would we know if every once in a while a neutron star collapsed due to an encounter a micro black hole? We've only been seriously watching the skies for a few decades. Also - neutron stars tend to be extremely small, at ~20km across, and have MASSIVE magnetospheres, often with jets of material belching out along the poles which would mostly pre-collide with any cosmic rays that might be aimed at that weak point. They're probably almost completely shielded from cosmic rays.

Comment: Re:A little behind the times (Score 1) 315

by Immerman (#47670891) Attached to: Why the "NASA Tested Space Drive" Is Bad Science

> Many orders of magnitude larger than here.

Many is vague: let's put a real number on it and say 6 orders of magnitude, or 25 MW. Pretty good size I'd venture. Multiply that by ~40uN/25W and you've got all of 40 Newtons of force. How often do you really suppose a 25MW microwave resonator is operated in a situation where 9lbs of force would be noticeable? Unless I'm drastically overestimating the size of such a thing friction alone would likely hold it in place. And if not a single 1/8" bolt certainly would.

Besides which, you're assuming your average microwave resonator would generate a force - and if there's anything at all to the claims that geometry of the resonator is an essential component then your average symmetrical resonator would not.

Finally, if we take the NASA measurements as indicative of forces being generated we're talking 1600nN/W, almost 500 times the efficiency of a radio-drive. (Incidentally, where did you find that 3.3uN/W number? I've been looking for a trustworthy value for radio thrust). Besides which if it actually works at all, then we're talking about an entirely new propulsion technology - the efficiency can be expected to improve dramatically as our understanding of the physics behind it improves. You wouldn't expect a Babylonian steam engine to be able to haul a train up a mountain, would you?

: is not an identifier

Working...