Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:Is the complexity of C++ a practical joke? (Score 5, Insightful) 427

And? Life is a learning experience, so break out the reference manuals and learn something new. Unless you've been thrown in way over your head it's unlikely you'll encounter more than one or two new methodologies in any given codebase, and it'll probably be pretty glaringly obvious when you run into a language feature you don't understand.

Comment Re:Cutting features and old syntax? (Score 1) 427

Why would you assume such a thing? C++ is rife with old, rarely-used "cruft". Meanwhile templates and lambdas are great, but they provide advantages largely orthogonal to inheritance. I'll grant you that purely on merit overloading is potentially a legitimate "cruft"candidate as it adds essentially no functionality while potentially increasing confusion, especially operator overloading, but it's so pervasive within the standard libraries that extracting it would likely drive away most of the developer base.

Comment Re:A little behind the times (Score 1) 315

>Lets be very clear, their data shows no such force.
Oh? Then why didn't their control load generate a thrust? As I recall they tested three devices this time: a Cannae drive, a null-Cannae drive (still supposedly a "valid" EmDrive design), and a resistive control load. Both drives showed roughly the same thrust, which changed direction along with the drive, while the equivalent resistive load showed nothing. It's possible that both drives were operating as ion drives (since they weren't in hard vacuum) or magnetically interacting with the lab equipment, but until such an error source is identified you cannot simply discard the results

I agree the theory doesn't pass the hinky-meter test, but the tests to date all seem to suggest that *something* is happening - or do you know of a number of independent, well-run tests that have shown a complete absence of thrust? This would hardly be the first time a working technology was built upon a nonsensical theory. (I give you exhibit A - most of Western medicine until the last couple centuries.)

Comment Re:LHC distinctions (Score 1) 100

And you, it seems, miss my point as well: I'm perfectly aware of how often *single events* of LHC energies or higher hit the Earth, and am not terribly concerned with them - in a few billion years if a single-event catastrophe were at all likely it probably would have occurred.

But consider multi-event interactions that might permit a dangerous particles to clump together into something that could expand fast enough to become catastrophic. Have you actually looked at the LHC flux? The LHC's design luminosity (interactions per second per cross-sectional area) is 10^34 cm^-2s^-1. Or to use the units in the graph, 10^30m^-2s^-1. Compare that to the TeV-range flux in your graph, at ~10^-6. The LHC is creating a TeV flux 10^36 times higher than that due to cosmic rays.

Besides which those flux levels are for the upper atmosphere - only an extremely small percentage of cosmic rays actually make it to the surface since, while the atmosphere is vanishingly thin at (essentially) lightspeed, it's also incredibly dense. The odds of making it through a few hundred miles of atmosphere without interacting with a single atom is extremely low.

Also worth considering is that thanks to conservation of momentum any black hole produced by a cosmic ray will itself be traveling at an appreciable fraction of lightspeed, and unless it accumulates mass VERY quickly to slow down it will pass through the Earth in a matter of milliseconds and be free to evaporate in interplanetary space. An LHC black hole on the other hand would be created from a head-on collision of equal-energy particles - any resultant black hole would be nearly at rest, and firmly gravitationally bound to the Earth.

>Now imaging the same rain falling on every neutron star. No black holes form that way either, and that's as dense a target as you can ask for.
Citation? ;-). We've never even definitively observed a black hole, and now you want to make absolute claims about how they do and don't form? How would we know if every once in a while a neutron star collapsed due to an encounter a micro black hole? We've only been seriously watching the skies for a few decades. Also - neutron stars tend to be extremely small, at ~20km across, and have MASSIVE magnetospheres, often with jets of material belching out along the poles which would mostly pre-collide with any cosmic rays that might be aimed at that weak point. They're probably almost completely shielded from cosmic rays.

Comment Re:A little behind the times (Score 1) 315

> Many orders of magnitude larger than here.

Many is vague: let's put a real number on it and say 6 orders of magnitude, or 25 MW. Pretty good size I'd venture. Multiply that by ~40uN/25W and you've got all of 40 Newtons of force. How often do you really suppose a 25MW microwave resonator is operated in a situation where 9lbs of force would be noticeable? Unless I'm drastically overestimating the size of such a thing friction alone would likely hold it in place. And if not a single 1/8" bolt certainly would.

Besides which, you're assuming your average microwave resonator would generate a force - and if there's anything at all to the claims that geometry of the resonator is an essential component then your average symmetrical resonator would not.

Finally, if we take the NASA measurements as indicative of forces being generated we're talking 1600nN/W, almost 500 times the efficiency of a radio-drive. (Incidentally, where did you find that 3.3uN/W number? I've been looking for a trustworthy value for radio thrust). Besides which if it actually works at all, then we're talking about an entirely new propulsion technology - the efficiency can be expected to improve dramatically as our understanding of the physics behind it improves. You wouldn't expect a Babylonian steam engine to be able to haul a train up a mountain, would you?

Comment Re:What if it were Microsoft code (Score 1) 191

Actually I believe in this case it does - the ONLY thing giving them license to distribute this software is their compliance with the GPL - don't comply, and you have no license (there's some legal precedent establishing that the GPL terms are conditions for acquiring the license rather than terms of a covenant - which among other things allows GPL authors to request injunctions against ongoing copyright infringement. Something only possible if the distributor has no license, not if they're only in violation of a contract.)

Regardless, in this case it sounds like Ximpleware's patent pledge was basically "GPL software doesn't have to worry about these patents". And since neither Ameriprise nor Versatta are distributing software under the GPL, neither is in much of a position to argue they are covered by that pledge.

Comment Re:So, such rules are bad for keeping people worki (Score 1) 327

>But of course, if you want stuff other than just not working, then you need to come up with a way to get that or have someone get it for you.
Indeed - and that mechanism is typicaly simplified as money. And since most people are working 40-60 hour weeks that greatly deflates my bargaining power for compensation - the labor market is flooded.

Just for the sake of argument lets suppose I was elected God for the week, and cut the length of the work week in half, along with doubling all pay rates so that everybody makes the same amount of money as before despite working half as long. And banned any preferential treatment for people working multiple shifts on pain of damnation (What? Where's the fun in being God if you don't get to dish out some hellfire?). What would that do?

First off you'd need to pay twice as many wages for the same amount of labor, so the labor costs of every good and service on the planet would roughly double - the capital costs however would remain unchanged, so depending on the particular good or service the point-of-sale costs would be somewhere in the range of 100-200% of normal. Let's say 30% of the average purchase is labor costs - double that and the average item then costs (.7+.3*2) = 130% of normal. That means your buying power from working a single job has has been cut to 1/(130%) = 77% of before.

Certainly everyone could start working double shifts to launch themselves to 144% of their previous buying power, but I'm betting a whole lot of people would decide that effectively earning 77% as much while halving their workload is actually the better deal. And if 10-20% of the population was happy with one job unemployment would vanish almost overnight as the market scrambled to fill empty shifts.

If *most* people were content with one job and a reduced income things would start to really improve - the labor market would be dramatically slashed, and the law of supply and demand means that wages would rise across the board as businesses compete for a limited labor pool. Hard to tell where things might end up, but if we were to assume another doubling in hourly wages we'd be talking about increasing the average item cost to (.7+.3*4)=190% of present, while the average single-job earner would be making 200% of present, for a 5% increase over current buying power despite working half as much.

Of course more advanced automation would also become more cost effective - but the price of that is in free-fall already, so I doubt it would make much difference in the long term.

Comment Re:One script kiddie made a mistake (Score 1) 100

Certainly cosmic ray events occur on a regular basis - however, how often do you suppose a tight cluster of thousands or millions of cosmic rays all simultaneously strike a same square millimeter of the Earth's surface in order to mimic a LHC event? A single QBH or strangelet may be harmless - make a few, or a few million, in close proximity in the same instant and the same isn't necessarily true.

As for your charged black hole - what makes you think it would stay charged? It's going to be falling right through solid matter, passing through innumerable electron clouds, and if it should snare a few for itself with its charge it will no longer have a net charge. How long do you suppose an electron whose wave-function interpenetrates a black hole will avoid being absorbed?

Aside from which you are again presuming our theories on the mechanics of black holes are correct. If that is the case then they've run the numbers and the risk is vanishingly small - not nonzero of course, but it would take a *really* unfortunate string of coincidences to occur. As I've explicitly pointed out I'm operating on the assumption that our theories are NOT perfect - it would be hubris to assume otherwise.

Comment LHC distinctions (Score 1) 100

Heh heh. The only problem of course being that they're not actually monitoring the LHC for all possible black holes that could potentially be created, and we have no idea how long it would take for a terminal event to build to noticeable levels. There could at this very moment be a microscopic black hole orbitting within the Earth, absorbing new matter just barely faster than it evaporates, biding it's time as it grows toward critical mass.

And no, there's two more important things special about the LHD as compared to the reactions taking place in our upper atmosphere (I assume that's what you were implying):

1) The reaction density is far higher - one black hole/strange-matter particle/etc. might well decay faster than it could reach critical mass, but what happens when you're creating thousands or millions of them all at once within a few cubic millimeters? A bit of bad luck and a few of them may combine into a mass large enough to be self maintaining - especially considering...

2) It's on the ground. Anything spawned in the upper atmosphere is going to spend the first few seconds of it's existence falling through low-pressure air. Opportunities to "feed" off normal matter would be few and far between. The same self-catalyzing particle created in the LHD would be encountering millions or billions of times as much matter in the same amount of time, as it passed through the test chamber and rapidly into solid rock. And the matter would be solid, which could potentially accelerate things dramatically as well - perhaps a black hole could not absorb free particles fast enough to survive for long, but how do the dynamics change when absorbing a large molecule, causing mutual acceleration of subsequent atoms towards the black hole through electrostatic forces rather than the vanishingly weak gravitational attraction which would be all it could initially muster.

Of course we could try to take comfort in the old "all events not prohibited are mandatory" - the earth has been around for billions of years after all, and we know cosmic rays do occasionally reach the ground. But mandatory does not mean frequent, the Earth has only been around for a few billion years, and our instruments are not yet sensitive enough to notice a collapsed planet around another star to do a statistical survey. Would you care to speculate on how often a huge, super-tight cluster of cosmic rays manages to reach the Earth's surface all at once in order to mimic a single large-scale LHC test?

Comment Re:One script kiddie made a mistake (Score 1) 100

Indeed - stupidity is the one "resource" our species is unlikely to ever run out of - even the brightest amongst us have more than enough stupid to screw up regularly.

And I think even the nuclear arms race probably wasn't peak stupid - we almost certainly couldn't sterilize the planet, and within a few centuries the radioactive fallout would have decayed to background levels again - probably only decades in some of the more out-of-the way corners of the globe.

Meanwhile things like nanotech and biotech have the potential to completely escape our control. You don't even need a grey-goo scenario - release enough buckyballs into the environment and virtually all cellular life on the planet will grind to a stop - you can't clean the stuff up, and it essentially never breaks down.

Not to mention doing things like operating particle accelerators on Earth that we think could well produce quantum black holes. Sure we're pretty sure they'd evaporate harmlessly, but if we were *certain* of the physics we wouldn't be wasting time building ever-larger particle accelerators. Take that shit to the Moon or something - then if something goes wrong we just end up with a black hole in orbit - sure, it screws up romantic moonlit nights, but who knows what advances might be possible with a singularity in easy reach.

No, I'm pretty sure we'll have plenty of stupidity for millennia to come.

Comment Re:What if it were Microsoft code (Score 1) 191

However, If XimpleWare granted a patent license or non-enforcment covenant to derivative GPL works (as I believe is the claim) the issue becomes moot - anyone using the code under the GPL license gets a patent license automatically - so no additional restrictions are imposed. However, as soon as the GPL is violated the patent license is likewise null and void, so that you are infringing both the copyrights and patents on the software(at least that's how I'd expect the patent license to be written).

As for XimpleWare's rights - so long as they're only distributing their own code I don't think there'd be an issue even if they insisted every GPL licensee purchased a patent license separately. It's their software, they can do whatever they want with it - the license only specifies what YOU can do with that software once you have it. Of course it would probably be pretty self-defeating to release their code under the GPL only to hold the threat of patent litigation over any downstream project. Not to mention putting them in legal hot water if they were incorporating anyone else's GPLed code into their product.

But again - so long as they've granted a patent license to anyone with a valid GPL license, it's a non-issue. The problem here is that Versatta doesn't have a valid GPL license, and thus can't extend it to their customers. And their customers don't have sufficient rights to Versatta's software to be able to independently bring themselves into compliance with the GPL (Ameriprise, the customer, are themselves currently suing Versatta to try to get those rights, though it's far from clear that they will suceed). And without a license under the GPL they have no claim to a patent license either. Or so I would expect a competent judge to rule.

Now if Ameriprise were an end-user I would be inclined to think Ximpleware were just being dicks, but Ameriprise is continuing to further redistribute the software to it's OWN customers, despite knowing that they are in violation of the GPL by doing so (they're the ones who brought it up). Under the circumstances I think Ximpleware is well within it's right's, both legal and moral, to demand that Amerprise stop selling a product that infringes on their patents without a license.

Slashdot Top Deals

"Engineering without management is art." -- Jeff Johnson

Working...