Follow Slashdot stories on Twitter


Forgot your password?
Slashdot Deals: Deal of the Day - 6 month subscription of Pandora One at 46% off. ×

Comment Re:Sounds nicely balanced... (Score 1) 331

Thanks for that - that's actually the best explanation I've seen on the subject.

To play the "unjustly oppressed minority" you have to appear to have fewer rights than your "oppressors." Political correctness doesn't allow for a fact-based discussion, because some facts are "unmentionable", and if you mention them, you're automatically the oppressor.

Indeed, I have experienced this myself when discussing rather rudimentary things like how nature finds a balance between men and women in physical fitness and martial arts training.

Feminism too often isn't about supporting equal rights for men and women, but about equal or superior outcomes, political correctness, and victimhood.

So it's not just me, I describe it as a question "Are you becoming equal or getting even?"

Of course, it's politically incorrect for me to point any of this out, but I'm fed up with the battle of the sexes and those who promote it rather than seeking a truce and finding common ground.

I think you are onto something there. At almost 200 pounds, people don't fuck with me in real life and I train women. The most down to earth and coolest chicks are the ones who accept the differences. I let them choke or hit me so they understand they can't out strength me, but as I explain that their hips can move this way or that, or to be more persistent in a defence to frustrate an opponent they adjust their game to play to their strengths. We do find common ground and respect that way.

The ones who do kick ass in tournaments and I think that they probably have the same attitude as your daughters.

Comment Re:PS4 (Score 1) 368

Microsoft products are forbidden in my house, I dont trust Sony either but at least they didnt try to sneak a constantly monitored microphone into my living room.

I think that both platforms have their own reasons for owning one. I bought both PS2 and Xbox when they came out and the thing that really pissed me off was that a game like Battlefront that had a really good LAN game option would not join the game if it was started on different consoles. Seriously, the most petty way to say 'fuck you' to console owners.

Consoles are pretty much the same but intentionally "uncompatible". If they were compatible then the argument really would be which one provided a better gaming experience as opposed to game availability. It would be about the games as opposed to the consoles.

I don't use the MS console any more only because anything you can do on their console you can do on a PC better. Xbox, a good console in it's own right, is still a piece of hardware that will be useless at the end of its service life, as will the Playstation.

That said, Sony has been more willing to entertain a linux installation than Microsoft has on their currently deployed hardware so if you want to experiment with Linux the Sony box is probably the best choice, that would be about the only reason though.

As to the hardware, both the PS2 and Xbox still work and will probably end up being linux boxes of some description if I get time or interest to do it.

Comment Re:In Situ Leach Mining (Score 1) 148

Depends on how you consider "significant". Coal mining was, for the longest time, the single most polluting mining activity due to sheer scale. Uranium mining is so small that it's below things like mining for metals like copper and iron.

Here is what Dr. Gavin Mudd, Hydrogeologist had to say about it.

because you don't need a lot of uranium mines to produce an equal amount of power.

and you likewise don't need a lot of ISL to produce a disproportionate amount of pollution.

our heirs would be digging it up for the valuable stuff we buried.

So bury the fuel processing and reactors with them, make it closed loop and teach your heirs how to maintain it, then keep building reactors in the same facility. That would support your 40%, 20,20,20 percent vision for a very long time. States would clamor over themselves to place such facilities as it would also attract industrial power users and infrastructure.

It's more likely that the AP1000 or EPR would be vastly beyond it's original end of life.

Unless you have come up with a solution to neutron embrittlement of the reactor vessel then I very much doubt that.

We have centuries of ore available, and we can even filter the fuel out of seawater on an energy-positive basis.

Well, I'd like to see what advances in membrane and pumping technology you are reffering to that achieve that in sea water and that doesn't change anything I sais when refering to extractying from hard ores. Your making a lot of assumptions about the efficiency of the process.

Basically, at about an OOM above present ore prices, filtering the Uranium out of sea water becomes profitable.

Except we are talking about joules per ton or kilolitre, not dollars. Energy costs cannot be deffered, only incurred.

wouldn't actually end mining, but would reduce it an order of magnitude or so...

Perhaps at the beginning. After that you would be burning ex-once through, plutonium and DU for thousands of years. That's what I'd prefer to hand down to our heirs, a solved problem, as opposed to a problem. Incidentally, the numbers were more around a ~100Tw advantage over once through and another ~100Tw if the reactor is disposed of in-situ.

Comment Already crashed it (Score 1) 126

Well after playing the first two the of BF looks like it has lost the LAN player capability and has become a delivery mechanism for EA.

It looks great, similar gameplay, but the load out is different from the first two. A bit sad, for pure fun, the first game was the best. I really just want to play the game with my mates on a lan, not over the net where it is slow.

Comment In Situ Leach Mining (Score 1) 148

In Situ Leach Mining is pretty nasty and there are a couple of points worth considering about it.

First, it is restricted to soft ores. The *availability* of this method is not indicative of total supply. Uranium in hard ores still have to be refined with crushers before processing further.

Acid leech mining is 'in-situ' - meaning the acid is pumped into the ground to dissolve rock. The risk you introduce with this method is polluting the water table. Any failure to assess the geology properly and it poisons aquifers. Some of these are used for farming so water is a pretty important commodity especially in dry continents where they mine uranium. I beleive this type of mining is illegal in the US for similar reasons.

It's a similar externality to the coal industry releasing radionuclides in ash into the air, except into water table instead, as in both cases we are talking about a natural radio-isotope, i.e. before enrichment. So it may not be a worthwhile risk to take to get uranium that way.

The liquid tailings produced have to be contained in acid dams. This liquid has a chemical and radiological toxicity of it's own and we have to store this by-product.

Uranium mining already produces liquid acid effluents and I recall that one of these sites had a 2 megalitre acid tank, full of radioactive effluents, burst in a world heratige national park a few years ago.

It may not be wise to trust water tables to these guys based on their existing record considering that ISL will create a lot more liquid tailings than the existing processes.

However it neatly illustrates the core design issue of the once through cycle used in modern reactors. The front end industrial processes (mining) forces you to make this essential choice:

1. you crush and process the ore and use a huge about of energy to get your fuel

2. You spend less energy but take a really big risk

These are the options the Nuclear industry has to consider to get it's fuel, it's no better than coal this way, just differently bad.

Of course the third risk is if you run out of that available fuel you *have* to use hard ores which means you have committed yourself to a very low energy return for the remaining lifetime of your new AP1000 or EPR.

This might also provide enough context to understand how the thorium fuel cycle creates a second similar problem with a different waste stream and, why the development of burner reactors was so important to end mining whilst transmuting existing transuranic stocks from the once through cycle.

Mining and refining nuclear fuel is almost a footnote

I think you will find that the energy consumption of this front end industrial process is the whole reason to consider ISL and that both of the mining processes create significant negative environmental externalities.

Comment Re: 20 cores DOES matter (Score 1) 167

"Memory bandwidth can become an issue"

Bandwidth is almost never an issue. _Latency_ is another matter.

Thanks stoatwblr, that's exactly what I was talking about.

There are a lot of tricks and bits to optimise things regarding locality (mainly around row based and lookahead accessing. CPUs aren't the only devices trying to predict what will be read next) and controller optimisation, but the underlaying dynamic ram itself hasn't actually improved much over the last 20 years in terms of time between addressing a random cell and getting an answer back from it.

Which is why I'm always trying to tune the amount of application latency so the CPU cycles can be used for actual work. Obviously everyone's workloads are different but it makes sense to provide a bit of a helping hand to the machine (usually so I can go home)

The big improvements have been around the number of requests you can make while waiting for that answer instead of being in request-answer lockstep and there is only so far that can be taken.

That's an interesting development I hadn't heard off. That would have a major impact on reducing application latency, I think I'll have to get my head around how it will affect CPU scheduler behaviour though.

In a similar vein, I've heard of new methods to move CPU scheduler functions to the cpu and arrange the on-chip memory so that it can load tasks physically closer to the actual core processing them so perhaps the two developments are somehow related. I appreciate you completing the picture of what's coming.

In the context of what I'm doing, I'm trying to get more work out of the machine.

_true_ 1GHz ram would have 1ns latency, not 12-30ns - and the reason that L1/L2/L3 is so important is because of that poor response time.

Precisely, if my calculations are correct, it's about the amount of time an instruction on a 1Ghz CPU would take. When I see a lot of context switching and minor page faulting my application is really exposed to that and the cpus are doing a whole lot of nothing useful waiting for L3 to write(mainly) to ram or to fill.

I've seen memory thrashing on productions systems, it's never pretty being in a PIR.

Comment State based terrorism (Score 3, Interesting) 167

Islamic terrorism is the excuse used to roll out state based terrorism. Bills introduced after the attacks in New York were the beginning and they have been consistently rolled out since then.

Generally covering up political incompetence appeared to be the core motivation, at first, but what better way to continue to roll out a campaign of harassing the populations of western democracies than by propping up and enhancing an ineffectual security theatre.

Having spent significant time reading these bills and writing to politicians to either stop or modify the wording of these laws it's pretty clear that ineptitude and general laziness has been behind the services inability to stop these attacks, most governments already have ample power to stop these attacks.

Most western countries passed effective terrorism laws back in the days of the IRA, ample powers were available to all these countries to stop terrorist attacks for decades. Not doing so allows our governments and controlled media to whip the populous into a frenzy that allows more state based terrorism to be rolled out in the form of laws none of us deserve.

Why? Because what the state is saying is you have no right to protect your rights and freedom and that it has a right to inspect the minutia of your life. In doing so it is also happy to expose you to organized crime, which has no impact on the state.

As distasteful as it sounds, the illusion of our freedom was over a long time ago. Orwell was an optimist in terms of what capabilities the state would have and, as usual, the moronic machinations of Islamic extremists give governments the excuse to drive us closer to a police state every day.

Comment Re: 20 cores DOES matter (Score 1) 167

Let me rephrase... we do bulk builds with pourdriere of 20,000 applications. It takes a bit less than two days. We set the parallelism to roughly 2x the number of cpu threads available. There are usually several hundred processes active in various states at any given moment. The cpu load is pegged. Disk activity is zero for most of the time.

Have you ever considered using sar to check the amount of minor page faulting going on? It would be interesting to measure the activity between L3 cache and memory, it's possible that memory is thrashing as the CPU scheduler attempts to divide time between logical cores, that are actually the same physical core.

My applications are messaging systems so they aren't transient processes, like a compile. Over 20,000 applications that is a lot of context switching and from what you have described he amount of latency introduced by the CPU scheduler when it context switches would be high enough to consider tuning your your build process.

In our case there are always two or three ridiculously huge source files in the GCC build that the Make has to wait for before it can proceed with the link pass.

they're probably a good candidate for CPU affinity however I don't think you can isolate a cpu at runtime so unless you bump up it's priority the cpu scheduler can still kick it off L3 and the core it is using. The point I'm making here is if you dedicate a core to these bigger source files the efficiencies you gain may be enough to allow the link to proceed sooner, if that is important to you.

It would also follow that when your processes start generating I/O to write the objects prior to your link phase the processes would all be vying for IO attention and as each gets priority I am almost certain you would see very high levels of minor page faults as the scheduler tries to balance them. You might find that being *unfair* about which process gets access to CPU resources actually reduces your overall compile time.

So its a good idea to look at the footprint of the individual processes you are trying to parallelize, too.

Absolutely! Parallelism is great however you really have to know your processes and behaviour of the CPU and IO schedulers to get the most out of them.

Memory is cheap these days. Buy more. Even those tiny little BRIX one can get these days can hold 32G of ram. For a decent concurrent build on a decent cpu you want 8GB minimum, 16GB is better, or more.

Agreed! Even so it's so slow compared to CPU cache and the expense of throwing a process off a core when you do a context switch is worth investigation as you may still be able to yield good gains. Having more memory is great however having a bigger CPU cache is much better as the CPU scheduler has to context switch less often.

I'm actually more excited about the cache on these things than the cores as that has a greater potential for increasing system efficiency than just increasing the cores alone.

"We shall reach greater and greater platitudes of achievement." -- Richard J. Daley