Forgot your password?
typodupeerror

Comment: Re:They used to be called UHF TV tuners (Score 1) 237

by Orp (#47815369) Attached to: Mysterious, Phony Cell Towers Found Throughout US

I never did that but a long time ago (80s) I did listen to some fascinating conversations broadcast in the clear around 1.7 MHz - just past the AM band - off of a cordless phone somewhere near my neighborhood. I had an old Hallicrafters shortwave radio that weighed nearly as much as I did (even more with the big external speaker). I don't remember the details of the conversations, only that it was mostly stupid stuff as would be expected.

Comment: $150k? (Score 1) 220

by Orp (#47201737) Attached to: NSF Researcher Suspended For Mining Bitcoin

I do most of my research on supercomputers. "Servcie Units" (SU's) are the currency on these machines. They are usually either node hours or core hours. Typical allocations are in the hundreds of thousands to millions of SUs.

I don't know what formula they used to come up with a dollar value. It would be nice to know, however, as I am in academia where real dollar grants get all the attention since they come with that sweet overhead. I'm sure my dean would appreciate the symbolism of getting the college overhead in SU's (and converting them to dollars).

But seriously, these machines are up 24/7 (unless down for hardware fault or maintenance) and while I'm sure they draw more current when the CPU is pegged if this guy was mining bitcoins with his allocation then really all he did was go against the terms of his allocation. Those SUs would have either been wasted or used up anyway. But you just don't mine bitcoin on federal supercomputers, man. Dick move.

I hope he at least used GPU accelerators with his code, the bastard.

Comment: Hate Variable Air Contraption (Score 1) 216

by Orp (#46998517) Attached to: Who controls the HVAC at work?

I have had an office in three different buildings on campus of my university. The first office was fine. I had a situation in the second building where the noise was in violation of Eurpoean Union standards for noise (I had the level measured with a SPL meter) but a couple of dB too low for OSHA. It was maddening; for months I begged facilities to address the issue. The office suite I was in had been converted from a lecture hall and there was this major HVAC hub above my desk, and it turned out they had the pressure way to high flowing through the vents. I wore earplugs a lot.

In the third building I am in, I have a situation where the temperature fluctuates about 15 degrees F daily. Yes, I measured this and plotted it with a little weather tracker. In this case, the thermostat for the office is located in another office. And the university spent hundreds of thousands of dollars to renovate this old building. I guess that's what happens when you always take the lowest bidder.

I am rather sensitive to noise so I'd rather have the fluctuation temperatures in a quiet office than pleasant temperatures in a noisy office, and I understand that when you remodel you might get weird results like this. But that doesn't stop me from wanting to strangle people.

Comment: Re:Does anyone even use Google's office suite? (Score 3, Informative) 89

I use it for "simple" stuff - for instance, it's very convenient to have a place to take notes at meetings (I do a lot of that with my job). Since I always have wifi where I work it's just a matter of opening up the Drive website and creating a new document. And then everything's in one place and it's easy to find stuff with Google's search, which works on document names and document contents.

I do create some "production quality" documents from within the Docs world, and export them to PDF or DOCX so I can share. But these documents are generally simple; the complex stuff I do in LaTeX. I really do not like Word with its seven thousand ways to frustrate me and the weird layout that I've never really gotten used to since they majorly changed it years ago. Libreoffice and Google's docs editor are nice and relatively simple and I find them easier to use. But I go back to Word when I have to which is frequently since "everyone" seems to use it.

It's convenient to have the ability to open attachments (from Gmail) in Drive/docs for quick viewing, but stuff created in Microsoft's Office doesn't always convert very well.

I fully realize what Google is doing by "sucking me in" to their world and having everything I do be stored on their servers. Ever since I bought a Chromebook Pixel and got the 1 TB of Drive space, I'm always finding ways to use it. I know they just want to harvest everything I do - so for the sensitive stuff I have an encrypted (ecryptfs) partition with Dropbox that I can mount on my Linux machines, and for wholesale archival storage of sensitive stuff I use PGP and stick it wherever. If Google Drive allowed the ability to mount the drive partition under Linux like Dropbox does, I would probably "drop the box" altogether.

Comment: The Why (is obvious) (Score 1) 2219

by Orp (#46202119) Attached to: Slashdot Tries Something New; Audience Responds!
If you want to know "why" do a Google Trends search on slashdot. You see something that looks like e^-x, asymptoting towards zero.

It's pretty bloody obvious what's going on. A company has a unique asset, this asset is not making them money. I can sort of sympathize with this. You gotta pay the bills, right? So they try to broaden their audience (no quotations, I get that too). But as all the old timers have said - repeatedly - they only come to the site for the comments, and WE ARE THE MOTHERFUCKING COMMENTS. This has to drive marketing nuts, I suppose. There are a bunch of other more "fun" sites out there where idiots can blather on about crap. Slashdot is unique in that it has a highly technical/educated audience and a good moderation system (hah, I remember the uproar over the current comment system when it first came out, or was tweaked - somehow people DO get used to these things!).

I think the reason why you're not seeing Dice or whoever ask for the opinion of the current folks who use slashdot, is that they already know the answer. It's not about you/us - it's about getting new people on board. The problem is, that is a losing proposition. Slashdot's readership/writership really goes back to the USENET days of absolutely no moderation and a has a free-for-all meritocracy mentality. Slashdot has been around for long enough that I think your audience has already found you. We're already here. We are middle-aged highly educated highly opinionated nerds who have dealt with enough corporate horseshit to see through these things. Sorry about that.

There is a pretty high level of childish vitriol that permeates this site. If you don't at least throw us a few crumbs a lot of the crap that goes on at -1 will get a hell of a lot worse. Your audience will turn against you and that will drive away the folks who make the site work. There are alternatives. Christ, with the dwindling number of commenters, you could probably host a slashdot-like site on any of the many cloud server type places out there.

Tread carefully, corporate folks. And it wouldn't hurt if you just accepted the fact that slashdot isn't something that's going to make you money. Maybe you can leverage slashdot in other ways to sell other stuff, I don't know. But if you fuck with your nerdbase they will fuck with you twice as hard. I don't envy your position and I truly hope you find a solution that meets your objectives... keeping the old timers happy while injecting a few audience base. But you should probably try a new approach. Have you considered sending out a survey-monkey type thing to gauge exactly what the old timers are willing to concede/put up with rather than just dumping it on us?

Comment: Re:Before this turns into a derpfest... (Score 1) 320

by Orp (#46002001) Attached to: Solar Lull Could Cause Colder Winters In Europe

His comment sounds like utter bullshit though. You can put all the CO2 in an atmosphere you want. If you don't have solar flux the heat on the surface will be minimal. One good example is Mars. There have been plenty of examples along history of temperatures decreasing by more than .1 or .3 Celsius even when there were no humans on the planet.

Ok Dr. Bagel. You win. I'll go burn my diploma, tell my colleagues at NCAR to eat a bag of dicks, and await your clearly superior intellect to publish that which is something other than 'utter bullshit'.

Since you are clearly an expert on the subject of the sound of bullshit, please, o wise one: Exactly what does utter bullshit sound like? As opposed to just plain bullshit? Do the flies buzz louder?

Comment: Re:Not the sun (Score 1) 320

by Orp (#46001679) Attached to: Solar Lull Could Cause Colder Winters In Europe

do you realize that the best possible thing to happen to a scientist is for him/her to make a discovery that tosses the widely accepted hypotheses on their head?

You clearly haven't done much research. As one guy used to say science progresses when the last generation dies. That's how fossilized fields become when all the people doing peer review have the same mindset.

Maybe I should have said "successfully publish a discovery" blah blah. If you don't believe that, well, talk to Einstein, Bohr, Watson, Crick, Darwin, etc. etc. etc.

Comment: Re:Bios code? (Score 4, Informative) 533

by Orp (#46001499) Attached to: Ask Slashdot: What's the Most Often-Run Piece of Code -- Ever?

I would probably have to say whatever is the inner loop on the system idle process in windows.

Ding, we have a winner. Not supercomputer code. Sure, supercomputers are... super and all, but the biggest one only has around 1 million processing cores. How many windoze machines are out there, idling away?

Comment: Re:Not the sun (Score 4, Insightful) 320

by Orp (#46001437) Attached to: Solar Lull Could Cause Colder Winters In Europe

On the other hand the "alarmist" logic is: "we already know the cause of the warming, it is humans saturating the atmosphere with too much CO2, we just need to gather and/or create the evidence to support this theory". That's called inductive logic, and is just as unscientific as what you describe coming from the "denialists".

"Real" science comes from gathering evidence and basing your theories on the evidence gathered. You then determine what it might take to falsify your theory and try as hard as possible to falsify it.

All I see from the "alarmist" camp is people trying to support their theories at all costs, calling things causation where there is barely correlation, and making very little if any effort to falsify their theories. This behavior is more akin to religion than any sort of science.

False equivalency is false.

Guess what? A lot of the "alarmist" are the same scientists doing the research. Sure, people get attached to theories, but do you realize that the best possible thing to happen to a scientist is for him/her to make a discovery that tosses the widely accepted hypotheses on their head? In other words, if a scientist did a rigorously peer reviewed study which indicated that, say, it's a reduction in neutrinos from the sun somehow, oh, say tweaking aerosol concentrations, leading to a strong causal relationship between this phenomenon and observed global warming - while also showing that the greenhouse effect of CO2 was much less of a factor than previously thought - that person would be fricking king of the scientific world.

The tired repeated bleatings of non-scientists who have not spent their careers repeatedly getting their work shredded by reviewers [this being the norm, not the exception] on the path to eventual publication do absolutely zilch to move things forward regarding understand what's really going on. The simple-minded idea that climate science is some sort of "alarmists versus skeptics" battle is laughable; this false equivalency between two imagined camps, each claiming to know the truth, is entirely imagined by ignorant people. Unless you've actually done science and gotten your work published in decent journals, these opinions mean absolute diddlyshit; nothing more than mental masturbation splooging text on the screen, masquerading as informed debate.

Comment: Before this turns into a derpfest... (Score 5, Informative) 320

by Orp (#46001053) Attached to: Solar Lull Could Cause Colder Winters In Europe
The NCAR link is probably the best for relating this to climate change:

So could a lengthy drop in solar output be enough to counteract human-caused climate change? Recent studies at NCAR and elsewhere have estimated that the total global cooling effect to be expected from reduced TSI during a grand minimum such as Maunder might be in the range of 0.1 to 0.3 Celsius (0.18 to 0.54 Fahrenheit). A 2013 study confirms the findings. This compares to an expected warming effect of 3.0C (5.4F) or more by 2100 due to greenhouse gas emissions. In other words, even a grand solar minimum might only be enough to offset one decade of global warming. Moreover, since greenhouse gases linger in the atmosphere, the impacts of those added gases would continue after the end of any grand minimum.

So perhaps a serious lull in solar activity could put some feeble brakes on global warming, slowing it down... temporarily, only to charge back when the sun gets over its issues.

I'm a meteorologist, not a climate guy, but I find the hypothesis that the current solar lull is responsible for the recent cold snaps in the northern hemisphere to be extremely dubious. Much more tenuous than the hypothesis that the meandering jet stream is happening due to the reduction in the north/south temperature gradient due from a reduction of Arctic ice cover, which itself is physically feasible but still not shown very conclusively.

The best way to get a grip on these issues would be to run many, many ensembles of weather models and coaxing out statistical links. And this is where weather/climate modeling is going, for good reasons... but as all the armchair slashdot climatologists will (perhaps rightly) point out, models have issues... but they are getting much better and ensembles help a lot to provide a handle on the probability that forcing A is causing response B.

Comment: Earplugs - have you heard of them? (Score 1) 340

by Orp (#45997657) Attached to: Americans To FCC Chair: No Cell Calls On Planes, Please
Yes, cell phones on a plane are a bad idea, but I've been using earplugs since forever and good ones (that seal well) will pretty much drown out everything... loud engine noise, screaming children, etc.

It would also help if the airlines charged a fee to access the cell repeater on the plane like they do with wifi. That might at least keep the most inane conversations from happening. Or not.

Comment: Power and legacy codes (Score 2) 118

by Orp (#45474803) Attached to: Warning At SC13 That Supercomputing Will Plateau Without a Disruptive Technology
... are the biggest problems from where I'm sitting here in the convention center in Denver.

In short, there will need to be a serious collaborative effort between vendors and the scientists (most of whom are not computer scientists) in taking advantage of new technologies. GPUs, Intel MIC, etc. are all great only if you can write code that can exploit these accelerators. When you consider that the vast majority of parallel science codes are MPI only, this is a real problem. It is very much a nontrivial (if even possible) problem to tweak these legacy codes effectively.

Cray holds workshops where scientists can learn about these new topologies and some of the programming tricks to use them. But that is only a tiny step towards effectively utilizing them. I'm not picking on Cray; they're doing what they can do. But I would posit that before the next supercomputer is designed, that it is done with input from the scientists who will be using it. There are a scarce few people with both the deep physics background and the computer science background to do the heavy lifting.

In my opinion we may need to start from the ground up with many codes. But it is a Herculean effort. Why would I want to discard my two million lines of MPI-only F95 code that only ten years ago was serial F77? The current code works "well enough" to get science done.

The power problem - that is outside of my domain. I wish the hardware manufacturers all the luck in the world. It is a very real problem. There will be a limit to the amount of power any future supercomputer is allowed to consume.

Finally, compilers will not save us. They can only do so much. They can't write better code or redesign it. Code translators hold promise, but those are very complex.

panic: kernel trap (ignored)

Working...