Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×

Comment Re:good and bad (Score 1) 136

I work with data collected by others, and those others are typically rather protective of their data for commercial reasons. I can use them for scientific purposes, but I'm not allowed to publish them in raw form. For most of these data there are no alternatives. I'd much rather publish everything of course, but that's impossible in this case, so I wonder if that means that I can't publish in PLOS any more now?

Just to be clear, I applaud this move, we should be publishing the data, plus software and such, where possible. Anyone happen to have a spare couple of tens of millions of euro lying around? That would probably free the data I'm using...

Comment Re:Grasping at Straws (Score 3, Informative) 552

You're probably just trolling, but you're currently modded +3, so I'm going to reply.

Such as the US just had one of the 10 coldest years on record.

Minima and maxima are by definition outliers. While there is an entire body of statistical literature on outliers, they're not used to determine trends or draw conclusions, because they are essentially (bad) luck.

The UK getting record snowfall despite AGWers claiming the UK wouldn't see snow after 2008.

Sources please. Because no serious scientist would ever make such a definite statement. A mathematician might, but science, including climate science, is all about statistics and probabilities. In any field. Perhaps you mean this article in the Independent? The scientist quoted says that in 20 years time, snowfall will become a rare and exciting event. So I think that we can consider him proven wrong if it snows in southern England for say, five out of ten years from 2020 onwards?

Antarctica getting within .5 degrees of the coldest recorded temperature on earth.

Antarctica is a huge and largely unexplored continent. Finding a new minimum in a situation where very little information was available is hardly suprising, and certainly shouldn't be used to draw any conclusions.

Along with 2000 record low temperatures recorded over the last couple of months.

Among how many measurements? Record since when? And see above about outliers.

Add that to the IPCC report showing no warming for 17 years.

Indeed. They also investigated why, but you're conveniently leaving that out since it doesn't fit your agenda. I'll give you a hand as to the causes according to the IPCC: an exceptionally quiet sun (there's another of those outliers), several smaller volcanic eruptions increasing the amount of dust in the upper atmosphere, and an increase in dust in the lower atmosphere, probably due to industrial pollution. According to the IPCC, the discrepancy is partially explained by these three causes (which weren't put into the models when the prediction was made), and the remaining difference is small enough to fit within the natural variation (stochasticity) of the models, or be attributed to errors in the models.

Its become pretty obvious which side has been lying. Now they are grasping at straws to report ANYTHING that shows their side "might" be right.

Sorry, this is not the 18th century anymore. Science is a quantitative affair, and necessarily so, because our world isn't binary. The question is not whether there is human-induced climate change, the question is how strong an effect humans are having on the biosphere. Maybe it's small enough to be negligible (probably not, according to what we currently know), maybe it's huge and a danger, but it's a quantitative question.

I'm going to ignore the alarmists and look at the evidence myself. If AGW was real, they wouldn't have to lie as often and at least ONE of their predictions would have happened.

Excellent idea. Try reading the IPCC report instead of The Drudge Report and you might find some.

Comment Re:Rule #1 (Score 1) 894

I'm from Germany, were gun laws are much, MUCH stricter and therefore we aren't seeing such tragedies on a yearly basis like it's come to be anticipated in the US.

In Swiss, every single person that once belonged to the army are not just allowed, but expected to have personal arms in home.

The guns aren't the problem. People are.

In particular, the US has a serious problem with the way its health care system is organised, with the way its education system is organised, and with the way in which mental health issues are dealt with in the criminal justice system. As a result, people who should be getting psychiatric treatment or other forms of professional help don't get it because 1) they can't afford it, because they can't hold a job because of their mental health issue (or even just because of the stigma associated with it) and there's no working social health insurance system, 2) they're in prison instead of having been institutionalised, because of the revenge-over-fixing-the-problem mentality towards crime, or 3) they're kids whose teachers are way too busy to build up a personal relationship with their students, and to notice the early warning signs and offer a listening ear.

The problem isn't the guns, and the problem isn't the people. Changing those won't help. It's the society that will have to change to really do something about this.

Comment Re:In English (Score 1) 139

Exactly, and that is especially a problem for the Oculus Rift and other virtual reality headsets that are coming onto the market, because it becomes really noticeable when you move your head quickly. I think that that is what they're mainly targeting here, although according to John Carmack, G-Sync won't work on the Rift. Anyway, for those interested in the technical details, graphics programming legend Michael Abrash (currently at Valve) wrote an excellent technical piece about the frame timing issues you get with VR headsets some time ago.

Comment Re:It's a solid rocket booster stack (Score 4, Interesting) 94

The Epsilon rocket is three stages of solid rocket booster, like an ICBM. So there's no fueling on the pad, no plumbing, no cryogenics, and no turbopumps. The launch team has a lot less to do than with liquid-fueled rockets.

They're also proudly proclaiming how quickly they can prepare the rocket for launch. I don't think that these features are coincidental, and I don't think that cost savings are the only driver behind developing this thing. North Korea's leadership is a bit unstable at times, it may have nuclear weapons, and Japan has had North Korean rockets fly over its territory before. It's a serious potential threat to them.

Since they lost in WWII, Japan has been very pacifist, but in recent years it has begun to expand its military activities a bit, taking part in a UN peace keeping mission for instance. Outright developing an ICBM would probably go a bit too far at this point, but making a civilian rocket that can be launched at short notice with a small crew and has the range to hit North Korea could just be an acceptable compromise between mitigating the NK threat and not rocking the domestic political boat too much with overly aggressive military moves.

Comment Re:Coincidentally... (Score 1) 293

Based off of a sample size of 1. Nice generalization.

Hey! That's one better than some of the climate change theories!

(I know this was meant as a troll/joke but you're hitting the nail) No. They have the sample size of "1 earth". Exactly "1 earth". Of course that's due to the lack of spare earths that we could compare ours too. But it is exactly what makes this whole subject statistically "challenging".

If all you could measure was the global average temperature then yes, you'd have one sample of a simple probability distribution, which contains so little information that you can't possibly derive any interesting knowledge from it about a system as complex as our planet. Fortunately, we have measurements of many aspects of that planet, not just temperature but atmospheric composition, ocean temperatures and salinity, albedo, ocean currents and wind, and so on, and not just global averages but measurements localised in space and time. So your one sample is actually a sample of an extremely high-dimensional, highly internally correlated probability distribution, which gives us much more information to work with.

Now, it's true that we can't do controlled experiments covering the entire Earth, since we don't have a control to compare against. We can do such experiments at a smaller scale and use the results to guide construction of whole-planet models however, and we can exploit the natural variety across our planet to test hypotheses and draw conclusions. So science is still possible, we just need to use different tools. Models make predictions, and if a model predicts our actual sample to be unlikely, we can rightfully conclude that that model is unlikely to be a good description of reality.

The modeller's challenge is to create a description of a complete planet that accurately describes the characteristics that you're interested in and correctly mimics the emergent behaviour (insofar as is relevant to your research question) of the actual planet, while still being simple enough to fit in a computer, give meaningful and comprehensible results, and have reasonable uncertainty bounds on its predictions given the limited amount of information we have available to feed it. I don't think that having a second Earth would make that job much easier.

Comment Re:Tipping point (Score 4, Informative) 524

We have 2 political parties in this country. They dictate the issues. The write the rules governing how you create a party, how you get on a ballet. Nearly everyone in the media belongs to one of the two parties. The parties control the message. You basically can not vote for anyone if they do not belong to one of the parties. You can write in a name, but the fact of the matter is it's nearly impossible to co-ordinate a write-in voting effort.

I'm not from the US, but given all that's happened in the past 15 years it seems to me that at this point voting either Republican or Democrat in any federal election should be considered treason. A vote for either of these parties is a vote for a government of the people, by the elite, for the corporations, and as I understand it, that wasn't quite the idea of your country. Perhaps a write-in or third party vote is a wasted vote, but at least you're not actively voting for this abomination.

As for alternatives besides your current third parties, in the most recent elections in Italy (which had similar issues) the Five Star Movement got almost a third of the vote in what was previously a two-party (or two-coalition) system, with a strictly online and on-the-streets campaign (they're boycotting the Berlusconi-controlled mainstream media). They're promoting amongst others more direct (e-)democracy, limited terms in both houses of congress filled by ordinary people who take a few years out of their lives to serve the country, and reduction in campaign spending.

It's certainly not perfect: they are having issues with disagreements within the party, it turns out online voting doesn't work too well technically, and some of their other policy ideas probably wouldn't work in the US. You'd need your own version of such a party for sure, fix some things, and then it still will be a struggle to make it work. But it shows that it's not impossible to break a two-party system even if it controls the mainstream media, and it's worth a try. Even inexperienced and/or somewhat incompetent representatives would be an improvement over what you currently have as long as they're at least honestly trying to represent the people.

Comment Re:They aren't drowning in plastic (Score 2) 427

Actually, we recently started collecting plastic separately. Which means that we now collect glass (white and coloured often separate), paper, clothing, compostable waste, batteries and other small chemical waste, and plastic separately in most places, and then there is a separate recycling scheme that puts a small extra fee on PET soft drink bottles, glass beer bottles and beer crates, which you get back if you hand them back in in the shop. Supermarkets have machines that you put them into, and they're collected when the shop is resupplied. The bottles are stripped of labels, cleaned, and reused up to 50 or so times IIRC, before they're recycled.

It's not much of a burden, you just keep a few extra bins or bags with waste around and remember to take them with you when you go get your groceries. Just about every supermarket has a bunch of recycling bins on the front court. That's not to say that there aren't lazy people who just toss everything in the garbage, but they're probably a minority.

Comment Re:You can't make promises... (Score 1) 112

And as my grandpa used to say "Girls want ponies, people in hell want ice water, I want a million dollars...that don't mean any of us are gonna get it".

Unless they are gonna kickstarter the chips in the thing it'll be DAMN hard to make it FOSS, simply because the ones making the GPUs, wireless, etc, are about the most proprietary lot on the planet. Hell I don't even think you CAN make a FOSS GPU as everything from texture compression on up is patented up the ass, I know there was a project to make one using an FPGA but I never heard any more about it, probably ran into the legal minefield and ran aground.

Basically, it ran out of money; the main contributors didn't have as much time available any more and making an ASIC is expensive. Some prototype boards were manufactured, and the employer of the main developer (who allowed him to use their tools, and work on it some during office hours) made a commercial product based on the design. It never got to producing a consumer video card though. I see now that Kickstarter actually existed in 2010, but I don't think anyone of us had ever heard of it, and I don't think we could have got the couple million dollars needed to have the cards produced.

For those interested, there's still an active mailing list, the project isn't quite dead.

Comment Re:yes, there are a reasonable number of positions (Score 2) 237

Another option you may want to look into is working at a supercomputer centre. These are usually (semi-)independent organisations that maintain supercomputers and fast networks, and help scientists use them. Jobs there include technical sysadmin type work maintaining compute clusters, storage arrays, and networking equipment, programming with an emphasis on parallellisation, optimisation and visualisation, as well as more consulting-type work where you advise researchers on how to best use the available facilities to achieve their goals, and gather requirements for the programmers. As a random US example, there's one in Chicago.

As for technical skills, if you're in the geosciences then you'll definitely want to brush up on your knowledge of Geographical Information Systems. ESRI ArcGIS is the big commercial vendor there, but there's also a lot of FOSS GIS software available. Also, some knowledge on geostatistics will help you communicate; some tutorials can be found here.

Comment Re:Still not Stallman-approved. (Score 4, Informative) 126

I don't understand why simply putting the closed source firmware on the card suddenly makes it ok for free software. Same code, just different home.

Back in the days of the Open Graphics Project (since defunct, although Timothy N. Miller is still working in this area and the mailing list is still active for those interested in the subject), we had several discussions about the borders between Free software, open firmware, and open hardware.

As I understood the FSF's position at that time, the point is that if the firmware is stored on the host, it can be changed, and frequently is (i.e. firmware updates). Typically, the manufacturer has some sort of assembler/compiler tool to convert firmware written in a slightly higher level language to a binary that is loaded into the hardware, which then contains some simplistic CPU to run it (that's how OGD1 worked anyway). So, the firmware is really just specialised software, and for the whole thing to be Free, you should have access to the complete corresponding source code, plus the tools to compile it, or at least a description of the bitstream format so you can create those. This last part is then an instance of the general rule that for hardware to be Free software-friendly, all its programming interfaces should be completely documented.

If the code is put into ROM, it cannot be changed without physically changing the hardware (e.g. desoldering the chip and putting in another one). At that point, the FSF considers it immutable, and therefore not having the firmware source code doesn't restrict the user's freedom to change the firmware, since they don't have any anyway. The consequences are a bit funny in practice, as you noted, but it is (as always with the FSF) a very consistent position.

We (of the OGP-related Open Hardware Foundation, now also defunct; the whole thing was just a bit too ambitious and too far ahead of its time) argued that since hardware can be changed (i.e. you can desolder and replace that ROM), keeping the design a secret restricts the users freedom just as well. So, we should have open hardware, which would be completely (not just programming interfaces, but the whole design) documented and can therefore be changed/extended/repaired/parts-reused by the user. The FSF wasn't hostile to that idea, but considered it beyond their scope. Of course, any open hardware would automatically also be Free software-friendly.

I tend to agree that in practice, especially if there are no firmware updates forthcoming but it's just a cost-savings measure, loading the code from the host rather than from a ROM is a marginal issue. Strictly speaking though, I do think that the FSF have a point.

Comment Re:Another day, another codec. (Score 5, Interesting) 86

Those existing codecs are all very similar technically, and riddled with patents. If Monty can make something new (and he can, see CELT) and work around those patents (and he can, see Vorbis, Theora), then it's definitely a welcome addition. And a codec doesn't have to dominate to be useful; Vorbis is widely used (Wikipedia, all sorts of software that plays sound and music including a lot of if not most video games) and supported on a lot of platforms (including hardware players and set-top boxes) even if it never did completely replace MP3 and AAC. If nothing else, having a free and unencumbered option will keep the licensors of the proprietary codecs at least somewhat honest.

Incidentally, isn't it about time for Monty to get an EFF Pioneer award? He's been very successfully working on freely usable audio and video codecs for well over a decade now, starting at a time when many people didn't believe that a non-encumbered audio or video codec was even possible. Someone with his skills could probably make a very good living in proprietary codec development, but he chose to start and fight the good fight (and now works for Red Hat). He belongs in that list IMHO.

Comment Re:Gosh!!! (Score 2) 318

So here we are, at a crossroads. If a project produces the source code needed to build a complete, binary-perfect copy of their executable(s), but it was run through the C pre-processor, or C++ pre-processor, is that enough? It compiles, it builds with the version of tools the provider used... if you discount the pre-processor, it is effectively the original source code provided to the compiler. Is that enough?

I believe Stallman answered that question already, and as you would expect from him, it's a smart answer too. In the GPL (v3, but it goes back all the way to v1) it says "The “source code” for a work means the preferred form of the work for making modifications to it." So, if the creator of the source code actually works on the preprocessed source all the time, then it's okay to redistribute only that. If, in fact, any work done on the program is typically done on the original, non-preprocessed source, then that is the source code and that has to be distributed. This neatly avoids having to define a minimum level of readability by simply requiring that all users/developers be equal.

So, for JavaScript, if the authors actually do their programming directly on the minified version, then distributing only that would be okay. If they don't, and use a non-minified version for development (which everyone does), then I'd want to have that original version as well before I'd call it Free software.

Slashdot Top Deals

What ever you want is going to cost a little more than it is worth. -- The Second Law Of Thermodynamics