Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Paul (Score 1) 25

As of right now, I would like to vote for Rand Paul. Only person who believes in the Constitution.

His foreign policy isn't as nutty as the media has made it out to be.

Comment Re:Money to be made (Score 1) 412

>>You know what they call "alternative medicine" that is proven to actually work? MEDICINE.

Do you know what they call people who quote memes without knowing that they're actually wrong?

Urban legend spreaders.

Ok, I guess that's not as pithy.

But seriously, that's not the difference between alt med and medicine. Even though peppermint oil has a strong research base showing its effectiveness for IBS, it will never be medicine, even though it works, because it's something you can pick up at any supermarket or GNC.

And that's not splitting a hair, either. It is not regulated by the FDA as a drug, so it is not "medicine", even though it is highly effective.

The official definition of alt medicine (from the FDA, WHO, NHS, NIH, etc.) is any medical practice not typically performed in usual practice of medicine.

Comment Re:Claims without evidence (Score 1) 412

>I'm disappointed that even the geeks of /. are so easily persuaded by pharmaceutical industry propaganda.

Even though Peppermint Oil had the highest ratings in both quality of research and effect size, of any of the studied IBS treatments in the paper, I'm sure there are a number of people reading this on Slashdot right now secretly suspecting that it must be bullshit because it sounds too "alt meddy" to them.

>Meanwhile, a lot of prescription medication is clearly dangerous. How many herbal supplements have been taken off the market recently because of health risks? Ephedra is the only one that comes to mind, and it isn't even all that dangerous by pharmaceutical standards. How many FDA-approved drugs have been taken off the market recently? Dozens.

In the IBS paper alone, there were several drugs that were pulled from the market for being too dangerous.

But it works the other way as well. Just because something is all hippie and natural doesn't mean it's inherently safer or doesn't have side effects. St. John's Wort has drug-drug interactions with many many drugs due to its effect on CYP3A4. Grapefruit juice, incidentally, is dangerous as well if you're on a lot of drugs due to its opposite effect on it.

Just because the FDA doesn't ban them doesn't mean they aren't going to be bad for you. In general, the FDA does not regulate herbs unless they can no longer be generally regarded as safe (GRAS).

Comment Re:Claims without evidence (Score 4, Interesting) 412

While you are right that there are a lot of studies showing no effect or a negative effect from alt med drugs, there are also peer-reviewed and high-evidence studies showing that some of them do work and are effective.

My wife is a member of ASPEN, and so I get to read through a lot of their journals with her. In a paper on treatments for IBS, the "drug" that had the highest strength of evidence, and effect size was... peppermint oil. The research shows pretty conclusively that it is better than a lot of IBS-specific medicines that have come out in recent years, several of which got pulled from the market for being dangerous, and now can only be prescribed in limited situations.

Peppermint oil will never be a medicine, because you can buy it at the grocery store. But it *is* highly effective at treating a very serious disease.

Comment Re:Money to be made (Score 1) 412

>Why? People buy it anyway and the studies cost many millions of dollars. They have NO interest in proving (or disproving) anything about these supplements. In fact they had congress pass laws explicitly preventing the FDA from regulating them so that they wouldn't have to prove their claims.

Fun fact. There are actually peer-reviewed scientific journals, for reporting these kinds of results. Some common "alternative medicines", like tea (boring, I know), have had literally hundreds of studies on their effectiveness at doing various things, with well understood results.

This does not mean they're effective. I skimmed numerous studies on milk thistle that all agreed it didn't have any of the reputed health benefits for the liver.

You can learn more about all the data the NIH tracks on alt med here: https://nccih.nih.gov/health/p...

Or read through a peer-reviewed journal here: http://www.alternative-therapi...

Comment Re:Gov't contractors are not paid by the hour (Score 1) 253

>Forcing long hours on contractors and saying "well, we pay your company hourly" is an immoral load of bullshit. This is nothing less than government-sanctioned overtime fraud.

The federal government is the worst offender against labor laws in the country.

It's nice being part of the same organization that investigates and prosecutes offenses, isn't it?

Comment Re:Problems in C++ (Score 1) 386

>1) No, that's not the case. The difference between .at and operator[] is that .at() has a const overload and operator[] is not.

Are you sure about that?

http://www.cplusplus.com/refer...

>2) Most high performance apps (eg. games) turn off bounds checking in any case for performance reasons.

Which is fine.

Personally, I'd have preferred it if I could simply enable or disable bounds checking on [], so I could test my code to make sure I'm not going to fry memory, and then disable bounds checking on performance critical code for release.

>Uh, stringstreams? http://stackoverflow.com/quest...

I'm not saying that you can't write your own functions, just that the STL string class is not a drop-in replacement in functionality in string.h

Comment Re:Problems in C++ (Score 2) 386

>>1. Dude ... if you want to query the size of an array, use a vector. No, it doesn't make your code less readable. And, no, you don't have to use .at() everywhere: C++ has this thing called operator overloading. Maybe you've heard of it. You can use array syntax with vectors. Use vectors.

Dude, don't use square brackets with STL arrays and vectors, just to make your code more readable. The [] operator skips bounds checking, which is the main reason for using these classes in the first place. At() is the proper methodology to use in pretty much every case, unless you are so confident in your bounds that its worth the trivial speed increase in access time.

>>2. I'd like to know what functionality you think you need string.h for when using C++ strings. I've found the standard library string type quite feature-complete.

The biggest gaps were filled in C++11 with replacements for atoi() and so forth, but there's still no replacement for strtok or some of the other functions in the core language.

>>3. C++ isn't an interpreted language; of course it won't have much reflection.

Sure. Makes life more difficult though by pushing those tests out to the linker instead of being able to code them directly from the language itself.

>>4. Forward declarations are not for saving the compiler time. They are for declaring a linkage interface with external code. If you ever even thought seriously about writing a C++ compiler you would know the language is not designed to make doing so easy.

Not my point. Quite obviously you need declarations for extern names not found in the file scope. But for functions within the same file scope, you still need to do forward declarations, which is only done to avoid having to do an extra pass over the code. Might have made sense in the 80s, but not today. Maybe it's not a huge deal since it's just a single copy and paste and edit every time you change a function definition when you're working on it, but it still otherwise serves no benefit.

Also, there shouldn't be much need for #ifdef guards any more. It's 2015. We should be able to include the same function definition twice without the universe breaking.

Comment Problems in C++ (Score 1) 386

>What kind of complexities has modern C++?

1. C++ still doesn't let you query a C-style array to determine its size, even though that functionality is tracked in dynamic arrays anyway, and can be calculated from staticly defined arrays within their own scope.

So every function using C-style arrays must also pass in a size_t holding the array size. This hurts readibility by wasting room on the parameter list, and exposes you to buffer overflow errors.

Legal:
int arr[10];
for (int i : arr) cout i;

Illegal:
void write_array(int arr[]) { for (int i : arr) cout i; }

STL arrays and vectors are obviously better, at the cost of decreased code readibility. Square bracket accesses are easier to read than .at()s everywhere.

2. Strings. Even with the string type, it is still shitty to use, still has terrible support, and you still have to use the c library string.h for some functionality since they've been too lazy to rewrite all of them into the C++ standard library. This means that people wanting to use the C++ string class still need to know the C way of doing things, and are still vulnerable to the same off by one errors that have been around for decades.

3. Almost no reflection capabilities.

4. The language still enforces rather idiotic rules about class and function definitions that modern languages have done away with, and for the better. It's not like putting #ifdef guards on your code is difficult, but in these modern times it should not be necessary. And forward declarations are a way of saving compiler time at the expense of programmer time. This is the opposite of what should be happening. Compilers are there to make programmers' work easier, not the other way around.

5. C++11 and later has made great strides in simplifying the life of programmers, but its cruft accumulation shows.

Comment Re:Stop trying to win this politically (Score 1) 786

>If you want to talk about science, then show me a tested climate model that has been subjected to an empirical test of its validity. It isn't that hard guys. We have a lot of very accurate historical data. Feed in past climate data and see if your climate model can predict the past or the present accurately. The first model that can do that which isn't just a collection of plug variables is something worth taking seriously.

What? No.

You have it completely backwards. All serious models are trained on and tested using historical data. If they can't even predict the past, what use are they?

But - here's the key point - predicting the past is *worthless* other than as a sanity check. As Garrison Cottrell told me, predicting the past is easy (even trivial). It's predicting the future that is hard.

The only way to really know for sure if a model works is to test it moving forward. And the IPCC doesn't have a great track record at that.

Comment Re:C++ (Score 1) 242

>STL sucks, I still have to do single character input and output from files, so much for getline BS

Gah, I/O in C++ is so horrible. In just the last month I've come across the following:

1) No platform independent way to do non-blocking I/O.

2) No iostream-compatible way of doing dup() or dup2(). You can change the buffers on iostreams, but this is not the same thing.

3) Just how shitty iostreams are at processing input files in a fault tolerant manner. On any major project, I always seem to just drop down to reading files one character at a time.

Comment Re:No locks (Score 1) 449

>Also, some problems can't be done in parallel, but we won't know how many can until we start trying....and then try for a few decades.

Right, but there's also a grey area between completely serializable and embarassingly parallel, in which methods like this will allow scaling algorithms up from "a few" computation nodes to "many", with the optimal numbers depending on the specific algorithms.

The biggest problems are still the same ones that existed when I got my Master's over a decade ago. Language support for parallelism isn't very good (I personally used MPI, which was awkwardly bolted on top of C++), it requires a certain amount of specialized knowledge to write parallel code that doesn't break or deadlock your machine (and writing optimized code is a bit more advanced than that), and library calls aren't all threadsafe. On the plus side, a lot of frameworks and libraries are now multithreaded by default, which nicely isolates the problems of parallel computing away from people who haven't been trained in it, and gives the benefits of parallel computing with only the downside of having to use a framework. =)

Comment No locks (Score 2) 449

Ungar's idea (http://highscalability.com/blog/2012/3/6/ask-for-forgiveness-programming-or-how-well-program-1000-cor.html) is a good one, but it's also not new. My Master's is in CS/high performance computing, and I wrote about it back around the turn of the millenium. It's often much better to have asymptotically or probabilistically correct code rather than perfectly correct code when perfectly correct code requires barriers or other synchronizing mechanisms, which are the bane of all things parallel.

In a lot of solvers that iterate over a massive array, only small changes are made at one time. So what if you execute out of turn and update your temperature field before a -.001C change comes in from a neighboring node? You're going to be close anyway? The next few iterations will smooth out those errors, and you'll be able to get far more work done in a far more scalable fashion than if you maintain rigor where it is not exactly needed.

Slashdot Top Deals

The use of money is all the advantage there is to having money. -- B. Franklin

Working...