Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Why "on Mac"? (Score -1, Troll) 306

As a Linux user, I was about to ignore the article when I glanced over the sentence "It affected not only Sun's Java but other implementations such as OpenJDK, on multiple platforms, including Linux and Windows. "

If I understand it correctly, all Java implementations have this flaw, so why write that it is a "MacOS vulnerability" and not "Java vulnerability"?

I want to know more how it affects my Ubuntu box!

j.

Comment Re:Nothing to worry about for academics (Score 1) 303

You have not read the article (or you have read it, but failed to understand). For example, you write "so long as the user owns the input" -- and this is most obviously not the case here (because you don't own the data that was used to produce the graphics).

Secondly, you don't own the software -- it is not running on your machine. Rather, you ask someone to produce specific results for you using software that you don't have and data that you don't own. It's like asking your friend, who owns rabbits and Adobe Illustrator, to draw an picture of a rabbit for you, and not like drawing yourself a picture of your own rabbit using your own software. It's a service, not a piece of software.

Thirdly, the license is quite specific about what is the subject of the copyright. (i) data might be copyrighted by a third party, and therefore it is necessary to give a link to specific results so as to attribute this particular data correctly; (ii) "specific images, such as plots, typeset formulas, and tables, as well as the general page layouts" are copyrighted. If you care, you can reproduce them on your own using your own software and data.

Finally, it is way better to have a precise license that clearly states what is allowed and what is not allowed than to be unsure whether the results can be used (and in what form), and who actually owns the copyright.

j.

Comment Re:Nothing to worry about for academics (Score 4, Insightful) 303

You think it's not reasonable? Then write your own Wolphram Alpha, if you really think it is that simple, and use that instead of WA for your work. Man, you have no idea what you are talking about here. Modern biology would be nowhere if people who build such "turing machines" were not credited for their work, and consequently get grants for their research.

For example, tons of software in bioinformatics is written with a completely open source and well known algorithms, using data gathered by experimentalists, and yet they get the recognition -- because someone had to come up the with the idea, gather (and maintain!) the data, run tests, implement, etc. etc. Believe me, even with simple ideas and algorithms and for simpler data sets this is a shitload of work. Heck, even re-implementations of existing tools get recognized.

Secondly, a scientific procedure requires that you publish your methods -- you have used software X to generate figure Y and table Z, then you have to write how you did it. And noone in her or his right mind will reimplement existing tools just for the sake of the current work without a very good reason.

That said, sometimes a tool like that allows you to "get on the trail" -- which you then pursue using something else. For example, WA would give you a hint that there might be a connection between cancer and, say, cigarettes, and you show this connection using clinical trials. In such a case, however, when you do not publish the data from WA directly, nor any figures derived from it, you are not required to cite it.

Note that I am in no way convinced that WA is of any use. The parts of it that overlap with my area of expertise (biology / biocomputing) are naive and rudimentary, and mostly useless to say the least.

j.

Comment Nothing to worry about for academics (Score 5, Informative) 303

All they ask is that you attribute them when publishing results derived from their service. Example:

Methods: "The comparative population studies were derived from the Wolphram Alpha service (Wolphram, 2009)"

Regular thing for academics. I cite NCBI blast service, I cite PFAM, I cite dozens of other services out there. Most of these tools require or ask for an attribution; and in most cases, this is anyways necessary in a scientific procedure.

j.

Comment Interesting (Score 2, Interesting) 125

One thing first. There *are* certain esthetic and technical rules / guidelines which are what we could call "objective" in the sense that they are very general. For example, a photograph usually looks better if the composition is balanced, if the 2/3rd (or golden mean) rule is used, the lines in the picture are coherent and lead the eye in the right direction (e.g. towards the subject), if the photograph is correctly exposed, colors matched etc. Of course, some of the greatest photographs break those rules; however, like in many things, you succeed in breaking the rules if you know what you are doing, and you cannot do it very often.

I can imagine that you can come up with an engine that is able to detect how "rule conformant" a given picture is.

However, pure formal esthetic judgement is what we rarely mean when talking about a "good photograph".
There is one main issue that will make it very hard to match our "overall" esthetic sense. Firstly, we are unable to detach the image contents from the "pure form". That means, if we see a worried women holding a child, we cannot just look at that as a composition. Also, we are always considering what we know about the subject. E.g. if we have a photograph of a man standing in water, if the photograph ends just below the place that his legs go into the water, we will have the impression that his legs are cut off, and that there is something wrong about the photograph. Finally, facial expression is immensely important for the perceived esthetics of a photograph.

I did some experimenting -- some of the truly great photographs of our times got rather lousy scores (e.g. Dorothea Lange's famous photograph, but also some color photographs as well), while at the same time rather random shots I did of my sons got even five out of five stars. Well. Maybe it will still be useful to someone to filter out the worse photographs.

j.

Comment That's why you have Impact Factor (Score 5, Informative) 213

There are hundreds or thousands of journals with a fairly low standard. Even if they are not industry founded, they make it relatively easy for anyone to publish next to anything. I know of scientific institutions that have their own journals just so that the (lousy) researchers can publish *somewhere* and have a non-zero publication list.
That said, it is also fairly easy to see how good a scientific journal is, especially to someone who reads scientific literature. The system is not perfect, but it is better than nothing, and relies on the number of times that a single article from a journal gets cited. This metrics spawns the "Science Citation Index" (how often did I get cited?) and "Impact Factor" (how often, on the avearage, an article from a given journal gets cited?).
Think Google. This is exactly what the original google algorithm was using: number of times someone found an information useful / reliable as a measure of how relevant / important / interesting this information is. However, IF / SCI is much older than Google or WWW.
Both indices can be misused or manipulated. Furthermore, they differ wildly depending on the area studied (in especially, medical journals have ridicoulously high impact factors) because of the different number of citations per article and article turnover rate. Finally, it can be really hard for a new journal to get a high IF because of preferential attachement -- scientists flock to these journals that already have high impact factors.
Still, they are better than anything else.
j.

Comment Both. (Score 1) 397

Human being are individuals. They have a genome (well, actually, two, 'cause of the mitochondria), they evolved, they form a population of interbreeding animals.

That said, they provide an ecosystem to a large number of microbial species, some of which are symbionts, some are parasites, some can be both. In general, we cannot live without our symbionts, and our symbionts are depending on us.

All that isn't news. This perspective on a human individual has been here for decades. What is new is that with 2nd generation sequencing it is now possible to thoroughly investigate the microbial composition of our symbionts parasites. This is an exciting new technology which allows such projects as the 1000 genomes project, Neanderthal genome sequencing, metagenomics and much, much more.

Just one more remark: given a population of genetically identical bacteria, it is sometimes wrong to call each bacterial cell an "individual". These cells can collaborate, exchange information, shape their environment and act more like an organism than a single invdividual. There are even some bacteria that can actually get together, differentiate and form a macroscopic, multicellular structure. So saying that we are colonised by 100 trillion of individuals is an exaggeration.

That said, we too can view ourselves as a colony of (mostly: think sperm / eggs and t-cells) genetically identical cells that communicate, collaborate and shape their environment, and also are (mostly, think: blood cells) physically linked together. And each our cell can be viewed as a symbiont between two organisms, each with its own genome and even its own genetic code (yep, the genetic code of the mitochondria differs from that used in the nucleus in our cells).

j. (IAAB)

Comment Sleep deprivation (Score 1) 1127

I'm not a programmer, and programming for me is just means of getting my science done. However, I wonder why noone has mentioned sleep deprivation yet. I agree that there is a point where lack of sleep actually seems to help, or at least doesn't matter. But then, there is this second line, after which staying awake and doing demanding mental effort is just a pure torture.

I have various experiences with harsh coding (and, in general, doing science) conditions, but nothing comes close to that.

j.

Slashdot Top Deals

"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein

Working...