Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:No multi-dimensional arrays (Score 1) 434

This is all about having standardized container class for data stored in multi-dimensional arrays. Most users, most of the time, will use whatever the default storage is. However, users writing high performance numerical algorithms have a legitimate need to be able to manually manage the layout of their data in memory so that they can optimize their data access pattern. There is no avoiding this, and if golang follows it's current trajectory, array formats will be just as much of a fragmented mess as they are in the C/C++ world.

Comment Re:No multi-dimensional arrays (Score 1) 434

You can't. That's the problem. Because there is no good language-level support for multi-dimensional arrays, most languages are forced to deal with something as basic as storing an array of numbers in libraries. Since it's more difficult to write a library that supports multiple storage formats (even ignoring efficiency), most libraries don't, and the user is forced to write code to translate between them.

Comment Re:No multi-dimensional arrays (Score 1) 434

Last I checked, Go has no language-level support for scientific computing whatsoever. At MINIMUM they should provide a standardized, fully-strided, ndarray data container, so that the libraries that developers add later will at least be able to easily share data. And if this is all they provide, I hope at least the language definition is flexible enough to extend the syntax to support things like slicing and broadcasting.

The scientific and engineering world really needs a language that combines the ease-of-use of python/numpy with language-level support for expressing concurrency. I hope Go will eventually become this language, but this is clearly not a priority for the Go developers at this time.

Comment Re:Could some one explain the following then (Score 1) 368

Unless you are using a high-end camera built for machine vision and scientific purposes that you are certain has a linear gamma curve, the numerical values in the image files are ~not proportional to photon counts. Likewise, if you display a linear ramp of data as an image (say using something like MATLAB's imshow(repmat(linspace(...),...)), your monitor will not emit a linear ramp of photon intensity. Until MATLAB, PIL, opencv, et al provide build-in conversion functions, you're going to have to use a home-rolled gamma conversion functions whenever you display data, and whenever you load data from an image file (unless you used a camera that you're sure is linear).

Comment Re:The camera post-processes the sensor data (Score 2, Insightful) 368

"the assumption one makes is that these integer values are not photons ^( 1/gamma) but simple photon counts (scaled to the 0-255 range)."

That is a very reasonable assumption to make, and one that most people who don't know anything about gamma make. Unfortunately, it's flat out false. Both MATLAB and PIL return the gamma compressed data, which, unless you used a linear machine vision camera (and you should if you're serious about this stuff), which of course had no gamma compression to begin with. If you need proof, load and save an image. The image data will be bit for bit identical to the original, indicating that no conversions were performed. (note that the header might have slightly different metadata, and JPEG re-compression is usually always lossy)

Gamma is so rarely handled properly, even by scientists and engineers, that OpenCV (the most popular library for computer vision) does not even contain a function for doing gamma (de)compression.

Comment Re:Gamma and Computer Vision (Score 2, Insightful) 368

I was talking about scientific and engineering uses, which often depend on the gamma curve even if most authors ignore it.

Most photographic software is oriented towards deliberately messing with the gamma curves arbitrarily to achieve aesthetic goals. Consumer cameras are even starting to do this onboard the camera, in, as far as I know, completely un-documented ways (indeed, they probably consider them trade secrets). See features like iContrast in canon cameras.

Comment Gamma and Computer Vision (Score 3, Interesting) 368

Gamma is often poorly understood even by people doing scientific and engineering work using images.

Does your algorithm depend (explicitly or implicitly) on the light intensity -> pixel data mapping?

If NO: You're probably wrong. Go read about gamma. Just because the picture looks right to you, doesn't mean it looks right to your code.

If YES:

Do you have the luxury of taking the pictures yourself?

If NO: You're stuffed. Pretty much all images on the internet and most public research databases have unknown/unreliable gamma curves.

If YES:

1. Spend a lot of time calibrating your camera yourself. This is only cheap if your time is worthless

or

2. Buy a machine vision camera. A $600 machine vision camera will have the specs of a $50 consumer camera, but at least you will know the gamma curve.

or

3. Ignore the gamma issue, cross your fingers, hope it's not hurting your performance, and publish your results knowing that you're in good company and nobody will call you out.

Comment Re:Notational Velocity (Score 1) 1007

Unfortunately, you're wrong about the triple-architecture (by which I assume you really mean triple platform) support.

I recently exchanged a couple e-mails with Zachary Schneirov, the author, about what form a Linux version could take, but no port currently exists. Have a look at the (Cocoa) source code in the git repository and see for yourself.

Slashdot Top Deals

"You need tender loving care once a week - so that I can slap you into shape." - Ellyn Mustard

Working...