Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×

Comment Re:Poster, you forgot the "controller" in MVC. (Score 1) 188

"Escape analysis and data flow analysis have existed for quite some time..."

Okay, well neither of these things are built in compile-time static analysis of memory allocation to prevent memory leaks efficiently. So I'm not sure why you mention them.

"Lisp allows for..." ...yeah, I'm going to stop you right there.

Comment Poster, you forgot the "controller" in MVC. (Score 1) 188

I agreed with the poster up until the last part, where he suggested that the "massive view controller" problem was a misunderstanding of what constitutes a "view" and a "model". I'd ask poster if he understands what constitutes a "controller" in the MVC model. And from that I surmise that who ever told him he didn't know what he was talking about was probably correct, at least in that particular instance.

But that the poster didn't realize this, throws into doubt earlier things he said, which I agreed with up until I got to that sentence. It strikes me now that those perspectives might really have been the result of the posters "arrogance and ignorance" as the phrase goes. And furthermore that his claims of things being a solved problem for "20 to 50", while I agree this happens a lot, I think we might disagree on what the specific instances of this are.

I would say for instance that, while it didn't need to invent a new language to do it, "Rust" provides a good solution to dynamic memory allocation (namely, compile-time static analysis), that is different and in many ways better than garbage collection and automatic reference counting (ARC). While he might claim this problem was solved decades ago, it is just a fact that compile-time static analysis for dynamic memory allocation did not exist until recently, except in the form of "scope" as opposed to "lifetime", which, while similar, it is different from.

Now there will certainly be a great deal of rediscovery, of re-inventing wheels poorly (for instance i just found someone re-created the java "date object", but made the linux epoch (zero milliseconds) dependent on the time zone.), but it must be noted also that there still are a fair amount of new, interesting ideas.

Comment other options ... (Score 1) 1067

I've often used "1" as a drop in when dividing by zero. also the logarithm of zero, i usually am working with information units, so it's plnp, and if p = 0, i just say that plnp = 0, and that's kind of like saying ln0 = 1.

i think mathematically infinity or negative infinity would make the most sense, unless its 0/0, in which case you'd apply l'hopital's rule. but i don't think the computers going to do that. another way is you could create another number system, that would run a tally of divides by zeros or multiplies by infinities, minus multiplies by zeros or divides by infinity. (and flipping the sign on negative infinity.)

Comment definitely c ironically the most object-orientatd (Score 1) 211

definitely c ironically the most modern programming language of the bunch.

maybe some day apple will finally get with the times and use an existing popular modern object orientated programming language like c++ or java.

you know, shortcut the whole process f making a feature-incomplete idiosyncratic and verbose programming programing language with inconsistent syntax and skip ahead to what everyone else had half a century ago.

why oh why don't they just use c++ or java?

sigh.

Comment author evidences how bad U.S. science literacy is (Score 1) 795

the author shows by his very writing of the article just how bad science education is in the u.s. that is, he himself is a victim of the very low standards and the lack of teaching and emphasis on philosophy of science. the author should be ashamed and embarassed for being a shining example of everything that is wrong and antiquated with science education in america. his philosophy belongs to the pre-socratics; the sophists. there is nothing new in what he is saying, it is embarrassingly old. and embarrasingly banal.

if the article proves anything it's by way of example: the author is an excellent example of the people that our education system has left behind.

Comment planing to fail? (Score 1) 269

so what i get from the article.

1) they start by using a measure that's not at all even correlated with what they're interested in.
2) they then completely switch the measure, so then you have two completely unrelated filters on the data - and the data (dna) is very high dimensional, so your final result set is of course going to be miniscule and effectively random. and woe and behold, that's what they got.
3) on top of that they looked at the correlation between the first filter and the second and found woe and behold their method has no chance at all of telling them what they want to know - which we already knew in step 1.

so... uh... do they see what's wrong with their methodology? could it be more obvious? this was botched really badly.

Comment it varies wildly (Score 1) 2

it will vary widely depending on a number of things, including database indexes, system tables, machine specs, operating system, machine specs, recent table usage, table size, whether an execution plan is cached etc.

* machine specs - obviously, memory, cpu, hard drive bandwidth and seek time, etc.
* operating system - this will determine the memory paging, process threading, disk caching, etc.
* indexes - an execute statement on indexes vs not an indexes will make orders of magnitude difference, especially for larger tables
* recent table usage - determines whether the database is paged into memory.
* table size - determines how much of the tabe is paged into memory, and how many comparisions it will need to do to get a resultset, etc.
* system tables - contains optimization parameters that will effect performance and execution plan creation, such as how many rows are expected in the table. if these are off from reality, the database could use a poorly performing execution plan. system tables can also effect paging and other global parameters that effect performance.
* whether an execution plan is cached - determines whether the database will have to re-design an execution plan

all these things are going to add so much variance that it's gong to totally swamp any chance of apples-to-apples comparision.

to do a real comparision you really have to look at it on the logical level rather than the empirical level. what database algorithms will optimize better? etc.

all in all, you're probably just not doing it right. they should be about the same, except in exceptional cases.

Slashdot Top Deals

"Here's something to think about: How come you never see a headline like `Psychic Wins Lottery.'" -- Comedian Jay Leno

Working...