Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
For the out-of-band Slashdot experience (mostly headlines), follow us on Twitter, or Facebook. ×

Comment: Re:Infinity (Score 1) 1067 1067

You seem to have that backwards. Dividing your five dollars by zero is like dividing it among zero people and suddenly, poof, it's gone!

Right. Exactly my point, except that I used "pieces" instead of "people." Which is to say, when you're dividing equally between n people, the unit isn't actually the people but the count of people. If you're dividing into labelled jars it is exactly the same, and the jars aren't the units either; the count of portions is. You don't divide by the people, you divide by the count of people. The other person claiming apples/oranges is making the same mistake. And the unit issues shouldn't even come up when the issue is divide-by-zero, which doesn't implicate units at all.

It always cracks me up when people misunderstand me, presume I said something that doesn't make sense, and then correct me by paraphrasing what I actually said but presented as a contradiction.

I guess it is no wonder that people can't map real human scenarios that rely on numbers to mathematics very well; they're already off by -2n most of the time. But you got the essential element; when you divide into zero pieces, it means you have zero pieces when you're done dividing. People will insist such behavior is "undefined," but actually it is defined; either as an error condition, an action that is not allowed (an exception) or as infinity. On a computer this is defined differently for floats (infinity) and ints (exception). The idea that it is undefined is simply ideology run amok, and repeated for decades by school teachers. We have multiple competing definitions right now, for different problem domains. The idea that we "can't" have a sensible default for non-academic situations is pretty silly. Luckily the solution is also easy; don't use numbers directly in code, use objects that represent the units and overload the math operators when using those units so that sensible defaults will present themselves. For example, money classes can return 0. Graphics classes can return 0. Physics-simulation-related classes can return infinity. Still, it would reduce the code complexity in most cases to have the computer defaulting to 0 and having it raise exceptions or return infinity for special academic math types.

I think this shows it was perhaps mistaken of early computer scientists to treat computer numbers as if the job of the computer is purely mathematical. If early programmers had been logicians or philosophers instead of mathematicians, we'd probably have pragmatic real numbers for standard use, and special functions or types for academic math.

Comment: Re:Infinity (Score 1) 1067 1067

Asking what is X divided by zero is no different than asking what is Y plus red, or what is Z times pineapple.

We had five bucks, and we decided to divide it up. After we were done dividing, the money was all gone. That is dividing by zero. It was divided into zero pieces. The inputs are the number of pieces you end up with, and the original value. The size of each piece is the output. So given that you know there are zero pieces in the end, you know that they have zero size, because of algebra.

This is true for almost any example involving money. Programmers who primarily deal with integers and money units find it natural to have anything that doesn't make sense to default to zero. If there was no result, and the units are money, it is accurate to say the result is zero dollars.

The problem is, this doesn't work well for people doing scientific programming, where they expect IEEE standard results, regardless of how much extra error checking this creates. And regardless of the fact that way over 99% of the time where divide by zero is possible, the sanity checks simply detect the zero, and assign the result of zero to some variable without doing the division. Still, they consider this good.

So in the end the answer is to use high level languages, and use money objects with sensible defaults instead of floats and ints for money values.

Comment: Re:Mastermind? (Score 1) 121 121

If nerds didn't desire quality sources of information about what is new in the technology sector, then I would agree it is a joke.

But no, actually it is just pathetic. And the most pathetic thing is, if somebody did a good job at it they would probably get a lot of readers from high value demographics. But it is impossible; editors are apparently compelled by some powerful force to cut corners continuously, even without any information theory analysis of how many corners would be too many. If they ever catch up with economic theory from the `90s, somebody will try a different method. I'll give it another 50 years.

Comment: Re:Elop just fulfilled his destiny. (Score 0) 121 121

The problem with attempting to assess his competence on slashdot is that people just assume that the goal is "the company makes more money" or "the stock price goes up." But in reality, if they're chasing dollars in a corporate environment the goal isn't to increase the stock price. The goal is often for the stock price to go up and down at convenient times, without (clearly) violating any rules. Other times the goal is more opaque. You can't follow the money, because they have tools to move it around where you can't see it, and much of it moves based on secret contracts with other mega-corps.

So abstract is the modern corporate governance environment, there is just no way to judge his competence or incompetence by the outcomes for the company itself. Even when a company appears to completely fail and go out of business, often the assets were "sold" to a subsidiary and then resold out of the company's control, and it was just a debt-laden shell that failed. From the sidelines the worst possible failure cannot be distinguished from the most magnificent success.

In the old days I hoped MS would fail. These days, they are no threat to anybody. I want the company to succeed because they make good keyboards. And that is a bigger impact on *nix users than anything they do in the software sector these days.

Comment: Re:It's good (Score 1) 246 246

No, actually, having a different conclusion than you says nothing at all about my hearing. You might want to go to pedanticologist and get checked.

I stand by what I said. And your strange attack doesn't even need refutation, because you simply present an opinion as if it somehow disproves my opinion. I don't doubt that your opinion is indeed different than mine.

I don't doubt that their speaker received applause. I find it very odd though, this idea that that tells you anything about anything. If subjective applause gauges were a basis for industry analysis, Apple would have the largest market share in personal computers, mobile computers, personal media players, and cell phones. They don't actually lead any of those categories. Or any category. I'll bet they could get the biggest applause at many conventions though. I'll bet if they showed up an insurance industry event, they'd get the biggest applause.

Comment: Re:It's good (Score 0) 246 246

I didn't hear applause, I heard a lot of laughter at the idea that this matters. The interest is mostly from people who are happy with the proprietary toolchains from the involved vendor.

I agree it is a good thing, though. But in a very, very tiny way. cups was important. This is not.

Comment: Re:That last sentence makes no sense (Score 1) 260 260

Use of "they" being disallowed by the popular style guides is exactly the point I was making in mentioning style guides. Thank you for taking the time to understand what you're responding too before commenting. You added so very much to the discourse.

Comment: Re:Multiplatform is king (Score 1) 260 260

Objective-C is very, very, very usable on any *nix platform. If you think it is only useful in Apple proprietary environments, that says a lot about you and nothing about Objective-C.

The only time there is any Apple-specific constraint is when you're using their libraries. That may be most often the case, but there is no reason that it needs to be on your own projects. Serious projects don't just glom onto whatever the nearest proprietary library is, they actually have to evaluate options and make choices. You can absolutely choose non-Apple libraries whenever you want.

A truer statement would have been: "Objective-C is mostly used on 1 platform for entirely social reasons."

Comment: Re:That last sentence makes no sense (Score 0, Flamebait) 260 260

And for non-sexists, we just think of it as polite, inclusive communication to attempt to balance any non-inclusive terminology, such as is required in a language like English that lacks gender-neutral pronouns accepted by popular publishing style guides.

Calling it "corpspeak" is as absurd and offensive as claiming it is "political" speech. Only politicians and executives are being polite for political reasons. Everybody else is doing it just to be polite. Polite-speak.

Comment: Re: BI == Business Idiots (Score 2) 260 260

If you're targeting C++/Java developers with a new language, then if you're successful you do indeed get less interest over time from that group. Those who agreed left and joined you; those that stayed will like your offering less than average; the remaining non-converts become less receptive to you over time. This is true even assuming nobody ever changes their opinion; they either liked the new thing when they saw it, or not.

The case where interest "increases" is where they are saying, "awwww, how cute... but don't expect me to use it." They don't hate it, they're willing to talk reasonably about the shortcomings, but they're also probably never go to use it seriously. These people can be persuaded to agree it is less bad after making changes to be more like them, but they're not going to actually switch anyways.

You don't do much better on your technical complaints. Are exceptions a thing that is the same between languages now? No? No. The semantics are often quite different. You can't just assume it works the same in a new language as the old one. It is not predictable based on the naming. You have re-learn and memorize the semantics for every language. Naming a new set of semantics something different, that isn't already overloaded, increases jargon quality within their language, both for new people, and for people who use multiple languages. And it doesn't change the amount you have to learn; the familiar language doesn't mean you can skip parts because you know how it works in another language.

As for user-defined types, that is a specific feature that has advantages and disadvantages either way. There are real reasons to make the choices they did for the niche their language intends to serve. You don't attempt to make any case that their choice is somehow undesirable for that niche.

You really seem to not understand coding practices. Things like DRY, as an old-timer I can say yeah, I don't understand it either. We already had the teaching that code re-use is good. DRY just seems to take the lesson, adopt a pithy cliche phrasing of it, and then throw out the actual lessons and substitute a rule of thumb. Do you always want to avoid repetition? No, only the vast majority of the time. There are times when it is bad. How would a youngster who only grew up with DRY reconcile that? There is no provision in the way it is taught to determine when it is applicable. Easy to remember, sure. But if you can't remember you want to maximize code re-use, cliches aren't going to save you. The only way to use these sorts of modern ideas is to ignore them whenever they don't look useful, which actually means you can ignore them before you start, and just follow traditional best practices instead. Those will lead you to ask how much code re-use you want, and then implement it. Usually that prevent repetition, except where it wasn't desired. ;)

VMS must die!

Working...