"Because I'm a robot, I don't have any emotions, and sometimes, that makes me sad." - Bender
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
The common truth of the int datatype in many languages is that it is n-bit arithmetic, meaning that it is arithmetic modulo 2n . If we keep adding 1, we get back to 0. This is a perfectly respectable arithmetic itself, and can be used, if used carefully, to determine integer arithmetic results. But to say that int is integer arithmetic with bounds and overïow conditions is to say that it is not integer arithmetic. Similarly, to say that float is real arithmetic, with approximation errors, is to say that it is not real arithmetic.
This is not to say that mythos is by deïnition false, but typically if mythos was true, it would be semantics or pragmatics. Mythos is the collection of comfortable half-truths that we programmers tell each other so that we do not have to handle the full truth.
This is from Mill's Theoretical Introduction to Programming. I think it's a good way of explaining why people buy into this "rise of the machines" crap. We use very human terms like "intelligence" and "instruction" to refer to patterns of bit-flipping on an inanimate state machine. These words are shorthand at best, and require a lot of poetic license to resemble what they resemble in language outside of computers. The Programming language, and with it, artificial intelligence, is a useful abstraction. As we strip the layers away we eventually get the essence: a glorified abacus.
Abstractions are made possible by ignoring certain details of reality which are deemed unimportant. In the case of computers, it also involves encapsulating something in a facade that makes it easier to work with without worrying about the strange and arbitrary details that govern it. Even the CPU does this (microcode). But the key thing is that we have to limit ourselves in certain ways to use the abstraction. For instance, there are certain kinds of optimization which a C compiler can't give you. But by giving that up we are given the ability to use something that's much easier to understand.
AI is no different. You make assumptions about what intelligence looks like so that you can come up with a version so basic that a calculator can do it. But is that real intelligence? Hell no.
In short: once you start programming the things you realize how dumb they really are. People who say differently probably believe the abstractions to be real, which causes me to doubt their credentials.
You were actually serious about the WMP thing?
It comes down to a few things:
- those common device drivers can do a hell of a lot these days
- that 4k executable expands to over 300 MB in memory when you run it
- these techniques have been perfected over decades of work
- mountain landscapes are one of a handful of real-world things that can be realistically generated with small equations
- these people are exceptionally talented
You are right to notice the similarity as there is a lot of overlap between music visualization and demoscene work. I would guess that the former arose as a result of work being done in the latter.
It's 4096 bytes, whatever you want to call that. A typical (self-imposed) demo limitation.
These things were being made long before there was a Windows or a WMP. And there are always those ones that make you feel like "this shouldn't be possible," but I suppose that's the point.
In a case where the JS is:
- harder to implement
- 100% redundant to what CSS can do
- does not involve getting a framework in place where modifications and additions become easier down the road
- the percentage of users who know what a JS vulnerability is matter
I didn't include accessibility above. Even WebTV supports JS. Its inclusion doesn't commit any sins that images didn't already.
I'm not interested in continually moving the goalposts in order to back up the flawed "JS is automatically bad" meme. JS *can* be bad but I honestly think we're past the point where there is much to gain by taking even the slightest pain in order to use it sparingly.
I didn't misunderstand the GP, but I did miss the part where "JS should only be used where truly needed" follows logically from "JS is an important technology, which 95% of users have support for."
The technology is powerful and pervasive and mollifying your average
In short: it's reached the saturation level where those without it can safely be ignored. An extra 100k of libraries can be ignored too. I think that if it presents even a slight advantage to a designer in terms of development time then they should use it. Their client and the 95% of people viewing the page with JS on will appreciate the quicker turnaround.
When I said "most of those" I meant the tag instance, not the articles. Again, you can't assume the tags were 100% serious.
crap, I meant "its use"
But the 95% percent of people with functioning browsers might appreciate those features, so why do the people stuck in 1996 get to dictate what's useful and what's not?
unless there is a compelling requirement to do so
Everyone has JS. There's no reason to have to justify it's use anymore. It's there, it can be used.
Just doing some chop-busting, should have added the wink.
Don't put the word "browser" in the summary of an article that isn't really about them. Knowledge of a web browser represents the bare minimum level of expertise necessary to comment on
The guy who wanted to read informative comments about VM exploits rather than NoScript
Wait, what are you trying to sell me?
It's such an obvious thing. I'm surprised to hear that they haven't already implemented it in all this time. Not having a 360, I had assumed that XNA was some sort of indie game nirvana. I'm disappointed to hear that the good games get buried but I'm glad to have gotten some straight talk on the subject.
For the record I think Microsoft deciding which indie games to bless with XBLA status also creates a conflict of interest, since a cheaper download-only game could potentially cannibalize sales of a more expensive disc-based game.
But it's a software platform, not a court room, so the conflict of interest isn't necessarily a problem, but it means the users see less value out of their system.
Having the top-rated games more or less automatically make it to XBLA would be amazing. I'd buy a 360 for that (not that I won't ever buy a 360, I'm just typically a little behind on the game hardware and haven't yet).
Sounds like letting the users rate the games would be the answer to that. Microsoft doing it automatically creates a conflict of interest. The indies are paying just like everyone else.
Not at liberty? Isn't Firefox open source?
According to 95% of users have JS on. There's no reason to essentially design two separate sites to support the other 5%. And it could be argued that that 5% could either easily turn it back on if they choose (in which case, they're the lazy one), or is using something really really old and has no need to, or doesn't want to.
I'm not a web developer, but it seems obvious to me that while it's possible and often sensible to include the other 5% (which may include spiders, which you typically want), ignoring them because you don't have time for two designs is not at all silly. They may not even be the type of people you want on your site anyway.
Most of those could be argued to be hinting at the the Blu-ray-related DRM present in Vista and newer MacBooks. And the iPhone is a closed system. There's an earlier post with some examples completely unrelated to DRM, and I think in those cases it's a case of the person knowingly using it as a joke to say that whichever commercial os is referenced in the headline is never going to be any good.
As that happens more, it could mean the end of DbD as a DRM flag and just people using it because they heard it once and it sounded cool. But hopefully people will continue to parse the actual words in the phrase. I don't think I've seen it yet where I didn't think it was supposed to be applied humorously.
Of course, this being the internet, and Slashdot at that, sarcasm often goes undetected.