Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:What they are probably meaning: (Score 2) 169

The writer of the original article should be shot, hung, shot, and then boiled.

Ah, Slashdot. "I don't like what this guy said! Kill him!" (Applause and upmods)

Good grief. Any normal person would recognise it's more likely that the OP was indulging in deliberate hyperbole to indicate his displeasure with the writer, rather than a psychopath who genuinely meant it literally. Especially given the repetition of "shot"(!)

Either you have some form of autistic spectrum disorder (in which case, no offence, but that did need explained to you), you're stupid or you're just a would-be-smartass trying to score argumentative points by feigning misunderstanding and offence anyway.

Comment Re: the AARP-files (Score 1) 166

He is well before that TOS / Harrison Ford old man event horizon.

Is the "old man event horizon" caused by stars putting on a *lot* of weight as they age?

In which case, I think Harrison Ford will escape that fate and simply turn into a white dwarf.

Marlon Brando on the other hand- yes, he became supermassive in later years, and quite likely turned into a black hole. In fact, lots of people in the film industry were inadvertantly killed when they got too close. No-one noticed this because to outsiders they still appear to be hovering around the "Brando event horizon".

Comment Re:Calculated risk (Score 1) 269

And then on top of that, when fraud is caught [the banks] just take the money back out of the merchant's account. In no way do they ever "pay it from their profits".

This. A hundred times this.

I don't know if it's changed recently, but from reading Internet discussions on credit card fraud etc., it was always clear that people thought that- despite a notoriously sloppy and too-lazy-to-fix-the-obvious-flaws attitude towards security, the party paying for the banks' apparent fecklessness was the banks themselves.

Except, it isn't- it's the merchants. If there's a fraud, the money gets yanked back from the merchant, and that's the last he'll see of it. (No, you *won't* get the money back- even if they catch the people involved, proving and prosecuting fraud is more hassle than it's worth for the police. And most of the time the police won't do anything even if they're presented with evidence of a blatant fraud setup presented to them on a plate (e.g. full address of a rented flat in London being used as the delivery address for goods bought (or attempted to be bought) with a known-stolen credit card).

So now you know why it's "too much work" for the banks to do something about your stolen and misused credit card in advance, until you've reported and cancelled it yourself. It's because there's nothing in it for them. I can guarantee that if *they* were paying, it would very quickly become doable.

This is why the banks don't give a t***; they don't have to, they're not the ones paying.

(Note; this describes the situation in the UK- we've had chip and pin for years, but it still doesn't stamp out misuse of credit cards, especially over the Internet).

Comment Re:Became ARM (Score 1) 106

Yeah, I heard they tried marketing it in the US, but it wasn't a major success. The BBC's main success here was in schools, and AFAIK the Apple II was one of the biggest sellers for that purpose over there; possibly it was already established by the time the BBC came out.

It should also be noted that the reverse is also true to some extent- while the Apple II was far from unknown over here (my Dad had one of the later ones at work), it was never (AFAICT) as prominent as it was in the US. Possibly because they waited a couple of years to launch it here and the PAL versions didn't have colour (Wozniak's method for generating colour was tied to NTSC timing). Also, computer markets were far more localised in the late-70s/early-80s.

Comment Re:Became ARM (Score 1) 106

the Model A was £235, the B £335 in 1981

The price rapidly went up to £400 for the Model B (as the Wikipedia page states, lower down) due to supply issues.

It's far more likely that the price was hiked because more people were buying it than expected.

What the Wikipedia article *actually* says is that the price increase was due to "due to increased costs", same as the contemporary referenced article claims. Since UK inflation was still high by modern standards- around 11 to 12%- circa 1981/82 (albeit steeply down from the eye-watering 18% it hit in 1980), it's quite possible that the increase was at least partly legitimate.

That aside, it's also worth remembering that most people's experience and memory of the BBC Micro will have been of the more common- but also more expensive (£335/£400)- Model B which became the de facto base model. (The Model A only included 16K RAM- not even enough to use the most demanding graphics modes- and omitted many of the interfaces, and despite its cheaper price never sold as well (*))

And *that* is just the base machine- it doesn't include the disk drives and RGB monitors that many of us remember using the computer with; (**) even at my most conservative (and generous) guesstimate, both those would probably have come close to doubling the cost of a BBC B system and made it around £2000 to £2500 in today's money.

Yes, even a "bare" Model A with a tape deck and plugged into a TV was still better than the ZX81, but it was three or four times the cost. I'll always have a soft spot for the BBC Micro, but it was never cheap, and the ZX81 can be forgiven because it *was* much cheaper and affordable to people who didn't have a chance in hell of buying a BBC.

(*) Just guessing here, but since schools were a significant chunk of the BBC's sales- even the Model A was an expensive machine for home users- they may not have considered the Model A worth the saving given the loss of functionality. Especially if they were going to be adding the aforementioned expensive monitor and drives anyway.

(**) Again, often in schools.

Comment Re:Write-only code. (Score 2) 757

The more powerful the language, the more it's like a loaded gun: You can use it responsibly and do amazing things with it, or you can put a bullet through your foot with it. Choice is yours... and the closer you get to bare metal with the language, the greater the chance of lead meeting foot at high speed.

Oddly, that brings to mind the famous quote from Bjarne Stroustrup himself...

"C makes it easy to shoot yourself in the foot; C++ makes it harder, but when you do it blows your whole leg off"

I've only briefly looked at C++, but when I did, I understood what he meant. C++ lets you do some very complex, powerful and abstract things compared to C, but even in the small amounts I learned (and have since forgotten) you could see the potential for an overly confident smartass to misuse or fail to understand the subtleties of these features and have things go wrong in a manner that was far more convoluted, non-obvious and hard to debug in a sadistically high-level manner than simply overflowing some poxy buffer.

AFAIK (and IIRC!) Java- which came after C++- is in many respects *less* powerful and more constrained in terms of what you can do (or at least makes you do it more explicitly) and some criticised this as dumbing down for industry that wanted "quantity over quality" programmers. Maybe so, but would you want to deal with some horribly subtle bug that was ten levels of abstraction away, all because some naive just-out-of-college programmer did something a bit too clever for his own good when overloading an operator?

Now that I think about it, I remember reading criticism of C#'s increased flexibility in operator overloading (compared to the language it was mostly a clone of, Java) as being A Bad Thing for the same reason.

Comment Re:"Haters" (Score 2) 196

To be fair, the argument against calling Pluto a planet was really political more than scientific--it's hard to argue that there's some nonarbitrary scientific justification for removing Pluto's planet status.

I'm sure those on the other side of the debate would argue that it's just as political and arbitrary to claim that it *is* a planet, beyond pure inertia (i.e. because it had always been called a "planet" until then).

Two wrongs don't make a right

Precisely.

Anyway, they were Pluto-haters, or haters of the idea of smaller planets messing up their tidy worldview.

Now I think you're trying too hard to rationalise the "haters" label. As I said, you don't have to agree with their opinion, nor the way they went about getting the result they wanted.

But that doesn't change the fact that dismissing their opinions and actions purely as "haters" was quite silly.

If the argument had started to become too personal, then this sort of concerted attempt to justify that way of thinking simply makes it worse. As you said, two wrongs don't make a right.

Comment "Haters" (Score 5, Insightful) 196

The efforts of a very small clique of Pluto-haters within the International Astronomical Union (IAU) plutoed Pluto in 2006

Yeah, that's right. They were "Pluto-haters". Not just people who happened to hold a different opinion he doesn't agree with.

That's not to say that you have to agree with their position, nor the way they went about having Pluto stripped of its status. But to ascribe their actions to the fact they personally "hated" Pluto- rather than simply believing that it couldn't justify its status as a planet- is somewhat childish.

I don't know if he meant "haters" in present-day sense (i.e. with its "haters gonna hate" connotations et al), but I've always had contempt for that usage. It's a cheap and easy way to counter anyone you don't agree with, to depersonalise and dismiss them in as people who hate purely because they're "haters". To make it a personal beef and a partisan issue rather than one of simple disagreement on a particular matter- one which would require legitimately addressing what they're actually saying instead of trying to puff yourself up in the cod-macho bullshit "them versus us/me" manner of an adolescent who's either immature enough to see things in that light, or has nothing to say beyond the convenient "haters gonna hate".

Seriously, step away from the gangsta rap and stop acting like a f*****g fourteen-year-old.

Comment Re: Just (Score 1) 163

Why not eat magic pills while running through a maze chased by ghosts?

It's called a rave.

A rave is running through a maze being chased by ghosts?!

Seriously, I'm assuming you were trying to rip off the now-famous Marcus Brigstocke joke, except you got lazy and didn't even bother to make sure that the (now mangled) version in your head made sense as a joke any more.

Or perhaps the joke is so overused and ingrained that retellings don't have to be correct or even make sense at this point... it's just an instinctive response that only requires the vague invocation of the two elements of pac man and raves that have somehow become funny because I heard a joke about that once but can't even be bothered repeating it correcty, etc. etc. etc.

Comment Re:Amateurish (Score 1) 516

The thing that really hit me about the screenshot was how crowded it looks. The example is presenting information with a clear underlying structure (a file system) and a small number of actions I can take, and probably half the area of that window is empty space. And yet, my immediate reaction is that there's no clear structure to tell me where to look, and the design desperately needs more visual hierarchy and better use of whitespace. Of course, this is a recurring problem with the current trend for flat designs

I agree that the screenshot looks more complicated than it needs to, but I'm not sure it's a problem with the "flat" graphical style so much as the layout which (IMHO) looks like versions of Windows from the not-at-all-flat Vista onwards (and even XP to some extent until you turned some of the crap off).

The problems with the icons there are- if anything- that they've moved *away* from flat design which (done well) would- and should- have simplified them to their essential elements and made them recognisable at a distance (à la road signs, etc.).

But, as stated by others elsewhere, MS has always been about change for the sake of change, playing silly b*****s by introducing new technologies and ways of doing things that are discarded in the next version of Windows simply for the sake of being new, or at least for selling some "new" crap.

Comment Re:Amateurish (Score 4, Insightful) 516

Those icons look like someone's first pixel art experiments. It seems that Microsoft has fired all of its professional graphics artists.

The problem is that- in terms of style- either they can't make up their mind what they are, or they're trying to have it both ways.

They're neither sufficiently clean and flat to match the current style of graphic design (which they went for with Windows 8), but nor do they work particularly well as 3D or prettified icons, or any other style in their own right.

The end result is that they just look like horribly underdesigned versions of "old school" icon design circa XP to Windows 7. And some (e.g. the warning "!" triangle and error "X" circle) just look badly designed full stop.

The colours are also far too bright to be used in large, solid blocks like that. It's probably no coincidence that the "flat" trend in general was accompanied by the rising use of *slightly* less fully-saturated colour (see here for an example); not dull by any means, but more tolerable for solid blocks than (e.g.) #FF0000 red etc. (*)

I grew to hate the use of bland gradients of the previous design trend (early Web 2.0 and later) and the glossy 3D effect started to get overdone (and cheesy) when adopted by every man and his dog. So I'm a fan of the flat look when it works. The problem (which I figured out at the start of the trend) is that if it's not done well, it can easily come across as being simply underdesigned or crude, and as it becomes more widespread it's likely to become adopted by people who can't tell the difference.

(*) Mind you, that was also a trend elsewhere, e.g. in clothing.

Comment 32X launches when Saturn already on the way?! (Score 2) 153

Additional stupidity; I remembered that by the time the 32X was announced in the UK, the (entirely incompatible) Saturn was already due for launch in the near future. Worse, I recently found out that in Japan, they actually launched at almost the same time.

What was the point of that?! Who was going to buy the 32X knowing that it was a stop gap for something imminent/already here? Granted, the 32X was much cheaper at launch- which was apparently the justification- but anyone with half a brain would have known that it would die when (as all new consoles do) the Saturn came down in price enough that Joe Public would buy it instead of a half-baked piggy in the middle.

(And anyone who realised that should also have realised that the software companies would be thinking the same thing and not likely to waste their time supporting a dead-end console.)

The other problem with the 32X was that Sega had *already* released an "enhanced capabilities" add-on for the Mega Drive/Genesis, i.e. the Mega CD, which you already mentioned. So the 32X was, in effect, the third separate (incompatible) "format" built around the same console.

All that is stuff that should have been obviously stupid at the time; there were other factors that led to Sega's downfall (e.g. Sony playing the PlayStation launch very well) one could argue are easier to spot with hindsight, but those were on top of the obvious stupidity of having the half-baked 32X muddy the waters- and confuse the consumers and retailers- at the time of the Saturn launch.

Comment Re:Question In Headline (Score 1) 153

Atari was a dead husk for two decades. Before the Atari name was used to rebrand some corporate consolidation.

Actually, the Atari *name* has never really been out of circulation for long.

The original Atari Inc. was split and sold off in 1984 to become Atari Corp., (shut down circa 1996, following the Jaguar debacle), and Atari Games (defunct 2003, but renamed in the late 90s (*)).

While Atari Corp. and Atari Games were the only successors that (IMHO) could really claim to be continuations of the original company- and they're both now defunct themselves- nevertheless, Hasbro bought the home rights to the name and IP from the defunct Atari Corp. and used them in the late 90s. (*) Infogrames in turn bought them from Hasbro a few years after that, and has used it ever since.

So yeah, today's "Atari" is just Infogrames, and the real Atari is long dead. But the point is that the name never really went away- it was in almost constant use.

(*) Atari Games- who only owned the name for arcade use- was renamed to avoid confusion with Hasbro's new "Atari".

Comment Re:Scared Idiots (Score 1) 286

The bold text was because I tend to be longwinded and like to provide a "tl;dr" version of the important points.

Well, you're wrong about the constancy of isotope ratios. There're plenty of processes, chemical, biological, and physical that lead to isotopic fractionation.

I didn't claim that this wasn't possible.

The question is whether bananas enrich the radioactive isotope of potassium in the fruit.

*Your* question was whether they enrich the radioactive potassium. If you're going to use that to bash what I said as "uninformed shit", I'd sure as hell expect you to know if that was the case. Evidently not.

And, indeed, I did read several sources- including, but not restricted to, Wikipedia- and none mentioned isotope fractionation as a factor.

Slashdot Top Deals

FORTRAN is not a flower but a weed -- it is hardy, occasionally blooms, and grows in every computer. -- A.J. Perlis

Working...