Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment "Mad Max 2" wasn't the original, surprisingly! (Score 1) 776

I have a hard time imagining any remake being better than the original. It little dialog, but excelled at making you feel for the characters and what was happening at the moment.

Tie that will a limited budget, it was showed they knew how to create a great movie.

"Fury Road" isn't a remake of any of the existing Mad Max films.

Also, you seem to forget (or maybe didn't realise) that "The Road Warrior"- i.e. the film known as "Mad Max 2" outside North America!- wasn't the original either. Granted, the name change (which was apparently because the original "Mad Max" wasn't well known over there) obscures that, and to be fair, "Mad Max 2" *is* probably the closest to what people associate with the series.

If you watch the original "Mad Max", it's quite obviously a much lower budget (*) (and smaller-scale) exploitation film- around a tenth of the budget of Mad Max 2- and IIRC *was* more character based. For someone who had seen the sequel first, I suspect that it might almost come across as a prequel or set-up for its better known follow-up. From what I remember, the basic elements associated with the later films *are* in place, but don't come together until surprisingly late in the film.

Something like "The Road Warrior" *couldn't* have been done on the budget of the (actual) original. That said, the budget of "Fury Road"- at a supposed US$150m is still *way* higher than even the Hollywood-bloated "Mad Max Beyond Thunderdome" (supposedly US$12m in 1985, which would be around $27m today)!

(*) Something that Wikipedia confirms; the second film was apparently AU $4.5m, whereas the original was around $400,000!

Comment Re:It can run Doom (Score 1) 368

and their website looks like it's from 1995 as well!

Finally, a web site which doesn't try to overrun your browser with unnecessary rotating images and the latest and greatest shiny because some web designer said, "Why not?"

Which is ironic, because even in the mid-90s (i.e. almost as soon as the web had become popular), people were already slapping crappy, pixelated pre-rendered GIF animations of all manner of spinning crap onto their "home pages"!

Please see this historical documentary of the phenomenon.

Comment Re:No, his hack was successful (Score 3, Insightful) 246

He issued an HCF instruction.

Shame I didn't have mod points- not just for the joke itself, but because- in a discussion thread that could otherwise have been mistaken for one on Fark or whatever- it says something that this is by far the most reminiscent of the traditional Slashdot audience and style.

Comment Re:danger vs taste (Score 1) 630

Sales of diet Pepsi are falling because half of them are buying Pepsi Max instead. Not sure how it differs from the diet option. They both taste equally bad to me.

Depends which one you mean- apparently there's a "Pepsi Max" (nee "Diet Pepsi Max") on the US market which has more caffeine than regular Diet Pepsi. The "Pepsi Max" sold in the UK since the early 90s is really just... Diet Pepsi marketed towards men instead of women.

About 15 years ago, I tasted some (UK market) Diet Pepsi and Pepsi Max side by side just out of curiosity- the difference was minor at best.

The reason for having the two was- I assumed- more to do with marketing. Diet Pepsi and diet drinks in general were marketed and perceived as "girl" drinks, which probably put off male consumers. Pepsi Max launched with (very) 90s male-oriented advertising. (*)

What surprises me is that Coca Cola took around 15 years to do the same marketing trick with Coke Zero. That- at least- has the excuse of being a clearly different product from Diet Coke. (It sucks because it follow's Diet Pepsi and Pepsi Max's nasty over-intense "sweetness" rather than Diet Coke's less intense but more unpleasantly "hollow" sensation (**)).

On the other hand, it means that some men (e.g. my boss) might still have a legitimate reason to buy Diet Coke instead of Coke Zero, which he doesn't like.

Personally, I think almost all diet soft drinks are horrible, except Sugar Free Irn Bru because it's one of the very few- if not the only one- that avoids both the above pitfalls if it's properly chilled. Plus, it's not over-sweet, it's got caffeine in it, and IT'S SCOTTISH! (^_^)

(*) What's weirder; the fact that with the wonder of the modern Internet YouTube (cough) I can find obscure twenty-year-old adverts in under a minute, or the fact that that advert still seems familiar to me after all that time. (I remembered the annoying "ooh" at the end even before watching it).

(**) You might recognise this as "my mouth thinks this is sweet, but some part of my reptilian brain knows damn well this is carbohydrate-free chemical-flavoured water and is refusing to give me any sense of satisfaction in drinking it").

Comment Re:Will probably be used for VR applications. (Score 1) 152

But the same thing could happen to VR one day. We've got a limited view of what VR is and what it can do right now. What happens within a decade or two might be so different that you'll be writing a similar comment about VR.

You're missing the point I was making. It's not that people 20 years ago would have had a limited idea of what the "phone" could do.

It's that a lot of what we now associate with the smart-"phone" was never really a consequence of the phone- or the phone functionality- itself. Rather, it's a result of the fact that they were driven by *computers* that allowed the introduction of useful but secondary functionality (like calculators, snake, et al) of ever-increasing sophistication. It's the evolution of that to the point that it is more important than the "phone" itself- yet the device retains its vestigial name.

Of course, expensive proto-smartphones had been around since the late 90s (e.g. Nokia 9000 Communicator), but even those were never designed solely as "phones".

Smartphones are as much the successors of portable computers and PDAs as they are of phones, and would be seen as such by someone from the 80s. If you'd asked someone then where (e.g.) the early Psion Organisers might lead us in 30 years time, you would probably have got more insight than asking them questions about "phones".

The only thing such people could be "blamed" for would be not foreseeing that we'd get there via the mobile phone rather than via the PDA/pocket-computer route.

Comment Re:Will probably be used for VR applications. (Score 1) 152

Two decades ago, nobody thought the "portable phones" market would ever overtake the laptops market.

That's misleading. Two decades ago a phone was just a phone, and people back then would assume that's what was meant.

Today's smartphones are effectively portable computers and communications devices that happen to include a phone as part of their functionality- the "smartphone" name is more a legacy of the direction they evolved from (i.e. the phone market) than a reflection of what they are now. If the concept had been invented out of the blue in a world of traditional "dumb" phones (mobile or otherwise), they almost certainly wouldn't be referred to as such.

Arguably they're more akin to a continuation of the concept of a PDA. The fact that they aren't- again- has more to do with where they evolved from and the fact that the market for PDAs (as they were then) had declined quite seriously in the years immediately preceding the iPhone.

Comment Re:What they are probably meaning: (Score 2) 169

The writer of the original article should be shot, hung, shot, and then boiled.

Ah, Slashdot. "I don't like what this guy said! Kill him!" (Applause and upmods)

Good grief. Any normal person would recognise it's more likely that the OP was indulging in deliberate hyperbole to indicate his displeasure with the writer, rather than a psychopath who genuinely meant it literally. Especially given the repetition of "shot"(!)

Either you have some form of autistic spectrum disorder (in which case, no offence, but that did need explained to you), you're stupid or you're just a would-be-smartass trying to score argumentative points by feigning misunderstanding and offence anyway.

Comment Re: the AARP-files (Score 1) 166

He is well before that TOS / Harrison Ford old man event horizon.

Is the "old man event horizon" caused by stars putting on a *lot* of weight as they age?

In which case, I think Harrison Ford will escape that fate and simply turn into a white dwarf.

Marlon Brando on the other hand- yes, he became supermassive in later years, and quite likely turned into a black hole. In fact, lots of people in the film industry were inadvertantly killed when they got too close. No-one noticed this because to outsiders they still appear to be hovering around the "Brando event horizon".

Comment Re:Calculated risk (Score 1) 269

And then on top of that, when fraud is caught [the banks] just take the money back out of the merchant's account. In no way do they ever "pay it from their profits".

This. A hundred times this.

I don't know if it's changed recently, but from reading Internet discussions on credit card fraud etc., it was always clear that people thought that- despite a notoriously sloppy and too-lazy-to-fix-the-obvious-flaws attitude towards security, the party paying for the banks' apparent fecklessness was the banks themselves.

Except, it isn't- it's the merchants. If there's a fraud, the money gets yanked back from the merchant, and that's the last he'll see of it. (No, you *won't* get the money back- even if they catch the people involved, proving and prosecuting fraud is more hassle than it's worth for the police. And most of the time the police won't do anything even if they're presented with evidence of a blatant fraud setup presented to them on a plate (e.g. full address of a rented flat in London being used as the delivery address for goods bought (or attempted to be bought) with a known-stolen credit card).

So now you know why it's "too much work" for the banks to do something about your stolen and misused credit card in advance, until you've reported and cancelled it yourself. It's because there's nothing in it for them. I can guarantee that if *they* were paying, it would very quickly become doable.

This is why the banks don't give a t***; they don't have to, they're not the ones paying.

(Note; this describes the situation in the UK- we've had chip and pin for years, but it still doesn't stamp out misuse of credit cards, especially over the Internet).

Comment Re:Became ARM (Score 1) 106

Yeah, I heard they tried marketing it in the US, but it wasn't a major success. The BBC's main success here was in schools, and AFAIK the Apple II was one of the biggest sellers for that purpose over there; possibly it was already established by the time the BBC came out.

It should also be noted that the reverse is also true to some extent- while the Apple II was far from unknown over here (my Dad had one of the later ones at work), it was never (AFAICT) as prominent as it was in the US. Possibly because they waited a couple of years to launch it here and the PAL versions didn't have colour (Wozniak's method for generating colour was tied to NTSC timing). Also, computer markets were far more localised in the late-70s/early-80s.

Comment Re:Became ARM (Score 1) 106

the Model A was £235, the B £335 in 1981

The price rapidly went up to £400 for the Model B (as the Wikipedia page states, lower down) due to supply issues.

It's far more likely that the price was hiked because more people were buying it than expected.

What the Wikipedia article *actually* says is that the price increase was due to "due to increased costs", same as the contemporary referenced article claims. Since UK inflation was still high by modern standards- around 11 to 12%- circa 1981/82 (albeit steeply down from the eye-watering 18% it hit in 1980), it's quite possible that the increase was at least partly legitimate.

That aside, it's also worth remembering that most people's experience and memory of the BBC Micro will have been of the more common- but also more expensive (£335/£400)- Model B which became the de facto base model. (The Model A only included 16K RAM- not even enough to use the most demanding graphics modes- and omitted many of the interfaces, and despite its cheaper price never sold as well (*))

And *that* is just the base machine- it doesn't include the disk drives and RGB monitors that many of us remember using the computer with; (**) even at my most conservative (and generous) guesstimate, both those would probably have come close to doubling the cost of a BBC B system and made it around £2000 to £2500 in today's money.

Yes, even a "bare" Model A with a tape deck and plugged into a TV was still better than the ZX81, but it was three or four times the cost. I'll always have a soft spot for the BBC Micro, but it was never cheap, and the ZX81 can be forgiven because it *was* much cheaper and affordable to people who didn't have a chance in hell of buying a BBC.

(*) Just guessing here, but since schools were a significant chunk of the BBC's sales- even the Model A was an expensive machine for home users- they may not have considered the Model A worth the saving given the loss of functionality. Especially if they were going to be adding the aforementioned expensive monitor and drives anyway.

(**) Again, often in schools.

Comment Re:Write-only code. (Score 2) 757

The more powerful the language, the more it's like a loaded gun: You can use it responsibly and do amazing things with it, or you can put a bullet through your foot with it. Choice is yours... and the closer you get to bare metal with the language, the greater the chance of lead meeting foot at high speed.

Oddly, that brings to mind the famous quote from Bjarne Stroustrup himself...

"C makes it easy to shoot yourself in the foot; C++ makes it harder, but when you do it blows your whole leg off"

I've only briefly looked at C++, but when I did, I understood what he meant. C++ lets you do some very complex, powerful and abstract things compared to C, but even in the small amounts I learned (and have since forgotten) you could see the potential for an overly confident smartass to misuse or fail to understand the subtleties of these features and have things go wrong in a manner that was far more convoluted, non-obvious and hard to debug in a sadistically high-level manner than simply overflowing some poxy buffer.

AFAIK (and IIRC!) Java- which came after C++- is in many respects *less* powerful and more constrained in terms of what you can do (or at least makes you do it more explicitly) and some criticised this as dumbing down for industry that wanted "quantity over quality" programmers. Maybe so, but would you want to deal with some horribly subtle bug that was ten levels of abstraction away, all because some naive just-out-of-college programmer did something a bit too clever for his own good when overloading an operator?

Now that I think about it, I remember reading criticism of C#'s increased flexibility in operator overloading (compared to the language it was mostly a clone of, Java) as being A Bad Thing for the same reason.

Slashdot Top Deals

"If I do not want others to quote me, I do not speak." -- Phil Wayne

Working...