Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
For the out-of-band Slashdot experience (mostly headlines), follow us on Twitter, or Facebook. ×

Comment: new games play differently (Score 1) 102 102

Some old games really are better if you like a certain type of game mechanics. For instance if you like crpgs with tactical combat there just isn't much being made like that now. The console gamers prefer a different type of game and that is the type that is being made now most of the time. Faster, twitchier, and imo more repetitive. Popamole. I guess it's really more about faster non-tactical combat that is the problem for players like me. I prefer combat that is more like chess where you have to carefully consider all of your moves and search for an optimal set of moves to win.

I am currently playing Icewind Dale for instance. If you like that type of game there really is not much out there anymore. I have had to resort to replying Infinity Engine games like Baldur's Gate II, Icewind Dale I and II, Planescape Torment, Temple of Elemental Evil, Fallout I and II, and Arx Fatalis just because these sorts of games just aren't being made anymore. I suppose the most recent games that I like are Neverwinter Nights 2: Mask of the Betrayer and Fallout 3: New Vegas.

I can't speak for everyone, but I don't have any nostalgia for games just because they are old. I don't seek out the first games I played from the late 70s like Super Star Trek that I played on my friend's DEC PDP-11 or text adventures like Colossal Cave or Zork or Atari 2600 games like Adventure or Combat. I don't care about the games from the early 80s like Archon, Castle Wolfenstein, Crush Crumble and Chomp, and Choplifter either. Yes I used to love playing them. They were fun to play at the time when nothing else was available, but I would have much rather played modern games with their far superior graphics. There are games that I know I would enjoy replaying but can't due to the graphics. Might and Magic 6, 7, and 8 for instance I used to really enjoy but can't anymore and of course games like those are not being made anymore either.

Only certain types of gamers are being catered to now. Only certain play styles. If you are not one of those people you have no choice but to replay old games if you want to play computer games at all. It's not nostalgia. It's desperation for any computer game you can actually enjoy and that means saying no to popamole twitch style combat if that isn't what you like. If I actually want that style and sometimes I do it is easy to find, but what about a more thoughtful style where you carefully plan your moves? You just don't see it much at least in crpgs.

There are some smaller independent and mostly crowdfunded developers now that are at least claiming to try to cater to that style of play but so far there hasn't been much in that regard and at least one attempt, Pillars of Eternity, failed utterly in terms of the combat imo. It ended up playing more like Dragon Age: Origins or other modern Bioware games. Again, catering to what the majority of gamers like despite being funded by gamers like me who wanted something that played more like Icewind Dale or Baldurs Gate II.

It is unfortunate that some developers seem convinced that nostalgia is everyone's reason for replaying old games because then you end up with games that intentionally go backwards and try to emulate things that were done solely because they had to be done that way because computers were so much slower or because rendering techniques at the time just weren't advanced enough. They are copying not only the good things about the older games but the limitations as well. I'm sure the developers who actually made those games would have loved to have been able to use more realistic graphics and smooth, continuous movement, but they didn't really have the choice back then.

This is the danger of attributing our love for older games to nostalgia. Maybe for some people that is all it is and they are perfectly happy with modern games, but that is not always the reason.

Comment: Re:shooting themselves in foot (Score 1) 76 76

Either you want them to flog the latest and greatest, or you don't. You've complaining that they're pushing 4K, and then saying they should totally push 4K.

I don't personally care what they do. If they want to be idiots and sell fewer products that's their business. Do they sell 4k monitors or do they sell video cards? Maybe they should ask themselves that. They should push 4k *and* they should push non-4k applications. Both. If they have trouble finding a current game that can make use of their processing power then they can write something themselves. Maybe a video encoding GPGPU app. Or a short game with photorealistic graphics. What you don't do is try to tell your customers that a 4k monitor is the only thing the product is good for.

So, you're a wannabee, who doesn't use these, doesn't have the gear to run it ... but you'd totally buy the biggest and baddest out there just because

No. What I am to AMD is a potential customer. A potentially profitable customer. I never said 'just because'. I gave my reasons. I'm not a wannabee anything. I don't give a fuck about 4k at the moment, but I do respect processing power. To me there is no such thing as 'enough'. I can always find a use for those cycles. Just like I can find a use for 16 GB or even 32 GB of RAM. I don't need RAM manufacturers telling me that 32 GB of ram is useless and that I only need 16GB.

I am a programmer *and* I work on games, but at the moment I don't feel the need to purchase a 4k monitor. Seems like they are doing a better job selling 4k monitors than selling their own products. Maybe they are trying to be honest, but it just seems stupid to me.

You can make the same argument for CPUs to an extent. At high resolutions most games are bottlenecked by the video card. So why buy the lastest and greatest CPU? Most people just use their computers for web browsing and checking email. Even a Conroe Core2 is way way more power than you need for what most people use their computers for.

Do you see Intel trying to convince people not to upgrade because it's not necessary? Well unless they also want to purchase another product X that they don't make. I guess I'm just glad I don't own any AMD stock with genius marketing like that.

Comment: Re:shooting themselves in foot (Score 2) 76 76

If you want 4k, you want the 4k offerings.

They are also implying that anyone with no immediate plans to buy a 4k monitor should not buy their high end cards. They are telling a whole segment of potential customers not to buy their products. At least not their high end flaship product. For someone with a video card that isn't that old that means they won't be upgrading until/unless they buy a 4k monitor.

It's very nice of them to be worried about me wasting my money on their products but maybe they should let us worry about that. I don't need them to convince me not to buy their products. I can figure that out for myself.

Comment: Re:shooting themselves in foot (Score 1) 76 76

People can already get performance for 1080P, so why advertise it?

To sell products and make money. Because it's up to the client to determine what they use the cards for. Not the manufacturer. If anything you should be trying to suggest new applications for the device rather than excluding them.

You don't say, "Don't buy this card if you don't have a 4k monitor because it will be useless. There is nothing you can do with it. No reason to own one. Just stick with your old card until you decide to buy a 4k monitor."

If a marketing droid came up with that genius campaign for my company he'd be out on his ass. If you sell exotic sports cars do you really want to emphasize how their current car is 'good enough' since both cars can reach the speed limit quite easily? No. You want to talk about the excitement of getting thrown back in your seat from the acceleration and even try to show how exciting it is to drive like 100mph in the desert or something like that.

I don't even have a 1080p monitor but I would only consider buying the highest end card because I only buy video cards like every 5-6 years or something and want to be future proofed for a while. I also do my own programming and want to explore various GPGPU options. Current games are not the only application.

Comment: shooting themselves in foot (Score 2) 76 76

I don't understand why the marketing people are so intent on telling us what we must do with their products. This whole 'gaming at 4K' seems like they are shooting themselves in the foot by excluding a huge segment of enthusiasts who are looking for any excuse to find a use for all that power. Why try to only sell your top of the line products to people with 4k monitors? I realize that consoles and just the overall cost of photorealistic graphics have somewhat reduced the need for high end cards, but jeez. At least try to sell high end products. Pathetic marketing strategy.

Comment: computer models can't be wrong (Score 0) 193 193

Computers are infallible. If a computer 'says' something we know that it is correct. The link between the assumptions that make the model and the model itself is seamless and is science. In this 'science' experiment is not needed and is possibly even harmful.

You learn the truth about the world from building your computer model. This isn't the 19th century anymore. Now we have computers. Actually doing something out there in meatspace to observe what happens when you try it is crude and unnecessary and in any case is subject to human error.

Don't question the assumption-model connection because computer models = science and the people who code the models are scientists. The fact that those people are 'scientists' also means they are correct. Who are you to question them? You'd better at least have a PhD in the relevant field.

If you question what human assumptions the model is built from you are anti-science and should be ignored as a religious nutjob and no it doesn't matter if you claim to be an atheist.

Comment: Re:Meet the New Act (Score 0) 294 294

About twice as many Democrats voted for it. Only 1 Democrat voted against it compared to 30 republicans. That's a very significant difference. I'm not a Republican or a Democrat. The petty squabbling between near identical political parties means little to me. But it was a highly biased post.

Comment: Re:outrageous (Score 1) 363 363

Try making some bomb threats or death threats

Do it from an internet cafe or cafe/restaurant wifi or from any open wifi connection or one only protected by wep. Quite easy to do quasi-anonymously. This really had nothing to do with Ulbricht. He just provided the street corner where drug dealers could gather and offer drugs if they wanted. It's not like he put a gun to anyone's head and forced them to do anything.

Comment: Re:outrageous (Score 1) 363 363

then I think I'll stick with not having this transaction platform exist at all for the betterment of humanity.

Good luck with that. Let me when you are victorious in your war against substances and black markets. Really. Let me know. I'll sign up for your We are Going to Win Any Day Now newsletter.

You can't win. Ever. Because your enemy is us. Human beings in general. You can execute as many people as you want. There will always be more.

Comment: Re:A few things here... (Score 1) 272 272

The average house price in california is 400,000$ How again is 70,000 not poor for that area?

If they bought the home 50 years ago or more as you hypothesize then the housing prices now are simply not relevant. Why even mention them? Poor people do exist in California. I've seen some living in cardboard boxes or under bridges. The rest I see living in the massive number of apartment buildings all over California. Renting. If you rent an apartment in California $70,000 is a rather nice income. Call it 'poor' if you want but if that's poor I would love to be poor. I could live comfortably in California with 1/4 that income.

You can do this in a number of ways. IBM chose to do all of them. Why do you find that funny? -- D. Taylor, Computer Science 350

Working...