Comment Re:Leaks like a sieve (Score 1) 496
Excel and Word are practically their own operating systems. You can't blame their problems on anything other than the insane legacy of backwards compatibility with a sprawling 20-year codebase.
Excel and Word are practically their own operating systems. You can't blame their problems on anything other than the insane legacy of backwards compatibility with a sprawling 20-year codebase.
They aren't even suppressing this article! They suppressed a previous article where he meticulously detailed a flaw in the cashback system without giving MS a chance to fix it.
This whole thing is ridiculous and nothing but anti-Microsoft / anti-Bing bashing.
This has nothing to do with Microsoft. From the article: Butterfly Photo set a three month cookie on my computer to indicate that I came from Bing.
So, a disreputable web site is setting a cookie when you click on a sales link. How is this Microsoft's fault again? What does this have to do with Bing?
A/V and photography stores are notorious for ripping off customers, both in-store and on-line. Surprise surprise, you can find these disreputable sites using search engines. Trying to blame this on Bing is like trying to blame your phone book for recommending a sketchy car mechanic.
Citation Needed.
Please provide one example of where EA released an alpha build. Or one example of where EA purchased a game already in development and then immediately diverted funds.
As much as you would like EA to be the big bad wolf knocking over studios left and right, the facts are that almost every studio that has gone down in flames under EA's ownership has done so due to its own people dropping the ball.
If you read any of the ex-Pandemic posts you will see that it was local mismanagement which led to poor quality product, not EA interference.
Likewise if you read the Escapist's article on the acquisition of Origin, one the most important quotes is this:
Garriott: "We doubled the size of the company from 200 to 400 that first year. We went from 5-10 projects to 10-20, and staffed those projects almost entirely with inexperienced people. It won't surprise you to learn those projects were not well managed. That was totally Origin's fault. We failed, and we ended up killing half of those products. That's probably what set up the EA mentality that 'Origin is a bunch of [deleted],' pardon my French."
This is a common pattern. EA buys a studio and gives the studio exactly what it wants, and the studio immediately hires new people and doubles its burn rate, spending tons of cash on payroll. And yet at the same time, the number of quality products at the studio declines. Growing pains, inexperienced management, whatever the cause, the result is the same. EA buys a successful studio, gives them money, the studio stops being successful.
Of course the game will be shipped before the studio says it's 100% done, because the studio is never going to claim that a shitty or buggy game is 100% done. The fact that it is still not a good game after 24 months of very-high-budget development does not mean that EA should pay for another 12 months. It means that the studio failed.
Then maybe that is something the Firefox team should have thought about. As part of their, you know, open-source tradition of focusing on usability and user-experience consistency?
I can definitely see how it would be annoying to the end user, but I can't agree with blaming Microsoft for following a well-documented API.
New ideas? Please.
The right way to manage a large problem is to periodically examine your processes, figure out the flaws and bottlenecks, and fix them. This is as old as time itself. Agile and Scrum just slap new buzzwords on old ideas, and their proponents act as if they have invented a cure for cancer.
If your business was failing under its old methodology, changing to "agile methods" is not going to help, except maybe as a catalyst to get your dinosaurs to quit in frustration over the meaningless jingo. The problem is that your business was unable to self-identify its flaws and correct them.
Nobody fails because they haven't heard of agile methods. Nobody fails because they honestly believe that a single "waterfall" cycle is the correct way to run a large project.
People and projects fail because they get locked into a specific process without any effort to identify and correct flaws in that process. Scrum is just one more process that you can be blindly locked into.
Everyone wishing that the money were spent on development instead of marketing is, unfortunately, living in an ideal fantasy world.
People are dumb. They follow trends, soak up advertisements, and generally do what marketers tell them to do. You personally might be immune, but remember that just by reading Slashdot (and therefore being somewhat tech-savvy) you have already self-selected against most of the population.
In modern culture, quality does not correlate with success. (Arguably, in entertainment, it never has... consider ticket sales for generic romantic comedies with famous actors vs thought-provoking art-house films.) Quantity is much stronger than quality. Exposure is all that matters.
Nobody bothers to do independent research anymore; Consumer Reports has been dropped in favor of Google search, and whoever has the most hits wins.
Welcome to the present day.
The lack of splitscreen is, sadly, a design tradeoff for having a huge open world where you can drive anywhere.
In most level-based games, like past Burnouts, the whole level is loaded ahead of time. Splitscreen just means having more players in the same amount of space. Every new player in splitscreen comes with a small, fixed overhead cost. The whole level has already been put into memory so there is nothing extra to be loaded.
But in an open-world game like Burnout Paradise, the players could be anywhere. The world is too big to fit into memory, so the game loads as much as possible and then intelligently loads "ahead" of where the player is, so that the world appears to be seemless.
But splitscreen players could be in totally different places on the map, driving at full speed in opposite directions. So the game would have to load twice as much data in the same amount of time. The second player doubles the cost of everything - twice as much memory, twice as much disk bandwidth to load ahead of each player, etc.
There are hardware limitations about how fast textures can be loaded from disk, how much memory is available, etc. Splitscreen is very hard for open-world games. It can be done, but it would take significant resources - making it work would probably tie up their best programmers for months.
Game development is all about allocating your resources as best you can. Ultimately someone decided that it was acceptable to drop split-screen in favor of making sure that the single-player and online experiences were as good as possible, and getting the game out the door on time.
The removal of split-screen still stings, of course
The interview (as opposed to the article) is at least a little more interesting.
There is nothing fancy here but he is trying to explain about the distinction between between running five tasks at once (a classic "threading" model), and splitting one task into five work units.
Many common threading models in video-game engines do not reduce latency; eg, "render thread", "audio thread", etc. You get a big win from doing two or three threads, but after that your physics takes an entire frame, or your rendering takes an entire frame, and you bottleneck. No matter how many more CPU cores you throw at it, those fixed number of threads are not getting any faster.
Nine women can't deliver a baby in one month, etc etc.
Hardly groundbreaking, but still a nice achievement given the state of most video game engines out there today. Burnout Paradise runs at 60hz with very low latency between input and screen. That's worth some kudos.
What, you mean like ActiveX and code signing? Anything goes, as long as the user knows it's dangerous?
The era of "trust the user to make intelligent decisions" is long gone. I would argue that it never really existed in the first place. Computers are for people who don't want to know how the computer works. They don't want to see Big Scary Dialog Boxes where they have to make the Right Choice or else their system could be compromised.
The onus is on application developers to minimize the frequency of scary choices, and to mitigate the impact of a wrong choice as much as possible.
There will always be malware that wants to install itself, websites that spread misinformed "security" tips, scammers who trick people into making wrong choices. And of course there will always be buggy or exploitable code.
Increased testing is useful. Increased architectural safety is MORE useful.
No, because the submission is unreasonable inflammatory rhetoric made by someone who thinks that he is much smarter than he really is. The explanation for all the "problems" this user is having is PEBKAC, not DRM.
You need to give your customers what they want, but not necessarily what they ask for. There is often a very gulf between the two, and unless your customers are professional designers they are very likely to mistake one for the other.
The job of a designer is to incorporate feedback and continually improve the design. That does not mean implementing every request, but rather addressing the root problem that leads to the requests.
In other words, don't give people a free hint button if playtesting shows that it reduces overall accomplishment. Figure out why people are finding certain puzzles so frustrating, and do something about that instead. Or else incorporate the hint mechanic in a way that rewards players for using it sparingly.
The submitter is trolling, and all the arguments about DRM are pointless. This has absolutely nothing to do with DRM.
Gears of War is, like all "good" Windows programs (according to Microsoft), a signed executable. It is also a game with online multiplayer, so it has an integrity check that tries to make sure you're not playing with modified game files (eg where all walls are rendered transparently or the player models have 50-foot-high red arrows above them).
The integrity check has a simple bug. It expects the signing certificate to be valid based on today's date, instead of on the date of signing. That's it.
It has nothing to do with rights or intentional expiration. Many other applications with expired signing certs work perfectly well.
It's just a bug. Please shut up about DRM.
The most exciting phrase to hear in science, the one that heralds new discoveries, is not "Eureka!" (I found it!) but "That's funny ..." -- Isaac Asimov