Other than the network capacity issue they are pretty "wink wink nudge nudge" about P2P, and are only doing the bare minimum to appear to be complying with the governments wish to institute network filtering.
Is that a recent change? I was under the impression that Exetel was against it when used for piracy on moral grounds and had no issue with permanently disconnecting repeat offenders. From a brief search on Google I was able to locate an Exetel blog post from a year and a half back that states:
Of course, as far as copyright theft goes, Exetel has taken the hardest stance of any ISP that I know of, and we have done that since we began over four years ago. Not because of any implied threat or stretched interpretation of the Copyright Act. And certainly not because of any direct financial benefit (quite the opposite, from the 'outraged' emails sent from soon to be ex customers). Rather because, actually, it is the right thing to do. We have always made it very clear that thieves are not welcome.
Of course I do realise that P2P is frequently used for legitimate file transfers, but the "wink wink nudge nudge" you wrote seems to imply you believe they are lenient in their dealings with piracy. The blog post would indicate otherwise and outlines the process they take for disconnecting anyone that causes them to receive a copyright infringement notice. A more recent example of their P2P off-peak policy also seems to indicate they aren't very friendly towards P2P in general, regardless of what is being downloaded.
...but the other option is for all traffic to be slow if the links max out due to unrestricted P2P
Of course the real other option would be to provide the bandwidth they advertised for the service and set realistic quotas to keep it in check, like most of the other ISPs in Australia do, rather than giving all plans a flat 60GB bonus during certain hours of the day that they cannot actually provide when many users take advantage of it.
When a game has problems and they don't publish details everyone bemoans that fact that they are so closed about it. "Just be up front and honest about it, we understand there can be problems..." everyone says. Then when some does that everyone starts crying out " Cheap bastards are just trying to get free advertising... ".
Game developers often run into that problem and most (almost all?) of them have decided it's not worth being candid about their problems. It's interesting that Stardock have been able to twist their problems into publicity though, as over the five articles you'll find on Slashdot over the last couple of months about Demigod there are a lot of comments from people that've never heard of either before; you couldn't deny that these articles have given them a serious brand recognition boost, the only thing that's left to decide is whether it's been positive or negative.
I was serious when I questioned whether admitting your mistakes wins you additional sales, though. I'd be incredibly surprised if there weren't a measurable fraction of sympathy sales from the piracy problem, or sales generated by the networking fix articles. It would actually be interesting to see whether their candid discussion of Demigod has turned out to be brilliant marketing, whether intentional or not, and whether that publicity has resulted in higher sales than they initially predicted for the game or not.
They've covered the other stuff, it'd be interesting to see them detail what effect the publicity has had on their sales. Whether intentional or not it has been a massive free advertising campaign, with positive comments about the game popping up on the articles covering it. If they truly want to be open, why not discuss how that publicity has influenced their ability to turn Demigod around from what initially appeared to be a disaster?
But then again I've not played (or heard of) Demigod before this
That seems somewhat odd, as they've tried to get as much free advertising as they can by posting various development or piracy related stories to social news sites; some stories even got caught by the traditional media. It's not a bad marketing effort; people seem to be falling over themselves to get the game's name out there.
I wonder how effective this type of advertising actually is though; does pointing out your mistakes and how you fixed them to a technical crowd win additional sales? Maybe that can be the next article they submit.
Ah thanks, that makes much more sense to me now. It would be interesting to see how well that sort of model turned out, logistically and from the idea that people would pay to have their content certified (if it meant enough to them or their company). Still, I think they might be better off giving the masses the power rather than giving relatively few people a massive power trip (see Wikipedia editors).
I've spent 15 minutes attempting to track down the definition of "CA" in this context to no avail, would someone be able to point out the correct definition for me? (sad, yes, but I'm curious to find other examples and a google search for CA isn't exactly concise).
Community Admin|Advisor|Assistant|Agent|etc, Certificate Authority and so on were perhaps mildly plausible ones I located on the free dictionary but there weren't exactly any conclusive answers for gaming (unless they're hiring concentration auras, of course). Community Admin sounds too broad for someone vetting quests.
I really wish people would define their jargon acronyms the first time they use them in their responses.
I imagine there's a slight difference in the cost to provide a connection in a highly centralised location to the carrier's network when compared to connecting a consumer in the middle of nowhere to that same network. The maintenance and upgrade costs are going to be far different when compared, and when you factor in the volume of customers and support costs... I think it's pretty silly to try and compare a datacentre connection to a residential one.
As for some of the other discussions going on: Bandwidth caps effectively reduce the amount of bandwidth a user utilises during peak periods by self-regulation, and allows the ISP to offer higher speeds to them when they do wish to use their bandwidth.
ISPs really have two options at the moment, they can either:
People seem to frequently argue that overselling bandwidth is wrong, but it actually works in favour of the majority of users. Joe blow that wants to use youtube occassionally wants his videos to download quickly, but doesn't want to pay for an expensive Internet plan. Buying an high speed plan with a bandwidth cap allows him to get his fast but rare downloads and the ISP doesn't have to worry about having a large number of users utilising 100% of the speed all the time and congesting their network because there's a bandwidth cap and the user self-regulates. This works pretty well because users don't all download large files at once usually so you can offer fast speeds and don't need to have all of the backhaul in place to provide that to all users simulataneously.
I guess my point is, based on the price most people are willing to pay for their Internet connection it's not currently feasible to provide the guaranteed high-speed bandwidth for everybody without bandwidth caps, particularly as more and more bandwidth intensive websites and applications are created. If you wanted a truly unlimited plan, you'd have to resign yourself to slower speeds in order for the ISP to be able to guarantee that bandwidth as it's not currently economically viable to give everybody guaranteed 100mbps connections. Most people opt for the "fast but limited" option because it's far more convenient when you want to use the connection, the argument that you should be able to utilise 100% of it all the time for the current price (which wouldn't facilitate massive backhaul upgrades across the country to provide that service) is asinine.
Sorry if that wasn't entirely coherent, but that's my experience with ISPs from Australia though. Even with the fibre-to-the-home upgrade that our government has proposed we will still have bandwidth caps because it'd be idiotic to lay enough fibre to give every internet-connected home in the country guaranteed speeds when only a tiny portion of the population would come anywhere close to using it (and let's face it, there aren't enough legal download services to really max out that connection for the average user).
You're discounting the ongoing costs of maintaining an ~8 year old operating system, if you continue distributing it you have to continue supporting it with fixes. The cost of fixing bugs in XP (some of which are non-existent in their newer systems but they must still spend resources on fixing) must be pretty high by now. I don't know if they make a loss by maintaining it and selling it for low prices on low-end machines, but the idea that because software is old sales are almost entirely (you quoted 99%) profit is highly unlikely to be true.
Not that often, maybe once with the i am rich app. It is very easy to avoid the rubbish.
I takes me a week to get my crappy, small time apps published.
You just contradicted yourself sir, by saying that you don't encounter crappy applications frequently yet can get them listed within a week. One of those is incorrect, unless you rely on other users voting for top 100 lists to find good applications in which case other people have to dredge through the rubbish to find the sapphires.
You apparently missed his point entirely while rushing to defend Apple. It's because you can push your "crappy, small time" applications past Apple's quality vetting that there's a problem; Apple's quality standards for the iPhone App store are apparently too low.
P2P throttling? Not here.
Exetel do, and we know of this only because they've been vocal about it; other ISPs may do it with more subtlety.
Forbidding servers on residential connections? Not here.
The Whirlpool broadband survey 2008 disagrees (search for "not allowed to run server", optus certainly restricts it).
So while the majority of ISPs don't do it, you shouldn't make out that it's all sunshine and roses in bandwidth cap land; some of the larger ISPs (Telstra and Optus) measure both uploads as well as downloads when considering your monthly bandwidth cap too (which seems to be an effective way to reduce p2p since you'll hit your cap that much faster by "giving back").
I agree that shaping connections rather than billing for excess usage makes more sense for ADSL/Cable connections though; it's much less daunting to get throttled as opposed to being charged extra. Internode have implemented a "Data Block" system that allows you to purchase chunks of bandwidth to extend your monthly cap in a pinch if you're about to get throttled (i.e. it isn't cost effective to do regularly) which could be worth looking into later on.
One more thing, if you do implement caps you'd want to look into some sort of monthly usage meter that's easily accessible to your customers. Net Usage Item is an example of a Firefox addon that tracks usage from various ISPs that helps people avoid overrunning their caps.
While I like the convenience of Steam, let's not forget that if Steam goes belly up, games bought there will become unplayable.
They announced that in that case the games would be unlocked.
That statement's been debunked several times, if VALVe goes belly up the administrators that take over are incredibly unlikely to allow anyone to flip a switch that would destroy the value of the company's assets. It's nice that they say it, but reality won't give them any control over it in that situation.
I like to purchase games through Steam to avoid having to hunt down the games in stores, as I've generally had bad luck when trying to get game-related items from stores here. I imagine it'd be similarly useful for people that would otherwise have to expend a large amount of transport effort to acquire the boxed version of the game. Some games, like Red Alert 3, even remove their boxed DRM in favour of the Steam version, which I tend to find less intrusive (it's pretty invisible to most internet-connected users).
As for the quota issue, in Australia the ISPs began implementing quota-free services on their own networks to counteract the large amount of bandwidth consumed doing things like gaming. Several even offer Steam content servers on their own networks as quota-free. Customers with Internode and Bigpond, for example, are able to acquire most (all?) Steam content quota free so bandwidth caps are irrelevant when downloading games; the only limiting factor is speed.
If American ISPs follow the Australian ones with the quota-free content servers and such we might find the number of people downloading games from Steam won't decrease when hard caps are implemented (since traffic on their own networks is essentially free they're likely to offer it).
You could argue that the time freed by completing the tasks you desire more efficiently in Linux allows you to perform more paid work, but claiming that therefore means Linux is paying you to use it is entirely deceptive and doesn't really advance the argument for Linux further as much as it causes people to gawk at the perceived intelligence of its vocal users. I'm quite sure anyone running around saying Windows pays them to use it because Photoshop is more efficient for them to use than Gimp would be smacked down with logic quite quickly, even if it provides them with additional time to complete additional paid work the OS isn't actually paying them.
No doubt I just got trolled, but I think if people try using this sort of argument to convince people to use Linux they're simply going to make themselves look deceptive rather than helpful.
They're two of the co-op only campaigns that they very slightly changed to run versus.
Heh, they'll require major overhauls to be properly playable on versus (hence their original exclusion). Those maps have far too many open / barren corridors and the finales for both offer little protection for the infected in many of the easily defendable locations (on a rock in the water in Blood Toll or one of the corners in Dead Air being the prime examples).
It'll be interesting to see how they change them, but I doubt it'll be "very slightly".
I had this idea presented to me a few days ago and though paraphrased it does ring true: Once technology evolves to the point where being faster/more accurate isn't much of an issue, the focus moves to aesthetics. Watches and mobile phones are the primary examples, where form matters more than function for the general population. They don't care if your watch never loses time if it looks ugly...
While computers in general haven't reached the point where they're sold based on aesthetics rather than features, many pieces of software appear to have hit that point (at least temporarily). If the interface is better it does seem to me as though people will care more about it than if it has a couple of additional features.
I may not have described it all that well, but hopefully you understand what I'm getting at rather than picking at specific points. People (in general) do seem to sacrifice utility for fashion, and while you use your computer in a more technical manner for others they don't need it to perform faster and thus are more enticed by making it look better.
The "cutting edge" is getting rather dull. -- Andy Purshottam