Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Comment Translation: (Score 3, Insightful) 105

"A giant company gave me an incredible service to use for free for many years. Now they have the audacity to require me to start paying for it instead of continuing to offer it to me for free in perpetuity. In response I will proudly brag about the completely unrelated point of how I continue to possess the physical items which I was in no way being told to get rid of before anyway."

I agree it's unfortunate that the free radio stations are gone (although Beats 1 continues to be free) but getting all high and mighty about how you still have albums on disc and DRM-free formats (which I do as well, for what it's worth) is unrelated and annoying to trot out.

Comment Re:Doesn't matter anyway (Score 2) 139

You can't patent game mechanics

Not exactly, no (i.e., you can't patent "guy with gun runs around and shoots things" for FPS games) but a board game itself, complete with its rules, can be patented. See the history of Monopoly and how several different board game patents were bought up by Parker Bros. back in the day to be able to release the game.

This, interestingly, actually works to the advantage of would-be game cloners because patents expire relatively quickly compared to copyrights. Consider Late for the Sky, a board game company whose output consists almost entirely of Monopoly clones, right down to the -opoly suffix (i.e., Aggieopoly, Miamiopoly, etc.) Rather than try and bat them down with some sort of bullshit reason, Parker Bros. instead just decided to get in on the game too, thus Star Wars Monopoly, Hello Kitty Monopoly, NASCAR Monopoly, etc.

Strictly speaking, though, none of this is really relevant because the article doesn't mention patents or copyright at all and really it's just the shit stirring summary that's trying to make Gygax into some thieving asshole after the fact.

Comment Re:10K ought to be enough for anybody (Score 1) 174

But then the phone companies went to extra steps to be able to block SMS, so they could charge fees for not blocking it, backwards as it sounds. And as if that wasn't enough, they went one step further, and started counting SMSes and where they terminated, so they could charge extra for both the amount and the source/destination.

Doesn't sound backwards at all if your intent is to make money off of people. I get that it would be nice to not have to pay for it but that's not how capitalism works. And the fact that people paid for it says it wasn't a bad idea to do it. Compare that to, say, if Facebook started charging for using Facebook - everyone would stop using it and go elsewhere.

Comment Wow (Score 4, Insightful) 540

This is maybe the shittiest article I've seen posted to Slashdot in a long time, and that's saying something.

First, why does the blame fall to Tim Cook of all people instead of the developers of the game?

Second, Apple has already set up a Family Sharing system to prevent just this sort of thing. Never mind the fact that your have to give your kid your password to the account tied to a credit card for this to happen in the first place.


To say nothing of the fact that in the article itself they said Apple refunded him the money. But yeah, they're assholes because he doesn't know not to give your kid access to your credit card.

Finally throw in a dash of globalization scare tactics and remind developers that they *only* get 70% of the IAP revenue, which they know about already, and you've got the Slashdot Shithead Trifecta.

Comment Re:This is nothing new (Score 3, Informative) 207

It's Oswald the Rabbit, not Oscar.

Also, kinda interesting story, Oswald was the antagonist of the game Epic Mickey. On the Idle Thumbs podcast, Sean Vanaman told the story about how he and some others were handed the Epic Mickey project because no one at Disney Interactive Studios knew what to do with Mickey. They came up with the idea to have Oswald be in the game but Disney didn't own it, NBC Universal did. They pitched it to Bob Iger who liked the idea so much that he put the wheels in motion to trade Al Michaels from Disney-owned ESPN to NBC (something Michaels had wanted to do) in exchange for Oswald and a few other things.

Vanaman tells the story that he had no idea any of this was going on until he read in the sports section "AL MICHAELS TRADED FOR CARTOON RABBIT"

Comment Re:Any real tangible merits to using Windows Serve (Score 1) 288

For a long time it was the only option to run .NET applications on (i.e., an ASP.NET site, .NET web services, .NET Windows services, etc.) so vendor lock-in plays a big part. That's potentially different now that .NET is open source and Microsoft is friendlier to FOSS stuff but for the time being most businesses will just suck it up with the devil they know.

Comment Re:OK, I'll bite (Score 1) 195

Yeah to me the biggest thing was the memory management. I've done some C/C++ stuff but most of my stuff was C# up until then. That wasn't the issue, the issue was that these devices didn't have much RAM, especially Android's Dalvik VM thing on the multiplatform project I was working on. That was a big shift backwards but I managed. I was almost disappointed when Apple figured out ARC since now some of that newfound skill was going away.

Comment Re:OK, I'll bite (Score 1) 195

You know why we're still using semicolons? Because they're still useful

I should have been more clear before - Swift doesn't need semicolons at the end of every line. But if you want to use semicolons to have multiple statements per line you can. And you can also put semicolons at the end of every line still if you really want to, they're just being ignored by the compiler is all. I occasionally go through and check and sure enough I still sometimes use them at the end of the line.

But you're right and they are still useful, which is why Swift lets you use them if you really want to. You just won't want to after a while. I was at WWDC when they announced Swift and in the first sessions where they showed the basics of the language and I thought removing semicolons was a bad idea. Now that I've used it for a while I agree with them.

Comment Re:OK, I'll bite (Score 2) 195

I used Obj-C for decades, and I still enjoy it.

I'm real sorry to hear that.

Seriously though, I may have been a bit harsh, and I only started using Obj-C a few years ago when I got into iOS development but really, while the original version was probably good for its time, the language didn't evolve well (Obj-C 2.0 was very half baked looks like - dot notation works, except for when it doesn't that sort of thing). People fluent with it are like those farmers that just know how to operate the thresher well enough to keep their hands from being chopped off. It's not because the thresher was designed to keep you from chopping your hands off. I guess this is a convoluted "shoot yourself in the foot" analogy but you get what I mean.

Comment Re:OK, I'll bite (Score 5, Interesting) 195

My take on it is - every language designer (be they an individual or a company), especially ones with a lot of compiler design experience, when they sit down to make a new language, they basically ask what would you do different if you could start from scratch today?

When Sun made Java they said "what would we improve about C++ if we had the chance?". Separate from the JVM concept, this is what they were thinking when they made the language. Back when Java was new people joked it was C+++. When Microsoft made C# they said "what would we improve about Java if we had the chance?".

Apple basically said "holy FUCK we need to get away from this shitty 80's language, C# does some good stuff, but what would we improve about it if we had the chance?

So in C# you used to have to declare something like this:

Something aThing = new Something();

Then languages started asking themselves "wait, why do we have to say the class name twice? We could just get away with just doing it once"

var aThing = new Something();

Swift says "wait, why do we need semicolons? I mean yeah it used to be that we didn't have great ways of telling lines apart but we've solved that problem now. If there's just the one statement on a line no need for a semicolon. And why do we need to say "new"? We know it's new. The calling of the class name via the constructor tells us that. Get rid of that shit too"

var aThing = Something()

Back when c# introduced "var" I was dead set against it. When Swift dropped semicolons I thought it was reckless. Now that I've been using Swift a while I get their minimalist religion. It's a struggle to go back to C# or JS and have to remember semicolons (although JS doesn't seem to give a shit either way)

To declare a constant in C# you declare its type as well as use a keyword

const int i = 4;

In Swift, they said "well, we're already using var, why not just swap that out for a constant?"

let i = 4

and the compiler in Xcode now shows you all the times when you could use a constant, which is way more often than you realize.

For all the spitballing about platform this and proprietary that, underneath it all Swift is the latest attempt at a language that uses what we've learned from previous languages. And it's possible some or all of their conventions have been used by other recent languages that just got an eyeroll from working developers but Swift has this tremendous advantage in that it has a compelling use case: iOS developers who don't want to use Objective-C. Because no one really wants to use Objective-C. Anyone who says they do is a victim of Stockholm Syndrome.

Comment Re: OK, I'll bite (Score 2) 195

Um, no. Swift and ObjC are alike in that both are programming languages. There the similarities end.

That's not really fair. They're both currently beholden to certain conventions, Objective-C because it invented them and Swift because it has to be compatible with them. But up until Swift being open source it was pretty much just like he said, Obj-C made sane. Still stuck on Apple platforms, still needs OS X/iOS stuff to be useful, etc.

Comment Re:Old vs. New Apple in one anecdote... (Score 1) 462

Yeah but like 99% of the time the laptop is open and being used so 99% of the time other people are looking at the logo, and not you. Might as well have the logo be right side up. Learning that the logo being upside down means you can open the laptop takes ten seconds max. Go back and watch old episodes of The West Wing where they use old Mac laptops with the upside down logos. They look dumb as hell.

Sometimes aesthetics are more important than tiny losses in functionality.

Comment Re:Webkit rules (Score 5, Informative) 96

How soon before Mozilla ditches desktop Gecko as well?

It's not using WebKit on iOS because it doesn't want to use Gecko, it's because it can't use Gecko. You can't release a web browser with its own rendering engine on iOS, you have to use the built-in WebKit The Chrome app for iOS also does this. What you're getting with Chrome/Firefox for iOS is the synchronization with your bookmarks and whatever other niceties you get with different interface styles.

The one exception is the Opera browser on iOS, but it also doesn't use its own rendering engine on the phone. It renders the page on a server and then sends your phone an image of it. This is a workaround and it also makes browsing really fast but it has zero privacy or security. You probably wouldn't want to browse anything sensitive like your bank info since Opera would get to see it too. This is assuming that Opera Mini hasn't changed any, that is.

Slashdot Top Deals

It is contrary to reasoning to say that there is a vacuum or space in which there is absolutely nothing. -- Descartes