Have we learned NOTHING from Maximum Overdrive?
Have we learned NOTHING from Maximum Overdrive?
Previously, I had to buy two sodas, and then hand one of the sodas to friend.
With the magic of social networking and Pepsi, now I only have two buy two sodas, enter a phone number, enter a greeting, record a video, and send a free soda code to a friend's mobile device, which they can use to access the same machine and retrieve a free soda.
I'm not arguing this gives Apple any legal ground at all (it's not a case I'd like them to win), but I think their role in changing our use of the word "app" to mean "smartphone program," and increasingly only "smartphone program," shouldn't be ignored.
The case is a bit different than your example, because they branded a subset of the application market (perhaps the original "app") as "app," a subset that previously hadn't been very profitable or popular. For example, if I decided to sell an emerging, but not original coffee product as "java," (bad example, I know) and called my marketplace "The Java Store," if another company, seeing my success, decided to create its own "Java Store" that sold the special kind of coffee product I had been selling, it would be painfully obvious they were trying to ride my coattails because I, through painstakingly precious and irritating marketing, built the associations between "Java" and some silly little coffee product, instead of coffee at large. The fact that some competitors may have tried to use "Java" to describe their somewhat different coffee products (i.e., Google and "Google Apps") but failed to brand that connection in our collective lexicons, to some extent demonstrates that my marketing was special (because it turns out Apple is "cooler" than Google) and I deserve credit for that.
But again, I don't think that should have a legal basis, but I think we all do know that Apple has changed how we can use the word "app," at least for the time being. If you tell your boss "I'm thinking of developing an app for brewing coffee" s/he is going to think "new $0.99 (I have no idea how much they cost) smart phone program," not "big Windows application." At least that's my hunch. Especially for the kind of people who will make most use of this new "app store."
But would you call WordPerfect an "app" today? Would you refer to Word as an "app" in an official document? I'm sure smartphone developers refer to their products as "apps" in official documents.
Perhaps more than polysemy, the concept of semantic drift is relevant. I suspect Apple has been the driving force behind that semantic drift, with their incessant (and obnoxious, IMHO) "there's an app for that" ads.
That said, I'm not sure if this is legal ground, but I do think Apple deserves some credit for the semantic drift that has taken place, for better or worse. I choose worse.
Still, I don't believe products were marketed as "apps" before Apple.
Google searches ignore polysemy -- when I think "app," I don't think Photoshop or Microsoft Excel, I think "a program for a smartphone." If you do too, then that's because Apple cultivated that word usage via the App Store.
"App" can mean a job application, a computer program (although typically non-entertainment), a great computer program (including entertainment) when following "killer," or a smartphone program. They're slightly different meanings. But while few people regularly referred to Microsoft PowerPoint as an "app," nearly everyone calls all smartphone applications "apps." More and more, the word "app" is synonymous with smartphone programs, and fewer and fewer people will use it outside of that context. Including me, and I don't have a smartphone--I've just seen many advertisements Apple made and paid to run.
I guess that didn't create any American jobs.
If you're talking about the ridiculous row limit, that went away in Excel 2007.
However, like many researchers I have used several versions of Excel to produce publishable graphs from summary data--means, SEMs, etc. I love R, but it was only recently that I decided to spend enough time learning the ins and outs of its graphing capabilities that I felt comfortable producing even a bar chart in R for publication. Since I had been producing my tables in Excel anyway--and I'm still not entirely in love with using Sweave or other LaTeX packages in R, so I still find myself going to Excel for producing summary tables--it's trivial to then tell Excel to plot away.
That said, this book would seem very cool had the review actually talked about what sort of graphing capabilities are described in the text. I'm personally curious about its lattice graphing packages, which R has good support for but for which I haven't seen any great instructional resources. Those are the sorts of graphs I imagine you are referring to, which are exploratory or diagnostic or just too sophisticated for Excel, and work over entire datasets using models you specify.
Notice how there is no P in it.
We'd like to keep it that way.
Seriously, the P stands for "program." Just drop it.
I think it's premature, though. Right now, we should be taxing gas more to encourage it's abandonment.
We should, but that's political suicide. Furthermore, once one Congress raises the tax on gas, the next Congress (which will inevitably be elected to replace that one after a wave of townhall meetings) will lower it.
You can still publish anonymously.
To liken this to Paine--while he did not sign his name to the original Rights of Man, presumably he gave it to someone, and that transaction wasn't anonymous. His efforts to distribute it at first would certainly have linked his identity to the work among the publisher or publishers he used. The book was sold, and the sellers were obviously not anonymous.
You want _more_ privacy than Paine had. You want the right to publish a comment without _anyone_ knowing who wrote it. That's not really what newspaper forums are for (which I think is just to generate page hits). If you have something to say anonymously, and want a lot of people to read it, you're going to have to work with someone who can get it out there and will hide your identity. That's really the only safe way to ever be sure regardless.
This is what Wikileaks was _supposed_ to be for (hence the Wiki), but I think it strayed a bit from that objective in recent years.
I won't defend that - they were certainly overzealous and careless in their handling of that domain. However, it appeared to be accidental, and in three days the websites were restored. Presumably the website owners have some legal case for any lost revenue.
This happens off-line, as well. Police make mistakes, innocents are harmed. Police are sometimes punished, and the state ends up paying out if the victim can engage in litigation. It's unfortunate, and often the side-effect of having a police force that is often given far too much leeway by a public that is too often too anxious about security.
To make any comparison between this and governments like Egypt, however, is dishonest. In Egypt, the intent, quite plainly, was censorship of political thought and speech. The freedns investigation was censorship of images the majority of Westerners agree should be illegal to produce and distribute that overstepped its bounds via either simple administrative error or a (bad) policy of "better safe than sorry." It was corrected fairly quickly.
I also question the numbers - sure, 84,000 sounds like a lot, but computers can make 84,000 different versions of the same thing in milliseconds. I've had some freedns domains in the past that I haven't used in years; I wouldn't know if there was this sort of disruption. Ultimately it sounds like a few businesses were temporarily disrupted as a result of a large police action. That's always happened in the physical world - which is unfortunate - and the Internet is not immune.
I don't think you can easily get phones dumber than that - unless you're willing to pay more. The "free" ones my wife and I got from T-Mobile recently have all those features. Does anyone actually use the calendar features on their dumb phones?
The only times I find myself really wishing I had a smart phone are when I'm waiting for something, like take-out; but then I play 30 seconds of Pac-Man (which came as a free demo on my phone) to see how high a score I can get before it times out, and repeat as necessary - and I feel like a big enough jerk standing next to a take-out counter doing that, I can only imagine how conspicuous I'd feel playing an actual game or reading email on a smart phone.
Note to self - bring book when I anticipate waiting. Problem solved. $70/mo. saved.
Unlike stuff we see and hear, you can't describe what we smell on a single dimension, and that's why we literally have hundreds or perhaps thousands of different olfactory receptors, while we have only three major types of light receptors on our retinas - and, correspondingly, three different color signals in most color display adapters.
While this machine promises 20 basic scents, I suspect, even if they were delivered well and integrated into a game seamlessly, you'd grow bored of them quickly.
I could see some limited uses - warning a player of a nearby danger, for example, which would work well with the limitations of olfaction - unless sounds or sights, our olfactory system adapts rather quickly to smells. A brief exposure to a certain aroma might be effective at the right point in the right game, but for such little reward this seems like a rather awkward solution. That said, aromas can be quite evocative, activating our limbic system in unique ways that could provide for an in-depth experience far richer that we've seen before - for example, the smell of incense in an abbey, for me, might be the difference between "yet another generic abbey" and "this feels real to me."
Is this why few things in video games have ever been more satisfying than using the grenade launcher in GoldenEye?
Most people will listen to your unreasonable demands, if you'll consider their unacceptable offer.