Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Cloud

OnLive Awarded Patent For Cloud-Based Gaming 87

donniebaseball23 writes "Cloud gaming provider OnLive has secured a patent for an 'apparatus and method for wireless video gaming.' The patent gives substantial leverage for OnLive over competing brands in the cloud-based gaming market. 'Hundreds of people have worked incredibly hard for more than eight years to bring OnLive technology from the lab to the mass market, not just overcoming technical and business challenges, but overcoming immense skepticism,' said OnLive CEO Steve Perlman. 'It is gratifying to not only see people throughout the world enjoying OnLive technology in the wake of so many doubters, but also receive recognition for such a key invention.'"

Comment think about the client side of the web (Score 2, Informative) 897

You might want to spend some time on jQuery and other tools for building more interactive web UI's. While there are promising newer languages for the backend, it's not yet clear that they're going to take over from Java, PHP, and .NET. But the Javascript, client-based side of things is definitely growing and new tools are being developed.

Comment Re:Download now? (Score 1) 717

IANAL, but I believe that in exercising editorial control over what applications to accept, Apple can't claim to be a hosting company or the mailman. I am not yet sure, however, that they are violating v2 of the GPL, since the terms seem to say that if there is a license specific to the product, it supersedes the generic iTunes one.

Comment Re:Why the hate.... (Score 3, Informative) 186

These protocols were designed for a different world:

1) They were experiments with new technology. They had lots of options because no one was sure what would be useful. Newer protocols are simpler because we now know what turned out to be the most useful combination. And the ssh startup isn't that much better than telnet. Do a verbose connection sometime.

2) In those days the world was pretty evenly split between 7-bit ASCII, 8-bit ASCII and EBCDIC, with some even odder stuff thrown in. They naturally wanted to exchange data. These days protocols can assume that the world is all ASCII (or Unicode embedded in ASCII, more or less) full duplex. It's up to the system to convert if it has to. They also didn't have to worry about NAT or firewalls. Everyone sane believed that security was the responsibility of end systems, and firewalls provide only the illusion of security (something that is still true), and that address space issues would be fixed by reving the underlying protocol to have large addresses (which should have been finished 10 years ago).

3) A combination of patents and US export controls prevented using encryption and encryption-based signing right at the point where the key protocols were being designed. The US has ultimately paid a very high price for its patent and export control policies. When you're designing an international network, you can't use protocols that depend upon technologies with the restrictions we had on encryption at that time. It's not like protocol designers didn't realize the problem. There were requirements that all protocols had to implement encryption. But none of them actually did, because no one could come up with approaches that would work in the open-source, international environment of the Internet design process. So the base protocols don't include any authentication. That is bolted on at the application layer, and to this day the only really interoperable approach is passwords in the clear. The one major exception is SSL, and the SSL certificate process is broken*. Fortunately, these days passwords in the clear are normally on top of either SSL or SSH. We're only now starting to secure DNS, and we haven't even started SMTP.

---------------

*How is it broken? Let me count the ways. To start, there are enough sleazy certificate vendors that you don't get any real trust from the scheme. But setting up enterprise cert management is clumsy enough that few people really do it, hence client certs aren't use very often. And because of the combination of cost and clumsiness of issuing real certs, there are so many self-signed certs around the users are used to clicking through cert warnings anyway. Yuck.

Comment content (Score 1) 1348

Whether Linux is realistic depends upon what you want to do. I use a Mac as my primary way to access TV, and to some extent movies. As far as I can tell the only major source of video content is iTunes. I'd love to find an alternative, but there doesn't seem to be any. Hulu only keeps some shows, and you can't rely on which ones. It seems unlikely that the major content providers will be interested in supporting Linux unless things change more than I think they will. It's pretty clear that if Apple hadn't pioneered with iTunes, they wouldn't even support the Mac. I find it odd that content providers haven't provided a credible alternative. It seems in their interests. But it looks like only Apple has enough clout to force them to do something reasonable.

Comment Pick a project (Score 4, Insightful) 565

The problem with books is that most people learn by doing, and toy problems don't teach you what a real application is like.

I'd suggest picking an open-source project and doing something with it. Depending upon the type of programming you want to do, add something to Linux, OpenOffice, or any of the number of Java-based things. (I'm currently working with the Sakai course management system. There are plenty of things that need doing there.)

The languages aren't any worse than what you're used to. The problem is that real programming these days tends to involve lots of complex libraries and frameworks. Those are hard to learn in the abstract, which is the reason for my advice.

Whether it make sense for someone to (re)enter programming as a job I can't say. That's a decision for you. There are a lot of problems with the profession. But there's also lots of important things that need to be done, and a lot of the people who think they're programmers aren't up to it. Programming approaches are changing often enough that skills go out of date in a few years. That's both good news and bad news for people like you. Since people have to learn new techniques all the time anyway, it's not like you have to relive the whole last 30 years.

The language depends upon what you want to do. Systems software and desktop applications typically use C-based stuff (C++ is probably the best place to start, although Objective C and other things have advantages.) Web applications use Java or .NET. I'd probably start with Java. You can find real and interesting applications in just about any language, so you can argue for Python, Ruby, and all sorts of other stuff. But C++ and Java are probably the place to start. I keep hoping that there will be some major new programming technology to use multiple processors / cores well. But there are lots of nice demos, but so far I haven't seen an approach that looks like it's going to really take off. That's really pretty discouraging. I wish things were more different from when you were programming. C++ and Java are only slight improvements on what you're used to. It's really the libraries and frameworks that are new.

If you're thinking of web-based work, I strongly suggest learning Javascript and at least one major Javascript programming environment (e.g. jquery or one of its competitors). UIs are increasingly moving into Javascript.

Comment Re:Hysteria repeating itself (Score 1) 701

But not releasing your data plays into their hands. I understand that you'll then have to deal what you think is irresponsible use of the data, but it's a lot better to say "they don't know what they're talking about" than to hide your data. The former is a scientific disagreement, or maybe even a disagreement between scientists and quacks, but the latter is an admission of guilt.

Comment What did the Church actually do? (Score 4, Informative) 369

In case anyone is interested, I just looked to see what was actually done about Copernicus. No action was taken during his lifetime. During the Galileo affair, motion around the sun was declared to be erroneous and heretical. Thus Copernicus' major work was taken out of circulation for 4 years, until it could be "corrected." 9 or 10 corrections were made, which appear to have been simply inserting the word "hypothetically" or equivalent, on the grounds that it was a hypothesis that hadn't been proven.

Note that I am not defending the actions of the Catholic Church. I just thought people might want to know what they were. The uncorrected version was put on the Index.The "corrected" version was not, so it continued to circulate. The source I looked at (http://hsci.ou.edu/exhibits/exhibit.php?exbgrp=1&exbid=14&exbpg=4) says that there was no official finding that Copernicus was heretical, although it appears that there was a general condemnation of heliocentrism (at least this is how I read a couple of seemingly contradictory statements).

Comment Re:No (Score 1) 172

The proper protection for software is copyright. Patents aren't needed to protect software. The development work goes into the code, which needs and has protection available via copyright. Patents protect the idea. I have yet to see a software patent that isn't obvious, except for public-key cryptography. And that patent is a major cause of the current Internet security problems. (It was filed during the same time that the basic Internet protocols were being designed. If it hadn't been present, it is highly likely that cryptographic checks would have been built into many of the protocols. While that wouldn't solve all security problems, it would leave us in a lot better shape than we are now.)

Comment Re:I know how *not* to make it more relevant (Score 1) 667

Now and then we've had problems where good support could help. When we moved to Java 1.6 the JVM started crashing. It turned out to be a problem with a Solaris-specific library. At the time, Solaris support included Java, so they gave us a workaround. We also thought we had GC-related issues, where we could have used help. We'd be interested in trying G1, but there are still enough bugs that Sun advised not doing it without support. Hopefully the support people would know more about the status, and could advise whether it is safe.

However we're talking about maybe one service request per year. So the very high priced contract they quoted us didn't make any sense.

Comment Re:Oracle has some work to do (Score 1) 667

I'm assuming (without having used it myself) that .NET does a better job of desktop support than Java. I am particularly concerned with JMF. Sakai (a course management system that we use) had a rather nice video conferencing component. It's been abandoned by its developer, largely because JMF was never finished for the Mac, and generally doesn't support what you'd expect. But JMF just happens to be the one thing I need at the moment. I believe that are a number of other examples.

If .NET doesn't have better support for applications like this, then I retract my statement, but still think Java needs to do more with the desktop.

Comment I know how *not* to make it more relevant (Score 4, Insightful) 667

What they had better NOT do: treat it like Solaris. You're only allowed to use it in production if you have support, and the only support they sell is a site license which costs $25 * the number of people in your company + the number of users for your application.

I'm not being entirely silly. I have an application for which I would have been willing to pay for Java support. But the only support Sun would sell us (late 2009, when they had already started Oraclizing) was an unreasonably expensive site-wide support contract.

Comment Oracle has some work to do (Score 4, Interesting) 667

Java isn't about to become irrelevant. There's no chance it's going to be the latest thing, because that opportunity only comes once. But if you want a language for doing major projects with long lifetimes, there's really Java, .NET and C++. For a lot of things, Java or .NET makes more sense, and realistically .NET limits you to Windows. For that class of things, the limiting factor in Java now is that Sun does't support the same range of APIs that MS does. Particularly desktop APIs needed for things like multimedia and games. If I wanted to make Java as useful as possible I'd put some manpower into that, and find ways to put some of the newer interpreted languages on top of the Java JVM. That would give them access to a good compiler and to the range of packages available in Java. (Despite a more limited set of APIs than .NET, there's still more than a newer language would otherwise have available.)

Slashdot Top Deals

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (5) All right, who's the wiseguy who stuck this trigraph stuff in here?

Working...