Please create an account to participate in the Slashdot moderation system


Forgot your password?

Apache Resigns From the JCP Executive Committee 136

iammichael writes "The Apache Software Foundation has resigned its seat on the Java SE/EE Executive Committee due to a long dispute over the licensing restrictions placed on the TCK (test kit validating third-party Java implementations are compatible with the specification)."

Comment SSL and intranets are a bad fit (Score 1) 286

A lot of responses that I have seen to this question are basically the following.

"Create your own CA (certificate authority) certificate and distribute them to the client workstations." Then they accuse the original poster of having asked an overly simple and uninteresting question.

I am going to say something nobody else seems to have said. SSL sucks big time for large workgroups inside a private intranet. It is an inappropriate solution that is being used for the lack of anything better. IE will give AD based authentication for browsers, but did not extend that to securing the communication channel itself.

This issue is much nastier and more complex then anybody has allowed for. SSL does a very good job of solving the problem of creating secure communications over untrusted anonymous networks. However, they are a real pain when the only thing you want to do is create a secure communication between two machines in the same room. In those cases, SSL comes with a lot of overhead that is really not needed. In the case of two machines in the same room (or workgroup), the machines are already on internal corporate IP addresses, so a lot of the issues that SSL was designed to solve (validating that the IP address really points to the expected entity) just are not applicable. Usually the only reason why you want to encrypt the data is so that somewhat private data won't be sniffed by other users. You are not trying to prove that you are a legitimate seller of any goods or services.

What really astounded me were the claims that it would be easy to get users to accept company controlled installs of browsers and tools. I have worked in such an environment and it was actively resisted and foiled because the choices were so limiting. For those who say "it would work it was done right", probably have not done cross browser development where you had to test on Linux, Mac, and variants of Windows machines. Nor have they done Java development where the Java has to communicate to the server (over https) as well (Java has its own client CA chain distribution).

Every place I have ever worked (big or small) has had http web sites when they really should have been https because of the pain of trying to use SSL. To say that this is because of bad IT management I think gets it wrong. SSL is a bad fit for this problem space and browsers (and Java) need to support other security solutions. It would be nice to recommend Kerberos, but Kerberos has really only gotten full implementation with AD and is even more painful for client adoption in most (with non Microsoft machines in the mix) real world scenarios I have seen. The state of intranet security is broken at its foundations and the proposed solutions that have been suggested here would not work (in practical, reliable, real world usage) for many workgroups working inside a much larger corporate entity.

Comment Re:Oracle is Evil, C# Java (Score 1) 428

There is another big difference between C# and Java. In Java, you are strongly discouraged from making native calls. For example, instead of using the native desktop GUI widgets, Java writes a large wrapper (Swing) around the desktop interface and tries to port all that complexity from platform to platform. In C# on Windows, I make Windows (C# friendly wrappers -- but still Windows) specific calls if I am trying to create a GUI user interface. Similar points go for certain other APIs such as ADSI, network pipes, registry, HTTP, and encryption. In fact, if you look at the general low level Windows APIs, there is a lot of functionality there that is not captured in Java.

I agree that C# is better than Java at doing the basics, but C# is hard to separate from the platform that birthed it. This makes C# a difficult language to deal with if you are not writing for a Windows platform. For example, from my understanding, the port to Linux of C# is not to make applications portable from Windows to Linux (except maybe some server apps), but to make C# a productive language for Linux the ways it is for Microsoft. I expect to call native Linux APIs from C# when I am running on Linux. Why not? The benefit of doing otherwise is not so clear given the general lack of portability of C#. This approach does create stress points. The .net API is a large collection of APIs, some of which run on Linux on C#, some that don't. Some are protected by GPL like licenses so you can use them without a fear of a lawsuit, others are not. As an example, I think there are still some controversies about some of the fancier parts of the WebForms APIs (some of the complicated dynamic HTML tables for example).

Comment Java is important for the server side (Score 1) 388

This post is in response to posts that say "Java is not important -- if we kill Java it won't matter that much". I disagree and the fact that Oracle is preventing the language from growing and potentially killing its future is big news and should not be dismissed lightly.

There is a saying, "democracy is the very worst form of government with the exception of all others." I have a similar opinion about Java.

Let me list four key strengths that Java has:

1. If you write code using primitives (such as byte arrays and char arrays) you can write parsing and syntax processing code that has near C-like performance. This applies to other tasks that need high performance such as querying or processing data. It is why higher level scripting languages can be written on top of Java.

2. It eliminates a lot of the dangerous, painful, and unstable aspects of programming in a non scripting language like C. It does garbage collection and does not allow you to corrupt your application memory or your heap in hard to detect ways. It provides clean stack dumps when errors do occur and prevents the application from crashing from silly programming mistakes.

3. It has excellent threading and synchronization support that can be used in a flexible and high performing way.

4. It can run on more than one platform with some success.

Other alternatives do not provide all four of these features (C# misses out on #4, Ruby, Python miss out on #1 and #3, and so on). I am not much of a fan of some of the libraries that have been built on Java (such as J2EE). Google and Eclipse's use of Java is much closer to how I think Java is supposed to be used for development projects. Because of the bad reputation that some Java libraries (such as J2EE and Swing) generate, some begin to associate Java with those libraries and rightfully believe that the world would be better off without them. But Java is used for much more than that. A lot of the more recent scripting languages are now written in Java or have popular ports to Java. As an example, some large portal applications use a variant of PHP ported to Java. And of course, there is Android. If you remember Oracle is suing Google right now for Android's use of Java so Oracle is quite aware of its importance for the future.

Comment Alternate REAL number lines (Score 1) 1260

I remember back when I was a graduate student reading about an alternate real number line where there existed a new number called "delta".

It was defined as being smaller than any positive real number and bigger than zero. Of course, this was not our normal real numbers that come from closing the rational numbers (using classes of Cauchy sequences) under the standard metric.

In these real numbers, 1/3 and .333... were not the same number, but were considered sufficiently close to be presented as the same answer to real world problems.

The advantage of this real number system is that it did interesting things to Calculus. All the complicated Epsilon & Delta limit theorems were trivialized and a lot of operations became simple algebraic manipulations. Also, things like integrals being the reverse of derivatives had interesting simplified proofs.

I also remember an argument being made that one could argue that this approach is not so far from reality. The reality is that in most cases we don't need more than 10 to 20 digits of precision. If we treated 10 to negative 80 as being this "delta" or essentially the same thing as zero (but not zero for calculus), you will find that mathematics does not fall apart as quickly as you might think and can still be essentially manipulated to give you most of the theorems and proofs of results critical to real world manipulations (including such things as General Relativity). And in fact, a lot of proofs become easier. This is not such a surprise to Physicists because they have been short cutting some of these types of proofs from the very beginning (starting we Newton who really did not quite grasp limits).

Comment Re:As an Oracle DBA (Score 1) 237

I agree with the above comment, though I do not think the differences are so severe. But I did want to go into the particulars of why Oracle has the best database product out there. This is from somebody who has developed a product that can run on most databases and has been deployed at 1000s of customers, some of whom have scaled up to billions of records. Here is my itemized list in order of importance (at least for the product I worked on).

1. Oracle has true row level transaction isolation. You start a transaction and you do not interfere with anybody who may be reading the rows you are updating. You also do not interfere with the updates to rows that are stored in the same "page". Databases that do not do this properly get two problems. The first is dirty reads where other transactions read rows that are temporarily in an inconsistent state with other rows (add a number to one row, subtract it from another, the other transaction sees the add but not the subtract). But the real problem is transaction deadlocks where each transaction is locking each other because they locked rows they were not supposed to. If you write code that is constantly "transactionally" grooming statistical data about the relative ranking of some row versus others, you will find it impossible to avoid these deadlocks. Only Oracle gives you true full row level (not "page level" as in SQL Server) transaction isolation. Developers in my group have written test programs specifically to prove this.

2. Oracle has sophisticated diagnostic tools that can help diagnose things such as: Why is this query running slowly? Is this data possibly corrupted? Where are my indexes stored and how well are they working? You have a lot of control over these things and that can really help scale a system from a million records to a billion records in a reliable way.

3. Ways to take advantage of multiple disks and fast independent IO to each disk. You can write your transaction log to one disk, partition data based on a particular column's value to various disks (for example if a column indicates "branch" of a company, each branch can have its own disk to store its data), write indexes to yet another disk. When you read or write, the database will read and write from these disks concurrently (really powerful on multi CPU/Core boxes). If the database determines that the query is targeting only a particular partition (for example, the query targets a particular "branch"), then it may figure out that it does not have to read data from the other disks at all. So if two such queries come in targeting different partitions, they will be doing IO to independent disks with independent hardware IO streams. I have had systems that were non functional for a customer become quite useful after applying clever tricks of this type (especially the partitioning when you have more than 20 partitions).

You may argue that most applications that are out there do not need these things. This is true, but nobody is going to make money from selling products to deployers of those applications -- they can usually get by on the free stuff (or close to free). If you need something from a database for which you are willing to pay real money, then Oracle still has some unmatched features.

Comment A less ambiguous variation of the problem (Score 1) 981

Here is a less ambiguous problem that shows the same effects.

Take two decks of cards. Shuffle each deck. Deal a card from deck 1 and another card from deck 2. If one of the cards is a spade, stop. If no card is a space, put the cards back into their original decks, shuffle again and repeat. Continue repeating until one of the cards dealt is a space.

Question: At the time you stop, what is the probability that the other card is black?

Linux Business

Is LGP Going the Way of Loki Software? 124

An anonymous reader writes "After the demise of Loki Software, Linux Game Publishing sprouted up in its place, and for the past nine years has ported a number of games to Linux. But LGP may now be sharing the same fate as Loki. Linux Game Publishing hasn't updated its blog or news pages in months, has stopped responding to e-mails, and its only active ports are games they began work on in 2002/2003."

Comment Re:Why would I buy an iPad (Score 1) 1713

You do know the general brevity and slandering nature of your comment reduces its validity.

Could you identify the "giants" for me? My general claim is that Apple is winning because they have a superior technology (not because of marketing). I also believed that was true about Windows 95/Windows NT/Microsoft Office (against the competitors available at that time). Do you disagree and why? I also believe that there are fundamental architectural decisions at the foundations of products that greatly influence their success or failure. Successful (note -- I did not say "good", as a programmer I find some choices made by Microsoft and Apple morally objectionable) software needs well designed foundations. Do you disagree with that? Or do agree with my general assertions but think I have picked lesser arguments to support my statements? What would be stronger arguments?

Comment Why would I buy an iPad (Score 4, Interesting) 1713

There is a theme to some of the comments which I wish to rebut. Essentially the theme is that Apple products really are not that good and they sell well only because Apple does such good marketing. The implied assumption here is that if you buy Apple's products, such as the iPod, you are just a sucker fooled by Apple's marketing campaign. Since I have bought Apple products because I thought they were the best products available for my needs, I see these statements as declaring that I am also a sucker and lacking in any real tech smarts. Essentially I feel like I am being called an idiot.

I remember when this debate was between Linux and PCs, and the Linux crowd was trying to argue that nobody should need to run Microsoft software to do their jobs or get things done. This was at time when you could not get Linux to legally read a DVD or use algorithms to do reasonable font rendering. Of course, these limitations were because of licensing issues, some of the most useful software productivity features were protected by commercial licenses or patents. The Linux advocates would argue that I should not be running such software in the first place because it was not "open" software. But that is a different argument. I have far more sympathy for the argument that running Linux is a superior moral choice. But arguing that Linux was a better OS for getting my job done was nonsense.

I am going to come at my argument in a backwards way. Instead of touting features of the iPad, I am going to describe artfully chosen limitations. The biggest limitation is that a developer cannot develop an application that can run as a persistent multi-threaded process. Any application that is not being used at any current moment is torn down and a new one instantiated. This is even more limited than the old Windows 3.x OS with its event driven model for task switching (for those you who don't remember -- Windows 3.x had only one running thread and all applications shared memory). Another limitation is that applications cannot use a shared file system or use shared libraries. You cannot build an application out of other applications or write applications whose purpose is to interact with other applications in useful ways. A user cannot even freely write code for their own application, build it, and run it.

For anybody who likes to tinker with their computers (I consider myself somewhat in that breed, I do programming for a living), this seems almost mind boggling stupid. But there is a method to this madness.

So what do you get back for these choices.

1. A very stable device that does not need to worry about applications doing semi-permanent bad things to your computer requiring a reboot. It is not stable just because applications have a hard time doing bad things, but the basic logic of behavior is so simple that you can "audit" and control it in a way that you cannot control a standard modern OS. This eliminates tangled logic scenarios that come up when you have interactions between device drivers, OS interrupts, glitches in hardware, and complicated applications. Also, it is far easier to write protections against hostile software, especially if you control the distribution of all software for your device.

I think many in the Slashdot crowd underestimate the importance of stability in a portable device. I reboot computers all the time because of glitches of various sorts. It is true that the OS is rarely to blame, it might be the device driver for my mouse, or a disk glitch, a misbehaving network router, or a bad application but generally such issues are fatal. And because of the complexity of the OS, the OS really has no chance at diagnosing the true cause of the problem.

That is not something I will tolerate in a lightweight portable device used for limited but useful activities. I have heard rumors that Android phones, once you start trying to run some of the same application that make the iPhone popular, have far more problems with various issues, such as unwanted battery run down for processes that do not properly terminate, or misbehavior from rogue or badly written applications. Now, I know some that love their Android based phones because they can program them the way you can program for any OS, but that is of no utility for those using those devices in boring (from a "techie" sense) but highly productive useful ways.

2. You are forced to push the real labor for GUI, multimedia, application deployment, event handling, and so on into the OS. You cannot insist that the application fulfill these functions because the applications are so tightly locked inside their software jail. This forces uniformity and consistency of behavior from all applications written for the device. There is a predictability in what the applications are unable to do. On a real OS, it is quite possible for end users to feel that they have no real clue what the software they are running is really doing or what the consequence might be if they ran the software in unusual configurations or settings. Of course, you also cannot have positive pleasant surprises as well. But for smallish devices that are used constantly for limited but important activities, that is a small sacrifice to pay for consistency of usage patterns.

So though this may sound counter-intuitive, I feel the real genius of the iPhone, iPod, and iPad is in what they cannot do and not in what they can do. Of course, there are some things Apple's software can do that nobody has matched. For example, only Apple has found a way of accurately guessing your intent when you press your over-sized thumb on a crowded display of small little letters. This may sound small, but in someways it is everything. It places the iPhone (and brethren) into an elite status clearly differentiated from the rest.


Failed Games That Damaged Or Killed Their Companies 397

An anonymous reader writes "Develop has an excellent piece up profiling a bunch of average to awful titles that flopped so hard they harmed or sunk their studio or publisher. The list includes Haze, Enter The Matrix, Hellgate: London, Daikatana, Tabula Rasa, and — of course — Duke Nukem Forever. 'Daikatana was finally released in June 2000, over two and a half years late. Gamers weren't convinced the wait was worth it. A buggy game with sidekicks (touted as an innovation) who more often caused you hindrance than helped ... achieved an average rating of 53. By this time, Eidos is believed to have invested over $25 million in the studio. And they called it a day. Eidos closed the Dallas Ion Storm office in 2001.'"

First MySQL 5.5 Beta Released 95

joabj writes "While MySQL is the subject of much high-profile wrangling between the EU and Oracle (and the MySQL creator himself), the MySQL developers have been quietly moving the widely-used database software forward. The new beta version of MySQL, the first publicly available, features such improvements as near-asynchronous replication and more options for partitioning. A new release model has been enacted as well, bequeathing this version the title of 'MySQL Server 5.5.0-m2.' Downloads here."

NYT's "Games To Avoid" an Ironic, Perfect Gamer Wish List 189

MojoKid writes "From October to December, the advertising departments of a thousand companies exhort children to beg, cajole, and guilt-trip their parents for all manner of inappropriate digital entertainment. As supposedly informed gatekeepers, we sadly earthbound Santas are reduced to scouring the back pages of gaming review sites and magazines, trying to evaluate whether the tot at home is ready for Big Bird's Egg Hunt or Bayonetta. Luckily, The New York Times is here to help. In a recent article provokingly titled 'Ten Games to Cross off Your Child's Gift List,' the NYT names its list of big bads — the video games so foul, so gruesome, so perverse that we'd recommend you buy them immediately — for yourself. Alternatively, if you need gift ideas for the surly, pale teenager in your home whose body contains more plastic then your average d20, this is the newspaper clipping to stuff in your pocket. In other words, if you need a list like this to understand what games to not stuff little Johnny's stocking with this holiday season, you've got larger issues you should concern yourself with. We'd suggest picking up an auto-shotty and taking a few rounds against the horde — it's a wonderful stress relief and you're probably going to need it."

Comment Re:Disambiguation reveals the simple answer (Score 1) 221

Let us suppose for a moment that mathematics is invented and not discovered. Just how far do you want to push this.

Was the concept of 0 and 1 discovered or invented? You can argue that because of quantum mechanics and the probability that there is a nonzero (though immeasurably small) chance that any particle (or group of particles) could exist at any point in space and time, that the idea of 0 and 1 cannot truly be represented in nature (if you tried to show me 0 blocks and 1 block, quantum mechanics would say that there is a small probability that my 0 block might actually be a small fraction of a block -- of course the odds against this are ridiculously small, but that is not the point). So even for something this simple, you can claim that mathematics is only a model that is not truly represented in nature.

The existence of 0 and 1 is an "axiom" in mathematics (for set theorists, they usually describe this as the existence of Set with nothing -- the nothing is 0 and 1 is the Set that contains nothing). It is not provable, but it does not mean that it cannot assume it to be true and work from there.

I want to find anybody rational that believes that there can be "intelligence" of any reasonable complexity and sophistication that does not intuitively understand the difference between the absence of something and something. I want to go further and say that this idea was discovered and not invented. The symbols and notations that we use to represent this idea were invented but the underlying idea exists and is true even if all of existence were to vanish and nothing ever existed anywhere.

Once you believe in 0 and 1, there is a nicely built up sequence of logic that will lead you to circles and PI. Some of it requires advance graduate mathematics to fully understand, but there is an unavoidable discoverable chain of logic. For example, the existence of numbers following 1. This is one of the Peano axioms. Again you have to assume it is true, but nothing breaks down in logic if you do and you work from there. Again, in Set theory, which is one way to build up the axiomatic foundations of mathematics, if 1 is the Set that contains nothing, then 2 is the Set that contains the Set that contains nothing. Once you believe in positive integers as being discovered and not invented, then the rest of the big construct called mathematics followed and was "discovered" just as much as 0 and 1 were "discovered".

Slashdot Top Deals

The "cutting edge" is getting rather dull. -- Andy Purshottam