Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Not completely useless in practice. (Score 1) 336

While it wasn't C++, I have used multiple inheritence as a mixin mechanism to implement a common protocol (e.g. propogating events) across classes that have ownership relationships but no other inheritance. The non-contrived examples I've seen for multiple inheritance all tend to be this sort of thing. Multiple inheritance isn't completely useless. You would probably find it useful for implementing default behaviours in environments where there are multiple sources of program control coming from a framework that an object has to play nicely with. One example might be a widget library for a GUI toolkit.

Comment Representativeness heuristic (Score 1) 133

I suspect that for every distributed OSS project that succeeds there are many that fail. You see the successful projects and the failed ones tend to slip back into obscurity. You can possibly learn something from the ones that succeed, but it shouldn't be 'OSS projects successfully manage distributed teams all the time.' I think that the majority of such projects actually fail. However, reaching out for advice or examples from the ones that actually work might be helpful. You should find folks on Slashdot who actually are involved with successful OSS projects. Maybe they will have something worthwhile to say.

Comment Re:I'd be willing to pay (Score 1) 167

You can still buy high quality PCs these days - they're called workstations. If you just want a good quality PC then take a look at an entry level workstation like a HP Z230 or a Dell T1700. One of the markets these machines are sold into is desktops running mission critical apps that have to meet SLAs. They tend to be built to higher standards than mainstream PCs.

Comment Re:so, I'm in the more than 8 yrs ago camp (Score 1) 391

I suggest that you look for a suitable secondhand workstation (e.g. a HP Z800) on Ebay. Either look for one in the right configuration or add memory, disk etc. to suit. Ex-lease workstations go for a small fraction of their new price on Ebay and parts for mainstream models (typically HP or Dell) are also quite readily available on the open market.

I did this with XW9300s a few years ago and a couple of SFF Z210s more recently. Works fine for me.

Comment Don't get too worked up about resource management (Score 1) 98

I did my undergrad degree on a lab not unlike this (actually Sun workstations using NIS/NFS to mount home directories - this was the 1990s). These machines were likely an 1-2 orders of magnitude less powerful than even your smallest desktop - desktops with 32MB of RAM and servers with 128-256MB. There was no resource management aside from disk quotas and the lab worked fine.

Depending on what you mean by high-usage I would have thought even modest desktop systems would be powerful enough for just about anything people get up to in a university lab (unless you mean Z800s with 192GB of RAM and somebody with an application for a machine that big). You could try goosing your smaller desktops by searching for 20-40GB SSDs to use as system disks (this should be plenty for the O/S, installed applications and swap) or upgrading the memory; SSDs like that go for peanuts on ebay.

Comment The fallacy of total database independence (Score 1) 425

I see java/database arguments and flamewars pop up from time to time. They usually centre on the performance vs. portability argument. I'll admit to some bias towards the smart-database school. Here's my $0.02 worth on this particular debate.

The fallacy of total database independence

For political reasons Sun would like us to treat Java as a platform. Therefore java is designed to be all things to all people, a layer between you and the operating system. This is at odds with what it actually gets used for. The same is arguably true for .NET, though for slightly different reasons.

Neither of these platforms are widely used for developing shrinkwrap software. I'll argue that both platforms are essentially irrelevant to the shrinkwrap software community outside of those companies that make Java or .NET development tools and infrastructure. Most java (or .NET for that matter) development is done for bespoke applications - where the client is paying for all the development and maintenance work. Most importantly, the client also owns and therefore specifies the hardware on which the application will be deployed.

I believe that the key fallacy of the portability argument lies in these four points:

Point 1: M:M application:database relationship

The database is not your personal persistent object store. Other people have to use the data.

Outside the trivial case, databases and applications live in a M:M relationship. The data will be used by more than one application. Tying your data integrity to the middle tier forces you to to go through the middle tier to update it. You have not escaped platform dependence but just moved it up a layer.

Any database has multiple stakeholders, including business users who want to analyse it. Therefore, application developers are not the only stakeholders in a database. Developers often do not realise this until it is pointed out to them.

Point 2: You still need to regression test

Any other platform (hardware, OS etc.) that you intend to support deployment on needs to be regression tested. Even if the application is write-once-run-anywhere portable, you still have to devise, maintain and find the warm bodies to execute a comprehensive regression test. Java's portability is not good enough to avoid this.

Point 3: Java is mostly used for bespoke development

Remember that Java is mostly used for bespoke development. The practical difference between "write-once-run-anywhere" and "relatively easy to port" is not an overwhelming issue on a bespoke project. There is a significant investment in development, and a set of regression tests has to be carried out.

If you have spent $1,000,000 on an application then the difference between $100,000 to regression test it on your new platform and $150,000 to port and regression test it is not such a big deal. The difference between "zero development effort" and "reasonable development effort" is not so significant in this light.

Point 4: O-R mappers are not a substitute for competent developers

If you're developing a database app without reasonable SQL skills on the team you've got bigger problems than portability. The notion of abstracting away data access into a black box is utopian. Data access performance is actually subject to mechanical constraints of disk performance. NOTHING else in the application stack has to actually move chunks of metal and ferrite (or whatever they make GMR heads from these days) around to work.

For this reason, database independence is just as big a myth as network transparency. Hands up those of us who design EJB components without regard to network round trips. You cannot ignore what the database is doing behind the scenes.

My argument is thus:

An application using POJO's and a reasonably competent data layer gives you flexibility to use database features (and drop out into stored procedures where this is sensible to do). Most work can be done with generic table/record classes. See Fowler's POEA book for deeper discussions of such architectures.

Lightweight data access libraries get you most of the practical benefits (reduced need for boilerplate code etc.) that a heavyweight O-R mapping layer does. They also give us more control over this part of our application.

With a little thought to the architecture, the DB-specific SQL code can be modularised so it is relatively easy to port. For the applications that people actually use Java to write in practice, "Easy to port" is good enough. "Write-Once-Run-Anyhere" is an asymptotic goal at best. Even if it wasn't, the baggage you take on to implement it might just be more expensive than the costs of porting the application to begin with.

For these reasons, I am not convinced that O-R mapping is all it's cracked up to be. I have worked on a J2EE project using an in-house O-R mapper and they do reduce database-related boilerplate code. However, they are not the only way to do this.

O-R mappers are more complex than you think and add a layer of hard-to-understand baggage to your application. Most of their benefits can be gained from simpler infrastructure (I'm a big fan of simple infrastructure but that's a whole different flamewar ;-). The question I put is whether the "portibility" side of this argument is relevant to the actual portability requirements of people who use java in practice.

Slashdot Top Deals

"Luke, I'm yer father, eh. Come over to the dark side, you hoser." -- Dave Thomas, "Strange Brew"

Working...