Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Linux Software

E-commerce and Linux 173

paRcat asks: "My company is using a proprietary system for letting our customers order online. It takes the order, and as soon as they click submit, sends it on to our main system via a serial connection. Both systems are running on NT, and they die constantly. I've been pushing to get Linux in here, and I think replacing the online ordering server is the best way to start. Our catalog has around 25,000 items in it. It's held in an Access database right now... around 14 Megs. I suppose it could be converted, but every time a pricing update comes out, it's distributed in mdb format. What tools exist for Linux that can do what I need? It just needs to allow access to the database, take the order, and send it down the serial line. I was contemplating setting up mod_perl and just writing a bunch of code, but I'm still a bit new to PERL, and I'm not even sure if that's the best choice. " Apache, Perl/mod_perl and MySQL/Msql/PostgreSQL all sound perfect for this application, but the issue of getting the Access information (both the existing data and the future updates) might be a problem. Any ideas?
This discussion has been archived. No new comments can be posted.

E-commerce and Linux

Comments Filter:
  • by fatboy ( 6851 )
    I think MySQL can import MS-Access databases, if not you can write an import asp script that will do it.
  • I don't know of any tool to directly import an Access database into MySQL but it's trivial to
    export the data from Access as CSV or tab-delimited and then import it with mysqlimport.

    I think the hardest part of all this would be programming the serial interface.
  • One: Check http://www.mysql.org out. It answers a fair amount of the questions you might have already.
    Second: As far as I know there is a middleware to convert access databases to MySQL data. This should take care of the problem with two db's. The only problem would be that it is a total db dump so it could become more and more unwieldy as the db grows.

    Third: You can also connect the Access db to MySQL using MyODBC if so is needed http://www.mysql.org/Manual_chapter/manual_ODBC.ht ml#ODBC which should maybe make things a tad easier, but then again....

    With hopes this helps...
    /Lindus
  • Perhaps some advice is in order...

    First: Check http://www.mysql.org out. It answers a fair amount of the questions you might have already.

    Second: As far as I know there is a middleware to convert access databases to MySQL data. This should take care of the problem with two db's. The only problem would be that it is a total db dump so it could become more and more unwieldy as the db grows.

    Third: You can also connect the Access db to MySQL using MyODBC if so is needed http://www.mysql.org/Manual_chapter/manual_ODBC.ht ml#ODBC which should maybe make things a tad easier, but then again....

    With hopes this helps...
    /Lindus
  • by Anonymous Coward
    Just create a similair table on your mysql server. Then link the table into your access database (create a new, linked table w/ the ODBC drivers for mysql)

    Then just copy and paste.

    You can automate this in about 5 lines of visual basic.

    On the down side, you do need to do this on 95 or NT.
  • Use ODBC to pull your information from the Access Database. Use the DBI module under perl. Installing mod-perl is pretty simple, but if you are not real familier you might want to just set it up as a standard cgi.

    Lando
  • You might want to try exporting from Access as some form of ASCII delimited text. I've done Access->PostgreSQL like this with CSV (comma-separated values), but there were some glitches with regard to fields that were null. In any event, it appears that you'll have to keep a Windows box around, if they send updates as .mdb files.
  • I agree that the serial interface would be the biggest pain. I wonder if the API to their interface is avaliable. For that matter, I have found very little information on how to _PROPERLY_ speak to serial devices. Guess I havent looked hard enough ;)
  • Doh!! I had to reread that, sorry.
  • by Anonymous Coward
    You can handle this problem in a number of ways.
    1. You can export the Access table to a CSV file(comma separated value) and then import the file to the database(with database suplied tools or a script).
    2. PostgreSQL and probably all the other Open Source Databases have ODBC drivers available which allows MS Access to link directly to the database server.
  • Well, the original problem seems to be that pricing updates come in mdb format. Although I can certainly agree with the original poster that I certainly wouldn't want to put an e-commerce server on NT, it seems to me that one NT workstation with a copy of MS Access shouldn't be too much of a problem. You create an mdb with linked tables, one linked to the file location where you will put your pricing updates, and one linked through the MySQL ODBC drivers to the MySQL (or PostgresSQL or whatever) databases. Then a little AutoExec macro, and you've got yourself an "update application".
  • mysql is a good databse,as others have mentioned, you can use access to connect to it...

    perl or php3 would probably be good choices for the language, as both are nice high level things that interact w/ databses easily

    i strongly recommend not attemtping to use minivend though, as its a major PITA to get working, and the documentation is old, brief, and in some cases wrong
  • by Gene77 ( 90233 ) on Thursday November 04, 1999 @01:48PM (#1562048) Homepage
    I would suggest looking to use MySQL, and as you receive updates in your MDBs, use the ExportSQL [cynergi.net] script by Cynergi. If you're comfortable in VBA, you'll want to tweak it a little if you are to use it a lot. I've used it at work, so I know that it works fairly well.

    Access allows you to get away with somewhat sloppy data modeling, so you'll need to revisit the generated SQL and make sure you have all the "NOT NULL" and different data types in place as you need them (this can all be tweaked in the VBA without too much effort, but you should be knowledgable in VBA and SQL). You no doubt know that the MySQL data types are more specific than the data types used by Jet (the actual db engine Access uses). This means that you'll want to make sure that the column that was exported as "REAL" shouldn't really be "MEDIUMINT" instead.

    The script generates SQL files which can then be used against MySQL in batch mode. So with it, you can set up the VBA function to create the appropriate SQL for refreshing your tables whenever you receive a new MDB.

    My suggestion is to dump Access. It's nice for rapid devolopment, but if you're running an e-commerce thing with it, then I pity you. Not only would NT be your problem, but you'll have Access' sloppy page-level record locking and it's tendency to not release connections and record locks with the consistency e-commerce apps should have the right to expect.

  • I know a lot of people are going to advocate PERL, but you should also look into the php ( http://www.php.net/ ) and mySQL combination.
    with php4/zend on the horizon, php is looking to be an increasingly robust and viable alternative to PERL.
    ----- --- - - -
    jacob rothstein
  • by msuzio ( 3104 ) on Thursday November 04, 1999 @01:49PM (#1562051) Homepage
    Well, it sounds like a lot more research is needed before you're going to be able to present a convincing case. The NT side has a few things going for it:

    - It's there. He who gets there first has the home-field advantage
    - It handles the Access data import needs without any problems (or at least you didn't mention any)
    - The usual PHB tendency to swallow MS FUD(tm) will probably work against you.

    That having been said, here's a start on countering it and working up a case:

    - Definately look into ODBC or some sort of easy export from Access. I'm not familiar enough with the MS world to know a sure solution, but I imagine the worst case solution is some sort of pretty simple VBA scripting in Access to open the file and dump out selected records (or all records) in a nice format MySQL can import.

    - I agree that MySQL is probably a good client database. Don't rule out other options, however, you don't want to find MySQL doesn't fit your needs and then have to propose *another* change to management. PostGres has worked well for us in some applications, and is a little more full-featured than MySQL (although not as fast, and feature-wise it's actually a pretty close race). Oracle on Linux is even a possible choice, but you haven't mention budget or database size. Since it's coming from Access, I'll assume it's a small database -- in that case, MySQL looks pretty good.

    - Web server: Apache or Stronghold if you need SSL. We just started using Stronghold, and so far it seems dead easy (much better than Netscape Enterprise Server which was the only SSL solution we had tried before this). It's not expensive as far as SSL solutions go, and it seems to track Apache releases pretty well.

    - Application coding:
    Perl is great, but you say you're not experienced yet. That's not a show stopper at all, but consider carefully how to proceed. If you're willing to pick up the Camel book, the Ram Book, and a printout of the CGI.pm and mod_perl documentation, you may be ready to be a Perl Web Programmer :-). But plan on it taking some time to get it right. Be sure to steal as much code as possible from the people who have gone before you -- I suggest a through reading of the WebTechniques archives to see some excellent solutions to common problems.
    As far as non-Perl solutions, Python is great, and Zope seems to be getting more and more attention. Check out Zope and scour freshmeat for other Web Application architectures, you may find those solve some of your problems well. Java servlets are an excellent choice also, but expect those to require much more programming savvy.

    Anyway, the first step really is to analyze the current system and figure out all the components. For each one, pick a few possible substitutes. Play around and convince yourself that the pieces you select play nice together. Then show *that* to the PHB and get the go-ahead, they tend to be pretty easy to convince if you know your stuff...
  • Ya I know everyone wants linux these days because its so stable. But the simple fact is that a system is as stable as the programmers and administrators creating it. I know many NT gurus who have had systems up for years with NT and I know the same with linux. It might be smarter to try and solve the problem then to just convert for conversion sake and then end up having problems with linux and giving linux a bad name. We don't need that.

    --MD--
  • I have a client who manages a database in Access, and they upload changes from it via the Windows ODBC drive to the PostgreSQL system that runs the web site. There's a bit of munging involved to reconcile their inconsistently maintained data but it works great and I never have to look at Windows.

    For another client I eventually made a web-based editing interface and eliminated their Access database altogether. Works even better :)

  • Post your resume. Show us the questions you would ask. If you can't even think of better questions, then your real imagination exceeds your imaginary one.

    Show us why you are better qualified than anyone for the job.

    --
  • I've been using PostgresSQL for a web-style application and it works just fine. My co-worker on the project has used MySQL and that also works just fine. IMHO, MySQL is probably better (it has some neat features such as the LIMIT/OFFSET on queries), but I cant affort the licensing restructions of MySQL for this.


    Also, if you're going to be learning Perl for this application, might I suggest using Python instead? It can do anything Perl can, but its a considerably cleaner language (read: more readable), and there's also a Apache module for it.

    Perl's fine for programs around 50-100 lines or so, but larger than that and it starts to be come really messy.
  • I am not as familiar with Access as I am with SQL Server, which I just converted a fairly large database over to MySQL as our Y2k backup plan. I used Perl and ODBC on the WinNT side to extract all of the data out of the database and write it out to a comma-delimited file. This file is copied over to the Linux side and then use mysqlimport to insert the data in the database. It is not to bad, two Perl scripts and a cron job and it is completley automated. I do not trust Access as a scalalbe multi-user database at all, it was not made for that at all. I would definitely move to MySQL or SQL Server if you stay on the NT side.
  • YAMS is a total solution to e-commerce, and is open source. It uses MySQL as the backend and modperl. It seems pretty nice, you might want to look into it.
    http://yams.screamdesign.com/
  • I don't think there is anything out there to plug into Access from Linux, but you can certainly export from Access into MySQL.
    Either use MyODBC as someone pointed out or you can use a script available here [informate.com] to dump an Access DB into nicely formatted MySQL compatible SQL.
    There are a couple of these utilities available on the MySQL web site [mysql.com].
  • There are existing tools (VB Scripts) for MS Access to export data to MySQL.. a couple are listed on the MySQL home page. Plus I'm sure it would be a trivial hack to customize the VB to your purposes (IE generate exactly the field types you need or special insert statements you need.) So at least you are down to using MS tools just to transfer data.

    FYI: The choice I have made for my company for end-to-end web based ordering is Linux, Apache, mod_ssl, PHP, MySQL, Cybercash. PHP allows you to link right against the Cybercash MCK and even comes with some nice pre-written interface functions.

    Good luck. Moving away from Access (even toward MS-SQL server or otherwise_ will help you A TON. Win NT ODBC drivers are really really poor.
    ---
    Don Rude - AKA - RudeDude

  • by SheldonYoung ( 25077 ) on Thursday November 04, 1999 @01:53PM (#1562060)
    Okay, can say I've done this exact thing. We started out prototyping on IIS and Access because we didn't have a decent Linux box yet and Access makes it easy to prototype databases. Our plan was always to go to Linux, but we were forced into it by IIS doing some freaky things.

    The perl was a piece of cake to move over - we even switched to mod_perl along the way. The database, however, was a bit of a pain. We moved it first from Access to SQL 7, then used SQL 7s data export function to stuff it into Informix on the Linux box. It was a nightmare, there are so many things that just don't move across. Views, identifiers longer than 18 characters, etc.

    My suggestions, from hard-won experience are:
    1. Run a multiplatform web server, not IIS. Unless you're using ASP or such it should be simple. This is by far the best move we made - It was almost bearable to develop on NT once IIS was out of the way.

    2. Clean up your database architecture and make your SQL portable. Don't assume TRUE == 1 or you can escape table names with [].

    3. Move in stages. Get mySQL, Informix, Oracle, whatever running under NT and move the data to it first. Then move it to the Linux box when things have settled down. Same with switching web servers.

    This is a good lesson in why to create portable applications. Just move in pieces and you'll gradually see your system get more and more stable, without getting above your head in new things to learn.

    Good luck.
  • As of v6.5 (or 6.4.2+patch) PostgreSQL also has LIMIT/OFFSET.

    I disagree about your assessment of Perl - I say it's fine for programs of about 500 lines after which you are probably done. Hooray for mod_perl!

  • by Anonymous Coward
    Everyone with an NT server who has problems blames Microsoft. I can tell you this: I run a very complex transactional website, using NT, SQL-Server, and ASP. We had a lot of problems for a while. But gradually we learned the right way to do things, and now we run smooth as silk. Now and then we reboot a webserver due to a memory leak that shows up in SSL (which we use for the entire site). That's it. We'll deal with that before long by clustering web servers.

    I don't know what technologies you're using other than Access, which is seldom a good solution for a website. But NT works well for us. As with any public server, you'll want to pare it down to run exactly what you need, no more.

    I do root for Linux, though, so if you go that way, best of luck!

  • Wow, that's real helpful. Let me summarize your post:
    "Linux sucks, Windows rules!!! Especially the version coming out in 3 months that's been delayed for over 2 years! It will be the greatest OS ever!"

    Linux, BSD, and NT are all scalable and reliable OS's as long as you have the right equipment and a competant SysAdmin, but it's not eh best solution for everything. Hell, even Microsoft uses UNIX over NT when it's better for the job (like in Hotmail)



  • a few easy steps (if you know java):
    1. get apache + jserv (java.apache.org [apache.org])
    2. re-write your scripts as servlets
    3. get the java COMM package (http://java.sun.com/products/javac omm/index.html [sun.com] to talk to the backend server.
    4. put your data in any database you want and connect to it via jdbc. One way would be to keep the access db on an NT box and whenever you get new data connect to it via jdbc, grab the data, stick it in more robust database (also via jdbc)


    advantages over perl:
    1. easier to maintain and debug
    2. see #1

    henri
  • ft p://ftp.cdrom.com/pub/perl/CPAN/modules/00modlist. long.html#7)DatabaseInterfac [cdrom.com]

    contains DBD::ODBC. should allow you to connect to odbc driver on nt box.

  • The course of action I would probably take would be to export the data from Access into a format that your favorite SQL database can handle.

    Access exports into Some spreadsheet formats, text, rich text format, Foxpro, dBase and even outputs to ODBC databases,(though you'd have to see if you can get a driver for access from the manufacturer).

    dBase is probably your best bet as far as real database formats are concerned.

    ODBC drivers, would be interesting to see if MS Access would accept PostgreSQL drivers, or if it would revolt. Anyway, the Access help file has some info re. installing ODBC drivers.

    If nothing else, most DBMS should import a text format table.

    Good luck.
    Hail the Penguin!
  • by wdr1 ( 31310 ) <wdr1&pobox,com> on Thursday November 04, 1999 @01:56PM (#1562068) Homepage Journal
    You'll find that radical shifts are often resisted in the buisness world. A phased implementation might be the best way to go & would easy to do.

    You're on the right path... Perl can certainly do what you. In fact, most of the difficult code that you would need is already written: DBI & DBD::ODBC. What you'd want to do is establish an ODBC connection to your Access database & do all your data manipulation that way.

    Using this, you can move the dynamic component of your site to Perl based CGIs running on your existing NT servers. (Take care to avoid anything platform specific, and avoid ASP/Perl-script. That will only furhter anchor you to NT.) Not a radical shift, and one that should both help your current situtation and increase everyone's confidence in what you're doing.

    Next, on the side, implement you linux box running mysql. Write a Perl script using DBI, DBD::ODBC, and DBD::Mysql to periodically refresh the mysql database from the Access table. This will be the trickest part. I recommend keeping the mysql data model the same as the Access data model to keep things simple. Unfortunetly, there isn't an easy/inexpensive way to read Access files on a linux box, but that's not a big obstacle. You can use DBD::Proxy. Alternatively, have two Perl scripts: one on the NT box to export data to a tab-delimited flat file, and another script on the Linux side to import the file. With some smart scripting, the use of Net::FTP & NET::Telnet, you could integrate this into one file. (Simpler, but less cool, just do some intelligent automated scheduling on both sides via NT's scheduler & Linux's cron.)

    Now, through mod_perl into the mix. Present each solution side by side to your superiours. Explain the performance difference, the scalability, the cost difference, the seemless integration, and the added functionality of the Linux/Apache/Perl/mod_perl/Mysql solution, and *presto* you have your solution (and you look like a superstar. :)

    -Bill
  • but I cant affort the licensing restructions of MySQL for this.

    There's a GPL MySQL out there, perfect for you then.

    Yours Truly,

    Dan Kaminsky
    DoxPara Research
    http://www.doxpara.com
  • How about using a real database to handle a business app instead of Access. Just because their is an ODBC driver for it doesn't mean its gonna be robust enough for e-commerce use. Upgrade to whatever DB Server product you want. Chances are its going to perform better than Access.

    Also learn SQL you'll soon see that importing data from any ODBC compliant DB is either simple or logical or both. If you don't want to then install MSSQL on your NT boxes and run the Access Upgrade wizard to import the data into new tables.

    Hire a NT Administrator. I work on both Linux and NT Servers and I can make a Linux Server as unstable as most NT Servers. Remember most people who call NT unstable do not know fully how to configure and maintain a NT Server. To make matters worse their are also many NT Administrators who don't know how to properly manage a NT Server. It sounds like your NT Administrators fall under the latter catagory.

  • Export the Access tables to a comment-delimited format and import it into your favourite unix sql server. In MySQL you'd use a statement like:

    load data local infile '/home/me/table1.txt' into table table1 fields terminated by ',' optionally enclosed by "'" escaped by '\\' ignore 1 lines (col1, col2, col3, col4);

    But check the docs to see if you need anything else. sed and awk are good if you need/want to sanitize the table a bit first.

    Then use an ODBC driver on the client machines to allow updates. MyODBC works well with Access 97 but I've heard of some problems with Access 2000 (I think someone at Microsoft is working on a patch for the next service pack?). Make sure all of the tables you want to update have a primary key defined.

    Read your SQL server's docs for access control - with MySQL, I added a user name, password and IP address triplet to the mysql.user table with 'N' for all of the global perms, and added a line to the mysql.db table allowing the user select, insert, update and delete privs to the given database from their IP address. I'm sure there are other, better ways of doing it though.

    I prefer PHP (http://www.php.net/) for the WWW frontend, but that's just me.
  • Although I agree that Access is the problem, using Microsoft SQL Server is not the best thing, even though it is 1000 times better than Access. I would suggest looking into a "real" SQL Server, such as Oracle. Microsoft SQL Server has too many bugs and limitations. SQL 7 is not ready for the enterprise, it is bugged out beyond what should be tolerated (just check the Knowledge Base!)

    SQL 6.5 is much more stable. It has limitations, such as no more than 16 tables in a view/query, but it is preferable to 7 at this service pack.

    The suggestion of MySQL and Linux with ODBC using Access is not bad, as long as Access doesn't crash and/or hang as it's famous for doing. But if you use ODBC with MySQL, you could use lots of front ends/middle men: PERL in Linux or ASP in NT etc.

    Setting up a MySQL Server on Linux shouldn't be any more difficult than anything else, as this poster is suggesting. It may save you money in the front and you will learn something in the end. The cost of Linux is the cost of learning--a price always worth paying.
  • I'd recommend PHP over PERL for the programming language if you do convert, PHP4 looks like it's going to blow both PERL and ASP away in terms of performance, and you'll probably be able to convert most of your current ssytem over to PHP easier than to PERL (there is an ASP->PHP converter available if you are using ASP at the moment.)

    For the database server, I'd recommend PostgresSQL over MySQL (transactions are nice), or you could just keep the database on the NT box and use ODBC (although I'd recommend upgrading to a database server like Oracle or MS SQL 7, MS Access is NOT meant for what you are using it for.)
  • The easiest thing you can do to help reduce the frequency of those crashes is to get the data off of Access. That poor little DBMS was just never designed to put up with the abuse that is constantly heaped upon it. You should be able to move the data to any ODBC compliant DBMS with little effort. I would suggest picking one that was cross platform so that you only had to move the database once. I suggest you take a look at Progress, as well as Oracle, Sybase, Ingress and, of course, MySQL. It's been a few years since I used Progress, but from what I have heard, it's just gotten better. MySQL is free (as in beer) on Unices, but not on Windows, and that's just one of many reasons it would make sense to put the database on Linux. Moving to a more robust DBMS will buy you time to port the application correctly.

    Make full use of that time to do the job right. I strongly suggest that you do not attempt to do the port yourself in a language you are in the process of learning. Everybody writes really horrible code during the learning process of any language, and you do not want that code enshrined forever in an application that's critical to your business. Hire someone who knows how to do this type of app, and perform a real analysis and design before doing the coding. You will be doing the second system, so you have a pretty good idea what the requirements are. You will, of course, follow my advice and bring in someone who knows how to write this kind of application; and when you understand the problem, the technology, and the data there is no better model to follow than the top down model: analyse, then design, then code. Of course even when you understand all the elements of your application the creation of the application itself changes the environment it lives in, and so some adjustments to the design, and perhaps even the analysis will be made. But these adjustments will always be more expensive if done after the coding has started then if done before.

    You have a working system. Fix it then replace it with a well-engineered second system; and your management will be pleased. Replace it with another hack, and nobody will be happy.

  • Run an untested beta for a server os! I know, he could just use windows 3.1. After all, it was made by the same "wonderful" company that wrote 2000.

    Ryan Minihan
    ryan@linuxhardware.com [mailto]
  • Perl's fine for programs around 50-100 lines or so, but larger than that and it starts to be come really messy.

    If your Perl programs are unreadable after 100 lines, you're doing something very wrong.

  • by exeunt ( 22854 )
    Mysql.org [mysql.org] and Mysql.com [mysql.com] are the USA mirror of Mysql.net [mysql.net] the german site. So if .com or .org is down, try .net. I cannot remember their other domain off the top of my head.
    Also, as their site says: Our connection to the world has been down between Saturday 19991030 GMT 22.32 to Monday 19991101 GMT 17.10
    ---
  • I'm currently in the throws of the same type of project as you.... The only addition that I have is I need to include an accounting system written in Visual Foxpro. My requirements are to access a MySQL system that's our main database system, with some information needing to be handled via MSACCESS, and needing to query the Foxpro based system as well. To make it even more challenging, the Foxpro based system HAS to be on NT server per our support contract with the vendor! Nasty, but there's a way to make it all play nice!

    I decided to use Perl instead of PHP mainly because of Perl's flexibility, runs on NT and Linux with little changes, excellent performance, etc. I'm not trying to start up a holy war here, use what you're comfortable with. The biggest hurdle was accessing Foxpro system on NT. For that I used OpenLink Multi-Tier 3.2 [openlinksw.com] on the NT box. All you'll need to do there is configure the ODBC driver for Access. If you can read the *.mdb file with Excel, then the ODBC driver is configured properly. Install the OpenLink on the NT box - read the docs that come with OpenLink, it's also straight forward. You'll also need to grab the UDBC & ODBC stuff from OpenLink as well. That needs to be used to compile the DBD::ODBC perl module. Again, RTFM it's all in there. I did run into one glitch on an unrecognized command when compiling DBD::ODBC with UDBC. I only tested the module against Foxpro tables, and it did generate some errors. They were all due to SQL commands being longer than 80 chars - not a problem in my environment though.

    Once that's up and running, and you've decided to use Perl - head over to the DBI/DBD Faq [hypermart.net]. Section 3 & 4 covers what you need to do. It's really not as hard as you think to connect perl to use ODBC, even on a seperate NT server. Connecting to any other ODBC compatible database works the same way.

    From what I've read, PHP is just as easy. I first started developing our databases using PHP, but switched because I have other projects that Perl can be used, but not PHP. Perl looks to be slightly more complex than PHP for straight database access, but so far it hasn't been that bad. Some of the PHP code that I have looks real close to perl, so the switch isn't as painful as I thought.

    I haven't tried this, but if you dump NT you could use iODBC to access the Access DB. All the parts are there, I've just never tried it. That way, you could still distribute your database in MDB format. Another option would be to keep a skeleton MDB around, and run everything on MySQL. When needed, you could just dump the MySQL tables out to the MDB tables and send it on its way. The second thing I did was set up a test network and just started working on it. I've been able to connect all the different databases together after about 12 hours of work.... Just to give you an idea of time and effort involved.

    HTH

    .mark

  • Isn't mod_perl the Perl Apache module? So that's not a point of comparison then?
  • Reasons to not use servelets/java.
    1. slow
    2. object orientation forced opon you is massive overkill for the types of things that people do with HTML and databases
    3. PHP and Perl are both designed to do exactly these kinds "texty" things.
    4. php and Perl can both do anything that java servelets could do themselves, or by system calls (if you just felt like you needed to use java somewhere then you could do it here)
    5. see number 1, its really true.

      Java is not the answer for everything. It does somethings well, and has loads of functions built into it, but it doesn't fit well with CGI stuff.

  • If you can't convert the whole damn thing to Linux, I strongly suggest you look into MSDE. MSDE ("MicroSoft Data Engine") was released with Office 2000 and is essentially the core of Microsoft SQL server without all the admin stuff. It also lets you access *.mdb files without Access. (Or so I'm told. I haven't tried that part of it myself.) Access is truly a dog, but SQL Server performs pretty well and from what you've said of your app, should easily handle your load. (From what I understand, Oracle is superior, but it isn't exactly cheap.) MSDE does not have any runtime charge like SQL Server, so there is really no cost to use it. (More Info [microsoft.com])

    (Of course, being Microsoft, there's already been one service pack...)

  • Regardless of anything else that you do, make sure users aren't initiating direct accesses to Access.

    If there are a lot of Access-based tools that are being used to "massage" the data internally, that is arguably not wonderful; what is crucial is that this not be used for the online copy.

    You probably should consider using something like MySQL for handling the online data access for the web server; it would be entirely appropriate to build a process that synchronizes the online data with what's in your "back office" systems. This synchronization can add substantially to the overall robustness of your systems, as this can allow you to detect both:

    • Discrepancies that might happen on the "Web Server" side, and
    • Discrepancies that might happen on the "Back Office" side.
    The online system that the public has access to doesn't need to have access to all the accounting information that the back office needs, and by clearly separating them, you can strengthen the security and robustness of both.

    It might be useful to build some abstractions behind the scenes like message queues like IBM's MQSeries, [ibm.com] on which Microsoft's MSMQ is based); a free tool that is commercially used that does this sort of thing is Isect . [netcom.com]

  • You mention only that the catalugue is in Acces... You could use MySQL for it, but I would never trust any financial data to a database server that does not support elementary database operations like transactions, or the ability to have an transaction log on a separate physical device...
    I personally like Sybase (11.0.3.3 is free to use in deployment on Linux), but Oracle or the other big names have also proven themselves in years of use..
  • We run multiple E-Commerce sites at my company, and after about 5 megs of DB data, and an average amount of site hits, access looks like a squirrel under an 18 wheeler. An easier move would definitely be to move from access to SQL. Even on a Linux box, Access will be unwielding poltergheist (poltergiest?) crashing your servers.
  • 1. Use PHP3 on top of PostgreSQL. 2. Create an HTML UI for sales people to update the prices in the pg database. 3. No dumping or syncing required. There really is no technological reason for a business not to run on Open Source software these days.
  • troll!!! Moderate this down please
  • And I suppose MS Access is a Serious Database... In that case why not just use perl or php with dbf files... (by the way, guy who had the original question, postgres, for one, has a dbf to pg convertor... although postgres lacks autoinc which suck, but there are add-in functions that can accomplish it)

    --
    Marques Johansson
    displague@linuxfan.com
  • I agree with what Bill's suggested. By essentially running in parallel, you're being cautious about not breaking the flow of the business (a good trait) and you can get comfortable with Perl, et. al. and confirm that data imports are happening correctly.
    Another selling point to your boss is that not only are you replicating and improving on existing functionality under NT, but once on a Linux/Apache platform, a lot of zero-cost *additional* functions can be added as needed. You'll also be much better poised to start taking advantage of XML parsers, etc. when that need comes along.
  • If folks don't understand why atomicity and transactions are important for financial applications, including e-commerce web sites, they should probably hire someone who does before implementing such sites.

    I use PostgreSQL for my personal web work, but think there's a lot to be said for using a tried-and-proven commercial database server when other folks' money is involved, as is the case with e-commerce. As is mentioned in this thread, Sybase has a no-cost version, yesterday's edition so to speak, available for Linux and it's good.
  • er... since when do you need transaction rollback mechanisms for read only databases, which 99% of all of the web database access I have done is? Most of the time you are not doing the standars a->b->c->a mechanism (the bank example) in a web enviroment...

    every time I hear someone say that you need transaction rollback for a web enviroment i cringe...
  • Moderate this, please.
  • Access is far more of a problem then NT. A good comparison would be Access:SQL Server::Win95:WinNT. Access is horrendous. I wouldn't trust it with my todo list.

    You certainly can run an e-Commerce server on NT. My company sells an NT payments system, which is essentially the brick-and-morter version of an e-Commerce system. I'm not about to get into an argument as to whether or not NT is better for this because, quite frankly, I have my own doubts about that. (More than doubts, actually.) But that does not mean that NT isn't workable. We've got numerous installations that stay up for months, and given the load they get, I see no reason why an NT e-Commerce system wouldn't be similar.

    One thing people need to remember is that "not as good" is not the same as "unworkable".
  • Strangely enough I have the impression that servlets are actually faster than perl when using IBM's jdk.
  • I just played around with this yesterday - ODBC access to mySQL data works very well. So all you have to do for the update is to write an VBA (the Access language) program to take the data from the access database and append it to a copy of it in mySQL. This should work very nicely.

    I definitely want to emphasize what many others have: DO NOT use Access for a multi-user application. It will work just well enough to fool you into committing resources, and then it will fall on its face. You are much better off getting the data into mySQL as soon as humanly possible, and then going from there.

    D

    ----
  • If you're using Active Server Pages, then I would suggest that it might be your problem. I've seen ASP's fall over constantly, and not from programming error, but because the dumb-ass ASP engine falls over all the time; has anyone seen the infamous:

    A trappable error occurred in an external object?

    This isually means the "World Wide Web Publishing Service" has fallen over (like IIS does only too well).

    I've converted over to Perl under NT and it's a dream compared to ASP's or compiled CGI programs!

    Though, if I had my way, we'd be using Oracle or MySQL under Linux instead of that lame excuse for an SQL Server from Microsoft - it's so dumb that it won't let you create a relationship between to tables within different "databases" on the same server! Thanks Microsoft!

  • While I've not run servlets, I do have experience with IBM's java. While it is much faster than blackdown or kaffe, it has a much larger startup time than perl does from what I've seen. Not that the this means anything, but for simplicities sake try writing "hello world" in perl, and in java, and then run them. You'll see that java was slower, to start up (and thats assuming that you'd be using perl as a CGI--which you probably wouldn't be doing)

    It just seems to me, that the majority of web scripts are more like "hello world" then they are like complex scripts that would benefit from OO.

  • by Anonymous Coward
    I agree most hardily. I also like linux and use it at home, but I also have built solid ecommerce solutions on NT and MS SQL. From the start we had to constantly hound our developers to produce good code that runs well, the problem is our developers were not trained in the NT area and needed a lot of help to learn how to write good code. NT is not very tolerant of sloppy code, but well written programs run very well. We have found most of our code problems were in the area of Data Access. MS has changed a lot with MDAC (to fix many problems that they have had) and some of the calls our developers were making were explicity not supported in the newest version of MDAC. Unfortunatly using these old none supported calls would still work most of the time, but cause other problems some of the time, after showing our developers the Documentation for the new MDAC they changed there code to use the supported ways of making calls, and now all works quite well. The biggest problem is that the developers can't keep up with the amount of changes being made by MS, so we also on our sites have picked a level of code and don't change, SP 6 is now our but it won't go onto any of our servers till there is a damn good reason. Now for the no brainer stuff, if you are using access for anything mission critical, you don't deserve to be working. For god's sake use a real database not a imaginary one.
  • Speaking of portability, I'm currently doing something similar:

    I designed an inventory system using MS access, also because I had no SQL servers set up and I needed the database immediately (pen-and-paper sucks).

    While the MS forms and database I set up were being used for initial data gathering, I set up Roxen's web server on the windows machine and started developing a front end and some queries using the wonderful RXML enhancements, including the SQL tags Roxen includes (damn, Roxen's nice).

    After getting some basic front-end parts up, I set up a linux box, and installed Roxen and MySQL (SuSE ships with both, but I rolled my own just to be on the bleeding edge and to get 128-bit encryption). Moving the web content to the linux box was 100% painless, since Roxen is virtually identical on both platforms.

    Moving the database over was almost as simple, using one of the scripts a few others have mentioned that I grabbed from the MySQL home page. I did have to change a memo field to a blob, and change the default "Yes" in an int field to a "1", but that was it. Then a simple mysql -u me -p database < oldaccessdata was all it took to bring the old data over to the world of Real SQL. :)

    It sounds like a lot, but in reality it only took me about a week to do the conversion - including the time it took me to learn how to set up MySQL and teach myself the needed SQL (in addition to doing what they actually pay me to do). The little extra effort is well worth it for the signifigant fine-grained access control gains I got from MySQL's grant tables, etc - and from a data access point of view since everything's web-enabled now.

  • Your solution (cluster machines) is sensible.

    I think it _is_ NT's fault, tho, else why reboot? Just kill the process, and a decent OS will reclaim the memory. If your OS doesn't, then it's only doing half of its job!
  • I love the php3/mySql combination. PHP makes the task of turning your data into web pages almost trivial, and both MySQL and PHP have excellent documentation. As for getting the data out of access, I'd save the data as delimited ascii, and then import it using perl with the DBD/DBI modules. That's also useful later on for various types of database maintenance.
  • I hate to see this guy go to all the trouble of making server os changes (to linux) when it isn't really necessary. He doesn't need linux or perl scripts or MySql. All of this just adds unneccessary complexity to the situation. And to solve this problem first we should look at his situation carefully: 1) He's got an NT server 2) He's got an Access MDB file about 14 MEGS His problem: 1) reliability problems Analysis of problem: 1) This is probably pushing Access too much. He needs a real database server. Access is fine for small databases with no more than one user accessing at a time. This is well known. 2) He needs a reliable inexpensive database server. 3) The OS he is using (NT) may not be his (or my) OS of choice, but the facts are it is more than reliable for what he is trying to do and it is already there. 4) MS SQL SERVER 6.5 or 7.0 (I would prefer 7.0) is a decent database and would probably work extremely well in this situation. ACCESS also supports upsizing to sql server directly in the Access App. Solution to Problem: A) create a ODBC SQL SERVER DSN B) upsize all the tables in the access database to Sql Server using the DSN created from 1). Access will replace the old tables with copies linked to the sql server. C) then compact and repair the database and enjoy! Isn't this easier than creating a bunch of interface scripts using DBI with PostGreSQL??? The answer is: Yes, it's as simple as A-B-C. So don't go overengineering your problems when it is not necessary as it is not in this case!
  • How fast/robust is an ODBC connection going to be
    to pump over 14 MB from Access?
  • Yes, definately. Access is a desktop based single user Database, even then, I wouldn't use it for more than 10,000 records. Move to MS-SQL Server would be a good option, as the administration is 10x easier, unless you have DBA experience or have a full time DBA. Admin on Unix based DBs are a bit more difficult if you do not have prior DBA experience -> the most important factor is Database Backup and Recovery. Make sure whichever Database you choose, you have a good and well tested disaster recovery plan. We have our e-commerce site using SQL Server 6.5 sp5 (not moving to SQL 7.0 yet) and have over 20,000 records. So far (almost 2 years in operation), there has never been a problem on the database end, but the main e-commerce web server server running on NT occasionally hangs (due to the unstability of NT 4) If you do have a DBA, go for Oracle 8, definately more stable (but more $$$!)
  • PHP is awesome. I've just started getting into it. It has tons of functionality, it can get stuff out of almost any database, including access under the right conditions. It's very easy language and PHP4 which is in beta now supports Secure transactions. www.php.net is a good starting place along with www.phpbuilder.com
  • PROBLEM 1: require stable OS's
    Having 2 NT machines may point to badly configured (I can hear the sniggers now ) ... but rip off anything you dont need on both machines, re-install to get a clean system. This should stop the problem of NT operating system crashing, but not rouge software. Generally if the machine is configured (HW & software) they crash as often as reported.

    Problem 2:
    database problems Our catalog has around 25,000 items in it. It's held in an Access database right now... around 14 Megs.

    been there, done that. At my old work [ringtail.com.au] we had similiar problems. I think it's a bit premature to write NT off for this kind of a problem just yet. It's not the environment (caveat: well configured NT system with min 128MbRAM, Fast disk drives etc..) it's your database engine.

    Access is good for playing with and possibly for serious applications a couple of years ago. But for a cat of say 25,000 records and more than a half dozen users Access will bog down under it's own limitations (file sharing database, not relational, mem leaks, your configuration settings, code access methods). I think before you start changing the operating system, code base and more importantly *mindset*, investigate a MS SQL server (or other politically correct db's) license.

    If management is balking at the cost... then cost out the hours to rebuild, test and get the system working using a linux/db/scripting approach. I say this because it's as much a business problem as a technical one. Change takes time, time equals cost - the same money to buy a new 'relational' database on your existing code base. With NT SQL Server you have the ability to upsize the DDL, re-insert the data etc...

    <rant>
    beware of suggestions to change your current setup by those preaching alternative OS's without fully investigating your current problems
    changing to Linux for instance is not the immediate problem.
    I'm a bit dubious here, and dont get me wrong I use/worship Linux as much as the next nutter...but dont do yourself and your company a diservice until you are 100% sure NT can't handle your business needs and not some *geek-lust* need to install a new OS. </rant>
  • by Anonymous Coward
    This is unrelated to the question, but a not-so-widely known fact is that one of the major early e-commerce companies providing third party credit card processing for online merchants was running their entire financial processing system (everything but the webserver) on Linux back in early 1995, before you could get secure servers for Linux. Best yet, it was done for technical reasons that couldn't be solved on any Microsoft or other Unix products. (posting anonymously without details for legal reasons, sorry)
  • I'm PROVING it every day on my machine for three weeks now! Win 2k pro is going to smoke NT4 and 98

    A single person sitting at a single system cannot, of course, "prove" the power or stability of any operating system. This scenario does not even rule out the possibility that some other person sitting at the same system might crash every ten minutes.

    I say "of course' because this is absolutely obvious to me. The fact that it's not obvious to our AC might indicate that he has only ever been exposed to a single operating system, and does not need much in the way of proof.
  • As you noted, you are concerned - quite rightly - , about the conversion of your database. There are a couple of things to note here, and other posters have already made one of these points.
    1. There are both middleware tools to convert Access databases, and direct imports in the databases you want.
    2. It is possible to create a new database from the old one *****if***** you wish to write a program to do the conversion. Your new preferred databases are widely used in the open source community ( MySQL is nice, and solid when properly configured ) and one can convert from Access almost trivially, as the SDK is available for free from M$, showing the exact underlying format of the database.
    As others have also noted, Access is rather sloppy in it's data rules, so be careful here.

    Another solution is to use an open e-commerce solution and port your data. There was an article on our very own ./ about a Linux e-commerce package not so long ago. Perhaps you might want to check the archives.

    Just my two cents.

    ***Unix is user friendly. It's just very particular about who it's friends are***
  • Problem 1: require stable OS's
    Having 2 NT machines may point to badly configured (I can hear the sniggers now ) ... but rip off anything you dont need on both machines, re-install to get a clean system. This should stop the problem of NT operating system crashing, but not rouge software. Generally if the machine is configured (HW & software) they crash as often as reported.

    Problem 2: database problems
    Our catalog has around 25,000 items in it. It's held in an Access database right now... around 14 Megs.

    been there, done that. At my old work [ringtail.com.au] we had similiar problems. I think it's a bit premature to write NT off for this kind of a problem just yet. It's not the environment (caveat: well configured NT system with min 128MbRAM, Fast disk drives etc..) it's your database engine.

    Access is good for playing with and possibly for serious applications a couple of years ago. But for a cat of say 25,000 records and more than a half dozen users Access will bog down under it's own limitations (file sharing database, not relational, mem leaks, your configuration settings, code access methods). I think before you start changing the operating system, code base and more importantly *mindset*, investigate a MS SQL server (or other politically correct db's) license.

    If management is balking at the cost... then cost out the hours to rebuild, test and get the system working using a linux/db/scripting approach. I say this because it's as much a business problem as a technical one. Change takes time, time equals cost - the same money to buy a new 'relational' database on your existing code base. With NT SQL Server you have the ability to upsize the DDL, re-insert the data etc...

    <rant> - sig to noise
    beware of suggestions to change your current setup by those preaching alternative OS's without fully investigating your current problems
    changing to Linux for instance is not the immediate problem.
    I'm a bit dubious about posts just throwing around languages, OS's and db's without working the problem first.. Don't get me wrong, I use/worship Linux as much as the next nutter...but dont do yourself and your company a disservice until you are 100% sure NT can't handle your business needs and not some *geek-lust* need to install a new OS, and code perl :)
    </rant>
  • write a fickin' trigger! Geez.
  • I'm still not sure why nobody has mentioned middle-of-the-road options like Sybase Adaptive Server Anywhere (formerly SQL Anywhere) or Borland Interbase. No they are not free but they are both full-featured SQL databases that can handle a fairly large load (we have almost 10gb total in several databases running on SQLA right now). I personally don't know anything about the free db engines but SQL Anywhere is definitely a good choice for NT.
  • This was released a few weeks ago or so, the thing about it that makes it so cool is the fact you can create specific task based roles. Roles are great because of the limitations they let you past with groups.

    I would hope that more people gave this excellent oppurtunity to test it out a chance in their environments.

    (Been doing Sybase administration for almost 4 years now, I have never worked with a more extensible, reliable and powerfull system...)

    -Dextius Alphaeus
  • Actually, instead of StrongHold, I think Raven SSL for Apache would be better way to go. It's $357 vs $995 for Stronghold. Covalent (the company who sells Raven SSL) also does technical support for both the Raven SSL module and Apache in general, which should go over well with the suits.

    You can find its website here: http://www.covalent.net/ [covalent.net]

    Or if you live in a free country, you can use mod_ssl at http://www.modssl.org [modssl.org]

    Also, I wouldn't really call it a close race between Postgres and MySQL features. MySQL doesn't plan to do SQL Transactions, for instance, while Postgres does. MySQL, on the other hand, has much friendlier SQL extensions, particularly for date formatting and such. Both have commercial support options.

  • I've had excellent luck with this free product.

    It takes some effort to pick up, but it's very flexible, and you get a lot of cool stuff without much effort. Like automatic shipping pricing based on zone, credit card encryption, database compatibility, awareness of accessories and properties (like color), quantity pricing, etc.

    My catalog only has about 300 items, but it is reputed to work well up into the millions.

    The basic concept is that you set it up to generate on the fly pages based on what's in the database. You can also link to it from static product pages if you want (which we do).

    The programming is done by writing perl code and sticking it in web pages to be run as the page loads. You can also call static functions that you put in configuration files, where they only have to be parsed once.

    The only downside is that while there is about an inch thick stack of documentation, it is very poorly written and hard to understand. Fortunately there is an active mailing list for support.

    Good Luck,
    -Loopy

  • by Anonymous Coward
    I would suggest that you check out using an application server mod for any major web server. The application server will allow you to use JSP and Servlets with Java. This is a good alternitive to CGI. Using Java (being a popular term) you could have an easier time getting the funding for the upgrade to a new system.

    Java Apache Page [apache.org]

    This site will point you into the right direction getting everything that you need. Should you need to get a developer I would suggest contacting Web Programmers, Inc. [webprogrammers.net] We have used them for our development and they've done a great job on our up coming project for Ruptime.com [ruptime.com]
  • Your comment about moving Access to MySQL to get to a "REAL" SQL made me laugh.

    MySQL does the barest of bones SQL. There is no replication (pending however) and no transactions (status?). If this guy's got a 25k-item catalogue and is expecting any kind of web traffic and wants data inegrity, he NEEDS transactions. Table locks suck.

    I suggest Postgres (maybe, it's what I use but not in a heavy production environment) or something meatier like Informix or Oracle. You do NOT want to be doing anything regarding money or inventory without transactions and row-level locks.
  • Wait one bloody minute... You've got a e-commerce solution running over a serial cable against a Access database and you're wondering why it's unstable? It has nothing to do with Linux vs. NT it has everything to do with your application architects being idiots.
  • MySQL is indeed not a particularly potent "relational" database, with focus on relational; [hex.net] it is, instead, a fairly thin veneer of SQL layered on top of a functionally-pedestrian but nonetheless fast B-tree database system.

    If what you're really doing are transactions, this is fine. Build up sets of updates that are themselves transactions.

    Keep in mind that the web server is not the back office; the data should get pushed over to the "heavy duty SQL box" when it comes time to do the accounting for either money or for inventory.

    Consider the MySQL database to be an "embedded" database system, intended to support just the web application. Make that robust, and leave the "heavy production environment" stuff for the other server.

    After all, you don't want customers outside to be directly hitting your master database, do you? I don't spell that "security."

  • Please don't listen to the "go to MS-SQL Server" people. A vendor said that to a company I worked for and it had more stability (but what doesn't have more stability than Access?), but it came with its own set of stupid problems. Keep it simple, use the MySQL/Linux combo and suck the access file into MySQL on a daily basis. I think you'll be pleased with the results, especially since your database is only 14 MB (upsize that to SQL Server?!? ridiculous!). Good luck!
  • By your standards, lots and lots and lots of morons. There are LOADS of such implementations out there, and for a while every computer magazine I picked up had an article on how to get your Access DB onto a web. It's not difficult, and it actually works well enough if throughput is low and the app is not mission critical.

    No one's going to take you seriously. Anyone who has abuse but no better solutions to offer can and will safely be ignored. Anyway, the choice of Access may not have been his, he may just have had to pick it up. Don't go away mad ... just go away, OK?

    As plentry of people have said, exporting data from an Access database is trivial. If you don't want to get a copy of Access, use ActivePerl (www.activestate.com) with DBI and DBD:ODBC and roll your own. You could perhaps use another DBD for the database you plan to use and do the entire data migration in a single script.


  • Hey all,

    I was facing the exact same problem just last month. In my experience, if you want to develop scalable server-side software, and you're using OO technology, you need a layer of database objects to encapsulate all your data access logic. You then build service-oriented objects on top of those. This is very similar to the EJB model (minus containers/transactions/etc), but you can't afford EJB because a good server is >$10k.

    But this object layer can prove to be a high maintenance piece of software. Every time a data structure changes, both the object layer and the database scripts have to incorporate the change.

    My software is written to read a single specification file, and from that file it generates:
    • mySQL DDL statements (CREATE TABLE) script file.
    • Import scripts to import data from delimited text files (any delimiter).
    • Java class files that encapsulate database access logic.
    • Java collection class files to represent arbitrary database queries.
    • JSP pages are generated to interact with the class files:
      • Pages to browse database tables
      • Pages to add and edit records
      • Pages to search the tables based on any criteria.

    So, yes, the specification file you feed the script can be quite complex, but it's nice to maintain your database AND your object layer from one place.

    Well, my software is really just a hack I put together, I wouldn't be surprised at all if many other people have done the same thing already. I dunno.

    Anyway, if anyone's interested in using my software, it's still not 100% foolproof, so I'd like to give it some nice stress testing in an open source environment.

    Any takers?


    - jonathan.

    "Every tool is a weapon, if you hold it right." - ani difranco
  • Excellent advice Sheldon, a few additional thoughts.

    1. mySQL is fast and stable we have been running it for 6 months on a system averaging 80 - 100k transactions a week. The one problem is transactions specifically the lack of COMMIT and ROLLBACK. (app has to keep 8-10 tables to keep in sync per transaction). We ended up simulating COMMIT and ROLLBACK with perl plus DBI with no significant hit to performance.

    2. Attaching Access frontends to mySQL is trivial. ODBC and away you go. As pointed out data migration, once schemas are migrated, is pretty easy.

    3. Schemas can be the problem. There is no easy way (that we have discovered at least) to export schemas. This to date has been a 'by hand job' (excuse the pun :) ).

    A few final ramblings:

    We love linux/apache(mod_perl)/mySql for infrastructure problems. Access is a great little quick dirty tool for building Entry and Reporting screens for our MS wedded customers. (NOTE: we did just recently convince one client to re-equip their 12 person data entry dept with little penquins). We have started to use PostgreSQL for some more complex data problem spaces. The short term results have been very positive.

    mitd
  • Looks like the astroturfers have come out of the woodwork for this one. ;) I wonder how far up the chain this article got escalated at Microsoft today.

    Anyway, regarding picking a Unix-type OS, and database it's obvious it's a complex issue. The machine crashing may be an immediate problem, but there are long-term issues to face. For instance, I have loaded (read loaded as in doing something - i.e., not idle) machines that stay up hundredes of days for an upgrade. That's without reading between the lines, reloading the OS, hacking the configuration or random parts of the OS breaking between upgrades.

    People may say that Unix systems require less effort to run, but what it really requires is more knowledge. For instance, the primary webserver I run for an "e-business" is a single Debian machine on a pentium pro 200. Through several Debian upgrades (including libc5 -> libc6) it has always been stable and reliable. No service packs that break half the stuff, no middle of the night crashes, nothing. The amount of administration effort to run the box (which does hundreds of thousands of $$ of business a year) is a few hours a month. The cost of the setup was around $5000.

    • Pentium Pro 200 server machine (with premium parts)- $2000
    • DPT Raid controller & 4 4GB IBM SCSI drives - $2500
    • Seagate Scorpion (DDS3 12/24GB) $600

    Fast-forward to my day job at a Fortune 500

    There they recently migrated our mail server from a single (1) machine running netscape mail server to a farm of NT servers running Exchange. The Netscape mail server was on a Sun Enterprise and was rock solid. The Exchange servers, on the other hand, are on a weekly reboot schedule. Our Exchange/NT team had done all it could, and came to the conclusion that either the machines could be rebooted every 7 days or crash on their own every 10.

    Also of interest is the management capabilities inherent with Unix-based systems vs. Exchange. For instance, on the Netscape mail server, if a user wanted files from his mailbox restored, a few files were restored from backup, and presto! On the Exchange server, the entire mail database has to be rolled back to the state where the files still existed (for *everyone*)

    Another item of interest is that when doing the mail server migration, the postmaster box ended up with over 60,000 pieces of mail in its box from warnings. With the server on solaris, I was able to write a quick perl script that would delete the files of specific subject line. The Exchange team's answer to a similar problem (this time with 100,000 emails) was to pull them up in Outlook and delete the messages 10 at a time. Of course, that wasn't possible, as the machine would just freeze due to the insane amount of RAM required to do such a thing (not to mention the time required to do this 10 at a time..) Luckily one of the up & coming unix geeks had a MS background and mentioned that outlook delete filters would do the trick (which they did - but it had to be accomplished from the client side)

    Anyway, the moral of the story is that NT server installations as a rule will cost more, require more maintenance and make it difficult for you to fix things when it really counts.

    Anyway, no matter what you do this time, I'd reccommend you at least set up an experimental server to do similar things to familarize yourself with a unix-like enironment. And, learn enough perl (or python/zope or php or java...) to put together the kind of web application you'd want. My personal favorite is perl, as CPAN has modules to do just about anything, and it's been invaluable to me as a system administrator and web programmer, but I know people have done very cool things with the others, also. Also, SQL. I'd recommend Postgres and MySQL (whichever fits the job) and, possibly Sybase (but its proprietary nature can be a pain at times). Don't forget about FreeBSD, either. Its scaled-down nature can make administration easier when you only want specific things on a box, and at present, it has some large file advantages over Linux.

  • I really can post coherent messages. Honest, I can! I plead this copy of Netscape leaking huge amounts of ram with large 's, preventing me from previewing. O:)
    People may say that Unix systems require less effort to run, but what it really requires is more knowledge.
    Should be "may say that Unix systems require more effort" .. Freudian slip. It's hard to say "Unix/more effort" together after the days I've wasted helping my g/f with her unweidly windows partition. ;)

    Btw, the "requires more knowledge" part mostly applies to getting started with the OS. It could require less knowledge to get a unix box running stably than it does to get an NT box running stably (and said knowledge is easier to come by) For example, you may have to do a bit of reading and compiling to get things how you want them, but the compiling is documented. Also, at least in the open source realm, mysterious failures tend not to be a problem. In the event something *does* fail, it's usually easy (or at least straightforward) to figure out why it's failing and how to fix it (as opposed to trying 25 different driver version, service pack version and system resource configurations) Even in the Unix world, Netscape Enterprise vs. Apache is a case in point here. When our corporate Netscape Enterprise server mysteriously failed to start one day due to a "sig6 pipe closed" error, it turned out that the search engine's index had gotten corrupted, and disabling the search engine prior to starting the netscape server would fix it. It took a call to netscape support to figure this out, though - running truss (equiv of strace on Linux) on the server didn't help, nor did the system or server logs.

    Another thing I think I should mention is that if you put Linux on a webserver, and the application outgrows the machine (or OS), it's trivial to port the application up to a more "enterprise" class OS such as Solaris. At the Fortune 500 I mentioned in my last post, the enterprise webserver was recently moved from a big Irix machine to a big Solaris machine. Since all the in-house apps were coded in perl, everyone's applications didn't require any porting. The same would also be true for any of PHP, Python/Zope (and even java if you're lucky). Of course, you could also design the application with real scalability in mind and, for instance, use 2000 FreeBSD webservers like Microsoft does with Hotmail and let something like Solaris handle the backend mass storage. (Or like deja.com or google.com do with Linux.. or yahoo.com does with FreeBSD)

  • Check out http://members.aol.com/bbirthisel/alpha.html. Bill Birthisel has provided a Win32::SerialPort module that is compatible with Device::SerialPort (CPAN) such that you can now talk to serial devices using the same code even in Dark Side of the Force environments.
    Using SerialPort and Perl is very flexible and powerful, the package let's you do exactly what you want/need to do. I used it recently to prototype a project, presuming that looping to satisfy reads, timing out, calculating CRCs, and such on the fly with Perl would require a re-write in C once I got it all working. Not a problem. It's fast and flexible, and very robust because Perl makes it easy to do things right.
  • If you're NT is constantly die-ing then you probably have a misconfigured item somewhere in your system.

    Good configured NT systems are as stable as Unix machines.

    Of course, it is MS's own mistake to have so much 'bad' sysadmin's out there. Unix has thousands of capable sysadmin's, they learned the trade in ten years or more of experience. NT admins however often just follow a course, take an NT certified whatever exam and hup, they are promoted to full blown NT sysadmins.

    If you compare NT to unix, don't forget to compare their sysadmins as well.

    But .. NT is harder to administrate then Unix, so it is quite natural to have less good NT admins then Unix admins

  • by Anonymous Coward
    A couple of notes about Oracle.

    I've installed Oracle on NT, Linux and Solaris. Oracle for Linux is riddled with bugs - both 8.0.5 and especially 8i. There are numerous patches and workarounds you must use, or it simply will not work.

    I was aghast at the quality level (or the lack thereof) of such an expensive piece of software. Even if the install goes well, you must understand that Oracle, by nature, uses up a LOT of resources. I've just finished installing 8i on a dual 500Mhz with 1gig RAM and raid0/raid5 (total 45gig) system. But once installed, it's faster than hell, and you have a huge amount of tools to do anything you want.

    The easiest install/most stable operation is on Solaris, where I was able to get it to work with 128MB RAM and 16gig raid.

    Oracle on NT is as much pain as it is unstable. Most of the fault lies in NT I'm sure. But we do use it because some 3rd party tools require NT.

    Oracle's nice if you got the money and the resources (no, maybe just money). But, what I like best is just a simple Linux/MySQL combo. This is also faster than a greased pig, but much much much easier than Oracle to implement. I mean I can install Linux in about 15 minutes and MySQL in about 5-10 minutes. Throw in Apache and PHP and I can have a development platform in less than an hour. And despite all the nifty tools Oracle has for storage management and data integrity and backup, I can write my own for MySQL easily.

  • Postgres has a very mature windows-odbc driver,
    while I use it only for relativly small dbs (2-5 MB) it works over a modem (33.6k) connection without a hitch (it's naturally slow though)
    Don't know about mysql, but your question sounds like postgres would be a very good solution, afterall you seem not to need the fastest database available, more a very reliable one.
    This is not to say that mysql isn't reliable, but it doesn't do transaction logging.
  • ...I need to include an accounting system written in Visual Foxpro.

    If you are going to do anything involving both Visual FoxPro and the internet, you should really check out Web Connection [west-wind.com]. It's a framework that allows you to create HTML pages on the fly directly from VFP code. It's really fast and the toolkit is highly scalable and very well supported.

    Visual FoxPro doesn't get much press from Microsoft because they want to push SQL and seat licenses. But people who use it know that VFP is very fast and reliable. If you are going to mess with databases, you should be coding with a database language.

    WebConnection runs Egghead's SurplusDirect site (among others [west-wind.com]), which gets up to 55,000 hits an hour during peak bidding times. VFP can handle it, as long as you're not writing crappy and inefficient code.

    There is support for XML, MSMQ, COM, sending data using HTTP (with a VFP app on the client side), email generation, PDF doc creation, etc. You can also access other databases from within FoxPro using ODBC.

    At work, I developed an e-commerce project that does all the online work for several clients that we support, all running from one base class, all coding done in FoxPro. One of the best things is that you can debug it just like a regular VFP app.

    I'm not associated with this company but I have used the products and can tell you it's very powerful. The price is way cheap and a shareware version is available for downloading. Check it out. Note: This is for Windows only, but you are not tied to IIS.

  • Just show them the CNET story [cnet.com] on Windows 2000 pricing and tell them you'll be bankrupt if you ever upgrade. Good luck on the conversion!
  • Hi, I didn't had the chance to read all comments so I hope I don't react with the same things people said before. If so, I'm sorry. I'm setting up a e-shop also at the moment and I can advise you two things: Use a sql server you know. I read something about people telling you mySql ain't that good. I prefer Oracle but have noticed that if you know how to handle mySql that it is as good as Oracle or sybase or PostgresSql. Use your knowledge, don't use something you think is better if you don't know how to handle it. And ofcoz, try not to use access. For the webpages I use ColdFusion (Allaire, www.coldfusion.com) running on a solaris machine with Apache. ColdFusion is very good for 'programming' e-commerce pages. If you want more info: check allaire site, they have a pretty large amount of information about e-commerce online. I think the coldfusion 4.5 runs on linux also. Pepijn Pepijn@kangaroot.net www.kangaroot.net [kangaroot.net] will go 'live' soon now.
  • If you're NT is constantly die-ing then you probably have a misconfigured item somewhere in your system.

    Good configured NT systems are as stable as Unix machines.


    Possibly, but unix machines tend to be well-configured by default. Performance optimizations may be possible, but I never heard about a unix box "constanly dying" because of a sloppy set-up.
  • by Anonymous Coward
    Absolutely correct, if data quality and database reliability are important to your company.

    I think it's clear from this thread that Access just isn't designed to run your enterprise. At best it's for non-DBA's to run simple databases for a relatively small number of users. So you're going to want to convert to a full-strength DBMS.

    The real problem with converting from Access to a real DBMS like MySQL is not in converting the data itself, but in converting the stuff that's not the data. An MDB file is not just tables of data, but an all-encompassing file that contains tables, queries, forms, reports, macros, and VB modules, etc.

    Any good database has rules which prevent invalid data from being entered. Just like any good program has good error handling to prevent any possible user entry from crashing the program. When you get bad data in your database the results are worse because you won't have any indication that your data is bad until it gets REALLY BAD, or painfully obvious. Not a good thing for business.

    The problem with Access is that it doesn't have a tool for managing all of these rules. They are hidden, partly in the table definitions, partly on your forms, partly in macros and VB modules, and partly in the "Relationships" view. A business-class DBMS will have a tool to manage all these rules and mechanisms in one place. It will also allow flexible rules. For example, you can relax the rules in order to import a table that you know contains errors, then it can find all the errors, separate them or fix them, and start enforcing the rules for all new data entry. Access just isn't that sophisticated, and what controls it has, as I mentioned, are not all in one place.

    The bottom line is that while it's easy to copy the data, it's very difficult to port over all of these hidden mechanisms that control data quality. That would require someone who knows a lot about Access, which I assume from your question that you are not (no offense). Your alternative is to copy the data to your favorite DBMS (MySQL, Oracle, or whatever), and then set up all your data integrity rules from scratch on the new system. That's probably what you'd end up doing anyway even if you could find all the hidden stuff in Access. It's also not a bad thing to do because it will force you to review and reconsider what data you are storing, how it is stored, how everything relates, what your data entry processes are, etc. That process is always a good thing. You would be wise to get help from a DBA.

    If you feel confident hacking a DBMS without a DBA, then more power to ya, but you would be wise to proceed cautiously with your company's data. Build the tables and rules in your new DBMS and test it extensively with test data. Make sure it not only works, but also robustly handles errors and bad data entry before you stake your business on it. If you don't believe that it's important to address data integrity issues, take a look at the Y2K problem/response/expense and realize that you can have the same problems with ALL of your data, not just the dates.

    If you aren't familiar with what data integrity/quality is, or how to maintain it, try the following links:

    Database Central [baobabcomputing.com]

    DM Review [dmreview.com]

    Best of Luck.

  • But in a servlet environment the JVM has already been started. So that nullifies the startup time difference except in cases where a new process has to be launched to run the perl script or CGI. And then we probably shouldn't even compare, should we?

    One nice thing servlets has going for it is that all of the session management is handled for you. This is a big benefit.

    Usually the only people that complain about Java's speed in a server-type environment have never even compaired it. Sort of like all of those Windows users who think Linux sucks but can't hardly spell it.

    I've done both CGI (even in C) and Servlet development. And although I would be dragged kicking and screaming into full-time web development, I would still recommend Java servlets in the right situation.

    My (stagnating) progeeks.com web site uses servlets and I was able to code the entire thing in one weekend. (Well, the parts I have finished anyway.) The company I actually work for is switching all of their web apps to servlets.

    Fun stuff,
    -Paul (pspeed@progeeks.com)
  • The reason I consider the "anectodal FUD" worth mentioning is that I have personally, (as have many people I know) been able to get Unix systems to be stable for months|years on a regular basis with no special effort.

    I haven't met any NT admins who can demonstrate the same. It's always a story about how a friend of a friend had a cousin who worked at some important place that had a cream of the crop admin who was able to push NT to a whopping 60 days uptime or somesuch.

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...