Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Microsoft

Microsoft to End DLL Confusion 687

MankyD writes "ZDNet is reporting that Microsoft is attempting to do away with DLL version conflicts in its next version of Windows with a technology it calls 'Strong Binding'. When new programs attempt to overwrite old versions of DLL's, the Strong Binding will index the DLL's and allow the programs to reference them by a unique ID, rather than by file name. Hopefully it will prevent a new program from breaking an old one. I would think this might add to DLL clutter however."
This discussion has been archived. No new comments can be posted.

Microsoft to End DLL Confusion

Comments Filter:
  • Auto-DLL Managment? (Score:5, Interesting)

    by MSTCrow5429 ( 642744 ) on Friday March 07, 2003 @09:06AM (#5457673)
    Sounds like a great idea. While there will be more DLLs in the registry, at least each and every program will have it's "own" DLL and can't be broken. Although I wonder if the software will default to the newest DLL and then go back if it doesn't function correctly.
    • by Anonymous Coward
      What you don't realize is that the process will also prevent a 'modified' (read:hacked) dll from operating as well.

      Those of us who use MS products, and rely on modified versions of dlls for proper functionality, (Macrovision removal in powerdvd, for instance) will be screwed.

      • by WindBourne ( 631190 ) on Friday March 07, 2003 @09:59AM (#5457973) Journal
        While a sibling poster pointed out that the admin could change it, you have probably hit on where MS is going. I suspect that MS will simply drop the admin capability.
        The versioning problem could have been handled with an external nameing approach (name-version.major.mindor.dll) and then using a resolver program. They are electing to give an internal id which implies that dlls will not be in one place or will have different names. This will make it hard to change out dll or know which ones to transport.
        Actually, just thinking about it, I've heard that longhorn is supppose to have a sql based file system. It would make it easy to sux this into it and use the id as a way to find the dll. You will simply need to know it.
      • by gilesjuk ( 604902 ) <<giles.jones> <at> <zen.co.uk>> on Friday March 07, 2003 @10:28AM (#5458229)
        They're probably doing it to break WINE as well. DLL problems aren't that bad with Win2k and XP, system DLLs are protected and with XP you can go back to a restore point.

        If an app installer is so badly written that it messes up your installation then the software can't be much good either.
        • by molarmass192 ( 608071 ) on Friday March 07, 2003 @12:00PM (#5459129) Homepage Journal
          I really don't think WINE was high on their list of priorities here. I think their idea had several desired outcomes:

          1 - force all non-longhorn users to upgrade
          2 - force all software vendors to code to the new .NET API, and
          3 - integrate SQL*Server into the OS

          Also, imagine what a nice kick in the teeth to Java (which I'm sure is a bigger radar blip than WINE) this will be. I think this will backfire on them, lack of full backwards compatibility is *one* of the reasons why XP never took off. This one lacks any backwards compatibility so you can just extrapolate the barriers to adoption.
        • by e2d2 ( 115622 ) on Friday March 07, 2003 @12:06PM (#5459186)
          They're probably doing it to break WINE as well. DLL problems aren't that bad with Win2k and XP, system DLLs are protected and with XP you can go back to a restore point.

          Gimme a break. DLL Hell has been a problem for a long long time so when they actually try and fix it they are now only trying to fix it to break WINE? That's a strech and I'm sure you know it. Yes you can go back to a restore point but that does not solve the problem. Now one app works but the other doesn't because they are both calling the same dll expecting different versions. Restore points are a temporary solution to a legit problem.

          If an app installer is so badly written that it messes up your installation then the software can't be much good either.

          So using the current system how do you propose that app developers deal with this? Say when I compile against a dll I am expecting version 1 and it works fine. Three years go by and the dll has gone through 2 new versions released with the latest service pack. Someone installs my application and it copies the old dll over the new one. The system is now screwed. My other option would be to link dynamically and since the new version is on the machine my application simply wouldn't work. BUT using the new system both my application and other applications would work fine, using their appropriate dlls. So how is this a bad thing again?

          Can anyone propose a better solution? If so I'm all ears.

        • If an app installer is so badly written that it messes up your installation then the software can't be much good either.

          I don't think that's so much the issue. One thing I *hate* is a seeminly simple installation that then requires a reboot.

          But, as a bit of a Windows programmer myself, I understand it. If a program happens to require a newer version of some DLL, that is currently in use, it can't replace it. And with Windows' current system, you can't just use a local copy of it either.

          When a program loads up a DLL, you can't specify a full path, just the filename. Windows has a specific search order, and the first place it checks is memory. "someapp.dll" is already loaded, so it uses that code -- even if you have a newer version in your own program directory.

          I've always wondered why the hell they went with this approach. You have to watch for name conflicts between private DLLs (my program may happen to have "mp3.dll", which is completely unrelated to some other program's "mp3.dll"). And of course if an application uses a newer version of a system DLL (common controls library is a commonly-upgraded one), replacing the system-wide DLL is required, and naturally a reboot is required. And there's always some chance this upgrade may break an older application...

          You can also forget about renaming a copy for private use, too; many of the system DLLs reference eachother by name. It works if you hex-edit them to reference the new names (I did this only as a proof-of-concept with VB runtimes)...

          In my opinion, the best/easiest solution for developers would be to chnage the search order to start in the application directory -- or better yet, only do this if some registry flag is set, so older apps can have the current default behavior...

          However, anything to aleviate this would be nice...
      • by Pxtl ( 151020 ) on Friday March 07, 2003 @10:38AM (#5458315) Homepage
        I can definitely see that that is where this is going. Perfect example: XP is skinnable. But, only MS certified skins can be run on XP. MS has not made any avenues for making certified skins. The result? Download a hacked DLL and then use non-certified skins.

        There is a massive community of XP-Styles, all centered around a hacked uxtheme.dll. If Explorer specifically looks for a specific version of uxtheme.dll instead of the filename, they're SOL.
        • by arkanes ( 521690 )
          Unless the "unique id" is encrypted, both in the DLL and in the executable, and in the file system, with keys you can't recover, then it will still be possible to replace DLLs with modifed versions. It'll just be awkward enough that it (probably) won't happen as part of a program install.
    • by bourne ( 539955 ) on Friday March 07, 2003 @09:32AM (#5457851)

      Sounds like a great idea.... each and every program will have it's "own" DLL

      Call me old-fashioned, but I thought the point of dynamic libraries was to reduce program size and duplication of effort by allowing multiple programs to load common functions out of libraries.

      So, it's a great idea, insofar as it completely negates the advantage of having DLLs in the first place. Why don't they just compile statically instead?

      • by AltControlsDelete ( 642641 ) on Friday March 07, 2003 @10:17AM (#5458103)
        I have not read the article, but I think this simply works by allowing different versions of the same DLL. If you have two programs that require the same version of the same DLL, they'll both use that DLL. However, if you install a new program that wants to install a different version of an already-installed DLL, that program will use its version of the DLL while allowing other programs to continue using the pre-existing DLL.
        • by bourne ( 539955 )

          I think this simply works by allowing different versions of the same DLL.

          Right - given N programs using foo.dll, how many different versions of foo.dll do you expect will be needed? Based on my experience with Windows, I would expect somewhere between N and 2*N (allowing for "upgrades" of N[i] which won't remove the "old" DLL version.

          Yes, theoretically, you'll have a smaller number of DLLs than you do programs. Realistically, I don't see it happening. The various versions of Microsoft's XML DLL (which comes to mind because of the security patch which requires you to figure out which versions you have installed, which is always more than 1) illustrate this perfectly.

      • hilarious... (Score:3, Informative)

        by kingkade ( 584184 )
        I think it works by 'indexing' the dlls which are required only based on version. call me old-fashioned, but i like to read the article first. ;-)
      • by cgenman ( 325138 )
        Agreed. When I first saw this article last night, I immediatly thought they decided to can that archaic, unnecessary system. Obviously there is plenty of disk space. And the RAM savings exists but isn't significant either, as only static and not dynamic data is sharable. When was the last time you saw an executable that didn't contain every library known to man, and whose static ram footprint was over 1 meg? Now, compare that to the number of times your system broke / crashed due to a DLL conflict.

        Get rid of it already! This patchwork systeming is well, I guess it's putting a roof over my head. But it is also unnecessary complicating an already excessively unnecessarily complicated system.

        It really is getting time that Microsoft put windows in an emulator and re-wrote the functionality of the OS. It will never happen, but it is time.

        -C

    • by n3k5 ( 606163 ) on Friday March 07, 2003 @09:32AM (#5457855) Journal
      ... at least each and every program will have it's "own" DLL ...
      Ah, yes, what a wonderful idea; let each and every program have its own DLLs, at the mere cost of rendering the whole DLL system totally useless.

      You see, if you just wanted to split your executable code over several files for one reason or another, you could include DLLs with your program (in its own directory) ever since. Those wouldn't ever be changed by Windows, but this has nothing to do with the registry. The whole idea behind registered DLLs that reside in a centralised place is that you have shared libraries, that you don't have to store code used by several programs in multiple places, but only once, where you can do easy updates.

      However, now that there are so many versions of Windows out there, Microsoft is experiencing compatibility issues with DLLs and they're doing something about it. I'm not familiar with the details of their solution and don't want to say it's a bad one at all. But your ideas are a little too extreme; saving one copy of a DLL per program is just absurd.
    • Yes but this raises a new security threat. Say there is a remote root hole in a certain DLL. If you get the newest version, some programs will still use the old DLL, witht he hole staying behind. That could be a big problem.
    • Unix has had library versioning from the very beginning. Shared library filenames specify what version of the shared library the file contains, and when programs load they can request a specific version thru the file name.

      And here comes M$ taking the same idea, and adding a point of failure in the form of some binary index of dlls. Jeezz this is just another thing I'm gonna have to fix when my windows friends start having trouble with thier computer. Really unnecessary. Couldn't they have just outright copied the Unix method? At least then they would have done it right.
  • DLL vs static libs (Score:5, Insightful)

    by selderrr ( 523988 ) on Friday March 07, 2003 @09:07AM (#5457675) Journal
    I never really understood the advantages of a DLL over a static lib in modern times.

    In the old days, when diskspace & memory were precious goods, they made sense to share code. But today, what's the burden of 4MB extra app size compared to the DLL misery ?

    Except for plugins, I see no reason why developers would need DLLs. Can anyone shed some light here ?
    • by Anonymous Coward
      What, a little 26k app should now be 4megs in size? All of them?
      Plus, if you`re going to support them for plug-ins, then the system is in place anyway.
      Plus, if you know what you`re doing, there *is* no DLL hell - install them in the directory where you install the app, rather than in a shared directory, give them a sensible name, and you`re sorted.
    • by fruey ( 563914 ) on Friday March 07, 2003 @09:11AM (#5457713) Homepage Journal
      Memory is still precious when you've got your OS eating up 128M of it (WinXP) and you have slightly older hardware.

      When 1Gb of memory becomes standard, then maybe. In any case, it is downright inefficient to have the same code in 3 or 4 places in memory, even if you have got loads to spare.

      • by wiggys ( 621350 ) on Friday March 07, 2003 @09:16AM (#5457746)
        Agreed. Windows XP gobbles up about 110 megabytes of RAM when it's just booted, without any background images, screensavers, system tray apps running etc.

        Dropping the fancy XP theme frees up about 5 megs of RAM, but if your system only has 128 or 256 megs of RAM you don't have a lot of room to load apps.

        • by override11 ( 516715 ) <cpeterson@gts.gaineycorp.com> on Friday March 07, 2003 @09:32AM (#5457853) Homepage
          Try using a different shell. It seems to me that explorer takes up a ton of ram, especially when doing anything besides looking at the background. Aston shell does great, and when paired up with the newest Phoenix build, everything seems to just zip and open up fast, and memory usage is a lot lower. :)
        • by oconnorcjo ( 242077 ) on Friday March 07, 2003 @11:24AM (#5458822) Journal
          Agreed. Windows XP gobbles up about 110 megabytes of RAM when it's just booted, without any background images, screensavers, system tray apps running etc. Dropping the fancy XP theme frees up about 5 megs of RAM, but if your system only has 128 or 256 megs of RAM you don't have a lot of room to load apps.

          Actually I think the reason XP takes up so much ram is because it is loading up a ton of USELESS DLL's and COM objects into ram so that IE will load faster and Word XYZ will open immediately, .NET Framework, VB Framework and the list goes on. A whole bunch of junk that really should not reside in RAM just so that MS's stuff loads faster than the competitors products. DLL's are a great idea but Microsoft has found that they have too many DLL's and they are disorganized.

          The idea that an opperating system has a standard core set of DLL's that all programs can use to add functionality to programs is a WONDERFULL IDEA! Microsoft is only finding out that there is such thing as too much of a wonderfull thing. They have been writing DLL's and COM components for windows for almost ten years and they need to keep backward compatibility with all of them no matter how ugly or badly implemented an original DLL was done. This method will allow them to stop having to keep backward compatibilty (but leads to other problems such as bloat). They will still have too many DLL's that are only usefull to only one or two programs (and this solution won't make it better) and really should be statically linked or put in a private directory of the XYZ program (and uninstalled when the program is) but this is a good step in the future management of DLL cruft because they can at least start the process of breaking backward compatibility of badly designed libraries.

      • When 1Gb of memory becomes standard, then maybe.

        Unfortunately, even when 1GB is standard, the problem is that people will be running Windows KAE-T (Kick Ass Experience - Trusted) which requires about 927MB of memory without themes.

    • by Anonymous Coward on Friday March 07, 2003 @09:12AM (#5457722)
      It is not only for preserving disk space.

      It also preserves ram by sharing common code between different applications.

      It also makes upgrades and bugfixes easier (think openssl, for example).
    • by IWannaBeAnAC ( 653701 ) on Friday March 07, 2003 @09:14AM (#5457739)
      Well, on a typical system there might be a hundred or so processes. Add 4MB to all of them and its suddenly not insignificant anymore.

      Plus, if two running processes are sharing a shared library, then a task swap doesn't completely blow away the cache.

      Finally, I think with Windoze I think there is a mode in which DLL's can have global data, shared among all programs using it. Not sure though, its years since I did any windoze programming.
    • by infront314 ( 598911 ) on Friday March 07, 2003 @09:16AM (#5457751)

      Bugs.

      If there is a bug or a security problem in a DLL you only have to replace that DLL instead of all statically linked programs.

      Remember the problems with zlib [gzip.org] a year ago? Would you like to replace 500 applications [gzip.org] or one library?

      • by Steeltoe ( 98226 )
        If there is a bug or a security problem in a DLL you only have to replace that DLL instead of all statically linked programs.

        This new fix seems to break that. You'd need 10-20 different versions of the DLL to overwrite the "unique IDs", if it is at all possible. Another "fix" that fixes the symptoms and introduce new "features", not the real problem (lack of code-proofing and immature languages).

        Can be quite entertaining in the future. I sure hope my company stays clear away from Longhorn.
    • by Anonymous Coward on Friday March 07, 2003 @09:17AM (#5457758)
      There are plenty of reasons to want dynamic linking. Library code is code you didn't write. Therefore, you have no control over its quality or security. If there was a buffer overflow exploit in a system library, and all programs statically linked that library, you would need to update your ENTIRE SYSTEM to remove the vulnerability. If the code is saved in a shared library (DLL), you just need to update the single shared library.

      It's a classic case of the elimination of duplication in computer code. By duplicating (statically linking) the library code into every app, you only increase the burden when you want to update the library.

      Furthermore, on Debian GNU/Linux I never have "DLL misery" because my operating system is not brain-dead. Multiple versions of shared libraries can coexist, there is a consistent versioning scheme, and a competent team of people who check to make sure that apps use the correct versions of the shared libraries.
    • by oolon ( 43347 )
      If I statically link an executable, and they library I link against has a bug in it, I have to rebuild the executable to get rid of the bug. If the library is dynamically linked and the library interface (key point) is the same I can upgrade my complete system by installing a new library....

      Think about it, if there is a problem with say the resolving library which means your machine is insecure, with static linking you have to rebuild everthing, also with static linking you have to work out what is statically linked with it too...

      The key think however is having a rock solid library interface with no "special" features which require XX version of YY. The incompatibilities only start happening when the interface suttly changes. Some programs get bitten and not others due to which parts of the interface they use....

      James
    • by Arethan ( 223197 ) on Friday March 07, 2003 @09:30AM (#5457841) Journal
      Plenty.

      Each static library you use in your code adds to the overall program size, as you've stated. Thus increasing load time when you run the app. A 4MB exe loads a lot faster than a 400MB exe. (At least when the bulk of the difference is actual code/runtime-data, rather than generic data tacked on to the end of the file.) When you have DLLs, the OS will reuse already loaded DLLs for your app, and will keep newly loaded DLLs in memory for a while even after the app shuts down. This tremendously speeds up application loading.

      Plus it allows for version upgrades in the DLLs without requiring recompiles of the applications. Of course, this feature really isn't used in Windows, since MS can't seem to make an application run according to specs, most developers work around and sometimes even depend on the bugs that are in MS's code. Thus creating DLL version dependant apps. If MS goes back and fixes the broken DLLs, they will break a bunch of third party apps in the process. Probably some of their own as well.

      Plus their is security. Applications do not have the same level of access as DLLs. In fact, if you want to do some low level hardware access in Windows, you MUST use a DLL as a thunking layer, since you can't access the hardware from code within a regular process.

      There are other reasons, but I'm getting lazy, so I'll let someone else finish if they feel like it is needed.

      In short, there are plenty of reasons to use dynamically loaded libraries. But the generally poor coding practices that occur on the Windows platform pretty much ruins almost all of them. It's actually pretty sad. Windows had a lot of potential to become a great OS. Too many fuck ups and shortcuts have wrecked that chance. Fixing it would require an entirely new code base. One that doesn't have native ABI support for old apps. Which, of course, MS will never do. At least there's open source software to do things right. :)
    • Except for plugins, I see no reason why developers would need DLLs. Can anyone shed some light here ?

      Say you're releasing a suite of programs. These programs are not necessarily bundled - some people might only need 1 program - others might need all 3 of them. All these programs need to use the $foo protocol. So you write a DLL for implementing the $foo protocol, and then all 3 of your apps use it. This is a good thing - say a security flaw is discovered - patch the DLL and send it out there. You don't have to patch each application.

      The real issue is the poor use of DLLs and lack of coordination between developers and Microsoft. How many copies of the Visual Basic Runtime DLLs do you have on your system? (VBRUN*.DLL) What about the MS Visual C Runtime? (MSVCRT*.DLL). Or the Microsoft Foundation Classes (MFC*.DLL)? Granted there is some version skew in these DLLs, but I found three identical copies of MSVCRT.DLL - one from MGI PhotoSuite, one from WinVNC, and one from Adaptec Easy CD Creator. And there are lots more identical ones too. I think what Microsoft is doing is a good idea, although the whole problem could have been avoided with better developers and smarter installer programs...

    • by GooberToo ( 74388 )
      Don't discount the amount of memory that you can save from shared libraries and DLL's. If you have a DLL which consumes, on average, 5M of ram and it's used by three applications at the same time (e.g. word, excel, outlook), that's a savings of 10M (3x5-5M; one still has to use it). It doesn't take too many applications (think GUI widget sets) to be run concurrently for this to significantly add up.

      It also helps reduce application start up times on MS platforms. In the case of Win, they don't generally use load on demand as one thinks of from the unix world. Instead, they load all references into the page file and then demand load from the page file as needed. This means, shorter start times for applications which share DLL's as they only have to be loaded once (into the page file).

    • by gibber ( 27674 )
      There are still a number of good reasons for using DLLs. Here are a few examples:

      • Many applications often share crypto routines in a dll. In order to add a new algorithm/protocol to all the applications on a system which use crypto (which is an unknown quantity) you really want to have these routines in a a dynamically linked library.
      • Often (in the proprietary, closed source world) routines are kept in a dll to keep their sources private. This doesn't prevent users of the routines to link them statically, however but it does keep the control of the content in the hands of the creator of the dll. I'm not saying that I think this is a great reason but it is compelling to a lot of proprietary software vendors.
      • In the opensource world the makers of libraries and the libraries' consumers (e.g. applications) are often different groups in weak syncronization. Because of the rapid development and improvement the applications can benefit from a library upgrade. Having to relink every statically linked application would require keeping a list of applications which is a messy prospect and more overhead than most users (packages/distros) want to deal with. It would require keeping around all of the original object files or compiling from sources. This would probably work best in Gentoo Linux but would still be a major pain and more time than it's worth.

      Really most UN*X systems (in particular those which are glibc based) already implement Microsoft's "new" scheme. For example, in /usr/lib/:

      lrwxrwxrwx 1 root root 20 Sep 23 17:58 libgtkhtml.so -> libgtkhtml.so.20.1.3
      lrwxrwxrwx 1 root root 20 Feb 9 2002 libgtkhtml.so.19 -> libgtkhtml.so.19.0.1
      -rwxr-xr-x 1 root root 549996 Feb 9 2002 libgtkhtml.so.19.0.1
      lrwxrwxrwx 1 root root 20 Sep 23 17:58 libgtkhtml.so.20 -> libgtkhtml.so.20.1.3
      -rwxr-xr-x 1 root root 549996 Feb 9 2002 libgtkhtml.so.20.0.0
      -rwxr-xr-x 1 root root 633193 Sep 23 17:58 libgtkhtml.so.20.1.3
      Note that this scheme can create a "unique identifier" allowing an application to use a particular function using "libname.particular.version" and the the name of the function.

      If an application just wants to use libgtkhtml but doesn't care what version it uses it will use "/usr/lib/libgtkhtml.so" and will get the most current version. If the application has a requirement of a version 19 of libgtkhtml it would use "/usr/lib/libgtkhtml.so.19" and get the latest version of libgtkhtml.so.19.0.1. If an application wanted version 20.1.3 it would merely use "/usr/lib/libgtkhtml.so.20.1.3". At any rate, you get the idea.

      It amuses me every time Microsoft comes out with a "new" twenty year old technology (such as say, symlinks). It's a move in the right direction but I think they lack the humility to say that they've been rejecting the right answer all along.

  • More bloat (Score:4, Insightful)

    by wiggys ( 621350 ) on Friday March 07, 2003 @09:07AM (#5457679)
    This is just going to create more bloatware. Although hard-disk space is now plenitful and cheap, RAM is still precious - do we really need 5 or 6 versions of the same DLL file in memory eating up resources and clock cycles?

    I'm sure the hardware manufacturers will be pleased, as usual.

    • Alternative? (Score:5, Insightful)

      by dachshund ( 300733 ) on Friday March 07, 2003 @09:17AM (#5457762)
      do we really need 5 or 6 versions of the same DLL file in memory eating up resources and clock cycles?

      One could argue that this is better than having five or six broken applications trying to use the wrong version of a DLL.

    • Re:More bloat (Score:5, Informative)

      by jonathanclark ( 29656 ) on Friday March 07, 2003 @10:11AM (#5458047) Homepage
      DLL Sharing doesn't save that much memory in the typical case. Consider this:

      - Unless a DLL is rebased, it will likely have conflicting load addresses with other processes and have to be reloaded at a different addresss anyway.

      - Only the static unwritten pages of a DLL can be shared across processes. For DLLs consuming a lot of memory it's usually for data not code.

      - DLL sharing can cause application slow down because of page faults for copy-on-write and the page-mapping setup that occurs initially. My loader [thinstall.com] which doesn't do page sharing usually loads DLLs much faster than Windows.

      Jonathan

  • by psychofox ( 92356 ) on Friday March 07, 2003 @09:08AM (#5457681)
    This sounds similar to Windows File Protection (which has been available since Windows 2000). WFP prevents applications from altering Systems DLLs - automatically restoring them if they are altered.

    http://www.serverwatch.com/tutorials/article.php/1 548951 [serverwatch.com]

  • by pummer ( 637413 )
    by the end of this, I'm gonna have 1,000,000 files in system32.
  • Now Microsoft is hoping to end when has become known as DLL Hell by building into Windows Server 2003 a system that will stop updated DLLs installed by new applications from overwriting older versions of the same DLLs that may still be used by existing applications.

    W2003 - W95 = 8 Years to have that briliant idea!
  • Welcome to VMS (Score:5, Insightful)

    by beldraen ( 94534 ) <{moc.liamg} {ta} {risialptnom.dahc}> on Friday March 07, 2003 @09:10AM (#5457705)
    I always find it interesting that Microsoft gets to announce and shake up the world with "a new feature" that they caused and of which at least one other major OS had long since solved. Versioning the library API? "Who would've thunk it?!?"

    In other news, Microsoft invents a journaling file system to prevent data loss.. oh, wait..
    • Re:Welcome to VMS (Score:3, Insightful)

      by Metaldsa ( 162825 )
      When 3% of desktop computers change a feature it isn't news. When 90% of desktop computers change their dll system it can become a worthy newspost. I think its not about the feature but how it will affect the mom and dads who don't follow techie news. No longer will they become confused (or irate) when they install an app and another breaks.

      Though this could bring slower performance I think it will be negligable to most windows users.
    • Re:Welcome to VMS (Score:4, Insightful)

      by vocaro ( 569257 ) <trevor@vocaro.com> on Friday March 07, 2003 @10:47AM (#5458433)
      Microsoft Windows has had versioning of DLLs for as long as I can remember, way back in the days of 3.x. The article says, "Windows and Windows applications have no notion of DLL version numbers," but this is false. You can see the version numbers for DLLs on your system by starting Windows Explorer and navigating to the c:\windows\system directory. Right-click on a DLL, then click Properties, then click the Version tab. You'll see the version numbers there, including the company name, copyright info, and other important details. Windows applications can retrieve this data by calling the GetFileVersionInfo [microsoft.com] function.
  • by psychofox ( 92356 ) on Friday March 07, 2003 @09:10AM (#5457706)
    I wonder how they deal with upgrades to DLLs where the the upgrade represents a security fix. In such circumstances one would definetly not want an application to use an old version of a particular DLL...
    • by sward ( 122951 ) on Friday March 07, 2003 @09:16AM (#5457749)
      Apparently, if you have 10 applications that each make use of the same DLL, not only will you now have 10 copies of that DLL, but in order to upgrade that DLL (e.g. for a security fix), you'll have to wait for each application's vendor to release its own upgrade package. You would then have a variety of security holes in your set of applications, despite their supposed common use of the same DLL.

      And the way most companies work, this upgrade package may also include changes to the application itself instead of being more focused as most here on Slashdot would prefer. The DLL fix may be the "main course", but you'll get a side of new bugs in the app, and possibly an upgrade fee as a cover charge.
      • It's a tradeoff. (Score:3, Insightful)

        by dmaxwell ( 43234 )
        Sure if this capability were overused it would be a mess. On the other hand, judicious use of it would solve a lot of problems. Normally I'm not a fan of MS but if they do it right (big if I know) it's a GOOD idea.

        Maybe some policy would help a bit. I'm thinking that things like services, especially public net facing ones be forced to use the latest DLL whether it breaks anything or not. If it's compatible, the service stays up. If it's not the service dies and doesn't make a public nuisance of itself. Reporting tools would help too. If it were easy to get a list of which apps were using which DLL, it would be possible to intelligently manage the situation. For that matter, make apps use the newest by default and then fall back to the oldest only if that doesn't work and it isn't a public facing service.

        Yes, this can cause problems but if they include the right tools and sane policy they're managable. This isn't intriniscally new VMS did something like this. Unix admins have been doing manually for years. MS just wants to automate it.
  • Umm Security. (Score:5, Insightful)

    by jellomizer ( 103300 ) on Friday March 07, 2003 @09:11AM (#5457710)
    I could see some possible security problems with this. When a DLL needs to be patched for a security issue it will only fix programs witht the correct version of the DLL. and the old version of the DLL wont be fixed. This is bassicly defeting the porpose of DLLs in the fact most applications will be using simular but slitght different versions of the DLL. At this point why not just compile everything staticly with DLL that way it is no longer an issue. As well as quicker and easer to install and uninstall.
  • by joshv ( 13017 ) on Friday March 07, 2003 @09:11AM (#5457711)
    Microsoft re-invents static linking.

    Christ, if you are going to do this, why not create a recompiler that bundles the executable and all of its referenced DLLs into one EXE and be done with it.

    As a nice side effect it'd make it a hell of a lot easier to move programs around your disk and between machines... Oh, ok, now I get it.

    -josh
    • Re:In other news.... (Score:4, Informative)

      by jonathanclark ( 29656 ) on Friday March 07, 2003 @10:07AM (#5458019) Homepage
      I've been working on this problem. I created an application that creates a compressed virtual filesystem containing the EXE, DLLs, and datafiles that can be deployed in a "pre-installed" state. When the application runs, a virtual machine technology similar to WINE is used to allow the application to load DLLs directly from the archive rather than go to the operating system. Unlike VMWare/WINE, however, the virtual filesystem is transparently merged with the real filesystem so the application can access files from ether place with no source changes. Also 90% of Windows API calls do not need to be replaced so there a very few compatability problems. The next version will have support for a merged virtual registry as well, so you can deploy ActiveX/COM controls without having to register them or even write to the system registry at all.

      Check it out:
      http://thinstall.com [thinstall.com]
  • by magiccap22 ( 318891 ) on Friday March 07, 2003 @09:11AM (#5457718)
    This is a capability of the .NET framework. It has nothing to do with the next version of Windows.

    It isn't really DLLs either, but rather .NET assemblies. .NET assemblies are the equivalent of DLLs for the .NET system, but this won't solve the problem of "DLL hell" unless all applications are re-written in .NET.

    This is a great feature of .NET, but this article is just the Microsoft PR/marketing machine, and is nothing new technically.
    • by wadetemp ( 217315 ) on Friday March 07, 2003 @10:24AM (#5458178)
      Thank you for pointing this out. There've been other good comments around the board about DLLs and how sharing should work, but you're evidentally the only one who's read the article and understands the technology.

      The feature was present in .NET 1.0 as well.

      I've always wondered, though... why did MS choose to keep the extension ".DLL" for .NET assemblies? That choice alone has caused more issues than I have ever seen (name conflicts between a COM DLL and a corresponding .NET interop assembly DLL, people trying to register a .NET assembly DLL with regsvr, VB6 allowing you to attempt to use a .NET DLL as a component when it doesn't work, etc. etc. etc.) While they may be "dynamic link libraries" in a traditional sense, they're not ".DLLs." Chalk the reporter up as one more of the confused.
  • Uh.... (Score:3, Interesting)

    by simetra ( 155655 ) on Friday March 07, 2003 @09:13AM (#5457727) Homepage Journal
    Wasn't that the point of the Registry?
  • Garbage collection (Score:5, Interesting)

    by dachshund ( 300733 ) on Friday March 07, 2003 @09:15AM (#5457742)
    Just as long as Strong Binding knows when to garbage-collect the old DLLs that aren't being used anymore, even if the uninstaller fails to explicitly remove them.

    In addition, the OS should be intelligent enough to know when an EXE's been manually deleted (thrown in the Recycle Bin). The current practice of placing all unstallation responsibilities on the vendor tends to result in DLL buildup when the uninstaller doesn't work right or isn't provided (not uncommon.) There should be a unified process for adding a DLL that links it to the executable file that requires it.

    • This is impossible unless there were a major change and all software developers conformed to it. But we are stuck with Windows 95 on steroids. Windows XP addresses the problem to a small extent by providing a "side-by-side" DLL loading system.

      There is no way for the OS to know which DLLs an application might use - since DLLs are "Dynamically" loaded. The application may only load the DLL in rare cases, or maybe the application is run for the first time 1 year after it's installed.

      To further complicate the problem, software vendors themselves cannot be 100% sure that no other application are going to use DLLs they installed - so to be safe they leave them behind on uninstall. So you are stuck with growing system32 directory. The only way to garbage collect is to refresh the entire machine.

      I have been working on a virtual machine techology [thinstall.com] that allows applications to run in a semi-isolated environment (DLLs, files, and registry keys are not shared with the system). So they have zero impact on other applications and other application installs cannot cause them to fail. Also uninstall becomes a simple "del program.exe".
  • Again? (Score:4, Insightful)

    by Darth Daver ( 193621 ) on Friday March 07, 2003 @09:16AM (#5457745)
    I thought Microsoft already claimed to have fixed "DLL Hell" once or twice before with Windows 2000 and Windows XP. How many times are they going to "fix" the same problem?

    One of the really annoying things about Microsoft is they always promise to fix something in the next version. It's always "next time" with them, but the problems never seem to go away.
    • Re:Again? (Score:3, Insightful)

      by Jugalator ( 259273 )
      Yes, it's indeed fixed as long as you use .NET assemblies. I guess Longhorn will be much more .NET-oriented than XP for example, since .NET was still a bit too new when XP was released. So I guess this is what might be happening:

      Some time before XP is released, Microsoft claims that .NET will solve the DLL Hell in be included in coming versions of Windows. Microsoft think this of course need an announcement.

      Visual Studio .NET is released, and Microsoft says programs developed for the .NET Framework will automatically avoid DLL Hell through "assemblies". Since everything since Windows 98 support the .NET Framework, the problems are solved in all these OS's as long as it's a .NET program. Microsoft think this of course need an announcement.

      Some time before Longhorn is released (i.e. now), Microsoft says that it will solve these DLL conflicts in the OS itself, since Longhorn will be their first OS with good .NET integration out of the box. Microsoft think this of course need an announcement.
  • Mac OS X Frameworks? (Score:5, Informative)

    by image ( 13487 ) on Friday March 07, 2003 @09:17AM (#5457757) Homepage
    Sounds a lot like what Apple did with OS X's Frameworks.

    Read more about that here [apple.com]. Be sure to read through the section on Framework Versioning [apple.com].

    Also note that MacOS has long done a great job at packaging applications together so that the installer is unecessary.
    • Also note that MacOS has long done a great job at packaging applications together so that the installer is unecessary.

      I might as well note that this approach has some serious problems as well as the benefit of added simplicity, like:

      • Only Apple can update the OS. That suits Apples goal of selling more copies of MacOS X, but shafts the customer slightly when they find that a few months after a new release comes out, applications start requiring it. See DirectX for contrast.

        Note that there is nothing wrong with a 3rd party application install upgrading parts of the OS, it's just that historically Windows has sucked at it. Look at Debian for an example of how this can be done well.

      • Code reuse is stifled. On Windows, for the longest time JabberCOM was the de facto Jabber library. Most clients used it, and if you had more than one client installed, they'd share the same copy. There is no real mechanism for doing that, save shipping the framework inside every app, causing enormous bloat. So it doesn't happen very much.

      • Modules are much "coarser" than say on Linux. I can't imagine MacOS developers getting together to share say 60k of code in a framework, it's just not worth it. That sort of thing happens all the time on Linux, and to a slightly lesser extent on Windows. Again, there's nothing wrong with depending on lots of shared code, it just has to be done properly.

      Eliminating some of the key code sharing technologies (true dependancy management, strong component model) from MacOS X was I think a pretty big mistake - it bought them short term simplicity while making sacrifices in the longer term.
  • by broothal ( 186066 ) <christian@fabel.dk> on Friday March 07, 2003 @09:20AM (#5457783) Homepage Journal
    I don't get it. The sole idea about DLL is, that they are dynamically linked libraries (hence the name ;) shared by applications. This new idea suggest that, in theory, every program can have their own unique DLL. What's wrong with this picture? It's a workaround because today, programmers create DLL's that are not backwards compatible thus breaking older programs once the DLL has been overwritten. Yes, the workaround will work, but at the same time it undermines the idea of sharing libraries and it doesn't exactly urge programmers to write nice code that doesn't break existant functionality.
    /Christian
    • It adds versioning to dlls -- that's all. Multiple programs can use the same version. If a new version of the dll is added (even if it has the same name), the system will direct old applications to the old dll and new ones to the new dll.

      See this [microsoft.com]

  • by Jezza ( 39441 ) on Friday March 07, 2003 @09:21AM (#5457791)
    Well this makes one start to wonder, if each program is going to have a unique DLL (or at least one shared with VERY few others) why bother to have DLLs at all? Why not just roll the library up into the application binary?!

    It'll be interesting to see how this adds to the bloat, I imagine it won't take long for the average user to amass quite a number of these things, mostly doing the same job!

    There must be a better solution than this!
  • by Raphael ( 18701 ) on Friday March 07, 2003 @09:23AM (#5457797) Homepage Journal

    According to Microsoft's Ivo Salmre, quoted in the article: "When a .Net component is installed onto the machine, the Global Assembly Cache looks at its version, its public key, its language information and creates a strong name for the component."

    In a few cases in the past, backwards compatibility has been (slightly) broken by service packs and security fixes. How will they deal with that? Presumably, the public key of a library can be affected by a patch. If an application uses the strong binding to request a specific version of a DLL, does that mean that it will keep its own copy of the DLL without the patches? Or will it have to deal with potential incompatibilities introduced by the patches? How "strong" will this binding be?

    And by the way, this idea looks rather similar to the usual UNIX shared libraries that allow an application to bind to libpng.so (version doesn't matter), libpng.so.1 (version 1 required) or libpng.so.1.0.89 (specific version and patchlevel required). The proposed system for DLLs does not seem to be very different from that.

  • DLLs (Score:4, Interesting)

    by chenry007 ( 211197 ) on Friday March 07, 2003 @09:28AM (#5457829) Homepage
    Correct me if I am wrong, but wouldnt that slow windows down even more. You computer will now have to search through your registry to find the correct DLL for your application. And what happens if you registry gets corrupted (not like that ever happens), then that you put you in an even worse position. Would it be easier to make the DLLs backwards compatible?

    • Re:DLLs (Score:3, Interesting)

      by Lethyos ( 408045 )
      Would it be easier to make the DLLs backwards compatible?

      You're right. It would. The trouble is, Microsoft's API documentation is a closely guarded secret, and if you've ever done Windows development using resources from MSDN, you'd know that it's utterly pathetic. All the most useful library calls are kept hidden from you under penalty of DMCA I imagine.

      So, what happens if you either A) have no access to powerful APIs or B) you have crappy documentation for a crappy set of APIs? Simple! You code it yourself, reinventing the wheel. Dozens upon dozens of application developers for Windows have done this, making changes and additions to what may be standard libraries. One vendor creates a library with a call foo(bar b), and distributes the library. Another makes a call like bar(foo f) and distributes the library under the same moniker. You can't just merge them... so one version has got to go.

      And of course, as for backwards compatibility, since when have Microsoft ever gotten that right? I've ran into so many issues with a Windows application not working on XP because I developed it on 2000. A common solution to this problem is to distribute the version of a library that came with 2000 and overwrite whatever existed on the XP box with it.

      All in all, Windows is a pathetic, shody state of affairs. I ask myself everyday: "how the hell does this criminal organization con so many sheeple into using their products!?"
  • by Spencerian ( 465343 ) on Friday March 07, 2003 @09:28AM (#5457830) Homepage Journal
    Almost all non-Windows operating systems avoid this DLL problem.

    For one, Mac OS X uses bundles. Each application has its code as well as libraries all wrapped up in a single package. Only that app uses the libraries there. Clean, simple.

    I doubt Microsoft's solution will solve the problem because their operating systems rarely show the cojones to stop an errant application from taking advantage of "features" placed within Windows that are self-compromising (e.g., Visual Basic Scripts, ActiveX). Some programming yahoo would just write something to override Microsoft's effort.

    Windows could use a DLL manager similar to the old Mac OS Extensions Manager. Actually giving the DLLs easily recognizable names and clear version numbers wouldn't hurt either.

    Aw, fuck--just chuck the damned thing and run a *nix, for cryin' out loud.
    • and abandones the advantage of shared libraries: one DLL ABC which is used by several applications.

      Sharing a library has one disadvantage: the interface of the library should not change, otherwise using applications will crash. When an interface changes, you have to update the version. Now, you can do this in several ways, most likely this is doable by using a filename version scheme, or as in .NET metadata with a versionnumber.

      The central point where you register shared dll's shouldn't be based on a directory though, but a central repository which holds ID's that refer to files on disk. This is implemented in COM somewhat: COM objects are stored in DLL's mostly and when you register a COM dll, its COM Objects are registered in the registry: each CLSID is stored with the physical file where to find the object. If you now store the files locally with the app, as it should be, you can register the com dll's and each application using a COM object with a given CLSID that is stored in the local stored dll can use them.

      The problem arises when you install 2 applications which use the same DLL with the same objects, only application A uses v1.0 and application B uses v2.0. v2.0 of the library has all the objects of v1.0 but also newer objects. You first install A. All dll's are stored local. Then you install B. A works, it will probably use the dll of B because B registered all teh objects with the local stored DLL. When you UNinstall B, A will not work anymore, unless you keep the dll with the objects around. Most people don't do this ("Hey the uninstaller left some dll files behind!" *executing del command*). That's DLL hell.

      What's best is thus a central repository (Be it the GAC or the registry) and a central store which allows multiple versions of a DLL to be stored. I'm pretty sure that's what MS is heading at, and not your MacOS X nor any Unix does this.
  • by Lethyos ( 408045 ) on Friday March 07, 2003 @09:29AM (#5457838) Journal
    In order to solve "DLL confusion" in Windows, Microsoft are going to increase of the number of DLLs on the system, potentially by as much as there are applications installed, and give each DLL a unqiue, but symbolic only (xyz123:pdq098 versus msvcrt.dll for example) identifier.

    One result is that from one machine to the next, not only will you not be sure what applications are using which DLLs, you will also have applications that use radically different identifiers for accessing their libraries.

    This eliminates library confusion... how? I can't wait to have to troubleshoot it. Here's another solution Microsoft: document your standard libraries so that idiot application developers don't feel they need to re-invent the wheel and dump custom libraries all over the place.

    Of course, the rest of us will continue using open source software.
  • by somethingwicked ( 260651 ) on Friday March 07, 2003 @09:32AM (#5457850)
    It's funny reading through the different replies to this story-

    Reply 1- "This is a horrible idea! Look at all the RAM/disk space this is going to use. M$ programmers are idiots!"

    Reply 2- "VMS/Apple/*enter slanted fav OS here* already does this! This was a good idea when it was done 10 years ago by ____ .

    Reply 3- An idea like this is so stupid, it will NEVER work right.

    So, its a stupid idea that will never work EXCEPT it has already been done BUT will take up too many resources UNLESS it is done by our fav OS AND then thats okay

    *grin*

  • Unsolvable? (Score:5, Interesting)

    by Hard_Code ( 49548 ) on Friday March 07, 2003 @09:40AM (#5457893)
    Well, it is one thing to say that application can now obtain the version of the DLL they want if they *explicitly ask for it*. It is another thing to say that, they always and forever, until the end of time get the DLL they were "compiled against" or "packaged with". It is not clear from the article which of these two situations is the case.

    The point of shared libraries is that you CAN upgrade one single library and have many applications "automatically" inherit the changes. This is how you can update a file dialog for instance, without recompiling every single one of your GUI applications. This is a Good Thing. The question then becomes "why is this shit breaking so much". The right solution is a proper combination of carefully-followed deprecation and backwards compatibility rules (preferentially married with some sort of standard version naming convention), and the ability for applications to explicitly choose the library version they want (or even better yet, runtime configuration directive that can be set by the user or administrator) in the cases that *it is known that the new shared library is not backwards compatible*.
  • by the_germ ( 146623 ) on Friday March 07, 2003 @09:40AM (#5457899) Homepage
    This has nothing to do with DLLs, but with .NET components. Read the article carefully!
    A .NET application can bind to a specific version of a component, but that's not even news - it already exists in .NET 1.0 and some aspects of it were already available in COM years ago.
    The only aspect of it that has to do with DLLs is that .NET components are usually exported from DLLs, but this won't solve any of the problems with 'normal' DLLs.
  • by CrazyLegs ( 257161 ) <crazylegstoo@gmail.com> on Friday March 07, 2003 @10:17AM (#5458107) Homepage
    Great... MS has discovered versioning and makes a big announcement. But what is really going on here is that MS is trying to design their way out of a more fundamental design problem that has plagued Windows since Day One. That is, they allow system-level software (DLLs) to co-mingle with software others have written. How many Windows shrink-wrap products have taken advantage of this feature by modifying base Windows DLLs, tripping the light fantastic over system-level directories, etc.??

    OS/2 - as an example only - had a much better scheme where o/s stuff lived in its own space and the stuff you built/bought lived in its own space (and never the twain shall meet). On top of that, they implemented the idea of a LIBPATH env variable so that you could set the path OS/2 would take when looking for DLLs. Consequently, screwups were minimized, versioning was not an issue, built/bought software could be maintained easily, and (wait for it...) you could upgrade the o/s without blowing away all your apps!

    Can't wait to hear what MS 'discovers' next!

  • by edxwelch ( 600979 ) on Friday March 07, 2003 @10:34AM (#5458282)
    The artical doesn't make it very clear, but I think it only applies to applications developed using the .NET framework. So, for all those other applications out there it's dll hell as usual ;)
  • by g4dget ( 579145 ) on Friday March 07, 2003 @10:41AM (#5458366)
    Binding applications to DLLs tightly doesn't solve the problem either because, as people have pointed out, often you do want applications to use a new version.

    The real solution is to indicate whether a DLL is a bug-fix release or whether it represents a significant and incompatible change to the APIs. You know, like Linux major/minor dynamic library versioning, for example.

  • by suitti ( 447395 ) on Friday March 07, 2003 @10:47AM (#5458430) Homepage
    The physician takes an oath "to do no harm". I'm considering putting together a Linux distribution with everything compiled statically.

    First, what are the advantages of DLLs?

    • Less Memory Footprint
    • Less Disk Footprint
    • Global Security Fixes
    • Use of third party binaries
    • Plug ins
    Less Memory Footprint

    In Unix, when you have two instances of an application running, say, vi, the executable code between the two is automatically shared. The shared library gains you nothing. To gain memory footprint, you need to use the same shared library from two applications at the same time. For example, libc might be used by vi and cc.

    However, if you compile statically, you bind in only the routines that are needed. For shared libraries, you need to have all routines available, since you don't know which of them are used. Now, your virtual memory system may notice that a shared libary page isn't used, and page it out. Yes, this requires additional run time execution time. The upshot is that you save memory only when you have enough different programs use the same shared library to overcome the overhead. I claim that this happens with libc, libX11, and not a whole lot else.

    Less Disk Footprint

    If you have 50 programs that use the same shared library, you can save some disk space becasue that libary code does not need to be duplicated that many times. However, shared libraries need to have the symbol information requried to perform the dynamic binding. The savings isn't that much.

    In the old days, when an entire Unix distribution fit on a 150 MB tape, the libc shared library savings amounted to about 30%. You could get more reduction in size by using compression.

    In fact, programs could be compressed on disk. The loader could decompress the image as it reads it into RAM. For slow disks, this may be faster than loading the uncompressed version into RAM. The down side here is that you then may not be able to use the original file on disk for virtual memory paging.

    In any case, it's getting hard to get a disk drive under 20 GB. 30% overhead reduction for the most common shared library doesn't amount to much.

    Global Security Fixes

    So, your libzlib.so.5 has a bug. You whip up a quick fix, create a new libzlib.so.5, and drop it into your system. You've just fixed all of your libzlib dependent programs, right? In fact, you fixed programs you didn't even know used libzlib. You may also have broken programs that you didn't know use libzlib. And, short of testing every program on your system, you don't know. The more complex the patch, the more likely you are to have broken many things.

    Quick. What is a utility which will tell you all the shared libraries that an application uses?

    Use of third party binaries

    Third party binaries can just as easily be distributed in source form or in a library that is statically bindable. Static binding is preferable, since you are unlikely to use a large fraction of a kitchen sink shared library - where the authors have no idea how the programmers will use it. Source is preferable, since the documentation rarely specifies enough semantic detail to allow proper use.

    Plug ins

    OK. Your application is Apache, and you want some real flexibility. If Apache is compiled so that modules can be loaded at run time, then the administrator can add the new module and turn on it's use in the configuration. This doesn't save any RAM or disk, but it may allow the admin to change a line of config, restart the web server, and start using some new feature.

    For Apache, the admin can also recompile with the new module compiled statically. I've done it both ways. My measurements show a small run time advantage to static compilation.

    Granted, if you can't recompile IIS, then DLLs will give you the same flexibility in exchange for a small performance penalty.

    The Dark Side of Shared Libraries

    If you compile your application statically, then upgrade your OS, you can copy the old application to the new OS, and it just runs.

    If your app has shared libraries, you have to track them down on your old OS, and copy them to your new OS. If you make a mistake, and copy your old libc.so over your new one, you run the risk of trashing every program on your new system. Brilliant.

    Take netscape as an example. It comes installed in it's own /usr/local/lib directory tree. In /usr/local/bin, netscape is a script which sets up the shared library search path to include the libraries that netscape needs, then runs the binary. This introduces script overhead and shell dependencies on a complicated package. And, when you upgrade your OS, you still need to find the old libc.so and copy it forward.

    RPMs

    Many seem to think that RPMs solve all these problems. However, many packages have bugs in their dependencies, etc. Many RPMs use different versions of the same shared libraries. I find that I have to override the dependencies to get stuff to install. Often, the requried package IS installed. Not just once in awhile. Much of the time. The difference between theory and practice is that, in theory, they are the same.

    Conclusions

    Shared libaries seldom save RAM or disk space. The problem with using them to fix bugs globally is that you don't know what you fixed, or even if you broke some things. Third party binaries should invariably be statically linked. In an open source environment, plug ins are not strictly needed. Shared libaries make OS upgrades more painful.

    So, what I'd like is a Linux distribution with no shared libaries. The compiler, gcc, would be configured to compile statically by default. Then, after some years of running the system in production, and after adding hundreds of applications to it, I'd be able to upgrade to a new distribution without having to recompile or do the shared library search.

    • Quick. What is a utility which will tell you all the shared libraries that an application uses?


      ldd?
      :~> ldd /bin/ls
      librt.so.1 => /lib/librt.so.1 (0x4001f000)
      libc.so.6 => /lib/libc.so.6 (0x40031000)
      libpthread.so.0 => /lib/libpthread.so.0 (0x40141000) /lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
  • by Fastolfe ( 1470 ) on Friday March 07, 2003 @11:13AM (#5458710)
    Even though this article isn't really about DLL's, why do we have NEWER versions of DLL's breaking applications anyway? The whole point of making a new minor release of a piece of software (DLL, component, whatever), is to fix things, NEVER to change the API or change the behavior of existing functions. It's these changes that break existing applications.

    New major releases should be considered a completely different DLL/component, since it conceivably has a different API or changes its behavior in some incompatible way.

    It seems to me that DLL's/components need to be treated as self-contained applications. They need to go through a rigorous testing and QA cycle (except that they don't generally expose anything directly to the users, but to other applications), and need to be installed as if they were their own application. Windows applications that have dependencies on DLL's can, during installation, tell the OS which DLL's they need and what the minimum version should be.

    Bundle these with the application if you need to, but to suggest that DLL's/components need to be kept at the same *minor* version to avoid breaking applications indicates a bad problem with how you build and test DLL's. I'd rather fix this problem than introduce this layer of version matching.
  • by Anonymous Coward on Friday March 07, 2003 @11:26AM (#5458841)
    I code in assembly and a few other languages. I can understand that it can be a very good thing to reuse one piece of code in several different places. I understand also that it can save space to reuse. No news here...

    As for the idea of "Strong Binding", I wonder what Billy G. expects to acomplish by adding yet more poorly designed, poorly documented LIBs to the programming mess that Windows has evolved into. On top of that, I wonder why I would need to save EVERY SINGLE VERSION of a DLL that makes it to release...

    Version tracking will become a nightmare.

    Consider:
    +User installs program COOL_PROGRAM.EXE
    -COOL_PROGRAM uses MS_COOLNESS.DLL
    +User gets an update to MS_COOLNESS.DLL:
    MS_COOLNESS_V2.DLL
    -The fix in V2 repairs a buffer overflow
    in a function that COOL_PROGRAM used from
    COOLNESS.DLL.
    Question : Does the installer for V2 know that COOL_PROGRAM is dependent on it? If this is the case, Billy G. is gonna have his hands full trying to keep track of what goes where with third party devs.

    If not, perhaps COOL_PROGRAM will go by default to the newest version of COOLNESS.DLL. Ok, now Billy will likely contend with tracking and modifying functions that have previously been used in highly specialized ways for security/system critical functionality that Windows does not provide either by accident or by intention. So NOW third party devs developing well organized and functional code/programs are forced to keep up with the madness of Windows development to save space. Hmmm... Guess it got the better of the buffer overflow this time. Or maybe they introduced a new bug into the system {par for course with MS}...

    Better yet, how about people developing security/system critical environments use their own code to avoid this whole mess? Ok, now you dont need DLLs do you? How about 3-5 times as many? Wait for the next Windows release? So the effort YOU made during XP to keep up with DLLs and other updates is pointless right? Or XP+1? The style of MS defines itself....

    Security/system critical programs?...
    Thats only one side?...
    Ok. Try this:

    Graphics, network comunication, encryption, file editing, database editing? Or maybe drivers, file converters, scripts, inter-app comunication, diagnostics?

    The list goes on. The problems generated by and complications arising from this framework are not worth the hassle.

    Instead of building a system where things get more complicated I would recomend a redesign of the system itself. Current and past states of instability/insecurity are more than I care to witness again. Billy has enough money to sit around daydreaming for the rest of his days while still paying his programers for doing nothing but daydreaming themselves for the rest of theirs... Perhaps they could get up off their butts and design a system from the ground up that is easy to use, safe, fast, and reliable for users old and new... Logical?

    I love C programming. C++ and Java are lots of fun. But IF you want something done right the first time, assembly and careful thought is the only answer...

    S-()-u-|-s-!-|)-E
  • by supabeast! ( 84658 ) on Friday March 07, 2003 @11:51AM (#5459064)
    Given the current cost of massive hard disks, why are programs still putting DLLs anywhere outside of ...\Program Files\Foo\dlls? I think most people would be happier losing some extra space per program due to DLL redundancy than to keep dealing with shared libraries!
  • here (Score:4, Informative)

    by rabtech ( 223758 ) on Friday March 07, 2003 @12:31PM (#5459448) Homepage
    Let me explain this because many people seem not to understand.

    When a program installs a "shared" DLL, the assembly manager looks at the DLL version. One of three things will happen:

    1. The DLL does not exist in the assembly cache - it is added.

    2. The DLL exists, but all other instances of it are a different major/minor revision. (X.Y.0.0) In this case, the DLL is added to the assembly cache as a separate version.

    3. The DLL exists in the cache, and the major/minor versions are the same. In this case, if the installing DLL has a newer revision (0.0.X.Y), then it will overwrite the old DLL. Otherwise, it is thrown away.

    When a program executes, it's manifest specifys what major/minor version of the DLL it needs, and the assembly cache will fetch it. HOWEVER, bug fixes, etc are supposed to be changes to the revision numbers only, so if a bug fixed version of the DLL is installed, the app will use that version.

    The assembly cache also keeps track of what set of DLLs go together. If version 1.2.7.X of FOO.DLL needs to also be run with 1.2.7.X of BAR.DLL, then the assembly cache can make sure a program never uses a mismatch, which has been a
    MAJOR cause of difficult to track instability over the years.

    The "new" .NET way of doing things is different though, and Microsoft won't logo or certify apps unless they follow these practices:

    1. If you have a DLL only used by your application, install it in your application's folder.

    2. If you have a suite or many apps that work together and use the same DLL, install it into program files\common\yourname.

    3. ONLY install DLLs into the System folder if they are very very widely used, or are actual system objects or libraries. (I.E. your app needs a newer version of the microsoft common dialog runtime. In that case, you ship the MSM which has the latest version of common dialog and related libs that are all known to work together, for EACH version of windows. The Windows Installer knows how to read the MSM and pick the appropriate set of files for the current OS/service pack level you are on. That way, developers running Windows 2000 don't b0rk a Win9x user by shipping the w2k libraries.)

    3a. An even easier way of handling things is to write your app for a specific service pack level on each OS (or possibly hotfix if a bug was fixed that is affecting your app.) In this way, you just tell your users "you need service pack X on OS Y, or service pack Z on OS A" to run the app.
  • by delus10n0 ( 524126 ) on Friday March 07, 2003 @04:20PM (#5461884)
    This whole Slashdot story thread is retarded. Obviously no one read the article to see that this really only applies to .NET (ASP/VB/C#/etc.)

    I can't tell you what an improvement assemblies are compared to a "component"/COM object. You'd have to build your DLL, then regsvr32 it into the system.. and if you ever needed to update the DLL, you'd have to stop your service/app that uses the DLL, then regsvr32 -u it... and then overwrite the existing file, then regsvr32 it again, and start your service/program back up..

    And now? You just overwrite the DLL. .NET takes care of the rest. Piece of cake.

    If you have .NET Framework installed, I'd suggest checking out %systemroot\Assembly\ [to see all the assemblies you current have installed, and their versions] and also the "Microsoft .NET Configuration" program under Administrative Tools. In there is an entire interface dedicated to managing the GAC.

    It's pretty cool stuff. .NET rocks the hizzy.

Our business in life is not to succeed but to continue to fail in high spirits. -- Robert Louis Stevenson

Working...