Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Big (and Small) Developments In Storage 27

louismg writes "On the same day that BlueArc released the Titan 2000 family, with performance more than three times higher than rivals EMC and NetApp, including a global namespace and scalability to 512 terabytes, EMC took a low-end approach by unveiling a line for the SMB market, dubbed Insignia. Red Herring claims that BlueArc's announcement changes the storage game, while The Register says that small means beautiful. What makes sense for today's IT infrastructures, with data growth showing no sign of slowing?"
This discussion has been archived. No new comments can be posted.

Big (and Small) Developments In Storage

Comments Filter:
  • by TallMatthew ( 919136 ) on Tuesday February 07, 2006 @08:41AM (#14658958)
    It sounded like EMC came out with a product in a small footprint, as it was compared to a high capacity product, but not so much. From TFA:

    The Insignia line consists of a basic, low-cost version of the VisualSRM management package, an SMB edition of the eRoom collaboration software, an SMB edition of the RepliStor replication package, and an SMB edition of Storage Administrator for Exchange.

    How can we get more from our customers who already shelled out top dollar for our storage? With a variety of cleverly-named software packages, of course. Ugh, I *despise* EMC.

  • by Philip K Dickhead ( 906971 ) <folderol@fancypants.org> on Tuesday February 07, 2006 @08:53AM (#14658991) Journal
    As both SAN and NAS envrionments will proliferate both in density, and diversity. I forsee that, as these become more and more commonplace, we will have at least one locally attached to each computing device in the near future - giving birth to LA-SAN/NAS.
  • Better organization! (Score:5, Interesting)

    by Myself ( 57572 ) on Tuesday February 07, 2006 @09:06AM (#14659030) Journal
    More bulk storage only solves half the problem. As the volume of email and document archives grows, organizing and searching them becomes more interesting than just storing them.

    Personally, I'm facing a minor storage crisis with a 15-gig music tree, a 20-gig photos tree, and dozens of gigs of other useful stuff that I only need on an occasional basis.

    When I offload the camera, the files go to the laptop. To avoid duplicating files, the most recent few weeks worth of photos should always be on the laptop. But anything older should move off, to free up space on the drive. I'd like to keep as much of my music in both places as possible, but I need a way to replicate file moves, metadata changes, and deletions so the same bad rips and inaccurate tags don't persist after I fix them once.

    There are a set of notes files, mileage logs, and other small files which I'd like to synchronize between the desktop and laptop. However, there's a possibility of both copies changing simultaneously (or at least between syncs), so I'd need a way to reconcile changes.

    These little dilemmas have me wasting a lot of space until I resolve them. Stale data and useless files are all over the place, and I don't have the notes I need when I need them. It sounds like I might be able to do most of this with rsync, but the notes files, CVS maybe?

    Has anyone tackled these issues before in a useful way, perhaps a sensible-storage-organization HOWTO? More space is not the issue right now, I need more sense.
    • closely tying into this is redundancy of data - you might have all the HDD on the world if you don't have a usable distributed fs to use on 'em.
    • There's some data which makes sense to have categorized. Say, work by client or project, music by genre/album/artist, photo by subject or date. But at some point you get to documents which you don't need to update often or plain just don't have their own archiving system. I stick all that stuff into a "Miscellaneous" folder which is then organized by month. It's sort of a giant junk drawer. The difference is that I remember when I worked on something and I can find it pretty easily. Anything else I can just
    • As slow as it can be, it seems you want something like iDisk so that you can have anything you want available to you anywhere. I think the issue is at the presentation layer rather than the storage layer, although I wonder what it would take to have roving volumes on a distributed SAN...
    • Off topic so be warned!

      I feel that half the problem in file management (i.e. search and retreival) lies in metadata. With the right metadata attached to each file, classification, organisation and search becomes far, far easier. The problem lies in who the metadata is generated and stored.

      For example, I'm building a specific CRM/Helpdesk hybrid thingy (technical term!) to replace our current system and massively bloated spreadsheets. Now because it's work based there's no issue in forcing some minor metadat
      • I think that for personal use, Metadata is a lost cause. Unless you can find someway for the computer to figure out the metadata on it's own (as in CDDB for mp3 tags), Then files that people create aren't going to have very good or any metadata. I think a much better approach is just to keep your stuff organized, but making a good tree of folders. Type of document, project, date, lots of other things can be used to make these folders. Then I find that it's easy to find the stuff you need. You can even
        • Haha. I'm just going to have a few thousand of my closest friends to manually key in all the metadata, in exchange sharing the content. Of course, only works for specific kinds of content...
          • Exactly It works with things like mp3s because everyone has the same songs. You can't do that for things like word documents, digital photos, or anything else you are actually creating yourself. If you are the only one or one of few with the document, then it's not very likely that anybody other than you is going to be generating the metadata.
        • If any user spends the up-front time to create metadata, s/he can take advantage of such a search and storage mechanism.

          For the rest of us, we'll still have the old, slow 'guess the filename, hope you find it somewhere' approach.

        • My fault - I was rambling on.

          Some metadata is transparent. After all MIME/File type is a piece of metadata present in every file. Even if some OSes have to rely on file extensions. Yes, I'm talking to you my monolithic friend....

          I don't think metadata is a lost cause. I think it's the future. The point I was arguing is how to hide it from the user. You're right - the system needs to be better able to supply extra metadata itself. Having MP3s automatically embellished with CDDB info is a fantastic idea. The
    • by Anonymous Coward on Tuesday February 07, 2006 @09:55AM (#14659249)
      You may find FolderShare [foldershare.com] of some use. It's an ad-hoc peer-to-peer directory synchronisation application.

      You set up one (or more) directories on at least two machines, tell it to keep those directories in sync - and magic happens. There's a central internet server that your machines connect read and store only the file checksums. And if more than one computer is online (either on the same LAN - or anywhere on the internet), files will be copied and the directories syncronised. It's a brilliant idea.

      I personally have several FolderShare shares, including a large folder off my home directory that I keep organised and know that it will be automatically replicated to my other machines for distributed backup. *And* it works well for my MP3 directory and my digital photos.

      It used to be a pay service, but Microsoft bought it and immediately opened the service for free to everyone. I believe their primary intent is for it to be a part of Windows Live when Windows Vista launches. FYI, there are Windows and Mac clients.

    • How about a document management system like moodle or Microsoft Sharepoint?
    • I wrote a script on my computers to facilitate the transfer of data from remote sources to local trees. Basically, the script accesses a specific directory common to all my media, and moves the data to a timestamp directory somewhere in my homedirectory. I originally wrote it to make working with flash drives easier, as I was frequently downloading stuff at work to use at home.

      I'd have a /transfer directory on a CDROM or flash drive, and all the data in it would be moved to $HOME/transfer/$timestamp whene
    • You are looking for Unison [upenn.edu], at least as far as your notes and non-audio/video data.
    • by Anonymous Coward
      and dozens of gigs of other useful stuff that I only need on an occasional basis.
      Pr0n, eh?
  • "What makes sense for today's IT infrastructures, with data growth showing no sign of slowing?"

    Both really. Alright, it's really cool that they can now store up to 512 terabytes. But for many businesses, that storage capacity isn't needed - and won't be for a while. By the time they do need it, they'd be at the point of needing to replace the old system anyways.

If graphics hackers are so smart, why can't they get the bugs out of fresh paint?