Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Microsoft Adds Risky System-Wide Undelete to Vista 365

douder writes "Windows Vista will have a new 'previous versions' feature when it ships next year. According to Ars Technica, the feature is built off of the volume shadow copy technology from Windows XP and Windows Server 2003. Now turned on by default, the service stores the modified versions of a user's documents, even after they are deleted. They also report that you can browse folders from within Explorer to see snapshots of what they contained over time. It can be disabled, but this seems like a privacy concern." From the article: "Some users will find the feature objectionable because it could give the bossman a new way to check up on employees, or perhaps it could be exploited in some nefarious way by some nefarious person. Previous versions of Windows were still susceptible to undelete utilities, of course, but this new functionality makes browsing quite, quite simple. On the other hand, it should be noted that 'Previous Versions' does not store its data in the files themselves. That is, unlike Microsoft Office's 'track changes,' files protected with 'Previous Versions' will not carry their documentary history with them."
This discussion has been archived. No new comments can be posted.

Microsoft Adds Risky System-Wide Undelete to Vista

Comments Filter:
  • by Anonymous Coward on Sunday July 30, 2006 @09:48PM (#15814374)
    The security risks could be eliminated by encrypted the user's home directory, a la Mac OS X.

    It's a fantastic feature. I remember Novell Netware had this and we used it a lot to roll back changes to code. It was better than version control when only one person was working on the project.

    I wonder if OS X 10.5 was going to have such a feature and it leaked out. This is actually a quasi-innovative idea from Microsoft. Maybe they stole it from Apple via corporate spying.
  • by zxnos ( 813588 ) <zxnoss@gmail.com> on Sunday July 30, 2006 @09:55PM (#15814415)
    eh? diyertfa?

    With Windows Vista, the operating system will make "shadow" (that is, backup) copies of files and folders for users who have "System Protection" enabled (the default setting).

    In Windows Vista, each partition that is protected by "System Restore" requires at least 300MB of space, and may use up to 15 percent of the available space on a partition to store previous versions of files. In the event that more space is required, the service will delete older restore points to make room for new ones.

  • Um, no. (Score:3, Interesting)

    by HotNeedleOfInquiry ( 598897 ) on Sunday July 30, 2006 @09:56PM (#15814422)
    In a normal office environment, assuming you can keep the porn and mp3's under control, people don't create enough bits in the course of a day to be an issue. Remember that this is the age of 300Gig harddrives for $100.
  • MS DOS and Undelete (Score:5, Interesting)

    by Prien715 ( 251944 ) <agnosticpope@nOSPaM.gmail.com> on Sunday July 30, 2006 @10:10PM (#15814476) Journal
    Just out of curiousity, the ability to effectively undelete things ought to rely on the filesystem. In the old days of MS dos, the first chars of the filename were simply changed to a reserved character, which was actually faster than going through and deleting the whole file. When the file system wanted to create a new file, it might use the nodes marked with the "it's ok to delete me flag". That's why MS Dos 6.22 and its brethren required you to type in the first char of the filename when you undeleted a file. So actually no, there's actually no overhead in creating a comprehensive file undelete system. Any 3rd party which implemented the same thing, might cause it to be slower.

    If they could be fast in MS DOS 6.22, I don't see why XP would make the feature inherently slower.
  • by bersl2 ( 689221 ) on Sunday July 30, 2006 @10:12PM (#15814484) Journal
    I wonder, could an existing open filesystem be modified so that a file marked with some attribute will store its contents as a log, rather than as a working copy, able to be rolled back and forward (probably by some utility) until squashed, yet have the current copy be worked with transparently, without making (invasive) changes to the VFS? Does something like this already exist? Maybe something using FUSE?
  • by cmason ( 53054 ) on Sunday July 30, 2006 @10:12PM (#15814485) Homepage
    I still don't understand how a non-transactional interface such as a filesystem can be used to record what is essentially a transaction: a version. In other words: how does the filesystem know to record a version? Presumably this operates without modifying the application (which would be necessary to provide a true transactional versioning system, such that provided by the "track changes" feature). Does this thing assume that file closes are transactions? Do users get presented with a slew of "versions" of files based on when the file was closed (assuming applications even close the file on "save")? Are these "versions" actually valid files? Can someone explain how this works?

    We toyed for a while with implementing something like this in our scientific data management application and decided in the end that it just wasn't possible because the (instrument vendor provided) applications would have to be modified to deliver information about when to create a "version" of a file. Instead, we require users to provide us with this information manually.

    -c

  • by Jerry Coffin ( 824726 ) on Sunday July 30, 2006 @10:12PM (#15814486)
    I wonder if OS X 10.5 was going to have such a feature and it leaked out. This is actually a quasi-innovative idea from Microsoft. Maybe they stole it from Apple via corporate spying.

    Microsoft got this one much more directly. Windows NT started out as basically the next version of VMS, designed and written almost entirely by former DECies (one rumor has it that the "NT" came from taking VMS and adding one to each letter to get WNT...) VMS has had a feature like this for years. It predates not only OS/X, but the Macintosh in general. I can remember using in about 1981 or so -- I don't remember for sure, but VMS 3 is what sticks in my mind -- and I don't think it was new then (it seemed pretty cool to me after dealing with Control Data mainframes, but the people who'd been using VMS longer didn't seem to think of it as new or exciting).

  • Sounds kind of like (Score:5, Interesting)

    by ZorbaTHut ( 126196 ) on Sunday July 30, 2006 @10:14PM (#15814493) Homepage
    a built-in versioning system. Want to roll back to a previous version? Bam, done. Want to fork? Just make a copy of the "old version" and move on.

    I'd like directory-by-directory control over this, some way of controlling when the old versions "go away" (I don't want mass-id3'ing of my MP3 collection to clobber my old documents, for example), as well as efficient move operations. But, as many are saying, this sounds like basically a good thing.

    It's a feature, and a pretty cool one. I wouldn't mind this in Linux. This is not a bad thing.
  • File versioning (Score:3, Interesting)

    by SiliconEntity ( 448450 ) on Sunday July 30, 2006 @10:15PM (#15814501)
    I used an OS back in the 70s, Twenex from Digital Equipment Corp, that had file versioning. Every time you wrote out a file it kept the previous N versions, typically 5. It wasn't oriented towards deletion so much as recovering old versions after you screwed one up. It was a pretty nice feature, although it tended to fill up disk space which was in short supply in those days.

    Today, I thought undeleting was what the trash can was for. With today's big disks you shouldn't have to Empty Trash very often.
  • by SilentChris ( 452960 ) on Sunday July 30, 2006 @10:26PM (#15814558) Homepage
    Truely, MS is damned if they do, damned if they don't.

    How many times has your mother/father/other family member called you over because they deleted "that one file" they never backed up (it's usually never just "that one file", but that's the typical excuse)? So you head over and, sure enough, the thing is gone. The only recourse is to buy some overpriced Norton Utilities or whatnot (that will probably slow down the system to crawl) and cross fingers.

    So, Microsoft enables a feature that's been built-in to the OS for a while and the reaction is instantly negative? Never mind that, daily, petabytes upon petabytes are backed up using VSS around the world, as almost all decent backup software uses it on Windows. Never mind that, if "privacy concerns" get in the way, you can always remove versions in VSS or disable it entirely.

    Seems much ado about nothing, personally. Don't like it? Turn it off.

    And if you're in a company, well, you don't get a choice. I'm not really sure I understand the "bossman" comments -- in most big companies, the "bossman" has been backing up every file you create, every site you visit, etc. for decades. Granted, 99.99% of it will never be looked at, but in these post-SOX days, you're pretty much mandated to catch that 0.01%. And if you don't like it, well, I guess you can always start a company with your own rules.

    Personally, I think this thing is going to be a tremendous blessing. When a relative calls me still using Windows (I've been trying to push them all to Mac), and says "My god, I deleted this crumb cake recipe! I'm doomed!" I'll be able to get it back after a couple clicks. Sounds great to me.
  • by SilentChris ( 452960 ) on Sunday July 30, 2006 @10:28PM (#15814568) Homepage
    System Protection will probably include System Restore, which is pretty much an invaluable feature. It's saved my ass (and people's asses I know) more than a few times when we're working on Windows boxen. You never know when the registry is going to crap out.
  • I don't think it was new then

    The VMS filesystem (Files 11) was an evolution of earlier DEC filesystems and had versioning buit in from the start. There's also a more user-oriented versioning filesystem which has been in development for Linux for the past few years.
    http://sourceforge.net/projects/versionfs/ [sourceforge.net]

  • by Registered Coward v2 ( 447531 ) on Sunday July 30, 2006 @10:33PM (#15814602)
    Amazing that a Good Thing gets turned into a big-brother or privacy issue just because it's Microsoft. Shadow copy has saved my ass twice in the past year and the more it's available, the better. If employees are worried about the boss checking up on them, then maybe they should just do their job.

    Actually, I'd be more worried about what can be discovered in a lawsuit - the raw ruminations of some employee could be very damaging - whether or not they were correct. This makes it harder to destroy working papers. In the old days, we kept all our working papers on a disk and then destroyed the disk along with our hard copy working papers - that way no one had to worry about what could be dredged up in a lawsuit.
  • No, not really. (Score:5, Interesting)

    by Ayanami Rei ( 621112 ) * <rayanami&gmail,com> on Sunday July 30, 2006 @10:40PM (#15814627) Journal
    This versioning in NT is based on a generic disk-snapshot system (similar to Linux's LVM, FreeBSD's gvinum stuff, Solaris DiskSuite, NetApp, etc. etc.)
    The VMS versioning was done in the file system itself. This system (and many related systems) are done at a layer underneath the filesystem, and are often filesystem agnostic.

    People like to say that Windows NT borrows a lot from VMS. That's like saying Linux borrows a lot from Multics. There isn't really _anything_ in common, but they are in the same spirit.
  • by SilentChris ( 452960 ) on Sunday July 30, 2006 @10:41PM (#15814631) Homepage
    All modern day file systems use this technique already. When you delete files in NTFS (or do a "quick format") all it's doing is changing the writeover bits in the file allocation table. The data is then free to be written over when the OS chooses. You can, in theory, restore files by scanning for table, doing a comparison to what's on the disk and seeing if the data has been written over. This is how 3rd-party recovery utilities work on XP.

    The trouble is a UI issue, not a technical one. Many users in the DOS-era (that knew about the utility) started to rely on it to recover data. It was really just a crapshoot, though. Since computing was going mainstream, you needed something easier to understand that would "undelete" the file every time.

    When MacOS and Windows came about, they introduced the concept of a temporary holding area for "deleted" files (the Trash Can and Recycling Bin respectively). The companies told users "put stuff here when you want to delete it, drag it out when you want to 'undelete'". It was a concept much easier to understand and "always worked" (as long as people put stuff in the can/bin and didn't delete fully the first time).

    Now, users in work situations are getting used to regular backups and being able to call the IT guys any time a file goes missing (or they need an earlier version). Many people, however, do not have a reliable backup system at home. All MS is shifting the onus of backups away from the user to an automated system. (One that, coincidentally, can still be controlled by the IT guys).

    VSS, as a tech, actually works quite well. It's used in almost every major backup solution on Windows because it reliably "freezes" files that are locked before backup. I think VSS will be a good solution to the "mother calling son because she deleted a recipe" problem.
  • by Ayanami Rei ( 621112 ) * <rayanami&gmail,com> on Sunday July 30, 2006 @11:48PM (#15814862) Journal
    A better question would be why we need the following files at all anymore:
    autoexec.bat
    config.sys
    msdos.sys
    io.sys
    ntdetect.com

    Those five files exist to make sure that this NTFS disk, if copied to a FAT partition, isn't attempted to be booted by anything other than Windows NT. But never mind that they are tiny files which are stored inline in the MFT, and marked H/S so you can't even see them normally.
    They implement a sort of null windows 98 boot telling you that you need to boot from a NT bootloader... which is *fanfare* NTLDR!

    ntldr and boot.ini are what make a drive bootable. They need to be in the drive root.

    Now Recycler and System Volume Information are ALSO both hidden and system so you don't normally see them. There are one of these per drive (they have nothing to do with a specific windows installation but store your Recycled Files and Restore Points for a drive respectively).

    So everything is marked hidden. After install, all you see are:
    Documents and Settings
    Program Files
    Windows

    I was just trying to be complete and figured the parent poster probably was annoyed after disabling "hide operating system protected files" and looking in his C:\...

    And what does it matter where they are? Because if the sector location on disk mattered, it certainly wouldn't make any difference if you located it inside Windows, because you'd have no way to migrate the windows folder itself anyway.

    In fact you can move around those folders (and you can use more than one) if you are smart about things.

    Typically I install a small system in C: using the original paths. I install all the SPs I can find, and update all my drivers.

    Then I change the UserProfiles folder to point to a RAID-0 volume so that all local/remote users profiles get created on the faster volume on first login. Thus some profiles are located on C: (DefaultUser, Adminsitrator) and everyone else's are in C:\UserData or D: or whatever.

    You can also choose to install Programs in _any_ folder you want at install time. C:\Program Files is just a default, and one that Windows uses internally to house things like Internet Explorer. Programs in the registry can store whatever path they want.

    So make yourself an E:\programs network folder after boot and put all install all your software there.
  • Re:No, not really. (Score:3, Interesting)

    by Archtech ( 159117 ) on Monday July 31, 2006 @05:22AM (#15815774)
    "People like to say that Windows NT borrows a lot from VMS. That's like saying Linux borrows a lot from Multics. There isn't really _anything_ in common, but they are in the same spirit".

    That turns out not to be the case. Anyone familiar with VMS internals found the NT internals practically identical in many cases, to an extent that was quite laughable. Moreover, established VMS internals experts were able to start teaching and writing about NT within months - they had very little new to learn.

    Of course there were some significant differences. The Hardware Abstraction Layer (HAL) was meant to allow NT to run on any hardware with a minimum of porting effort. Arguably, if DEC had done something similar for VMS it would still be riding high and profitable today - the fatal drawback of VMS was that it had the VAX (and subsequently Alpha) tied like a millstone round its neck.

    Then there was the famous Windows GUI. Legend has it that Dave Cutler wanted to make some changes to that, in the interests of security, stability, and performance. But Gates and his crew strictly forbade any changes, understanding the value of keeping it as compatible as possible to ease the migration path for their millions of customers.
  • by Bazzargh ( 39195 ) on Monday July 31, 2006 @05:41AM (#15815819)
    That feature is seriously screwed up. Microsoft are *still* trying to sell people on the idea that its ok to share around the editable document, when in reality its hardly ever ok. All it takes is for one person to forget to remove hidden data and you're on the news.

    Look at the list of Office products it integrates with - there's one missing. Outlook. Why isn't outlook set up to prompt you to ask if it should strip the documents before sending? Why is there no feature on exchange to block emails leaving the domain with unstripped attachments? Why doesn't iis block access to unstripped files? Now those would make it a feature worth having.

    Stepping back from MS for a moment, the same problem actually exists in many other file types - even html (meta tags and comments). Its why the microformats movement thinks metadata should be presentable and parsable [microformats.org] rather than hidden in 'document properties'. Their solution isn't complete though - we need to separate the notions of 'Save As' and 'Publish'. One way to achieve this in a corporate/government environment would be for servers to require digital signatures on outgoing documents - this would introduce publication into a document lifecycle for the purpose of integrity, at which point we can hook in 'strip doc' wizards to minimize risk.

    Just thinking out loud.

Remember, UNIX spelled backwards is XINU. -- Mt.

Working...