Most efficient/quickest way to set NTFS permissions in PowerShell

Hello all,
Trying to figure out what the most efficient/quickest way to set NTFS permissions via PowerShell is. I am currently using ICACLS but it is taking FOREVER as I can't figure out how to make inheritance work with this command.
This has prompted me to begin looking at other options for setting NTFS permissions in PowerShell, and I wondered what everyone here likes to use for this task in PowerShell?

Ah ok. Unfortunately, my ICACLS is taking FOREVER. Here is the code I'm using:
ICACLS "C:\users\[user]\Desktop\test" /grant:r ("[user]" + ':r') /T /C /Q
However:
1.  I can't figure out how to make the inheritance parameter work with ICACLS
2. If I do make the inheritance parameter work with ICACLS, I still need a way to add the permission to child objects that aren't inheriting.
Any tips on how to improve performance of ICACLS?
1. icacls folder /grant GROUPNAME:(OI)(CI)(F)  (i will post corrected code later, this works in CMD but not powershell couse of bracers)
2.  get-childitem -recurse -force |?{$_.psiscontainer} |%{icacls ....}  (or u can list only folders where inheritance is disabled and apply icacls just on them)
I think jrv and Mekac answered the first question about inheritance flags. I would just add that you probably don't want to use the /T switch with icacls.exe because that appears to set an explicit entry on all child items (that's probably why it's taking
so long).
For your second question, I'd suggest using the Get-Acl cmdlet. It throws terminating errors, so I usually wrap it in a try/catch block. Something like this might work if you just wanted the paths to files/folders that aren't inheriting permissions:
dir $Path -Recurse | ForEach-Object {
try {
Get-Acl $_.FullName | where { $_.AreAccessRulesProtected } | ForEach-Object { Convert-Path $_.Path }
catch {
Write-Error ("Get-Acl error: {0}" -f $_.Exception.Message)
return
If you're looking for speed/performance, you don't want to just use the PowerShell Access Control (PAC) module that Mike linked to above by itself. It's implemented entirely in PowerShell, so it's incredibly slow right now (unless you use it along with Get-Acl
/ see below for an example). I'm slowly working on creating a compiled version that is much faster, and I think I'm pretty close to having something that I can put in the gallery.
Since I wasn't sure which command would give you the best results, I used Measure-Command to test a few different ones. Each of the following four commands should do the exact same thing. Here are my results (note that I just ran the commands a few times
and averaged the results on a test system; this wasn't very rigorous testing):
# Make sure that this folder and user/group exist:
$Path = "D:\TestFolder"
$Principal = "TestUser"
# Native PowerShell/.NET -- Took about 15 ms
$Acl = Get-Acl $Path
$Acl.AddAccessRule((New-Object System.Security.AccessControl.FileSystemAccessRule(
$Principal,
"Read", # [System.Security.AccessControl.FileSystemRights]
"ContainerInherit, ObjectInherit", # [System.Security.AccessControl.InheritanceFlags]
"None", # [System.Security.AccessControl.PropagationFlags]
"Allow" # [System.Security.AccessControl.AccessControlType]
(Get-Item $Path).SetAccessControl($Acl)
# PAC Module 3.0 w/ PowerShell/.NET commands -- Took about 35 ms
$Acl = Get-Acl $Path | Add-AccessControlEntry -Principal $Principal -FolderRights Read -PassThru
(Get-Item $Path).SetAccessControl($Acl)
# icacls.exe -- Took about 40ms
icacls.exe $Path /grant "${Principal}:(OI)(CI)(R)"
# PAC Module 3.0 w/o Get-Acl -- Took about 350 ms
Add-AccessControlEntry -Path $Path -Principal $Principal -FolderRights Read -Force
Unless I messed something up, it looks like the native PowerShell/.NET commands are faster than icacls.exe, at least for modifying a single folder's DACL.

Similar Messages

  • Unable to set NTFS permissions on share using PowerShell. The user shows up with no rights checked off.

    I am having a little problem here with setting NTFS permissions via PowerShell. 
    Basically I am able to make a new directory on the share, and assign a user NTFS permissions however it just assigns the select user without any permissions set.
    $username = "test.user"
    $directory = "\\testlab-sv01\Share\newfolder"
    New-Item -Path $directory -ItemType Directory
    $colRights = [System.Security.AccessControl.FileSystemRights]"FullControl"
    $InheritanceFlag = [System.Security.AccessControl.InheritanceFlags]::ContainerInherit
    $PropagationFlag = [System.Security.AccessControl.PropagationFlags]::InheritOnly
    $objType =[System.Security.AccessControl.AccessControlType]::Allow
    $objUser = New-Object System.Security.Principal.NTAccount("$username")
    $objACE = New-Object System.Security.AccessControl.FileSystemAccessRule($objUser, $colRights, $InheritanceFlag, $PropagationFlag, $objType)
    $objACL = Get-ACL $directory
    $objACL.AddAccessRule($objACE)
    Set-ACL $directory $objACL
    A side question, why isn't this native in Powershell? Is it for security reasons? I expected there to be a cmdlet for it. 
    Thanks. 
    Kyle

    When you say there are no permissions, do mean that the ACL Editor is showing 'Special permissions' and none of the other boxes are checked?
    Try changing the inheritance and propagation flags to this:
    $InheritanceFlag = [System.Security.AccessControl.InheritanceFlags] "ContainerInherit, ObjectInherit"
    $PropagationFlag = [System.Security.AccessControl.PropagationFlags]::None
    That sets the ACE to apply to the folder (InheritOnly propagation flag isn't set) , subfolders (ContainerInherit inheritance flag is set), and files (ObjectInherit inheritance flag is set), which is necessary for the ACE to not be considered 'special' in
    the ACL Editor.
    Awesome. Thanks. That did work. 
    And yes I did mean that it was showing special permissions with nothing checked. 
    Kyle

  • PowerShell - what is the most efficient/fastest way to find an object in an arraylist

    Hi
    I Work with a lot of array lists in PowerShell when working as a sharepoint administrator. I first used arrays but found them slow and jumped over to array lists.
    Often i want to find a specific object in the array list, but the respons-time varies a lot. Does anyone have code for doing this the most efficient way?
    Hope for some answers:-)
    brgs
    Bjorn

    Often i want to find a specific object in the array list, but the respons-time varies a lot. Does anyone have code for doing this the most efficient way?
    As you decided to use an ArrayList, you must keep your collection sorted, and then use the method BinarySearch() to find the objects your looking for.
    Consider using a dictionary, and if your objects are string type, then a StringDictionary.
    You stil fail to understand that he slowness is no in the arraylist.  It is in the creating of the arraylist which is completely unnecessary.  Set up a SharePoint servefr and create a very large list and test..  You will see. An arraylist
    with 10000 items takes forever to create. A simple SharePoint search can be done in a few milliseconds.
    Once created the lookup in any collection is dependent on the structure of the key and the type of collection.  A string key can be slow if it is a long key.
    The same rules apply to general database searches against an index.
    The main point here is that SharePoint IS a database and searching it as a database is the fastesst method.
    Prove me wrong devil!    Submit!  Back to your cage! Fie on thee!
    ¯\_(ツ)_/¯
    You seem to be making a lot of assumptions about what he's doing with those arraylists that doesn't seem justified based on no more information than there is in the posted question.
    [string](0..33|%{[char][int](46+("686552495351636652556262185355647068516270555358646562655775 0645570").substring(($_*2),2))})-replace " "

  • Set folder permissions by Powershell

    Hello there.
    I'm trying to set permissions to a shared folder by Powershell.
    I used the following script:
    $Acl = Get-Acl "C:\MySharedFolder"
    $Ar = New-Object system.security.accesscontrol.filesystemaccessrule("username","FullControl","Allow")
    $Acl.SetAccessRule($Ar)
    Set-Acl "C:\MySharedFolder" $Acl
    So, what it happens is:
    1. The permission doesn't work;
    2. This permission appears in the "Security" tab of the folder as "Special permissions".
    Do I need to do something else?
    Is there a special way to set permissions to shared folders?
    Thanks in advance.
    Regards
    Lucas Gustavo

    The FileSystemAccessRule constructor that you're using is creating an ACE that only applies to the folder (there are no inheritance and propagation flags set). Changing the line that creates your ACE to the following should fix the issue:
    $Ar = New-Object System.Security.AccessControl.FileSystemAccessRule ("username","FullControl","ContainerInherit, ObjectInherit", "None", "Allow")
    If you do a lot of access control stuff interactively (or if you would like to be able to use Desired State Configuration to configure access control), I've got a module that helps make doing this stuff more "PowerShelly" (ish?):
    PowerShell Access Control module
    Creating that ACE would look like this:
    New-AccessControlEntry -Principal UserName -FolderRights FullControl
    You could also simplify your code above to one of the following examples:
    # This uses the native Get-Acl and Set-Acl
    $Acl = Get-Acl "C:\MySharedFolder"
    $Acl | Add-AccessControlEntry -Principal UserName -FolderRights FullControl
    $Acl | Set-Acl
    # Or this example lets the module get and set the security descriptor:
    Add-AccessControlEntry -Path C:\MySharedFolder -Principal UserName -FolderRights FullControl
    My favorite function, though, is Get-AccessControlEntry:

  • Moving LR and pics to a new computer/HD, most efficient/painless way...

    Hello people of the Adobe LR forum!
    The primary HD on my computer is crapping out (after only 6 mos!), and it's the same HD in which all my pictures reside as well as my LR install.
    What is the recommend way to transfer my images and all of my precious LR metadata image develop information to another HD? All of my pictures are in one big catalogue.
    I read through the Help but couldn't really find anything that clearly addressed this.
    If I install LR on a new primary HD, copy over the folders of images, and then open of my catalogue, will that work? Is it critical that my .lrcat file and "Lightroom" folder remain in the same relative file location to the RAW pictures/folders?
    Any help would be appreciated!
    Thanks,
    Minh
    www.minhternet.com

    Yes, you can copy everything directly over and it will work. It is time consuming if you have a large library, but not at all painful. To avoid "finding" your photos in Lr later you will want to keep the folder structure the same.

  • Removing advanced ntfs permissions in powershell

    Hi,
    I am trying to remove special permissions of a folder
    I found a technet article that helps me understand the concept but couldn't get it to
    work for special permissions. 
    I am trying to remove create files special permissions for c:\temp
    $colRights = [System.Security.AccessControl.FileSystemRights]"CreateFiles" 
    $InheritanceFlag = [System.Security.AccessControl.InheritanceFlags]::None 
    $PropagationFlag = [System.Security.AccessControl.PropagationFlags]::None 
    $objType =[System.Security.AccessControl.AccessControlType]::Allow 
    $objUser = New-Object System.Security.Principal.NTAccount("BUILTIN\Users") 
    $objACE = New-Object System.Security.AccessControl.FileSystemAccessRule `
    ($objUser, $colRights, $InheritanceFlag, $PropagationFlag, $objType) 
    $objACL = Get-ACL "c:\temp" 
    $objACL.RemoveAccessRule($objACE) 
    Set-ACL "C:\temp" $objACL

    Thank you for your time to explain and provide sample code
    The cool thing about your code is no matter how many special permissions I give, it removes them all though I pass one right, like AppendData.
    After running the code like
    $acl = get-acl C:\temp
    $rules=$acl.GetAccessRules($true,$true,[Security.Principal.NTAccount])|?{$_.IdentityReference -eq 'BUILTIN\Users'}
    $rules|%{$acl.RemoveAccessRule($_)}
    $newrules=$rules|Remove-AcePermission -FileSystemRight 'AppendData'
    I run $acl | set-acl
    Then it removes all the special permissions and keeps only the standard ones. Works as expected
    But when I tried this for C:\, for some reason it didn't work as I get the error when I run Set-Acl
    Set-Acl : The security identifier is not allowed to be the owner of this object.
    At line:20 char:8
    + $Acl | Set-Acl
    +        ~~~~~~~
        + CategoryInfo          : InvalidOperation: (C:\:String) [Set-Acl], InvalidOperationException
        + FullyQualifiedErrorId : System.InvalidOperationException,Microsoft.PowerShell.Commands.SetAclCommand
    But the rule modification does work as expected like shown here:
    Without special permissions:
    PS C:\WINDOWS\system32> $rules
    FileSystemRights  : ReadAndExecute, Synchronize
    AccessControlType : Allow
    IdentityReference : BUILTIN\Users
    IsInherited       : False
    InheritanceFlags  : ContainerInherit, ObjectInherit
    PropagationFlags  : None
    With Special permissions CreateFiles and AppendData:
    PS C:\WINDOWS\system32> $rules
    FileSystemRights  : CreateFiles, AppendData, ReadAndExecute, Synchronize
    AccessControlType : Allow
    IdentityReference : BUILTIN\Users
    IsInherited       : False
    InheritanceFlags  : ContainerInherit, ObjectInherit
    PropagationFlags  : None
    After using the code:
    PS C:\WINDOWS\system32> $newrules
    FileSystemRights  : CreateFiles, ReadAndExecute, Synchronize
    AccessControlType : Allow
    IdentityReference : BUILTIN\Users
    IsInherited       : False
    InheritanceFlags  : ContainerInherit, ObjectInherit
    PropagationFlags  : None
    But only thing is that I couldn't set this new rule on C:\ ...
    Am I missing anything while setting it on C:
    I've seen that you did mention "The above makes some assumptions and is not designed to work with
    all scenarios. "  But was wondering what I was missing

  • Making NTFS permissions read/write without ability to create/delete folders

    Out at one of our job sites we have a server running Windows Server 2012 R2 that's got a file share accessible to our onsite people. Our project managers have devised a very strict folder structure for this file share, and for auditing purposes they want
    to stick as close to this structure as possible.
    Therefore, although people onsite must have read/write access to create, modify and delete files, they do not want them to be able to create or delete folders. They want them to use the existing folders and not tuck stuff away into folders that no one knows
    exists except the person who created them.
    The closest way I've found to do this is to deselect the advanced permissions 'Create folders / append data' and 'Delete subfolders and files.' This has a few side effects however, the most noticeable being that due to not being able to append data to files,
    certain files (such as CAD drawings) can't be edited by anyone except the person who created them.
    Is there a way using just NTFS permissions to accomplish what the project managers want? And if not, are there any useful third-party utilities that will help us do this?
    Thanks in advance for any assistance.

    Hi,
    I'm not much familiar with AutoCAD- what's the exact behavior which is stopped by the restricted folder permission?
    For example, if AutoCAD will create a folder in editing, we will not have a solution as users needed to create folders so that AutoCAD could run properly. 
    And if AutoCAD works like Office files, that create a temp file for editing, this will be the solution:
    1. Give Domain Admins - Full Control - This folder, subfolders and files.
    This is to allow all admin could access and edit all data.
    2. Give SpecificUsers group (a group contain all normal users) - Full Control without "Change permissions" and "Take Ownership" -
    Files Only.
    This is to give that group most permissions to create, edit and delete files but not Folders.
    3. Give SpecificUsers group another permission:
    Traverse folder
    List FOlder
    Read Attributes
    Read extended attributes
    Create files - this is important. Without this permission you will not able to save Office files. 
    Read permissions.
    Give above permissions to "This Folder, subfolders and files".
    This is to allow users to access all subfolders. 
    If you have any feedback on our support, please send to [email protected]

  • Most efficient way to delete "removed" photos from hard disk?

    Hello everyone! Glad to have this great community to come to for help. I searched for this question but came up with no hits. If it's already been discussed, I apologize and would love to be directed to the link.
    My wife and I have been using LR for a long time. We're currently on version 4. Unfortunately, she's not as tech-savvy or meticulous as I am, and she has been unknowingly "Removing" photos from the LR catalogues when she really meant to delete them from the hard disk. That means we have hundreds of unwanted raw photo files floating around in our computer and no way to pick them out from the ones we want! As a very organized and space-conscious person, I can't stand the thought. So my question is, what is the most efficient way to permanently delete these unwanted photos from the hard disk
    I did fine one suggestion that said to synchronize the parent folder with their respective catalogues, select all the photos in "Previous Import," and delete those, since they will be all of the photos that were previously removed from the catalogue.
    This is a great suggestion, but it probably wouldn't work for all of my catalogues since my file structure is organized by date (the default setting for LR). So, two catalogues will share the same "parent folder" in the sense that they both have photos from May 2013, but if I synchronize May 2013 with one, then it will get all the duds PLUS the photos that belong in the other catalogue.
    Does anyone have any suggestions? I know there's probably not an easy fix, and I'm willing to put in some time. I just want to know if there is a solution and make sure I'm working as efficiently as possible.
    Thank you!
    Kenneth

    I have to agree with the comment about multiple catalogs referring to images that are mixed in together... and the added difficulty that may have brought here.
    My suggestions (assuming you are prepared to combine the current catalogs into one)
    in each catalog, put a distinctive keyword onto all the images so that you can later discriminate these images as to which particular catalog they were formerly in (just in case this is useful information later)
    as John suggests, use File / "Import from Catalog" to bring all LR images together into one catalog.
    then in order to separate out the image files that ARE imported to LR, from those which either never were / have been removed, I would duplicate just the imported ones, to an entirely separate and dedicated disk location. This may require the temporary use of an external drive, with enough space for everything.
    to do this, highlight all the images in the whole catalog, then use File / "Export as Catalog" selecting the option "include negatives". Provide a filename and location for the catalog inside your chosen new saving location. All the image files that are imported to the catalog will be selectively copied into this same location alongside the new catalog. The same relative arrangement of subfolders will be created there, for them all to live inside, as is seen currently. But image files that do not feature in LR currently, will be left behind by this operation.
    your new catalog is now functional, referring to the copied image files. Making sure you have a full backup first, you can start deleting image files from the original location, that you believe to be unwanted. You can do this safe in the knowledge that anything LR is actively relying on, has already been duplicated elsewhere. So you can be quite aggressive at this, only watching out for image files that are required for other purposes (than as master data for Lightroom) - e.g., the exported JPG files you may have made.
    IMO it is a good idea to practice a full separation of image files used in your LR image library, from all other image files. This separation means you know where it is safe to manage images freely using the OS, vs where (what I think of as the LR-managed storage area) you need to bear LR's requirements constantly in mind. Better for discrete backup, too.
    In due course, as required, the copied image files plus catalog can be moved bodily to another drive (for example, if they have been temporarily put on an external drive, and you want to store them on your main internal one again). This then just requires a single re-browsing of their parent folder's location, in order to correct LR's records inside this catalog, as to the image files' changed addresses.
    If you don't want to combine the catalogs into one, a similar set of operations as above, can be carried out for each separate catalog you have now. This will create a separate folder structure in each case, containing just those duplicated image files. Once this has been done for all catalogs, you can start to clean up the present image files location. IMO this is very much the laborious and inflexible option, so far as future management of the total body of images is concerned... though there may still be some overriding reason for working that way.
    RP

  • What is the best, most efficient way to read a .xls File and create a pipe-delimited .csv File?

    What is the best and most efficient way to read a .xls File and create a pipe-delimited .csv File?
    Thanks in advance for your review and am hopeful for a reply.
    ITBobbyP85

    You should have no trouble doing this in SSIS. Simply add a data flow with connection managers to an existing .xls file (excel connection manager) and a new .csv file (flat file). Add a source to the xls and destination to the csv, and set the destination
    csv parameter "delay validation" to true. Use an expression to define the name of the new .csv file.
    In the flat file connection manager, set the column delimiter to the pipe character.

  • Most efficient way to globally shift page content

    Hi folks,
    I decided to start my InDesign career with something nice and simple: a 326-pp book divided into 57 sections.  I just got my first POD proof back, and it all looks good – except that I would like to shift everything down the page by a pica or a quarter of an inch.
    The master page setup in my book is not the most efficient.  I have an A-master in each story document (= chapter = section) with nothing on it but a single column with margin settings. Then I have two chapter title page masters (recto & verso) and two body text masters (recto & verso).  I can modify the masters on my sync source story document, but when I try to sync the changes through the other story documents, I get erroneous placement of headers, and I lose the story- (i.e., chapter-) specific title text (which is entered in the title master pages).
    I have been looking for a global fix. I tried adjusting the top margin in the A-master page, but the margin doesn't seem to push page elements down.  Another possibility is to set up a global crop routine in Acrobat (my final output is PDF), and then add the material chopped from the bottom back to the top (also in Acrobat).  I'd like to find some way of pulling off the necessary shift in InDesign, however.  I have a gut feeling it's possible, but a search of the InDesign Help material hasn't turned up anything yet.
    Any thoughts?
    TIA
    Richard Hurley
    Grass Valley MultiMedia

    "Use the AdjustLayout.jsx, a sample script that ships with InDesign. Go the the Scripts Panel and find it there and double-click it,"
    Neat. 
    The Acrobat crop is the way to make this happen in a hurry, but I had a good time playing with the Script Panel. Thanks for letting me know about it.

  • Most Efficient Way to Populate My Column?

    I have several very large tables, some of them are partitioned tables.
    I want to populate every row of one column in each of these tables with the same value.
    1.] What's the most efficient way to do this given that I cannot make the table unavailable to the users during this operation?
    I mean, if I were to simply do:
    update <table> set <column>=<value>;
    then I think I'll lock every row for writing by others until the commit. I figured there might be another way that makes better sense in my case.
    2.] Are there any optimizer hints I might be able to take advantage of here? I don't use hints much but with such a long running operation I'll take any help I can get.
    Thank you

    1. May be a better solution exists.
    Since you do not want to lock the table...
    Save the ROWID's of all the rows in that table in a temporary table.
    Write a routine which will loop through this temporary table and use the ROWID to update the main table and issue commit at regular intervals.
    However, this does not take into account the rows that would be added to main table after the temporary table has been created.
    2. Not that I am aware of.

  • Most efficient way to consume log files

    Hello everyone,
    I've been absent from the forums for awhile but I'm back at it now... 
    I have a question about the most efficient way to consume log files.  I read in Powershell in action, by Bruce Payette that using a switch statement with a regex worked pretty well, that being said I haven't tried it yet. Select-string is working pretty
    well for me but I have about 10 different entry types that I need to search logs for every 5 minutes and I'm scanning about 15 GB of logs at every interval.  Anyway, if anyone has information about how to do something like that as quickly as possible
    I'd appreciate it.
    1.  piping log files that meet my criteria to select-string
       - This seems to work well but I don't like searching the same files over and over again
    2. running logs through get-content and then building a filter statement
      - This is ok but it seems to use up a fair bit of memory
    3. Some other approach that I haven't thought of yet.
    Anyway, I know this is a relatively nebulous question, sorry about that.  I'm hoping that someone on here knows a really good way to find strings in logs files quickly.
    Hope that helps! Jason

    You can sometimes squeeze out more speed at the expense of memory usage, but filters are pretty fast. I don't see a benefit to holding the whole file in memory, in this case.
    As I mentioned earlier, though, C# code will usually blow PowerShell away in terms of execution time.  Here's a rewrite of what I just did (just for the INI Section pattern, to keep the post size down):
    $string = @'
    #Comment Line
    [Ini-Style Section Line]
    Key = Value Line
    192.168.0.1 localhost
    Some line that doesn't match anything.
    Set-Content -Path .\test.txt -Value $string
    Add-Type -TypeDefinition @'
    using System;
    using System.Text.RegularExpressions;
    using System.Collections;
    using System.IO;
    public interface ILineParser
    object ParseLine(string line);
    public class IniSection
    public string Section;
    public class IniSectionParser : ILineParser
    public object ParseLine(string line)
    object o = null;
    Match match = Regex.Match(line, @"^\s*\[([^\]]+)\]\s*$");
    if (match.Success)
    o = new IniSection() { Section = match.Groups[1].Value };
    return o;
    public class LogParser
    public static IEnumerable ParseFile(string fileName, ILineParser[] lineParsers)
    using (StreamReader sr = File.OpenText(fileName))
    string line;
    while ((line = sr.ReadLine()) != null)
    foreach (ILineParser parser in lineParsers)
    object result = parser.ParseLine(line);
    if (result != null)
    yield return result;
    $parsers = @(
    New-Object IniSectionParser
    $results = [LogParser]::ParseFile("$pwd\test.txt", $parsers)
    $results
    Instead of defining separate classes for each type of line and output object, you could probably do something more generic with delegates (similar to how I used ScriptBlock.Invoke() in the PowerShell example), but it might sacrifice some speed to do so.

  • Most efficient way to get document names?

    I was wondering what is the most efficient way to get the document names in a container? Use the built in 'name' index somehow, or is there an 'efficient' XPath/XQuery?
    We've been using the XPath /* which is fine with small instances, but causes a java heap out of member error on large XML instances i.e. /* gets everything which is not ideal when all we want are document names.
    Thx in advance,
    Ant

    Hi Antony,
    Here is an example for retrieving the document names on c++:
    void doQuery(XmlContainer &container,
    XmlQueryContext &context,
    const std::string &XPath)
    XmlResults results(container.queryWithXPath(0, XPath, &context));
    // Iterate through the result set as is normal
    XmlDocument theDocument;
    while(results.next(theDocument))
    std::cout << "Found document named: "
    << theDocument.getName() << std::endl;
    Regards,
    Bogdan Coman

  • Fast/most efficient way to cut out of multiple tracks at the same time

    I try to select by dragging around but it lights up the entire length of each tract- I can use the scissors to manually cut the tracks but there has to be a better way...any ideas?
    Thanks, Don

    I don't know if this is most efficient but here's my "home grown" way:
    I place the section locator bars located at the top in the area I want to cut. You can fine tune this by zooming in close. I then right click on the track and use the split at locator function. I then can cut, drag or copy just that new, separate area. If you want to cut multiple tracks you can leave the locators where they are and repeat on all the tracks you want to split. I haven't tried selecting multiple tracks, and I'm not at my computer to try it right now.
    Maybe there's a quicker way but that has worked for me!

  • What is the most efficient (in terms of cost) way to transfer 35 mm slides to my iMac hard drive?

    What is the most cost efficient way to transfer 35mm slide images to my iMac hard drive?

    This gives you the basics
    http://www.tech-faq.com/how-to-copy-slides-to-disk.htm
    The first option, the professional one, isn't the cheapest - but the results should be good.
    The second option is very appealing. Some printers come with brackets that can be used to scan film or slides.
    The third option is a slide scanner. I haven't used this option, but there's a significant startup cost of under $100.
    In short, a scanner is the cheapest and most efficient option.

Maybe you are looking for