Import-csv and export-csv

 Hi everyone,
How do I modify the script so that I see all the entries in the computers_result.csv file.  As it stands only the last entry in the computers.csv shoes up in the computers_result.csv file.  Thanks so much.
Import-Csv C:\computers.csv | ForEach-Object {
$computername = $_.computer
$result = get-adcomputer $computername -Properties description  | select Name,Description
$result | export-csv C:\computers_result.csv

Try this and let me know
$result = @()
Import-Csv C:\computers.csv | ForEach-Object {
$computername = $_.computer
$result += get-adcomputer $computername -Properties description | select Name,Description
$result | export-csv C:\computers_result.csv
Regards Chen V [MCTS SharePoint 2010]

Similar Messages

  • Import, edit, and export QuickTime document: wise? how to maintain quality?

    Can iMovie HD 6.0.3 be used to import a reasonably-high-quality QuickTime file (Apple Graphics compression), edit it, and export to Quicktime, all resulting in a QuickTime file with the same clarity as the original?
    I have a quicktime file that captures a demonstration on my desktop. It was recorded with SnapZ Pro (with movie option) into this format: Apple Graphics, 889 x 500, 256, IMA 4:1, Mono, 22.050 kHz. (Note: I got 889x500 when I constrained SnapZ to a 16x9 format.)
    I need to clip out sections, rerecord some of the audio, insert snippets of replacement video (re-recorded). I have Quicktime 7.1.3 Pro, but I had this idea it would be easier to do this editing with IMovie HD 6.0.3.
    My problem is that everything I export has noticeably worse quality than the original source. I have tried all sorts of projects (DV, HDV, MPEG-4, ISight). I've tried exports at full quality, lower quality, various expert settings.
    For example:
    - Create a DV-Widescreen project.
    - Import the movie. What I see on the screen is slightly smaller (in both dimensions) than the QuickTime Player shows me, and quite noticeably blurrier.
    - Picking Expert export, I use Compression: Graphics, Depth: Color, Quality: Best, key frame rate 24, Dimensions 889x500, Scale: Letterbox.
    - The QuickTime file displays as the right size, with a tiny black border for letterboxing, and with crummy quality (text has shadows, making it worse than just blurry.) All this despite being 30MB from a 3.2MB original.
    Quite likely, I'm using the wrong tool. I don't have a camcorder - I don't even have a TV - so I don't know any of the jargon. What would be the right tool for this task? Stick to Quicktime Pro?
    MacBook   Mac OS X (10.4.8)  

    How would Final Cut Express work for editing quicktime as above?
    same answer:
    FCE handles dv/NTSC/PAL only ...-
    for sure, you can import "any format/res" ...- with loss of quality, if import doesn't fit standards...- and: FCE is the "more mightier" tool, means, you can easiely arrange two/many videos in one picture, arrange them, position them ...-
    if you want to stay with your original settings, you have to use QTpro (less convenient) or FCP (less cheap) ...

  • Import rename and export rename

    Hi,
    I've figured out how to rename images on import to "literal"_"yymmdd"_sequence (and that works fine), but I'd also like to use a directory structure of bucket_{job number} where job number if a number that I increase from time to time. Can I do that with the filename templates?
    Also (and more important!) I need to rename the files on export to the main players names (I shoot sports) followed by the original yyymmdd_sequence number from the original name. I figure if I put the main players names into the title IPTC field, I can then put that at the start of the existing file name, but I need a way to only use the YYMMDD_sequence of the original name, and not the original literal.
    This is the naming sequence for most of the pubished shots, so I'd like to automate it as it's the one that takes the time!
    I tried to find documentation on the filename template's, but cannot find the detailed stuff. Pointers for a newbie would be appreciated!
    Thanks.
    Merv.

    I have a related issue: My Panasonic camera (DMC GH-1) uses filenames like TDDDNNNN.EEE, where "EEE" is the Extension (usually jpg, maybe rw2), T ist the Type ("P" (sRGB) or "_" (AdobeRGB)), DDD is the directory Number (initially "100"), and NNNN is the sequence number. That sounds fine, but there is a problem: The shot following P1000999.jpg is not P1001000.jpg, but P1010000.jpg. That is only three digits of NNNN are used, and the directory number is incremented after 1000 shots (not after 10000). Well, it could be a firmware bug in the Panasonic, but I'd like to be able to fix those names using sequence numbers like "DDD * 1000 + NNNN". I could not find out how to do that.
    BTW: May older Canon PowerShot Pro 1 used a new directory every 100 shots, but it used all four digits in the files's sequence number (the directory is not part of the image file name there).

  • Import pef and export dng

    I can import .pef files into elements 5.0 and make changes fine. I would like to know if I can export those altered files to a dng file format. When I try to export the dng file format does not appear. Do I need another plugin? I have already installed the latest (dated 03/02/2007) camera raw.8bi into the c:\program files\adobe\photoshop elements 5.0\plug-ins\file formats directory already.

    I am trying to save pef as dng in the organizer portion not the editor. When I click on export I see options for original format, jpg, png, tiff and psd. I do not see a dng option. I am also evaluating Lightroom 1.0 and it does have dng as save as option. Is there another plugin I need for Elements 5.0 to save a dng file?

  • MAXL Import Database (and export data?)

    I have an import database MAXL command that is working great (sql account and password substituted).  The rule file has a SQL connection to our ERP. 
    MAXL> import database PLANNING.DetView data connect as sql_account identified by "Password" using rules_file "'\\\oceanus-d13\\Oracle\\Middleware\\user_projects\\epmsystem3\\EssbaseServer\\essbaseserver1\\app\\PLANNING\\DetView\\NLIncur.rul'" on error write to "'G:\\DB_Process\\DetView_Process\\errors\\DetView_NLIncur_data_load.mxl.err'";
    We are going to run this on a nightly basis and our requirement is to have a historical record of what was "pulled" from our ERP (even if it didn't make it in to Hyperion).
    Here are the options that I've thought of:
    Have a separate SSIS package use the same SQL to dump the data out of the ERP database into a flat file.  The downside to this is that I'm using subvars from Essbase that don't exist in SQL.  So I'd have to write the SQL in a different way potentially causing discrepancies.
    A separate Data Export of what was loaded.  The downside here is that it doesn't have the errors that never made it in.
    I can't find another option in the import database command to spool the output of the SQL to a file. 
    Does anybody have thoughts on how to do this or are there other ways to achieve what I'm trying to do?
    Thanks-
    Cameron McClurg

    The SQL run via the load rule understands my subvars.  The load rule SQL looks like this:
    select...
    WHERE
      AND (('1' = '&IncSpanYr' --IF THE PERIODS SPAN YEARS
      AND (('12' = '&CurMoNm' --IN JUNE LOOK INTO NEXT YEAR PERIOD 1
      AND ((A.FISCAL_YEAR = '&CurYrNm' AND A.ACCOUNTING_PERIOD IN ('&PriMoNm', '&CurMoNm'))
    This turns into:
    WHERE
      AND (('1' = '1' --IF THE PERIODS SPAN YEARS
      AND (('12' = '12' --IN JUNE LOOK INTO NEXT YEAR PERIOD 1
      AND ((A.FISCAL_YEAR = '2014' AND A.ACCOUNTING_PERIOD IN ('2', '3'))
    My issue is getting a dump of all that data that the SQL is pulling.  I don't see a way to do that in the import database command.  A separate job on the SQL side (like SSIS) could pull the data, but can't access the subvars easily.  Does that make sense?  Or did I misunderstand your suggestion?
    Thanks-
    Cameron

  • How do I get my videos imported 1080, and exported as 1080?

    I have no idea what I am doing wrong. I have a GoPro hero3 black edition. When I import I import into imovie, make my movie, and whenver I want to export to youtube, quicktime, as a movie, etc., it never gives me 1080 option. I upload it to youtube and the 720 looks like crap. Grainy and not what I see when I am editing the movie. I want to be able to have it be clear in 1080 but this is really getting on my nerves...
    Anyone?

    Oh, unless you shot tall.
    If you shoot vertical/tall video it cannot be zoomed all the way out in the exit.
    Shoot wide, or edit in a different editing app.

  • Import AVCHD and Export in Full 1080 HD

    Hello,
    I have a Sony AVCHD handycam that records to a memory stick. Up until now, I have been using iMovie 08 on my iMac Intel Core Duo, but I am limited to iMovie's 'Large' (960x540) setting for my export options. I have tried exporting using Quicktime (tried several different option settings), but for whatever reason I cannot get a good result...it produces jerky and unusable video. The 960x540 looks ok, but I am trying to preserve my original 1080 HD resolution.
    Would Final Cut Express be the answer, or does it use the same type of export settings as iMovie 08?
    Thanks,
    Kenton

    You're probably exporting to a format and data rate that's too high for your computer's capabilities. The exporting application will make no difference. If your drives aren't fast enough to sustain high definition material at a high data rate it's always going to look choppy when played back full screen.

  • Function modue which imports belnr and exports vbeln?

    hi all,
    can anyone tell me if there is any function module in which if i pass accounting document(belnr) i get the corresponding billing document(vbeln)..

    You can also try:
      DATA: v_reference    TYPE bkpf-awtyp,           "Reference procedure
            v_objectkey    TYPE bkpf-awkey.           "Object key
        v_objectkey = billing_doc.       
        v_reference  = 'VBRK'.
        SELECT SINGLE bukrs belnr gjahr               "Accounting Doc Header
          FROM bkpf
          INTO doc_int
          WHERE awtyp =  v_reference
            AND awkey =  v_objectkey.
    Rob

  • Can the screen be cropped to and exported at a custom aspect ratio?

    I think I'm needing Premier to do something it's just not designed to do. What I need to do is to take five or six different screen and audio captures of the same event (a 3D virtual reality instructional session), each representing a different person's view of that event, assemble those in a split screen, synchronize them to the designated "main" capture's audio, delete the other audio tracks, and add CG title labeling each one. The effect is like looking at a display showing six security camera images. From what I can determine, it is possible to do those things in Premier.
    However, what I then need to do is to crop out any extra space to the left and right of that split-screen image and export the file as a .mov with the aspect ratio produced by the cropping so that the videos are as large as possible and are not distorted from resizing them to fit the standard aspect ratio. Depending on the size of the screencaps, the resulting .mov might have nearly a 1:1 ratio. What is critical for this is that the video be as large as possible and the "black" space be as small as possible so that the person who comes in later to analyze these composite videos can see everything  as clearly as possible. I also need to crop on the fly; because these images aren't always captured at the same size, it is impossible to preset a size for the resulting video.
    We've been using a screen capture tool to do this that works but is a pain; clips can only be moved and resized by mouse-drags, and the software has to import to and export from its own proprietary format, which is very slow. However, ever piece of "real" video editing software I've looked at only seems to export in TV or film aspect ratios. Quicktime Pro looked for a bit like the best bet, but it doesn't seem to be able to handle multiple audio tracks (which are necessary since I have to synchronize tracks by ear much of the time).
    Is what I'm describing--exporting .mov files while preserving custom aspect ratios that are created by cropping during editing--something Premier can do? If so, would it be something fairly easy for complete novices to do?

    medeamajic wrote:
    On a Mac based system ScreenFlow might work best but on the PC side FRAPS might work best. You can do what you want to do with Premiere Pro once you record the screen capture. As Stephen_Spider mentioned you might need to crop and even resize the images. FRAPS can record at 1/4 the screen resolution and still have decent results. PP CS 5.5 can play several layers (6 PIPs) of the FRAPS codec at 1/4 resolution in realtime.
    Thanks, but we cannot change the screen recording process. According to the study's protocols, the virtual world has to be full-screen or almost nearly so during the session, and that full screen has to be captured for analysis. Resizing the screen captures is not a problem (especially since I already said I'm doing that), but if you're saying that Premier can only play back up to six screen captures simultaneously if they're captured at 1/4 screen, then that's a deal-breaker right there.
    By the way, here's approximately what each of the completed composite videos looks like:
    Screencap 1
    Screencap 2
    Screencap 3
    Screencap 4
    Screencap 5
    Screencap 6
    Each screencaptures is resized to the largest consistent size that will fit into this format, and then the resulting video is cropped to the outside border. If there are only five screencaptures, I simply center the single one on the bottom on the "center line."
    And, to be clear (though I've said this several times already), this is something that is already being done. We probably have more than 60 of these .mov files, each around 45 minutes long, with five or six synchronized screen captures in each. Frame rate etc. has not been any issue with these, and neither has playback of the .mov file from HD or DVD.
    Message was edited by: singerm2

  • How to enable Validate and Export in FDM task

    Hi everyone
    I am working with setting up an FDM Task to Import, Validate and Export data to HFM.
    The Task is able to import data to FDM, but I cannot make the process start the Validate and Export functions. If I do the process manually, then data is imported to HFM succesfully.
    Am I missing something in the scripts or do I need to enable some feature in FDM?
    Action Script:
    Sub BatchLoadLedgerTrans()
    'Declare Local Variables
    Dim lngProcessLevel
    Dim strDelimiter
    Dim blnAutoMapCorrect
    'Initialize Variables
    lngProcessLevel = 50   'Import
    strDelimiter = "_"
    blnAutoMapCorrect = 0
    'Create the file collection
    Set BATCHENG.PcolFiles = BATCHENG.fFileCollectionCreate(CStr(strDelimiter), FileName)
    'Execute a Standard Serial batch
    BATCHENG.mFileCollectionProcess BATCHENG.PcolFiles, CLng(lngProcessLevel), , CBool(blnAutoMapCorrect)
    End Sub Integration Script:
    Function LedgerTrans_EDW_to_HFM(strLoc, lngCatKey, dblPerKey, strWorkTableName)
    Dim objSS 'ADODB.Connection
    Dim strSQL 'SQL String
    Dim rs 'Recordset
    Dim rsAppend 'tTB table append rs Object
    'Initialize objects
    Set cnSS = CreateObject("ADODB.Connection")
    Set rs = CreateObject("ADODB.Recordset")
    Set rsAppend = DW.DataAccess.farsTable(strWorkTableName)
    cnss.open "Provider=SQLNCLI11; Data Source=10.250.200.10; Initial Catalog=EDW; User ID=FDM-user; Password=MyPassword"
    cnss.CommandTimeout = 0
    'Create query String - VIEW
    strSQL = "Select * "
    strSQL = strSQL & "FROM EDW.hfm.ledgertranstable "
    'Get data
    rs.Open strSQL, cnSS
    'Check For data
    If rs.bof And rs.eof Then
    RES.PlngActionType = 2
    RES.PstrActionValue = "No Records To load!"
    Exit Function
    End If
    'Loop through records And append To tTB table In location’s DB
    If Not rs.bof And Not rs.eof Then
    Do While Not rs.eof
    rsAppend.AddNew
    rsAppend.Fields("PartitionKey") = RES.PlngLocKey
    rsAppend.Fields("CatKey") = lngCatKey ' PlngCatKey
    rsAppend.Fields("PeriodKey") = dblPerKey ' PlngCatKey
    rsAppend.Fields("DataView") = "YTD"
    rsAppend.Fields("CalcAcctType") = 9
    rsAppend.Fields("Account") = rs.fields(2).Value
    rsAppend.Fields("Desc1") = rs.fields(2).Value
    rsAppend.Fields("Entity") = rs.fields(6).Value
    rsAppend.Fields("ICP") = rs.fields(8).Value
    rsAppend.Fields("Amount") = rs.fields(14).Value
    rsAppend.Fields("UD1") = rs.fields(5).Value
    rsAppend.Fields("UD2") = rs.fields(9).Value
    rsAppend.Fields("UD3") = rs.fields(10).Value
    rsAppend.Fields("UD4") = rs.fields(7).Value
    rsAppend.Fields("UD5") = rs.fields(13).Value
    rsAppend.Update
    rs.movenext
    Loop
    End If
    'Records loaded
    RES.PlngActionType = 6
    RES.PstrActionValue = "SQL Import successful!"
    'Assign Return value
    LedgerTrans_EDW_to_HFM = True
    End Function Best regards
    Frederik
    Edited by: Frederik Andersen on Jun 1, 2013 1:32 AM

    Found the solution.
    The variable lngProcessLevel should be set to "12" to execute the full Import, Validate and Export.
    Best regards
    Frederik

  • Import-CSV and Takeown.

    I'm currently trying to come up with a way to search an entire folder directory to find all objects that a particular user is owner to, export that list to csv, then import that csv and using takeown to grant local Administrators Owner.
    Scouring the internet, I've been able to come up with this Powershell script to scan a directory and export the findings to a csv.
    param(
    [string]$username = "NameofUserSearchingFor",
    [string]$logfile
    Set-ExecutionPolicy Unrestricted
    if ($logfile -eq "") {
    $logfile = "c:\" + $username + "-Owner.csv"
    Write-Host "Setting log file to $logfile"
    #Path to search in
    [String]$path = "c:\TestFolder"
    [String]$AD_username = "Domain\" + $username
    #check that we have a valid AD user
    if (!(Get-ADUser $AD_username)){
    Write-Host "ERROR: Not a valid AD User: $AD_username"
    Exit 0
    Write-Output "$username" | Out-File -FilePath $logfile
    $files = Get-ChildItem $path -Recurse
    ForEach ($file in $files)
    $f = Get-Acl $file.FullName
    if ($f.Owner -eq $AD_username)
    Write-Output $file.FullName | Out-File -FilePath $logfile -Append
    exit 0
    That script exports data in the form of:
    NameofUserSearchingFor
    C:\TestFolder\TestFolder1
    C:\TestFolder\TestFolder2
    C:\TestFolder\TestFolder1\test1.txt
    C:\TestFolder\TestFolder2\test2.txt
    I'd like to use takeown to read each line of text and take ownership for local Administrators.  
    The script i'm trying to use doesn't do anything though.
    #Local Admininstrator Take Ownership
    $rows = Import-Csv "c:\NameofUserSearchingFor-Owner.csv"
    ForEach ($row in $rows)
    takeown /A /F $row
    Perhaps I'm going about this all wrong.  I'm relatively new to Powershell and have been trying to come up with a way to do this for the past 3 days.
    Any assistance would be greatly appreciated!
    *Update*
    If I reconfigure the takeown portion a bit to this:
    #Local Admininstrator Take Ownership
    $Path = "c:\NameofUserSearchingFor-Owner.csv"
    $rows = Import-Csv $Path
    ForEach ($line in $rows)
    takeown /A /F $Path
    The result is:
    SUCCESS: The file (or folder): "c:\NameofUserSearchingFor-Owner.csv" now owned by the administrators group.
    But will repeat as many times as there are lines of text in the csv.  So if there are 4 lines of text in the csv, that line will repeat 4 times.  I find it interesting that it knows how many lines there are but instead of granting local Administrator
    the owner to the path specified in the line it will instead grant local Administrator to the csv file.

    That is because you are takingownership of the CSV.
    ForEach ($line
    in$rows){
        takeown /A
    /F $Path
    should be:
    ForEach ($line
    in$rows){
        takeown /A
    /F $line
    ¯\_(ツ)_/¯

  • Querying SQL db based off of values in a CSV and exporting filtered list to a new CSV

    Hello, we use Office365 for our student e-mail accounts and are trying to appropriately license those accounts based on if they are currently enrolled or not (currently enrolled get the full license with OneDrive, SharePoint, Lync, Office download etc. while
    non-currently enrolled get e-mail only).  I have exported a list of UserPrincipalNames from Office365 to a CSV that have a particular license applied.  I want to compare that list against a SQL view that will identify if those UserPrincipalNames
    (found under the "Email" column in the SQL database) are currently registered for classes or not (we have a "CurrentlyRegistered" column populated with a "0" for not registered and a "1" for registered"). 
    I then need to export the filtered list of UserPrincipalNames to a different CSV so that I can use that second CSV to change the license in Office365.
    For example, I export a list of accounts from Office 365 that have the e-mail only license applied to a CSV named "Exchange.csv" that has a single column of data with a header of "UserPrincipalName".  I want to compare the accounts
    in the "Exchange.csv" to our SQL view (which among other columns, has one named "Email" that matches the values of "UserPrincipalName" and a "CurrentlyRegistered" column) to see if any have changed status to being currently
    enrolled so that I can flip their license to the full license. I would know this by returning any accounts that have a "1" in the "CurrentlyRegistered" column and saving those to a CSV named "Full.csv" that would have two columns
    with headers "Email" and "CurrentlyRegistered".
    I need to know how to connect to SQL from PowerShell and return the filtered list of accounts that have changed status.  I am hoping for something like:
    $constring = "Server=MyServer\Instance;Database=StudentEmails;Trusted_Connection=True"
    $SqlConnection = New-Object System.Data.SqlClient.SqlConnection
    $SqlConnection.ConnectionString = $constring
    $SqlCommand = New-Object System.Data.SqlClient.SqlCommand
    $SqlCommand.CommandText = $query
    $SqlCommand.Connection = $SqlConnection
    $SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
    $SqlAdapter.SelectCommand = $SqlCommand
    $DataSet = New-Object System.Data.DataSet
    $SqlConnection.Open()
    $SqlAdapter.Fill($DataSet)
    $SqlConnection.Close()
    $DataSet.Tables[0]
    Remove-Item C:\O365Operations\CSVs\Full.csv
    $query = Import-Csv C:\O365Operations\CSVs\Exchange.csv | ForEach-Object {Invoke-Sqlcmd -Query "SELECT [Email],[CurrentlyRegistered] FROM [StudentEmails].[dbo].[v_StudentEmailAccounts] WHERE CurrentlyRegistered = '1'"} | Export-Csv C:\O365Operations\CSVs\Full.csv
    -Append}
    Am I anywhere close with this?
    Thanks,
    Jeremy Hawks
    Systems Administrator @ Green River College

    My CSV has a single column with a the #Type in the first line, the header of "UserPrincipalName" in the second line and the rest of the lines having a single e-mail address, similar to:
    #TYPE Selected.Microsoft.Online.Administration.User
    UserPrincipalName
    [email protected]
    [email protected]
    [email protected]
    The "Email" column in SQL will have some matching values to the "UserPrincipalName" column in the CSV. 
    So if any of the (UserPrincipalName in CSV) values match the values found in the column (Email in SQL) then I want to export those e-mail addresses to a CSV so that I can use the new CSV to change the licenses in Office 365.
    When I run your modified code I get the following:
    PS C:\Windows\system32> $csv=Import-Csv C:\O365Operations\CSVs\Test.csv
    $csv|select email
    foreach($User in $csv){
         # find user
         if($DataTable|?{$_.Email -eq $user.Email}){
             Write-Host "User found: $($user.email)" -fore green
             #Select Email,CurrentlyRegistered | Export-Csv C:\O365Operations\CSVs\IW2.csv -Append
         }else{
            Write-Host "User NOT found: $($user.email)" -fore red
    email
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    User NOT found:
    PS C:\Windows\system32>
    If I change the command from
    $csv|select email
    to
    $csv|select UserPrincipalName
    Then I get a full listing of the e-mails in the CSV with none filtered out (so it is not comparing it to the returned data from the SQL table.
    Suggestions?
    Jeremy Hawks Systems Administrator @ GRC

  • Import-csv consisting of permissions and users

    I have exported a csv via powershell and i am looking use that csv to import and edit some ad objects.  any assistance would be appreciated.  this is what i have cobbled together so far:
    Get-ReceiveConnector <RelayName> | import-csv c:\scripts\permissions.csv | % {Add-ADPermission –User $_.user –ExtendedRights $_.Permissions}
    TIA

    when i use your get-receiveconnector script my csv file looks like this:
    PSComputerName
    RunspaceId
    AccessRights
    ExtendedRights
    ChildObjectTypes
    InheritedObjectType
    Properties
    Deny
    InheritanceType
    User
    Identity
    IsInherited
    IsValid
    exchangereceive.com
    d767dfd3-b03d-4fa9-8c78-b38a80dda06c
    System.DirectoryServices.ActiveDirectoryRights[]
    Microsoft.Exchange.Configuration.Tasks.ExtendedRightIdParameter[]
    TRUE
    All
    DOMAIN1\ExchangeLegacyInterop
    exchangerecev\Default receive
    FALSE
    TRUE
    If i were to use your second script wouldnt it have to be modified so that the user that requires the rights is included?
    That is because it is a real CSV file.  It needs to be loaded with Import-Csv
    ¯\_(ツ)_/¯

  • Import CSV, Edit Data, Export CSV

    Hello,
    I am a powershell novice and have been given a task to complete. I need to scan through AD, find all of the servers that are supposed to be active, "ping" them and, if active, get their IP address and verify that a number of services are running.
    Once all of the data is collected, I need to write all of this out to a .CSV file. So far, what I have is this:
    Using Import-Module ActiveDirectory, I am able to get the full list of servers with appropriate information and export that to a CSV
    I import the CSV file above back in and create new headers (columns) for more information
    The full list of servers are pinged individually. If pingable, I get the IP address and service information needed
    Where I am stuck is writing all of this back out to a .CSV file. Right now, I am just writing the information back out to the screen for verification. I'm sure it's something small, but being a novice I'm not sure where to go. Any help?

    Here is the full code. The Write-Host statements are just for me during the testing to see if the information is correct. The last statement was what I tried to write the information out, but it came up empty and I really haven't found a way to get it to
    work.
    Import-Module ActiveDirectory    
    Get-ADComputer -LDAPFilter "(OperatingSystem=*Server*)" -Property * |
        Select-Object Name,Created,LastLogonDate,OperatingSystem,OperatingSystemServicePack,OperatingSystemVersion,CanonicalName |
        ? {$_.CanonicalName -notlike "*Archived Servers*" -and $_.CanonicalName -notlike "*Sun Prod*"} |
        Export-CSV "c:\temp\AllADComputers4.CSV" -NoTypeInformation -Encoding UTF8
    #Add other headers
    $csv = get-content "c:\temp\AllADComputers4.CSV"
    $csv[0] += ",Pingable,IPAddress,SCCM Status,AV Name,AV Version,AV Status"
    $csv | out-file "c:\temp\AllADComputers4.CSV"
    $MachineInfo = Import-Csv -Path "c:\temp\AllADComputers4.CSV"
    $Processed = ForEach ($objitem in $MachineInfo)
        If (Test-Connection -ComputerName $Objitem.Name -Quiet -Count 1)
            If (Get-Service -name "Symantec AntiVirus" -computername $Objitem.Name)
                $AVStatus = Get-Service -name "Symantec AntiVirus" -computername $Objitem.Name
                $AVName = "Symantec AntiVirus"
            Elseif (Get-Service -name SepMasterService -computername $Objitem.Name)
                $AVStatus = Get-Service -name SepMasterService -computername $Objitem.Name
                $AVName = SepMasterService
            ElseIf (Get-Service -name "Norton AntiVirus Server" -computername $Objitem.Name)
                $AVStatus = Get-Service -name "Norton AntiVirus Server"-computername $Objitem.Name
                $AVName = "Norton AntiVirus Server"
            Else
                $AVStatus = "Stopped"
                $AVName = "None"
            $SCCMAgentStatus = Get-Service -name ccmexec -computername $Objitem.Name
            $Networks = Get-WmiObject Win32_NetworkAdapterConfiguration -ComputerName $Objitem.Name | ? {$_.IPEnabled}
            $objitem.Pingable = "Yes"
            foreach ($Network in $Networks)
                    $IPAddress  = $Network.IpAddress[0]
            Write-Host $Objitem.Name,"...Success...",$IPAddress,"   SCCM Status = ",$SCCMAgentStatus.Status, "Anti-Virus = ",$AVName, "Status = ",$AVStatus.Status
            $service = gwmi -computername $objItem.Name -class Win32_Service | ? {$_.Name -eq $AVName}
            $path = ($service | Select -Expand PathName)
            $path = $path.Replace(":","$")
            $path = $path.Replace('"','')
            $FullPath = "\\"+$objitem.Name+"\"+$path
            $ver = Get-ItemProperty -Path $FullPath | select-object -expand VersionInfo | select-object -expand ProductVersion
            Write-Host $Fullpath
            Write-Host "Version = ",$ver
        Else
            Write-Host $Objitem.Name"...Failed"
            $objitem.Pingable = "No"
    $Processed | export-csv -Path "c:\temp\AllADProcessed.CSV"

  • Need to import csv and manipulate one column

    I have a monthly report output by an FTP program into CSV format. The CSV file has headers and 3 columns:
    1) a group name
    2) the number of transfers that group made
    3) the total size of all their transfers (in bytes)
    Example (with headers and 6 rows of data for 7 rows total):
    "monthly transfers","sum of monthly transfers","sum of file size"
    "group1 (Prod to Dev)","26","11556381672"
    "group2 (Prod to Dev)","5","348197396"
    "group3 (Prod to Dev)","14","1272913379"
    "group1 (Dev to Prod)","0","0"
    "group2 (Dev to Prod)","1976","2036311426"
    "group3 (Dev to Prod)","1","131"
    I have been asked to write a script that will convert the bytes to megabytes and then sort based on group name and export the new data so auditors (non-technical people) can review it.
    I've worked with Excel a bunch in Powershell (I.E. opening a com object and writing directly to a cell) and I'd prefer to do it that way, but the server this script needs to run on doesn't have MS Office installed and can't have anything else installed on
    it, so I'm trying to learn about import-csv and such. I've come up with this:
    $file = "C:\ftp\ftp_report.csv"
    $csv = import-csv $file
    $csv | sort "monthly transfers"
    The part I'm stuck on is converting that third column from bytes to megabytes. I know that in Powershell I can do "X / 1mb" to convert a number from bytes to megabytes, but I'm not sure of how best to do this in a specific column of a CSV file.
    Should I create a hash table? Should I count the number of rows and then do a "foreach" on
    $csv[0].'sum of file size' / 1mb
    Am I overthinking this and there's a really easy way to manipulate all values in a column?
    Thank you in advance.
    [email protected]

    Probably like this, right at the parameter:
    ("{0:N2}" -f ($row.'sum of file size' / 1mb))
    There's no place like 127.0.0.1

Maybe you are looking for

  • Why does iPhone (3gs) sound better than Macs via Bluetooth?

    Why does my iPhone (3gs) sound so much better via Bluetooth on my (Jawbone) Jambox than my new Mac Mini or 3 year old MacBook Pro?  Proxmity is about the same and doesn't seem to make a difference. sound from via Bluetooth sounds muted like in a cave

  • MIGO Number range

    Hi Experts, Can any body explain me for new code how to assign number range for goods receipt. We have two co codes. For one more code we have to assign other number range. Thank you Shakir

  • Disorganized library - help with how to clean up?

    Hi guys, My library is an absolute mess. I have up to five duplicates of a given file, each of which are somehow at different sizes, and all my 50GB of files are spread out over somewhere between 4-6 huge and horrendously disorganized primary folders

  • Rightparen error

    I keep getting this error 1084: Syntax error:expecting rightparen before message here's the code private function populateDatabase():void                                         var createTable:SQLStatement= new SQLStatement();                       

  • Deletion of /usr/lib Directory

    I did something extremely foolish and deleted the /usr/lib directory on my G4 tibook. Now none of the apps work, including unix commands such as mv and cp. I then proceeded to try rebooting the computer, and it will not boot. It also does not boot in