How do I compare two csv files and not disable the user if the username is found in the 2nd file using powershell?

Hi Guys
I have two csv files with the following headers and I need to import both files into the script to check whether the StaffCode is present in the Creation/Renewal of Contract csv in a DisableAccount Script so I can stop any action to disable the account as
the staff has renewed the contract with the company so the account should not be disabled.
However my accounts are still being disabled. I am not sure now to construct the query so that it detects that the account is to be left alone if the staffcode is present in both files
I does recognize that the $staffcodeN in the renewal file matches the $staffcode in the termination file
but still proceeds to disable or set an expiry date to the account anyway based on the termination file. 
How do I stop it from doing that?
1)In the Creation/Renewal of contract file the following headers are present
     -  TranCode,StaffCode,LastName,FirstName,SocialSecurityNo,DateJoin,Grade,Dept,LastUpdateDate,EffectiveDate
2)In the Disable of contract file the following headers are present
    - TranCode,StaffCode,LastName,FirstName,SocialSecurityno,LastDateWorked,Grade,Dept,LastUpdateDate,
My data is not very clean , I have a-lot of special characters such as = , ' ,/ and \ characters to remove first before i can compare the data
Thanks for the help in advance.
Yours Sincrely
Vicki
The following is a short snippet of the code 
$opencsv = import-csv "D:\scripts\Termination.csv"
$opencsv2 = import-csv "D:\scripts\RenewContractandNewStaff.csv"
foreach ($usertoaction in $opencsv) 
$Trancode = $usertoactionTranCode
$StaffCode = $usertoaction.StaffCode.replace("=","").replace('"','')
$LastName = [string]$usertoaction.LastName.Replace("/","\/").Replace(",","\,")
$FirstName = [string]$usertoaction.FirstName.Replace("/","\/").Replace(",","\,")
$socialsecurityno = $usertoaction.SocialSecurityNo.replace("=","").replace('"','')
$DateJoin = $usertoaction.DateJoin.replace("=","").replace('"','')
$LastDateWorked = $usertoaction.LastDateWorked.replace("=","").replace('"','')
$Grade = [string]$usertoaction.Grade
$Dept = [string]$usertoaction.Dept
$LastUpdateDate = $usertoaction.LastUpdateDate.replace("=","").replace('"','')
$AccountExpiry = [datetime]::Now.ToString($LastDateWorked)
foreach ($usertoaction2 in $opencsv2) 
$TrancodeN = $usertoaction2.TranCode
$StaffCodeN = $usertoaction2.StaffCode.replace("=","").replace('"','')
$socialsecurityNoN= $usertoaction2.SocialSecurityNo.replace("=","").replace('"','')
$DateJoinN = $usertoaction2.DateJoin.replace("=","").replace('"','')
$GradeN = [string]$usertoaction2.Grade
$DeptN = $usertoaction2.Dept
$LastUpdateDate = $usertoaction.LastUpdateDate.replace("=","").replace('"','')
$EffectiveDate = $usertoaction.EffectiveDate.replace("=","").replace('"','')
$LastName2 = [string]$usertoaction2.LastName.Replace(",", "").Replace("/","").trim()
$FirstName2 = [string]$usertoaction2.FirstName.Replace("/","").trim()
# Use DirectorySearcher to find the DN of the user from the sAMAccountName.
$Domain = [System.DirectoryServices.ActiveDirectory.Domain]::GetCurrentDomain()
$Root = $Domain.GetDirectoryEntry()
$Searcher = [System.DirectoryServices.DirectorySearcher]$Root
$Searcher.Filter = "(sAMAccountName=$samaccountname)"
$doesuserexist1 = $Searcher.Findall()
if ($doesuserexist1 -eq $Null)
{Write-Host $samaccountname "account does not exist"}
elseif ($StaffCodeN -match $staffcode)
write-host "user has renewed the contract, no action taken"
else
if(($lastupdatedate -ne $null)-or($LastDateWorked -ne $null))
                    write-host "Setting Account Expiry to"$accountexpirydate
#$ChangeUser.AccountExpires = $accountexpirydate
           #$Changeuser.setinfo()
if ($UserMailforwarding -ne $null)
#Set Account expiry date to Last Date Worked
# $ChangeUser.AccountExpires = $accountexpirydate
# $Changeuser.setinfo()
 write-host "staff" $displayname "with staff employee no" $samaccountname "has                          
mailforwarding" 
Write-host "Please disable the account manually via Active Directory Users & Computers and 
Elseif ($accountexpirydate -lt $todaysdate)
#disable the account

Hi Vicki,
This Forum has an insert-codeblock function. Using it will make your script far more readable
Your script is missing some parts, it is impossible to follow the problem.
You are performing the same string cleaning action on $opencsv2 for each element in $opencsv, when doing it once should suffice. Why not start it all by cleaning the values and storing the cleaned values in new arrays?
The Compare-Object function is great, why not take it out for a stroll on these lists, it might just safe you lots of unnecessarily complicated code ...
You are creating a new $Domain, $Root and $Searcher object each iteration, when doing it once should suffice. Probably not much of a time-saver, but every little thing contributes.
Try pinpointing the problem by doing extensive logging, not only by writing which action was taken, but writing the inidividual information (variables, mostly) before evaluation occurs. Your if/elseif/else looks sound, so if it's still not doing what you
want, the ingoing data must be different from what you think should be there.
Cheers,
Fred
There's no place like 127.0.0.1

Similar Messages

  • How would I merge two csv files in Powershell

    Hello,
    I have two CSV Files.  Each has a list of virtual machines in them. One CSV file has the name of the VM as well as details about the RAM, CPU, VM Details (from VMM).  The other CSV File has the name of the VM as well as the IP address.
    How would I merge these two together?  For any VM's that do not exist in either file I wouldn't want to drop these VM names either. 
    Thanks.

    Thank you mjolinor for your suggestion.  This is not quite working for me.  What I'm seeing in the csv3.csv file all the  column details from the $firstCSV, and I'm seeing the headings included from the $secondCSV but all the details
    of the headings from the $secondCSV are empty.  Here is some more information for you:
    My $firstCSV file has the following headings:
    VMName  ChargedWA  BillingWA  VMOperatingSystem  VMHost  vCPUs  MemoryGB  StorageGB 
    My $secondCSV file has the following headings:
    VMName  IPAddress  SubnetMask  Gateway  DNSServers  MACAddress  device0  Totalspce0  freespce0  device1  Totalspce1  freespce1  device2  Totalspce2  freespce2
    Each CSV input file has in common the column of VMName.  Most values in VMName are common to both input files but there are some values in VMName that are not in the other input file.  I'm looking to find the following:
    When the same VMname exists in both files, write out all columns for that record from both files into the merged file.
    When a VM name is in $firstCSV but not the second...still write out this record with the values from the $firstCSV.  The column headings from the $secondCSV would be empty for this record.
    When a VM name is in $secondCSV but not the first...still write out this record with the values from the $secondCSV.  The column headings from the $firstCSV would be empty for this record.
    It's funny how I think I've described my issue in my initial post but as I describe it more I further define the details.  I hope I've described my issue for all to understand.
    Thank you.

  • How to get Document Set property values in a SharePoint library in to a CSV file using Powershell

    Hi,
    How to get Document Set property values in a SharePoint library into a CSV file using Powershell?
    Any help would be greatly appreciated.
    Thank you.
    AA.

    Hi,
    According to your description, my understanding is that you want to you want to get document set property value in a SharePoint library and then export into a CSV file using PowerShell.
    I suggest you can get the document sets properties like the PowerShell Command below:
    [system.reflection.assembly]::loadwithpartialname("microsoft.sharepoint")
    $siteurl="http://sp2013sps/sites/test"
    $listname="Documents"
    $mysite=new-object microsoft.sharepoint.spsite($siteurl)
    $myweb=$mysite.openweb()
    $list=$myweb.lists[$listname]
    foreach($item in $list.items)
    if($item.contenttype.name -eq "Document Set")
    if($item.folder.itemcount -eq 0)
    write-host $item.title
    Then you can use Export-Csv PowerShell Command to export to a CSV file.
    More information:
    Powershell for document sets
    How to export data to CSV in PowerShell?
    Using the Export-Csv Cmdlet
    Thanks
    Best Regards
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • Compare two pdf files using adobe acrobat through command line

    Does anyone know how to compare two pdf files using adobe acrobat through command line. I want to do this via command line because we want to compare hundreds of file every day through some automated windows tasks.
    If command line option is not available in acrobat, then is it feasible to make use of acrobat javascript API to do this task?
    Any kind of help will be greatly.

    Command-line: Not possible.
    JavaScript: Possible, but very limited. Basically the only thing you can do is simulate clicking the Compare Documents button. The rest has to be done manually.
    However, it *might* be possible to automate this process a bit more using a plugin. Ask over at the Acrobat SDK forum for more information...

  • How to get DocSet property values in a SharePoint library into a CSV file using Powershell

    Hi,
    How to get DocSet property values in a SharePoint library into a CSV file using Powershell?
    Any help would be greatly appreciated.
    Thank you.
    AA.

    Hi AOK,
    Would you please post your current script and the issue for more effcient support.
    In addition, to manage document set in sharepoint please refer to this script to start:
    ### Load SharePoint SnapIn
    2.if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null)
    3.{
    4. Add-PSSnapin Microsoft.SharePoint.PowerShell
    5.}
    6.### Load SharePoint Object Model
    7.[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint”)
    8.
    9.### Get web and list
    10.$web = Get-SPWeb http://myweb
    11.$list = $web.Lists["List with Document Sets"]
    12.
    13.### Get Document Set Content Type from list
    14.$cType = $list.ContentTypes["Document Set Content Type Name"]
    15.
    16.### Create Document Set Properties Hashtable
    17.[Hashtable]$docsetProperties = @{"DocumentSetDescription"="A Document Set"}
    18.$docsetProperties = @{"CustomColumn1"="Value 1"}
    19.$docsetProperties = @{"CustomColum2"="Value2"}
    20. ### Add all your Columns for your Document Set
    21.
    22.### Create new Document Set
    23.$newDocumentSet = [Microsoft.Office.DocumentManagement.DocumentSets.DocumentSet]::Create($list.RootFolder,"Document Set Title",$cType.Id,$docsetProperties)
    24.$web.Dispose()
    http://www.letssharepoint.com/2011/06/document-sets-und-powershell.html
    If there is anything else regarding this issue, please feel free to post back.
    Best Regards,
    Anna Wang
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • Is it possible to monitor State change of a .CSV file using powershell scripting ?

    Hi All,
    I just would like to know Is it possible to monitor State change of a .CSV file using powershell scripting ? We have SCOM tool which has that capability but there are some drawbacks in that for which we are not able to utilise that. So i would like
    to know is this possible using powershell.
    So if there is any number above 303 in the .CSV file then i need a email alert / notification for the same.
    Gautam.75801

    Hi Jrv,
    Thank you very much. I modified the above and it worked.
    Import-Csv C:\SCOM_Tasks\GCC2010Capacitymanagement\CapacityMgntData.csv | ?{$_.Mailboxes -gt 303} | Export-csv -path C:\SCOM_Tasks\Mbx_Above303.csv;
    Send-MailMessage -Attachments "C:\SCOM_Tasks\Mbx_Above303.csv" -To “[email protected]" -From “abc@xyz" -SMTPServer [email protected] -Subject “Mailboxex are above 303 in Exchange databases” -Body “Mailboxex are above 303 in Exchange databases" 
    Mailboxex - is the line which i want to monitor if the values there are above 303. And it will extract the lines with all above 303 to another CSV file and 2nd is a mail script to email me the same with the attachment of the 2nd extract.
    Gautam.75801

  • How to rename content type on web site level and underneath all the document libraries using powershell?

    Hello
    I wantto rename content type on web site level and underneath all the document libraries using powershell, please let me know how can do this?
    I saw this url
    http://suryapulipati.blogspot.in/2011/08/rename-content-type-name-in-list-using.html, but I want to change everything from Web Site, subsite and underneath document libraries. Because if you try to change on any on document library, it will be applicable
    for that document library, if you change on web level then underneath document libraries content type names not updated automatically.
    Please advise
    Avi

    Not sure with powershell.
    I could help you out with Server Side or Web Services though.
    Brandon James SharePoint Developer/Administrator

  • How to Compare 2 CSV file and store the result to 3rd csv file using PowerShell script?

    I want to do the below task using powershell script only.
    I have 2 csv files and I want to compare those two files and I want to store the comparision result to 3rd csv file. Please look at the follwingsnap:
    This image is csv file only. 
    Could you please any one help me.
    Thanks in advance.
    By
    A Path finder 
    JoSwa
    If a post answers your question, please click "Mark As Answer" on that post and "Mark as Helpful"
    Best Online Journal

    Not certain this is what you're after, but this :
    #import the contents of both csv files
    $dbexcel=import-csv c:\dbexcel.csv
    $liveexcel=import-csv C:\liveexcel.csv
    #prepare the output csv and create the headers
    $outputexcel="c:\outputexcel.csv"
    $outputline="Name,Connection Status,Version,DbExcel,LiveExcel"
    $outputline | out-file $outputexcel
    #Loop through each record based on the number of records (assuming equal number in both files)
    for ($i=0; $i -le $dbexcel.Length-1;$i++)
    # Assign the yes / null values to equal the word equivalent
    if ($dbexcel.isavail[$i] -eq "yes") {$dbavail="Available"} else {$dbavail="Unavailable"}
    if ($liveexcel.isavail[$i] -eq "yes") {$liveavail="Available"} else {$liveavail="Unavailable"}
    #create the live of csv content from the two input csv files
    $outputline=$dbexcel.name[$i] + "," + $liveexcel.'connection status'[$i] + "," + $dbexcel.version[$i] + "," + $dbavail + "," + $liveavail
    #output that line to the csv file
    $outputline | out-file $outputexcel -Append
    should do what you're looking for, or give you enough to edit it to your exact need.
    I've assumed that the dbexcel.csv and liveexcel.csv files live in the root of c:\ for this, that they include the header information, and that the outputexcel.csv file will be saved to the same place (including headers).

  • How can i compare two excel files with different no. of records.

    Hi
    I am on to a small project that involves us to compare two excel files. i am able to do it but am struck up at a point. When i compare 2 different .csv files with different no. of lines i am only able to compare upto a point till when the number of lines is same in both the files.
    Eg. if source file has 8 lines and target file has 12 lines. The difference is displayed only till 8 lines and the remaining 4 lines in source lines are not shown.
    Can you help me in displaying those extra 4 lines in source file. I am attaching my code snippet below..
    while (((strLine = br.readLine()) != null) && ((strLine1 = br1.readLine())) != null)
                     String delims = "[;,\t,,,|]";
                    String[] tokens = strLine.split(delims);
                    String[] tokens1 = strLine1.split(delims);
                   if (tokens.length > tokens1.length)
                    for (int i = 0; i < tokens.length; i++) {
                        try {
                            if (!tokens.equals(tokens1[i])) {
    System.out.println(tokens[i] + "<----->" + tokens1[i]);
    out.write(sno + " \t" + lineNo1 + " \t\t" + tokens[i] + "\t\t\t\t" + tokens1[i]);
    out.println();
    sno++;
    } catch (Exception exception)
    out.write(sno + " \t" + lineNo1 + " \t\t" + tokens[i] + "\t\t\t\t" + "");
    out.println();
    Thanks & Regards                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

    A CSV file is not an Excel file.
    But apart from that your logic makes no sense.
    If the 2 files are of different sizes the files are different by definition, so further comparison isn't needed, you're done.
    If you want to compare individual records, you need to compare all records from one file with all records from the other, unless the order of records is important in which case your current system might work.
    That system however is overly complicated for comparing CSV files.
    As you assume a single record per line, and if one can assume those records to have identical layout (so no leading or trailing whitespace in or between columns in one file that's not in the other) comparing records is simply a matter of comparing the entire lines.

  • How do I call two csv files into ews?

    I have to call two csv files into a script with ews:
    Param([string]$calInputFile = "C:\Algemene agenda\Data docenten.txt" )
    Param([string]$calMailboxFile ="C:\Algemene agenda\Alle docenten.txt"
    # Load all Entries
    $calagenda = Import-Csv $calInputFile
    $calMailbox = Import-Csv $callMailboxfile
    This do not work.
    How can I do this???

    OH!!  Why didn't I see this before?  s-:  You are having problems with the Param definition, but it appears you are having troubles pulling the data into the script.  You can only have a single Param definition for a script.  Try
    the following instead:
    Param(
        [string]$calInputFile
    = "C:\Algemene agenda\Data docenten.txt",
        [string]$calMailboxFile
    ="C:\Algemene agenda\Alle docenten.txt"
    ## Load all Entries
    $calagenda = Import-Csv $calInputFile
    $calMailbox = Import-Csv $callMailboxfile
    To define multiple parameters, you have your Param definition (with its parentheses) and all parameters are defined inside it, separated by commas.

  • How to append two CSV files using ftp

    Hi
    Please let me know the FTP command to append the two .CSV files into one .CSV file.
    e.g.
    Let me explain:
    one CSV file has the fields F1 , F2, F3 , F4 and has 5 records
    another CSV file has the same sequence of fields F1, F2, F3, F4 and has 10 records
    after appending both the files , I must get 15 records.
    Thanks
    Alok

    Ok I will try out.
    I am illustrating my requirement as follow
    File#1 (type .CSV)
    F1     F2     F3     F4
    100   566    89     86   
    235   256    56     12
    File#2 (type .CSV)
    F1     F2     F3     F4
    56     56     98     102
    12     23     36      523
    23      56     56    89
    Now we want to Append File#1 to File#2 as follow: 
    F1    F2    F3     F4
    100   566    89     86   
    235   256    56     12   
    56     56     98     102
    12     23     36      523
    23      56     56    89
    Please suggest now that which command would be appropiate.
    Alok

  • How can i compare two XML files storeds in a LONG column

    Hi,
    I need to compare two xml files. My xmls are stored in two table like this:
    Table 1
    ID_COL number(5);
    XML1 LONG()
    Table 2
    ID_COL number(5);
    XML2 LONG()
    I need compare the values of the tags of this xmls files e to list de differences.
    Tks,
    Fernando.

    yes odie you are right...i think that my xml is wrong...
    I would like to compare every element/attribute...
    Bellow is another xml...this is ok....tks
    <?xml version="1.0" encoding="UTF-8" ?>
    - <nfeProc xmlns="http://www.portalfiscal.inf.br/nfe" versao="2.00">
    - <NFe xmlns="http://www.portalfiscal.inf.br/nfe">
    - <infNFe Id="NFe31121059106377000172550010003957681605366269" versao="2.00">
    + <ide>
    <cUF>31</cUF>
    <cNF>60536626</cNF>
    <natOp>VDAS PROD ESTABELECIMENT</natOp>
    <indPag>1</indPag>
    <mod>55</mod>
    <serie>1</serie>
    <nNF>395768</nNF>
    <dEmi>2012-10-03</dEmi>
    <dSaiEnt>2012-10-03</dSaiEnt>
    <hSaiEnt>18:30:00</hSaiEnt>
    <tpNF>1</tpNF>
    <cMunFG>3159605</cMunFG>
    <tpImp>1</tpImp>
    <tpEmis>1</tpEmis>
    <cDV>9</cDV>
    <tpAmb>1</tpAmb>
    <finNFe>1</finNFe>
    <procEmi>0</procEmi>
    <verProc>1.0</verProc>
    </ide>
    + <emit>
    <CNPJ>59106377000172</CNPJ>
    <xNome>METAGAL IND E COM LTDA</xNome>
    <xFant>METAGAL INDUSTRIA E COMERCIO LTDA</xFant>
    - <enderEmit>
    <xLgr>ROD BR 459</xLgr>
    <nro>333</nro>
    <xCpl>KM 121</xCpl>
    <xBairro>DISTRITO INDUSTRIAL</xBairro>
    <cMun>3159605</cMun>
    <xMun>SANTA RITA DO SAPUCAI</xMun>
    <UF>MG</UF>
    <CEP>37540000</CEP>
    <cPais>1058</cPais>
    <xPais>BRASIL</xPais>
    <fone>3534719100</fone>
    </enderEmit>
    <IE>5969141300009</IE>
    <IM>01183</IM>
    <CNAE>2949299</CNAE>
    <CRT>3</CRT>
    </emit>
    + <dest>
    <CNPJ>59275792000150</CNPJ>
    <xNome>GENERAL MOTORS DO BRASIL LTDA</xNome>
    - <enderDest>
    <xLgr>AV GOIAS</xLgr>
    <nro>1805</nro>
    <xBairro>BARCELONA</xBairro>
    <cMun>3548807</cMun>
    <xMun>SAO CAETANO DO SUL</xMun>
    <UF>SP</UF>
    <CEP>09501970</CEP>
    <cPais>1058</cPais>
    <xPais>BRASIL</xPais>
    </enderDest>
    <IE>636003724112</IE>
    <email>[email protected]</email>
    </dest>
    - <det nItem="1">
    + <prod>
    <cProd>XM20C9500PPR</cProd>
    <cEAN />
    <xProd>ESPELHO RETROVISOR EXTERNO</xProd>
    <NCM>70091000</NCM>
    <CFOP>6501</CFOP>
    <uCom>PC</uCom>
    <qCom>80.0000</qCom>
    <vUnCom>35.8700000000</vUnCom>
    <vProd>2869.60</vProd>
    <cEANTrib />
    <uTrib>PC</uTrib>
    <qTrib>80.0000</qTrib>
    <vUnTrib>35.8700000000</vUnTrib>
    <indTot>1</indTot>
    <xPed>XRW001RV</xPed>
    <nItemPed>000001</nItemPed>
    </prod>
    - <imposto>
    - <ICMS>
    - <ICMS00>
    <orig>0</orig>
    <CST>00</CST>
    <modBC>3</modBC>
    <vBC>2869.60</vBC>
    <pICMS>12.00</pICMS>
    <vICMS>344.35</vICMS>
    </ICMS00>
    </ICMS>
    - <IPI>
    <CNPJProd>00000000000000</CNPJProd>
    <cEnq>0</cEnq>
    - <IPINT>
    <CST>54</CST>
    </IPINT>
    </IPI>
    - <II>
    <vBC>0.00</vBC>
    <vDespAdu>0.00</vDespAdu>
    <vII>0.00</vII>
    <vIOF>0.00</vIOF>
    </II>
    - <PIS>
    - <PISNT>
    <CST>08</CST>
    </PISNT>
    </PIS>
    - <COFINS>
    - <COFINSNT>
    <CST>08</CST>
    </COFINSNT>
    </COFINS>
    </imposto>
    <infAdProd>PC.93378954-COMPL.PED.XRW001RV</infAdProd>
    </det>
    + <total>
    - <ICMSTot>
    <vBC>2869.60</vBC>
    <vICMS>344.35</vICMS>
    <vBCST>0.00</vBCST>
    <vST>0.00</vST>
    <vProd>2869.60</vProd>
    <vFrete>0.00</vFrete>
    <vSeg>0.00</vSeg>
    <vDesc>0.00</vDesc>
    <vII>0.00</vII>
    <vIPI>0.00</vIPI>
    <vPIS>0.00</vPIS>
    <vCOFINS>0.00</vCOFINS>
    <vOutro>0.00</vOutro>
    <vNF>2869.60</vNF>
    </ICMSTot>
    <retTrib />
    </total>
    + <transp>
    <modFrete>0</modFrete>
    - <transporta>
    <CNPJ>00980331000488</CNPJ>
    <xNome>THALE TRANSPORTES E LOG. LTDA</xNome>
    <IE>5963866160070</IE>
    <xEnder>ROD BR 459-KM 121 - DIST INDL, S/N</xEnder>
    <xMun>SANTA RITA DO SAPUCAI</xMun>
    <UF>MG</UF>
    </transporta>
    - <veicTransp>
    <placa>DPF8048</placa>
    <UF>SP</UF>
    </veicTransp>
    - <vol>
    <qVol>20</qVol>
    <esp>OUTROS</esp>
    <pesoL>64.000</pesoL>
    <pesoB>104.000</pesoB>
    </vol>
    </transp>
    + <cobr>
    - <fat>
    <nFat>000000395768</nFat>
    <vOrig>2869.60</vOrig>
    </fat>
    - <dup>
    <nDup>1</nDup>
    <dVenc>2012-11-20</dVenc>
    <vDup>2869.60</vDup>
    </dup>
    </cobr>
    - <infAdic>
    <infCpl>VIA DE TRANSPORTE RODOVIARIA CODIGO : 108061 PEDIDO NRO : ACIMA FABRICA :72480 REDESPACHO ATRAVES DE VELOCE LOGISTICA S/A ESTRADA DOS ALVARENGAS SAO BERNARDO CAMPO ASSUNCAO SP CNPJ : 10.299.567/0003-26 IE : 635.600.028.11 IPI - IMUNE CFE.ART.18, INCISO II, DO RIPI - DECRETO No.7.212/2010. REMESSA COM FIM ESPECIFICO DE EXPORTACAOMERC.A SER EXPORT.P/GENERAL MOTORS DO BRASIL LTDA.DECEX=3-0322/10-0007 ESTOCA-GEM TEMPORARIA NA VELOCE LOGISTICA S.A. ESTR.ALVARENGAS,4018 B.ASSUNCAO S.B.C. CNPJ10.299.567/0003-26IE.635.600.028.110 FT NR.431.183 ROMANEIO :131.588</infCpl>
    </infAdic>
    </infNFe>
    + <Signature xmlns="http://www.w3.org/2000/09/xmldsig#">
    - <SignedInfo>
    <CanonicalizationMethod Algorithm="http://www.w3.org/TR/2001/REC-xml-c14n-20010315" />
    <SignatureMethod Algorithm="http://www.w3.org/2000/09/xmldsig#rsa-sha1" />
    - <Reference URI="#NFe31121059106377000172550010003957681605366269">
    - <Transforms>
    <Transform Algorithm="http://www.w3.org/2000/09/xmldsig#enveloped-signature" />
    <Transform Algorithm="http://www.w3.org/TR/2001/REC-xml-c14n-20010315" />
    </Transforms>
    <DigestMethod Algorithm="http://www.w3.org/2000/09/xmldsig#sha1" />
    <DigestValue>jN2ozPH3/GvAS8Q5lh/t9bzuXCw=</DigestValue>
    </Reference>
    </SignedInfo>
    <SignatureValue>GAXPLvMCtIYdwMxXDcyL0kr5hCDPCFw8/uNYHFcdTMqBhLgIcEtzHRf8qioWlUVSHNf5jnCLKGjhDV4bEJqkcBhWsKouMzojQ+Z6hkFQAWNuJfPIzutmtRy3AePc5tHK0lI3tF3ws9memboJ8sW21IOWHB6eB0jK2gmhcOlDejc=</SignatureValue>
    - <KeyInfo>
    - <X509Data>
    <X509Certificate>MIIGajCCBVKgAwIBAgIIaHrIAHUBA4wwDQYJKoZIhvcNAQEFBQAwdTELMAkGA1UEBhMCQlIxEzARBgNVBAoTCklDUC1CcmFzaWwxNjA0BgNVBAsTLVNlY3JldGFyaWEgZGEgUmVjZWl0YSBGZWRlcmFsIGRvIEJyYXNpbCAtIFJGQjEZMBcGA1UEAxMQQUMgU0VSQVNBIFJGQiB2MTAeFw0xMTEwMjQxMzQ3MThaFw0xMjEwMjMxMzQ3MThaMIHuMQswCQYDVQQGEwJCUjELMAkGA1UECBMCTUcxHjAcBgNVBAcTFVNBTlRBIFJJVEEgRE8gU0FQVUNBSTETMBEGA1UEChMKSUNQLUJyYXNpbDE2MDQGA1UECxMtU2VjcmV0YXJpYSBkYSBSZWNlaXRhIEZlZGVyYWwgZG8gQnJhc2lsIC0gUkZCMRYwFAYDVQQLEw1SRkIgZS1DTlBKIEExMRIwEAYDVQQLEwlBUiBTRVJBU0ExOTA3BgNVBAMTME1FVEFHQUwgSU5EVVNUUklBIEUgQ09NRVJDSU8gTFREQTo1OTEwNjM3NzAwMDE3MjCBnzANBgkqhkiG9w0BAQEFAAOBjQAwgYkCgYEA0E4tWimBp7BdqbUbNQLK8NDkxMsqeEnJILklbGp7e0MghfjADGcV9z07B0t2KsAhlPAtx22D885rycUzVehoUisyB3a3Xfu3FqRB9ItXvEPDaLM/DtJrMu3xIWq60RzoSgnFyw8cNJ3hYJxloPm5exTc5kOHcQlNhsiLzzJLk4ECAwEAAaOCAwYwggMCMAkGA1UdEwQCMAAwDgYDVR0PAQH/BAQDAgXgMB0GA1UdJQQWMBQGCCsGAQUFBwMCBggrBgEFBQcDBDAfBgNVHSMEGDAWgBSa3SK29nfpQm9IwlFAoFbi83Q/uzCBuQYDVR0RBIGxMIGugRVHTklMQ0VATUVUQUdBTC5DT00uQlKgIQYFYEwBAwKgGBMWR0VPVkFOSSBEQSBTSUxWQSBOSUxDRaAZBgVgTAEDA6AQEw41OTEwNjM3NzAwMDE3MqA+BgVgTAEDBKA1EzMwMTExMTk2NzA3MzY3NTg3ODAzMDAwMDAwMDAwMDAwMDAwMTUuNzcwLjg2NDBTU1AgU1CgFwYFYEwBAwegDhMMMDAwMDAwMDAwMDAwMFcGA1UdIARQME4wTAYGYEwBAgENMEIwQAYIKwYBBQUHAgEWNGh0dHA6Ly93d3cuY2VydGlmaWNhZG9kaWdpdGFsLmNvbS5ici9yZXBvc2l0b3Jpby9kcGMwgfMGA1UdHwSB6zCB6DBKoEigRoZEaHR0cDovL3d3dy5jZXJ0aWZpY2Fkb2RpZ2l0YWwuY29tLmJyL3JlcG9zaXRvcmlvL2xjci9zZXJhc2FyZmJ2MS5jcmwwRKBCoECGPmh0dHA6Ly9sY3IuY2VydGlmaWNhZG9zLmNvbS5ici9yZXBvc2l0b3Jpby9sY3Ivc2VyYXNhcmZidjEuY3JsMFSgUqBQhk5odHRwOi8vcmVwb3NpdG9yaW8uaWNwYnJhc2lsLmdvdi5ici9sY3IvU2VyYXNhL3JlcG9zaXRvcmlvL2xjci9zZXJhc2FyZmJ2MS5jcmwwgZkGCCsGAQUFBwEBBIGMMIGJMEgGCCsGAQUFBzAChjxodHRwOi8vd3d3LmNlcnRpZmljYWRvZGlnaXRhbC5jb20uYnIvY2FkZWlhcy9zZXJhc2FyZmJ2MS5wN2IwPQYIKwYBBQUHMAGGMWh0dHA6Ly9vY3NwLmNlcnRpZmljYWRvZGlnaXRhbC5jb20uYnIvc2VyYXNhcmZidjEwDQYJKoZIhvcNAQEFBQADggEBAD70onZUzYAAUjK/j3b+d1VULHGPxmJU9sjfAa1QiCt1JniRTZITjXcw08pT/DMDmZRHOkWM0amQZtKKa6Oz9fg2Mv+aBoh0ERuC2XMTpdB0Kq04cY90zMJbteMvCzpUKIsT2wJDRZok1my+GyR3rUxLyHTfnqt1+f3o1DeRiGmldHIHHlv6MeVZeL82jfrw3kZnFi8k+rDGfywcfum9M66qfNqUv9fL/ibLVogzwg8WyErbbW1cAMqxv8rWNJHvNs8dbJOCBKaW4ZJDkO/8CpuvyKxSdS3OUdjuI1RAx9R0RBMemuv4h4S7rhOEhjkBB5hHFT5IeDded+oVzY3lpIU=</X509Certificate>
    </X509Data>
    </KeyInfo>
    </Signature>
    </NFe>
    - <protNFe versao="2.00">
    - <infProt>
    <tpAmb>1</tpAmb>
    <verAplic>13_0_32</verAplic>
    <chNFe>31121059106377000172550010003957681605366269</chNFe>
    <dhRecbto>2012-10-03T17:35:55</dhRecbto>
    <nProt>131120853536488</nProt>
    <digVal>jN2ozPH3/GvAS8Q5lh/t9bzuXCw=</digVal>
    <cStat>100</cStat>
    <xMotivo>Autorizado o uso da NF-e</xMotivo>
    </infProt>
    </protNFe>
    </nfeProc>

  • Compare two text files in Powershell and if a name is found in both files output content from file 2 to a 3rd text file

    Is it possible using PowerShell to compare the contents of two text files line by line and if a line is found output that line to a third text file?
    Lets say hypothetically someone asks us to search a text file named names1.txt and when a name is found in names1.txt we then pair that with the same name in the second text file called names2.txt
    lets say the names shown below are in names1.txt
    Bob
    Mike
    George
    Lets say the names and contents shown below are in names2.txt
    Lisa
    Jordan
    Mike 1112222
    Bob 8675309
    Don
    Joe
    Lets say we want names3.txt to contain the data shown below
    Mike 1112222
    Bob 8675309
    In vbscript I used search and replace commands to get part of the way there like this
    Set objFSO = CreateObject("Scripting.FileSystemObject")
    Set objFile = objFSO.OpenTextFile("testing.txt", ForReading)
    strText = objFile.ReadAll
    objFile.Close
    strNewText = Replace(strText, "Mike ", "Mike 1112222")
    Set objFile = objFSO.OpenTextFile("testing.txt", ForWriting)
    objFile.WriteLine strNewText
    objFile.Close
    That script works great when you know the name you are looking for and the correct values. Lets say someone gives you a list of 1000 employees and says import these names into a list in the correct format and one sheet has the correct names only and
    the other sheet has lots of extra names say 200000 and you only need the 1000 you are looking for in the format from names2.txt.

    Sure,
    Here's a simple one:
    $names1 = "C:\names1.txt"
    $names2 = "C:\names2.txt"
    $names3 = "C:\names3.txt"
    Get-Content $names1 | ForEach-Object {
    $names1_Line = $_
    Get-Content $names2 | Where-Object {$_.Contains($names1_Line)} | Out-File -FilePath $names3 -Append
    This basically just reads $names1 file, line by line, and then read $names2 file line by line as well.
    If the line being evaluated from $names2 file contains the line being evaluated from $names1 file, then the line from $names2 file gets output to $names3 file, appending to what's already there.
    This might need a few more tinkering to get it to perform faster etc depending on your requirements. For example:
    - If either $names1 or $names2 contain a lot of entries (in the region of hundreds) then it will be faster to load the whole content of $names2 into memory rather than opening the file, reading line by line, closing and then doing the same for every single
    line in $names1 (which is how it is currently works)
    - Make sure that your comparison is behaving as expected. The .Contains method always does a case sensitive comparison, this might not be what you are after.
    - You might want to put a condition to ignore blank lines or lines with spaces, else they'll also be brought over to $names3
    Hopefully this will get you started though and ask if you have further questions.
    Fausto

  • How to remove HTML encoding from csv file using powershell

    Hi guys i am exporting data from a sharepoint list using powershell which works fine. My problem is that some of the fields contain HTML mark up. Is there a way to remove all of the html mark up from the array before writing it to csv ?
    I have tried playing about with System.Web.HttpUtility.HtmlDecode but with no luck the code runs but no html is removed.
    Below is a sample of my script.
    Set-Variable HOME $env:USERPROFILE -Force
    (Get-PSProvider FileSystem).Home = $HOME
    if(-not(Get-PSSnapin | where { $_.Name -eq "Microsoft.SharePoint.PowerShell"}))
    Add-PSSnapin Microsoft.SharePoint.PowerShell;
    # imports assembly needed for url stuff to do
    Add-Type -AssemblyName System.Web
    $SPWeb = Get-SPWeb "http://site url"
    $SPList = $SPWeb.Lists["List Name"]
    $exportlist = @()
    $SPList.Items | foreach {
    $obj = New-Object PSObject -Property @{
    "Employee full name" = $_["EMPLOYEEFNAME"]
    "Employee login name" = $_["EMPLOYEENAME"]
    "Department Name" = $_["DEPARTMENT"]
    System.Web.HttpUtility.HtmlDecode("OBJECTIVESTOBEACHIEVED_0") = $_["F1_S4a_OBJECTIVESTOBEACHIEVED_0"]
    "OBJECTIVESTOBEACHIEVED_1" = $_["F1_S4a_OBJECTIVESTOBEACHIEVED_1"]
    "OBJECTIVESTOBEACHIEVED_2" = $_["F1_S4a_OBJECTIVESTOBEACHIEVED_2"]
    "OBJECTIVESTOBEACHIEVED_3" = $_["F1_S4a_OBJECTIVESTOBEACHIEVED_3"]
    "OBJECTIVESTOBEACHIEVED_4" = $_["F1_S4a_OBJECTIVESTOBEACHIEVED_4"]
    "OBJECTIVESTOBEACHIEVED_5" = $_["F1_S4a_OBJECTIVESTOBEACHIEVED_5"]
    "OBJECTIVESTOBEACHIEVED_6" = $_["F1_S4a_OBJECTIVESTOBEACHIEVED_6"]
    "OBJECTIVESTOBEACHIEVED_7" = $_["F1_S4a_OBJECTIVESTOBEACHIEVED_7"]
    $exportlist += $obj
    $exportlist | select "Employee full name", "Employee login name", "Department Name", "OBJECTIVESTOBEACHIEVED_0", "OBJECTIVESTOBEACHIEVED_1", "OBJECTIVESTOBEACHIEVED_2", "OBJECTIVESTOBEACHIEVED_3", "OBJECTIVESTOBEACHIEVED_4", "OBJECTIVESTOBEACHIEVED_5", "OBJECTIVESTOBEACHIEVED_6", "OBJECTIVESTOBEACHIEVED_7" | Export-Csv ~/Export.csv -noType
    $SPWeb.Dispose()
    any help would be much appreciated.

    Should have googled before posting !
    "OBJECTIVESTOBEACHIEVED_0" = [Microsoft.SharePoint.Utilities.SPHttpUtility]::ConvertSimpleHtmlToText($_["F1_S4a_OBJECTIVESTOBEACHIEVED_0"],-1) -replace '\s+', ' '
    Clears all the additional white space. As always jrv you
    have saved the day, many thanks for all the help.
    Mal

  • How can i read every"number" value and not only the 1st?

    in the vi only the first "number" value(from the data acquisition)is read in the small loop.how can i make every value to be read?every value must be compared with the "numeric" constant,and,if greater the led must turn on.afterwards,when a value which is less than the "numeric" is found the led must turn off.please answer or send mail to [email protected]
    Attachments:
    oximet5.vi ‏152 KB

    The more I look at your program, the less I understand it.
    Why are you setting number = number in a case if SO2 = ""? number always equals number.
    In your sequence frames 1 and 4, you have no control over which write (date or time) happens first. Just placing one function to the left of another doesn't make it happen first if your wiring doesn't create data dependency. Review the section on Data Dependency in chapter 5 of the LabView User's Manual.
    It looks you're writing to and reading from the same file using Write File and Read Lines from File.vi. Why read back data you just wrote? You have the data on your diagram. If you want to convert it from string to numeric, use a Sring/Number Conversion function from the String palette.
    On Read Lines from File.vi (which I'm not sure you even need), you should use a shift register for the start of read offset rather than a local variable for mark after read (chars.). With a shift register, you can initialize it to 0 when the VI starts. Using a local, if you restart the VI, it will try to startup from where you left off the last time, but you just opened the file for create or replace.
    In your sequence frames 2 and 3, why do you wait for the number to be less than the numeric before writing a carriage return to the file? Also, LabView has a End of Line constant which adapts to the expected value for the operating system. That's generally more flexible than a Carriage Return.
    It looks to me like you're overusing control refnums. You don't need to use a control refnum and a property node to set or read a control's value if you can wire directly to the control's terminal.
    I really don't understand what you're doing with pause and variants. I may be missing the point, but it looks like you made this much more complicated than it needs to be. Why not keep it a boolean?
    For your pause-path, you open the file but never close it. You can lose data that way. You also open it using open function 3, create new file. You'll get an error if the file already exists.
    On a general note, your diagram would be easier to read if you were more selective in how you routed your wires: you have wires on top of wires, wires running under sub-VIs, wires running in the frame of a while loop, wires running under labels.
    I think there's a temptation to overuse sequences. I don't think you need one here. As I mentioned in my earlier message, case structures and shift registers will be more useful to you.

Maybe you are looking for

  • Put sd card in DVD hole how do you get it out?

    Hi my Daughter puthte SD card in to the DVD slot bu mistake . Is there any way I can get it out safely or do I have to take the iMac in ?

  • Setting Up a Second Network Interface.

    Hi All, I am really quite new to Solaris and UltraSpac 10's, so please keep that in mind when responding. (newbie):D I have a Sun Ultra 10 which appears to have to ethernet ports. One of which I believe looks to be intergrated to the motherboard and

  • Ata is not loading

    Hai All, I am using OWB 9.2 and running a mapping which will load around 200MB data into a flat file from a view. After loading 160MB suddenly it got stopped loading but the mapping is still executing. I waited for 3 hours still it is running without

  • Photograph and PM01

    How to upload the Photograph of the applicant using     OOAD transaction? I need step by step procedure if possible. Give me any example for when we create a custom infotype using PM01?       Kindly and quickly do favour to me friends. Mark:5

  • How to hack flex applicaitons :(

    Is this possible. Say I have valuable infromation in my swf file compiled from FB3. Can people see the code? If don't let someone see my login screen in my flex program, can they still access the important functionality in that program somehow?