Script -asjob (modified bgping script)

Hello,
I liked very much the original bgping script (http://poshtips.com/bgping-a-high-performance-bulk-ping-utility/)
I thought, it would be nice to modify the script for query multiple servers (for example, get some information from them, like Serial Number, asap).
So..
Ive cut the lines i didnt need (IP address checking, results etc.. ) and left only JOB related lines..
Param ([int]$BatchSize=3)
#list of servers
$source = Get-Content .\servers_FQDN.txt
#scriptblock
$blok = {gwmi win32_bios |select serialnumber}
$elapsedTime = [system.diagnostics.stopwatch]::StartNew()
$result = @()
$itemCount = 0
## checking running jobs
if (get-job|? {$_.name -like "Script*"}){
write-host "ERROR: There are pending background jobs in this session:" -back red -fore white
get-job |? {$_.name -like "Script*"} | out-host
write-host "REQUIRED ACTION: Remove the jobs and restart this script" -back black -fore yellow
$yn = read-host "Automatically remove jobs now?"
if ($yn -eq "y"){
get-job|? {$_.name -like "Script*"}|% {remove-job $_}
write-host "jobs have been removed; please restart the script" -back black -fore green
exit
$offset = 0
## measure object
$itemCount = $source.count
Write-Host "Script will run against $itemcount servers!"
## Script start time mark
write-host " Script started at $(get-date -Format ("yyyy/MM/dd hh:mm:ss")) ".padright(100) -back darkgreen -fore white
write-host " (contains $itemCount unique entries)" -back black -fore green
$activeJobCount = 0
$totalJobCount = 0
write-host "Submitting background jobs..." -back black -fore yellow
## end of clean, verified code
for ($offset=0; $offset -le $itemCount;$offset += $batchSize){
$activeJobCount += 1; $totalJobCount += 1; $HostList = @()
$HostList += $source |select dnshostname |sort |get-unique |select -skip $offset -first $batchsize
$j = invoke-command -computername $($source) -scriptblock $blok -asjob
$j.name = "Script`:$totalJobCount`:$($offset+1)`:$($HostList.count)"
write-host "+" -back black -fore cyan -nonewline
write-host "`n$totaljobCount jobs submitted, checking for completed jobs..." -back black -fore yellow
$recCnt = 0
while (get-job |? {$_.name -like "Script*"}){
foreach ($j in get-job | ? {$_.name -like "Script*"}){
$temp = @()
if ($j.state -eq "completed"){
$temp = @()
$temp += receive-job $j
$result += $temp
remove-job $j
$ActiveJobCount -= 1
write-host "-" -back black -fore cyan -nonewline
if ($result.count -lt $itemCount){
sleep 3
write-host " "
write-host " Script finished at $(get-date) ".padright(60) -back darkgreen -fore white
write-host (" Number of hosts : {0}" -f $($result.count)) -back black -fore green
write-host (" Elapsed Time : {0}" -f $($ElapsedTime.Elapsed.ToString())) -back black -fore green
write-host "TEMP variable: $temp"
$result
write-host " Script completed all requested operations at $(get-date -Format ("yyyy/MM/dd hh:mm:ss")) ".padright(60) -back darkgreen -fore white
write-host (" Elapsed Time : {0}" -f $($ElapsedTime.Elapsed.ToString())) -back black -fore green
There is a bug i didnt discovered yet, the $result table containts $temp*number_of_jobs ... why?
If i run this script against 10servers, divided by 2 (batchsize 2), it would create 5 background jobs. While $temp contains serial number from every single server, $result will multiply it * 5.
I can use $temp as result as well, but original author used $result for a reason ...
I  think the problem starts with line
foreach($jinget-job|?{$_.name
-like"Script*"}){
...  can u correct this script, so i will get results only 1 time?  In case of 10 servers with batchsize of 2, it will create 5jobs containing 2 queries ...
Elapsed time is now ~24sec, i think it should be done in around 6..

.. ah.. figured it out..
wrong:
$HostList += $source |select dnshostname
|sort |get-unique
|select -skip $offset
-first $batchsize
 $j = invoke-command
-computername $($source)
-scriptblock $blok -asjob
correct:
$HostList += $source |sort |get-unique |select -skip $i -first $batchsize
$j = invoke-command -computername $Hostlist -scriptblock $blok -asjob

Similar Messages

  • Please assist with modifying this script to read in a list of servers and output results to excel

    Hello,
    I have an excellent script written by Brian Wilhite to gather the last logged in user using Powershell.
    http://gallery.technet.microsoft.com/scriptcenter/Get-LastLogon-Determining-283f98ae/view/Discussions#content
    My Powershell Fu is stumbling to modify this script to the following requirements:
    Currently the script must be loaded first into Powershell, then it can be executed.  I would rather edit the script in my Powershell ISE, add the .txt file to the script itself that gives a list of the servers I need to scan.  Then when I
    press Run Script in my ISE it executes.
    I would also like to output the results of the file to excel (.xls or .csv will do).  Currently the results look as follows:
    Computer : SVR01
    User : WILHITE\BRIAN
    SID : S-1-5-21-012345678-0123456789-012345678-012345
    Time : 9/20/2012 1:07:58 PM
    CurrentlyLoggedOn : False
    I would prefer it shows up like so:
    Computer User SID Time Currently LoggedOn
    SVR01 WILHITE\BRIAN S-1xxx 9/20/2012 1:07:58 PM FalseSRV02 WILHITE\BRIAN S-2xxx 9/26/2014 10:00:00 AM True
    Any help you can provide would be greatly appreciated.  I'll add the full script to the end of this post.
    Thank you.
    Function Get-LastLogon
    <#
    .SYNOPSIS
    This function will list the last user logged on or logged in.
    .DESCRIPTION
    This function will list the last user logged on or logged in. It will detect if the user is currently logged on
    via WMI or the Registry, depending on what version of Windows is running on the target. There is some "guess" work
    to determine what Domain the user truly belongs to if run against Vista NON SP1 and below, since the function
    is using the profile name initially to detect the user name. It then compares the profile name and the Security
    Entries (ACE-SDDL) to see if they are equal to determine Domain and if the profile is loaded via the Registry.
    .PARAMETER ComputerName
    A single Computer or an array of computer names. The default is localhost ($env:COMPUTERNAME).
    .PARAMETER FilterSID
    Filters a single SID from the results. For use if there is a service account commonly used.
    .PARAMETER WQLFilter
    Default WQLFilter defined for the Win32_UserProfile query, it is best to leave this alone, unless you know what
    you are doing.
    Default Value = "NOT SID = 'S-1-5-18' AND NOT SID = 'S-1-5-19' AND NOT SID = 'S-1-5-20'"
    .EXAMPLE
    $Servers = Get-Content "C:\ServerList.txt"
    Get-LastLogon -ComputerName $Servers
    This example will return the last logon information from all the servers in the C:\ServerList.txt file.
    Computer : SVR01
    User : WILHITE\BRIAN
    SID : S-1-5-21-012345678-0123456789-012345678-012345
    Time : 9/20/2012 1:07:58 PM
    CurrentlyLoggedOn : False
    Computer : SVR02
    User : WILIHTE\BRIAN
    SID : S-1-5-21-012345678-0123456789-012345678-012345
    Time : 9/20/2012 12:46:48 PM
    CurrentlyLoggedOn : True
    .EXAMPLE
    Get-LastLogon -ComputerName svr01, svr02 -FilterSID S-1-5-21-012345678-0123456789-012345678-012345
    This example will return the last logon information from all the servers in the C:\ServerList.txt file.
    Computer : SVR01
    User : WILHITE\ADMIN
    SID : S-1-5-21-012345678-0123456789-012345678-543210
    Time : 9/20/2012 1:07:58 PM
    CurrentlyLoggedOn : False
    Computer : SVR02
    User : WILIHTE\ADMIN
    SID : S-1-5-21-012345678-0123456789-012345678-543210
    Time : 9/20/2012 12:46:48 PM
    CurrentlyLoggedOn : True
    .LINK
    http://msdn.microsoft.com/en-us/library/windows/desktop/ee886409(v=vs.85).aspx
    http://msdn.microsoft.com/en-us/library/system.security.principal.securityidentifier.aspx
    .NOTES
    Author: Brian C. Wilhite
    Email: [email protected]
    Date: "09/20/2012"
    Updates: Added FilterSID Parameter
    Cleaned Up Code, defined fewer variables when creating PSObjects
    ToDo: Clean up the UserSID Translation, to continue even if the SID is local
    #>
    [CmdletBinding()]
    param(
    [Parameter(Position=0,ValueFromPipeline=$true)]
    [Alias("CN","Computer")]
    [String[]]$ComputerName="$env:COMPUTERNAME",
    [String]$FilterSID,
    [String]$WQLFilter="NOT SID = 'S-1-5-18' AND NOT SID = 'S-1-5-19' AND NOT SID = 'S-1-5-20'"
    Begin
    #Adjusting ErrorActionPreference to stop on all errors
    $TempErrAct = $ErrorActionPreference
    $ErrorActionPreference = "Stop"
    #Exclude Local System, Local Service & Network Service
    }#End Begin Script Block
    Process
    Foreach ($Computer in $ComputerName)
    $Computer = $Computer.ToUpper().Trim()
    Try
    #Querying Windows version to determine how to proceed.
    $Win32OS = Get-WmiObject -Class Win32_OperatingSystem -ComputerName $Computer
    $Build = $Win32OS.BuildNumber
    #Win32_UserProfile exist on Windows Vista and above
    If ($Build -ge 6001)
    If ($FilterSID)
    $WQLFilter = $WQLFilter + " AND NOT SID = `'$FilterSID`'"
    }#End If ($FilterSID)
    $Win32User = Get-WmiObject -Class Win32_UserProfile -Filter $WQLFilter -ComputerName $Computer
    $LastUser = $Win32User | Sort-Object -Property LastUseTime -Descending | Select-Object -First 1
    $Loaded = $LastUser.Loaded
    $Script:Time = ([WMI]'').ConvertToDateTime($LastUser.LastUseTime)
    #Convert SID to Account for friendly display
    $Script:UserSID = New-Object System.Security.Principal.SecurityIdentifier($LastUser.SID)
    $User = $Script:UserSID.Translate([System.Security.Principal.NTAccount])
    }#End If ($Build -ge 6001)
    If ($Build -le 6000)
    If ($Build -eq 2195)
    $SysDrv = $Win32OS.SystemDirectory.ToCharArray()[0] + ":"
    }#End If ($Build -eq 2195)
    Else
    $SysDrv = $Win32OS.SystemDrive
    }#End Else
    $SysDrv = $SysDrv.Replace(":","$")
    $Script:ProfLoc = "\\$Computer\$SysDrv\Documents and Settings"
    $Profiles = Get-ChildItem -Path $Script:ProfLoc
    $Script:NTUserDatLog = $Profiles | ForEach-Object -Process {$_.GetFiles("ntuser.dat.LOG")}
    #Function to grab last profile data, used for allowing -FilterSID to function properly.
    function GetLastProfData ($InstanceNumber)
    $Script:LastProf = ($Script:NTUserDatLog | Sort-Object -Property LastWriteTime -Descending)[$InstanceNumber]
    $Script:UserName = $Script:LastProf.DirectoryName.Replace("$Script:ProfLoc","").Trim("\").ToUpper()
    $Script:Time = $Script:LastProf.LastAccessTime
    #Getting the SID of the user from the file ACE to compare
    $Script:Sddl = $Script:LastProf.GetAccessControl().Sddl
    $Script:Sddl = $Script:Sddl.split("(") | Select-String -Pattern "[0-9]\)$" | Select-Object -First 1
    #Formatting SID, assuming the 6th entry will be the users SID.
    $Script:Sddl = $Script:Sddl.ToString().Split(";")[5].Trim(")")
    #Convert Account to SID to detect if profile is loaded via the remote registry
    $Script:TranSID = New-Object System.Security.Principal.NTAccount($Script:UserName)
    $Script:UserSID = $Script:TranSID.Translate([System.Security.Principal.SecurityIdentifier])
    }#End function GetLastProfData
    GetLastProfData -InstanceNumber 0
    #If the FilterSID equals the UserSID, rerun GetLastProfData and select the next instance
    If ($Script:UserSID -eq $FilterSID)
    GetLastProfData -InstanceNumber 1
    }#End If ($Script:UserSID -eq $FilterSID)
    #If the detected SID via Sddl matches the UserSID, then connect to the registry to detect currently loggedon.
    If ($Script:Sddl -eq $Script:UserSID)
    $Reg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey([Microsoft.Win32.RegistryHive]"Users",$Computer)
    $Loaded = $Reg.GetSubKeyNames() -contains $Script:UserSID.Value
    #Convert SID to Account for friendly display
    $Script:UserSID = New-Object System.Security.Principal.SecurityIdentifier($Script:UserSID)
    $User = $Script:UserSID.Translate([System.Security.Principal.NTAccount])
    }#End If ($Script:Sddl -eq $Script:UserSID)
    Else
    $User = $Script:UserName
    $Loaded = "Unknown"
    }#End Else
    }#End If ($Build -le 6000)
    #Creating Custom PSObject For Output
    New-Object -TypeName PSObject -Property @{
    Computer=$Computer
    User=$User
    SID=$Script:UserSID
    Time=$Script:Time
    CurrentlyLoggedOn=$Loaded
    } | Select-Object Computer, User, SID, Time, CurrentlyLoggedOn
    }#End Try
    Catch
    If ($_.Exception.Message -Like "*Some or all identity references could not be translated*")
    Write-Warning "Unable to Translate $Script:UserSID, try filtering the SID `nby using the -FilterSID parameter."
    Write-Warning "It may be that $Script:UserSID is local to $Computer, Unable to translate remote SID"
    Else
    Write-Warning $_
    }#End Catch
    }#End Foreach ($Computer in $ComputerName)
    }#End Process
    End
    #Resetting ErrorActionPref
    $ErrorActionPreference = $TempErrAct
    }#End End
    }# End Function Get-LastLogon

     This should work:
    Get-LastLogon -Computername (Get-content .\Servers.txt) | Export-CSV .\Output.csv -NoTypeInformation
    I just tested it on my test domain and it did the trick.

  • I need help to modify my AS script

    I have the following script and I would like to modify it:
    1.On this file I need to type the name of some video Albums in order to be displayed in the SWF file.
    2. What I wanr is that this file read the specific folder and read the directories which they will be the names of the Albums
    How can I do this?
    One more thing is that this file was created to work with Flas CS3 and I am trying to test it with CS5.
    I really appreciate the whole help I can get.
    I don't know anything about AS2 nor AS3, I only know hoe to modify the files by following comments and other samples from all around the web.
    Thanks and I hope someone can help me, I've been trying few thing but I just stuck. So I really need help.
    //  Set the path to the External Parameters file relative to the *.swf file.
    //  If this file cannot be found or if it contains errors, the
    //  Internal Parameters(the parameters below) will be used.
    //var ParametersFile = "MyControls.xml";
    var ParametersFile = "XML_Files/MyControls.xml";
    //  Set the path to the Theme file relative to the *.swf file.
    //  If this file cannot be found or if it contains errors, the
    //  Default Grey skin will be used instead.
    //  To learn how to edit Themes, please refer to the 'Help' folder.
    // next line commented by SAMY
    //var ThemeFile = "Theme.xml";
    // NEXT LINE ADDED BY SAMY
    var ThemeFile = "FLASH_DIR/3D_GALLERY/BlueTheme.xml";
    //  To learn more about how to add albums, please refer to the
    //  'Help' folder. This line says that replace and modify the name of the title album and the xml file which is as shown here
    var AlbumLabel_1 = "Pastor Alejandro Bullon";//<-- This is the typical line that I want to be input from the external folder name *
    //var AlbumDataFile_1 = "Videos/Alejandro_B/Alejandro_Bullon.xml";
    // next line for website configuration typical
    //var AlbumDataFile_1 = "Media/Media.xml";
    //next line works fine locally
    //var AlbumDataFile_1 = "Videos/Videos_website.xml";
    var AlbumDataFile_1 = "FLASH_DIR/3D_GALLERY/Videos/Alejandro_B/Alejandro_B.xml";
    var AlbumLabel_2 = "Pastor Stephen Bohr";// <--  *
    // next line commented by samy
    var AlbumDataFile_2 = "FLASH_DIR/3D_GALLERY/Videos/Stephen_B/Stephen_Bohr.xml";
    // next line added by samy for website configuration typical for all albums
    //var AlbumDataFile_2 = "Videos/Videos.xml";
    var AlbumLabel_3 = "Pastor Caleb Jara";
    var AlbumDataFile_3 = "FLASH_DIR/3D_GALLERY/Videos/Caleb_Jara/Caleb_Jara.xml";
    //var AlbumDataFile_3 = "City/City.xml";
    var AlbumLabel_4 = "Pastor Doug_Batchellor";
    var AlbumDataFile_4 = "FLASH_DIR/3D_GALLERY/Videos/Doug_B/Doug_Batchellor.xml";
    //var AlbumDataFile_4 = "FLASH_DIR/3D_GALLERY/City/City.xml";
    var AlbumLabel_5 = "Musica";
    var AlbumDataFile_5 = "FLASH_DIR/3D_GALLERY/Musica/Musica.xml";
    //var AlbumDataFile_5 = "Landscape/Landscape.xml";
    var AlbumLabel_6 = "Powerpoint";
    var AlbumDataFile_6 = "FLASH_DIR/3D_GALLERY/Powerpoint/Powerpoint.xml";
    var AlbumLabel_7 = "Escuela Sabatica '10";
    var AlbumDataFile_7 = "FLASH_DIR/3D_GALLERY/Escuela_Sab/Esc_Sab_2010.xml";
    var AlbumLabel_8 = "Escuela Sabatica '11";
    var AlbumDataFile_8 = "FLASH_DIR/3D_GALLERY/Escuela_Sab/Esc_Sab_2011.xml";
    var AlbumLabel_9 = "Test Nature";
    var AlbumDataFile_9 = "Nature/Nature.xml";
    //  Select wether to enable or disable error messages created
    //  due to 'file not found' , 'format not supported' or 'corrupted
    //  XML files' type of errors.
    //  Note: There error messages are automatically disabled when you
    //  export your *.swf file.
    var EnableErrorMessages = "yes";//[Yes  , No]
    //  Set parameters for items.
    var ItemWidth = 170;
    var ItemHeight= 130;
    var ShowItemNumber = "yes";
    //var ShowItemNumber = "no";
    //  Select fitting technique , stretch the thumb picture to fit the item
    //  or crop it from the top left.
    var ThumbFittingMethod = "stretch";
    //  Select what to do when the file preview is clicked, either to enlarge
    //  the preview or navigate to the URL provided for the current item in
    //  the XML data file of the current album
    var WhenPreviewIsClicked = "Enlarge";//[Enlarge  , GetUrl]
    //  Select the window target, '_blank' to open a new window or '_self' to
    //  navigate to the URL in the same window
    var WindowTarget = "_blank";
    //  Select wether to show the information of the item or not
    var ShowItemInfo = "yes";
    //  Select wether to show the albums menu or not
    var ShowAlbumsMenu = "yes";
    //  Select wether to show the video controller or not
    var ShowVideoController = "yes";
    //  Select wether to show the autoplay option or not
    //var ShowAutoplayButton="no";
    var ShowAutoplayButton="yes";
    //  Set the delay time for autoplay, this will be used for pictures only
    var AutoplayDelayTime = 5;
    //  Set the spinning speed of a single wheel
    //var WheelSpinningSpeed = 5;
    var WheelSpinningSpeed = 2;
    //  Select direction of scrolling of pages
    var DefaultDirection = "LeftToRight";
    //  Select wether you want to disable one of the wheels
    var DisableWheel = "none";
    //  Set the maximum number of items to be loaded on a single wheel
    var MaximumLoadOnEachWheel = 10;
    //  Select how you want the wheel to interact with the mouse
    //  Refer to the 'Help' folder for more information.
    var ScrollingStyle = "2";
    //  Select wether to enable tool tips or not.
    var EnableToolTips = "yes";
    //  Set the delay time for the tool tips to appear
    var ToolTipsDelayTime = 1;
    //  This is like a shortcut, set this parameter to 'Name' to display
    //  the name of the item as a tool tip.......
    var ToolTipsContent = "tooltips";//[ToolTips , Name , FileType]
    //  Select wether to enable or disable visual effects.
    var EnableDepthOfField = "yes";
    var EnableMotionBlur = "yes";
    Message was edited by: samy4movies

    This is a web-based app. And the application is for a carrousel video gallery.
    I already figure out the auto XML generator with php, but I think I want to get all in one process. Meaning that I only want to upload my videos and run php codes by themselves and not to worry in adding or modifying the *.fla file everytime that I insert a new folder ("Album").
    This is the link for the project I am working 
    http://anaheimspanish.net/index.php?option=com_content&view=article&id=98&Itemid=124
    It's called 3D Video Gallery, I bought the component through a website for flash components, but their support is not very good, that's why I want to fix as much as I need.
    Thanks in advance for your help
    If you need a full zip project to test it, let me know.

  • Cannot modify logic script on BPC 10.0

    Hello All,
    I need to change a simple logic in our existing logic script. I changed, validated and hit a save button. It says that "The Script Logic file has been updated". However, my change is not saved. When I open a script Logic after I saved, I see the original one only.
    I talked to BASIS team to check the PROCTIMEOUT on server and found that it has 3600. Which is two times higher than SAP recommended. There are about 200 lines in logic scripts.
    Could you please help why I can not modify the logic scripts?
    Thank you very much in advance!!!
    Thanks
    Indra

    Hi Indra,
    I dont think its related to the timeout settings of a DIA process. Could you please check if you have proper authorizations to Update the script Logic in BPC  ?
    Regards
    Ram

  • How can I modify this script to return only certain rows of my mySQL table?

    Hi there,
    I have a php script that accesses a mySQL database and it was generated out of the Flex Builder wizard automatically. The script works great and there are no problems with it. It allows me to perform CRUD on a table by calling it from my Flex app. and it retrieves all the data and puts it into a nice MXML format.
    My question, currently when I call "findAll" to retrieve all the data in the table, well, it retrieves ALL the rows in the table. That's fine, but my table is starting to grow really large with thousands of rows.
    I want to modify this script so that I can pass a variable into it from Flex so that it only retrieves the rows that match the "$subscriber_id" variable that I pass. In this way the results are not the entire table's data, only the rows that match 'subscriber_id'.
    I know how to pass a variable from Flex into php and the code on the php side to pick it up would look like this:
    $subscriberID = $_POST['subscriberID'];
    Can anyone shed light as to the proper code modification in "findAll" which will take my $subscriberID variable and compare it to the 'subscriber_id' field and then only return those rows that match? I think it has something to do with lines 98 to 101.
    Any help is appreciated.
    <?php
    require_once(dirname(__FILE__) . "/2257safeDBconn.php");
    require_once(dirname(__FILE__) . "/functions.inc.php");
    require_once(dirname(__FILE__) . "/XmlSerializer.class.php");
    * This is the main PHP file that process the HTTP parameters,
    * performs the basic db operations (FIND, INSERT, UPDATE, DELETE)
    * and then serialize the response in an XML format.
    * XmlSerializer uses a PEAR xml parser to generate an xml response.
    * this takes a php array and generates an xml according to the following rules:
    * - the root tag name is called "response"
    * - if the current value is a hash, generate a tagname with the key value, recurse inside
    * - if the current value is an array, generated tags with the default value "row"
    * for example, we have the following array:
    * $arr = array(
    *      "data" => array(
    *           array("id_pol" => 1, "name_pol" => "name 1"),
    *           array("id_pol" => 2, "name_pol" => "name 2")
    *      "metadata" => array(
    *           "pageNum" => 1,
    *           "totalRows" => 345
    * we will get an xml of the following form
    * <?xml version="1.0" encoding="ISO-8859-1"?>
    * <response>
    *   <data>
    *     <row>
    *       <id_pol>1</id_pol>
    *       <name_pol>name 1</name_pol>
    *     </row>
    *     <row>
    *       <id_pol>2</id_pol>
    *       <name_pol>name 2</name_pol>
    *     </row>
    *   </data>
    *   <metadata>
    *     <totalRows>345</totalRows>
    *     <pageNum>1</pageNum>
    *   </metadata>
    * </response>
    * Please notice that the generated server side code does not have any
    * specific authentication mechanism in place.
    * The filter field. This is the only field that we will do filtering after.
    $filter_field = "subscriber_id";
    * we need to escape the value, so we need to know what it is
    * possible values: text, long, int, double, date, defined
    $filter_type = "text";
    * constructs and executes a sql select query against the selected database
    * can take the following parameters:
    * $_REQUEST["orderField"] - the field by which we do the ordering. MUST appear inside $fields.
    * $_REQUEST["orderValue"] - ASC or DESC. If neither, the default value is ASC
    * $_REQUEST["filter"] - the filter value
    * $_REQUEST["pageNum"] - the page index
    * $_REQUEST["pageSize"] - the page size (number of rows to return)
    * if neither pageNum and pageSize appear, we do a full select, no limit
    * returns : an array of the form
    * array (
    *           data => array(
    *                array('field1' => "value1", "field2" => "value2")
    *           metadata => array(
    *                "pageNum" => page_index,
    *                "totalRows" => number_of_rows
    function findAll() {
         global $conn, $filter_field, $filter_type;
          * the list of fields in the table. We need this to check that the sent value for the ordering is indeed correct.
         $fields = array('id','subscriber_id','lastName','firstName','birthdate','gender');
         $where = "";
         if (@$_REQUEST['filter'] != "") {
              $where = "WHERE " . $filter_field . " LIKE " . GetSQLValueStringForSelect(@$_REQUEST["filter"], $filter_type);     
         $order = "";
         if (@$_REQUEST["orderField"] != "" && in_array(@$_REQUEST["orderField"], $fields)) {
              $order = "ORDER BY " . @$_REQUEST["orderField"] . " " . (in_array(@$_REQUEST["orderDirection"], array("ASC", "DESC")) ? @$_REQUEST["orderDirection"] : "ASC");
         //calculate the number of rows in this table
         $rscount = mysql_query("SELECT count(*) AS cnt FROM `modelName` $where");
         $row_rscount = mysql_fetch_assoc($rscount);
         $totalrows = (int) $row_rscount["cnt"];
         //get the page number, and the page size
         $pageNum = (int)@$_REQUEST["pageNum"];
         $pageSize = (int)@$_REQUEST["pageSize"];
         //calculate the start row for the limit clause
         $start = $pageNum * $pageSize;
         //construct the query, using the where and order condition
         $query_recordset = "SELECT id,subscriber_id,lastName,firstName,birthdate,gender FROM `modelName` $where $order";
         //if we use pagination, add the limit clause
         if ($pageNum >= 0 && $pageSize > 0) {     
              $query_recordset = sprintf("%s LIMIT %d, %d", $query_recordset, $start, $pageSize);
         $recordset = mysql_query($query_recordset, $conn);
         //if we have rows in the table, loop through them and fill the array
         $toret = array();
         while ($row_recordset = mysql_fetch_assoc($recordset)) {
              array_push($toret, $row_recordset);
         //create the standard response structure
         $toret = array(
              "data" => $toret,
              "metadata" => array (
                   "totalRows" => $totalrows,
                   "pageNum" => $pageNum
         return $toret;
    * constructs and executes a sql count query against the selected database
    * can take the following parameters:
    * $_REQUEST["filter"] - the filter value
    * returns : an array of the form
    * array (
    *           data => number_of_rows,
    *           metadata => array()
    function rowCount() {
         global $conn, $filter_field, $filter_type;
         $where = "";
         if (@$_REQUEST['filter'] != "") {
              $where = "WHERE " . $filter_field . " LIKE " . GetSQLValueStringForSelect(@$_REQUEST["filter"], $filter_type);     
         //calculate the number of rows in this table
         $rscount = mysql_query("SELECT count(*) AS cnt FROM `modelName` $where");
         $row_rscount = mysql_fetch_assoc($rscount);
         $totalrows = (int) $row_rscount["cnt"];
         //create the standard response structure
         $toret = array(
              "data" => $totalrows,
              "metadata" => array()
         return $toret;

    Hi there,
    I have a php script that accesses a mySQL database and it was generated out of the Flex Builder wizard automatically. The script works great and there are no problems with it. It allows me to perform CRUD on a table by calling it from my Flex app. and it retrieves all the data and puts it into a nice MXML format.
    My question, currently when I call "findAll" to retrieve all the data in the table, well, it retrieves ALL the rows in the table. That's fine, but my table is starting to grow really large with thousands of rows.
    I want to modify this script so that I can pass a variable into it from Flex so that it only retrieves the rows that match the "$subscriber_id" variable that I pass. In this way the results are not the entire table's data, only the rows that match 'subscriber_id'.
    I know how to pass a variable from Flex into php and the code on the php side to pick it up would look like this:
    $subscriberID = $_POST['subscriberID'];
    Can anyone shed light as to the proper code modification in "findAll" which will take my $subscriberID variable and compare it to the 'subscriber_id' field and then only return those rows that match? I think it has something to do with lines 98 to 101.
    Any help is appreciated.
    <?php
    require_once(dirname(__FILE__) . "/2257safeDBconn.php");
    require_once(dirname(__FILE__) . "/functions.inc.php");
    require_once(dirname(__FILE__) . "/XmlSerializer.class.php");
    * This is the main PHP file that process the HTTP parameters,
    * performs the basic db operations (FIND, INSERT, UPDATE, DELETE)
    * and then serialize the response in an XML format.
    * XmlSerializer uses a PEAR xml parser to generate an xml response.
    * this takes a php array and generates an xml according to the following rules:
    * - the root tag name is called "response"
    * - if the current value is a hash, generate a tagname with the key value, recurse inside
    * - if the current value is an array, generated tags with the default value "row"
    * for example, we have the following array:
    * $arr = array(
    *      "data" => array(
    *           array("id_pol" => 1, "name_pol" => "name 1"),
    *           array("id_pol" => 2, "name_pol" => "name 2")
    *      "metadata" => array(
    *           "pageNum" => 1,
    *           "totalRows" => 345
    * we will get an xml of the following form
    * <?xml version="1.0" encoding="ISO-8859-1"?>
    * <response>
    *   <data>
    *     <row>
    *       <id_pol>1</id_pol>
    *       <name_pol>name 1</name_pol>
    *     </row>
    *     <row>
    *       <id_pol>2</id_pol>
    *       <name_pol>name 2</name_pol>
    *     </row>
    *   </data>
    *   <metadata>
    *     <totalRows>345</totalRows>
    *     <pageNum>1</pageNum>
    *   </metadata>
    * </response>
    * Please notice that the generated server side code does not have any
    * specific authentication mechanism in place.
    * The filter field. This is the only field that we will do filtering after.
    $filter_field = "subscriber_id";
    * we need to escape the value, so we need to know what it is
    * possible values: text, long, int, double, date, defined
    $filter_type = "text";
    * constructs and executes a sql select query against the selected database
    * can take the following parameters:
    * $_REQUEST["orderField"] - the field by which we do the ordering. MUST appear inside $fields.
    * $_REQUEST["orderValue"] - ASC or DESC. If neither, the default value is ASC
    * $_REQUEST["filter"] - the filter value
    * $_REQUEST["pageNum"] - the page index
    * $_REQUEST["pageSize"] - the page size (number of rows to return)
    * if neither pageNum and pageSize appear, we do a full select, no limit
    * returns : an array of the form
    * array (
    *           data => array(
    *                array('field1' => "value1", "field2" => "value2")
    *           metadata => array(
    *                "pageNum" => page_index,
    *                "totalRows" => number_of_rows
    function findAll() {
         global $conn, $filter_field, $filter_type;
          * the list of fields in the table. We need this to check that the sent value for the ordering is indeed correct.
         $fields = array('id','subscriber_id','lastName','firstName','birthdate','gender');
         $where = "";
         if (@$_REQUEST['filter'] != "") {
              $where = "WHERE " . $filter_field . " LIKE " . GetSQLValueStringForSelect(@$_REQUEST["filter"], $filter_type);     
         $order = "";
         if (@$_REQUEST["orderField"] != "" && in_array(@$_REQUEST["orderField"], $fields)) {
              $order = "ORDER BY " . @$_REQUEST["orderField"] . " " . (in_array(@$_REQUEST["orderDirection"], array("ASC", "DESC")) ? @$_REQUEST["orderDirection"] : "ASC");
         //calculate the number of rows in this table
         $rscount = mysql_query("SELECT count(*) AS cnt FROM `modelName` $where");
         $row_rscount = mysql_fetch_assoc($rscount);
         $totalrows = (int) $row_rscount["cnt"];
         //get the page number, and the page size
         $pageNum = (int)@$_REQUEST["pageNum"];
         $pageSize = (int)@$_REQUEST["pageSize"];
         //calculate the start row for the limit clause
         $start = $pageNum * $pageSize;
         //construct the query, using the where and order condition
         $query_recordset = "SELECT id,subscriber_id,lastName,firstName,birthdate,gender FROM `modelName` $where $order";
         //if we use pagination, add the limit clause
         if ($pageNum >= 0 && $pageSize > 0) {     
              $query_recordset = sprintf("%s LIMIT %d, %d", $query_recordset, $start, $pageSize);
         $recordset = mysql_query($query_recordset, $conn);
         //if we have rows in the table, loop through them and fill the array
         $toret = array();
         while ($row_recordset = mysql_fetch_assoc($recordset)) {
              array_push($toret, $row_recordset);
         //create the standard response structure
         $toret = array(
              "data" => $toret,
              "metadata" => array (
                   "totalRows" => $totalrows,
                   "pageNum" => $pageNum
         return $toret;
    * constructs and executes a sql count query against the selected database
    * can take the following parameters:
    * $_REQUEST["filter"] - the filter value
    * returns : an array of the form
    * array (
    *           data => number_of_rows,
    *           metadata => array()
    function rowCount() {
         global $conn, $filter_field, $filter_type;
         $where = "";
         if (@$_REQUEST['filter'] != "") {
              $where = "WHERE " . $filter_field . " LIKE " . GetSQLValueStringForSelect(@$_REQUEST["filter"], $filter_type);     
         //calculate the number of rows in this table
         $rscount = mysql_query("SELECT count(*) AS cnt FROM `modelName` $where");
         $row_rscount = mysql_fetch_assoc($rscount);
         $totalrows = (int) $row_rscount["cnt"];
         //create the standard response structure
         $toret = array(
              "data" => $totalrows,
              "metadata" => array()
         return $toret;

  • Can someone help me modify a script file?

    Hi Everyone!
    I am working on a couple videos in AE that need subtitles.
    I found this script that would work for me really well:
    // Subtitle generator by !Rocky
    // modified by Colin Harman ( http://colinharman.com/ ) to work on a Mac
    // Save this code as
    // "subtitles.jsx"
    // Create a text file with your subtitles.
    // Each line of text is one on-screen line.
    // To have several lines on-screen at the same time,
    // simply separate them with a pipe ( | ) character.
    // eg "Character 1 talks|Character 2 interrupts"
    // Create a new text layer in your comp, adjust its position,
    // make sure the text's centered, so it looks nice
    // Add markers (Numpad *) where each subtitle line must be shown/hidden.
    // With the text layer selected, run the script, and select the subtitles file.
    // Enjoy!
    function makeSubs() {
      var layer = app.project.activeItem.selectedLayers[0];
      if (layer.property("sourceText") != null) {
       var textFile = File.openDialog("Select a text file to open.", "");
       if (textFile != null) {
        var textLines = new Array();
        textFile.open("r", "TEXT", "????");
        while (!textFile.eof)
         textLines[textLines.length] = textFile.readln();
        textFile.close();
        var sourceText = layer.property("sourceText");
        var markers = layer.property("marker");
        for (var i = sourceText.numKeys; i >= 1; i--)
         sourceText.removeKey(i);
        var line = 0;
        var subTime, subText;
        for (var i = 1; i <= markers.numKeys; i++) {
         subTime = markers.keyTime(i);
         sourceText.setValueAtTime(0, " ");
         if ((i % 2) == 0) {
          subText = " ";
         else {
          subText = textLines[line].replace("|", "\x0d\x0a");
          line++;
         sourceText.setValueAtTime(subTime, new TextDocument(subText));
    makeSubs();
    Except there is one problem. With this script, the first marker makes the first line show, the second marker makes the first line disappear and the third marker makes the second line show. (etc..)
    Can this be modified in a way that the first marker would make the first line show, the second marker would make the second line show (etc..) As I don't need "empty" subtitles in this case.
    Thanks,
    Daniel

    Hi,
    I get an error at line 50 but it seems to be working fine. I will test it on one of the videos later today.
    Thanks for the help!

  • How to assign modified forms (SAP scripts etc.,) to Dunning texts

    Hello, how do I assign forms (SAP scripts etc.,) to dunning texts according to no 5 above?
    I've copied the form f150_dunn_01 and modified it. Now I want to assign my modified form to dunning text. In txn fbmp I have created new procedure and when I clicked on Dunning texts it takes me to dunning text screen where I should be able to assign my modified form. But the screen is greyed out and not editable. Is there anything I'm missing or I've permission issue?
    Please help
    Regards
    Naz
    Most of the config for Dunning is done with TCode FBMP.
    Key Steps are :-
    1.     Define the dunning procedure or use the standard
    2.     Define the dunning levels (desired stages of reminder)
    3.     Define the dunning interval (frequency of reminder)
    4.     Line item grace periods & Min.days in arrears (acct) govern the criteria for picking items to be dunned along with Minimum amounts
    5.     Assign forms (SAP scritps etc.,) to dunning texts. Forms contain the contents of the reminder notice and the format of notice.
    6.     You can include the Sp GL Transactions as well.
    7.     Customer/Vendor Master Records needs to be updated with the Dunning
    8.     Procedure etc, Dunning tab in the master data
    9.     Use F150 to run.

    Hi Praveen,
    Ok let me go through the steps that I have done so for.
    1. New dunning procedure 004 been created in Txn FBMP
    2. Then by choosing the assigned dunning procedure 004, trying to update the form by selecting the company code and then Dunning text. But the screen is grayed out.
    Please pass me your email address then I will send you the screen shoots.
    Thanks
    Naz

  • Modify and add new field in standard Purchase Order script

    Hi All,
           I have some problem with Modify and add some new fields in Standard Purchase Order script. From ME22n transaction code it display PO detail. In <b>item detail</b> with <b>condition</b> tab all data will display in Currently PO script output.
           But I want to display ME22n->Item Detail->Invoice->Taxes Data. how to display this data in standard PO script. All data are fetch from <b>Structure</b> like ( KOMV,KOMVD..etc) then How it can be Display?
    Waiting for Replay.
    Himanshu Patel.

    Work with an Abaper.Tell your requirement [ addition of a field] and ask him to include this field by using the functionality " Field Exit".

  • Script to modify multiple files

    This script is supposed to open, modify, save, and close all files within a chosen folder.
    There are a couple problems with the script
    The files are "legacy" .eps and .ai, so a save dialog pops up. I want to save the file and move on, suppressing dialogs if possible
    Only the first file is opened
    If I comment out the line inside the for loop that runs the convertToUncoated() function, all of the files will open. If there is a way to upload the test files I'll gladly do that.
    #target illustrator
    * Author: Jeff Hines
    * ThankYou: http://forums.adobe.com/people/elDudereno
    * Brief: Converts Pantone 143 and 287 to their uncoated equivalent
    var useFolder = false;
    try {
        // If we have an open document, let's work on that
        sourceDoc = app.activeDocument;
        convertToUncoated(sourceDoc);
    catch(err) {
        useFolder = true;
    if(useFolder != false) {
        // Choose a folder, gets stored here
        sourceFolder = Folder.selectDialog("Please choose the folder.");
        // Array of files
        files = sourceFolder.getFiles();
        // Number of files
        filesLength = files.length;
        // Loop through files
        for(i=0; i<filesLength; i++) {
            // Current document
            sourceDoc = app.open(files[i]);
            // Run the conversion
            convertToUncoated(sourceDoc);
            // Try closing the document
            sourceDoc.close(SaveOptions.SAVECHANGES);
    function convertToUncoated(theDocument) {
        var docRef = theDocument;
        with (docRef) {
            // Unlock layers
            for(i=0; i<layers.length; i++) {
                layer = layers[i];
                if(layer.locked == true)
                    layer.locked = false;
            try {
                // New color: 143 U
                newColor143 = new LabColor();
                newColor143.l = 74;
                newColor143.a = 30;
                newColor143.b = 56;           
                // New spot color, gets added to swatches
                newSpot143 = spots.add();
                newSpot143.name = 'PANTONE 143 U';
                newSpot143.colorType = ColorModel.SPOT;
                newSpot143.color = newColor143;
            catch(err) {
                newSpot143.remove();
            // Get the replacement color for 143
            replacement143 = swatches.getByName('PANTONE 143 U').color;
            try {
                // New color: 287 U
                newColor287 = new LabColor();
                newColor287.l = 38;
                newColor287.a = 4;
                newColor287.b = -38;
                // New spot color, gets added to swatches
                newSpot287 = spots.add();
                newSpot287.name = 'PANTONE 287 U';
                newSpot287.colorType = ColorModel.SPOT;
                newSpot287.color = newColor287;
            catch(err) {
                newSpot287.remove();
            // Get the replacement color for 287
            replacement287 = swatches.getByName('PANTONE 287 U').color;
            // Loop through paths
            for (i=0; i<pathItems.length; i++) {
                with (pathItems[i]) {
                    if (filled == true && fillColor instanceof SpotColor) {
                        if (fillColor.spot.name == 'PANTONE 143') fillColor = replacement143;    
                        if (fillColor.spot.name == 'PANTONE 287') fillColor = replacement287;
                    if (stroked == true && strokeColor instanceof SpotColor) {
                        if (strokeColor.spot.name == 'PANTONE 143') strokeColor = replacement143;
                        if (strokeColor.spot.name == 'PANTONE 287') strokeColor = replacement287;
            // Loop through stories
            for (j=0; j<stories.length; j++) {
                with (stories[j]) {
                    for (var k = 0; k < characters.length; k++) {
                        with (characters[k].characterAttributes) {
                            if (fillColor instanceof SpotColor) {
                                if (fillColor.spot.name == 'PANTONE 143') fillColor = replacement143;
                                if (fillColor.spot.name == 'PANTONE 287') fillColor = replacement287;
                            if (strokeColor instanceof SpotColor) {
                                if (strokeColor.spot.name == 'PANTONE 143') strokeColor = replacement143;
                                if (strokeColor.spot.name == 'PANTONE 287') strokeColor = replacement287;
            try {
                // Remove old swatches here
                swatches.getByName('PANTONE 143').remove();
                swatches.getByName('PANTONE 287').remove();
            catch(err) {}
            return true;

    * ThankYou: http://forums.adobe.com/people/Muppet%20Mark && http://forums.adobe.com/people/elDudereno
    Mostly Mark. I just pointed the way. Good job Jeff.

  • Help modifying a powershell script

    Hello,
    I have recently been given a task to write/find a script that is capable of performing Full and Incremental backups. I found a script that does exactly what I need, however, it requires user input. I need this to be a scheduled task and therefore I need
    the input to be a static path. Here is the script I am talking about:
    #region Params
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$false,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $HashPath,
    [Parameter(Position=3, Mandatory=$false,ValueFromPipeline=$false)]
    [ValidateSet("Full","Incremental","Differential")] 
    [System.String]
    $BackupType="Full",
    [Parameter(Position=4, Mandatory=$false,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]  
    [System.String]
    $LogFile=".\Backup-Files.log",
    [Parameter(Position=5, Mandatory=$false,ValueFromPipeline=$false)]
    [System.Management.Automation.SwitchParameter]
    $SwitchToFull
    #endregion 
    begin{
    function Write-Log
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $Message,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $LogFile
    #endregion
    try{
    Write-Host $Message
    Out-File -InputObject $Message -Append $LogFile
    catch {throw $_}
    function Get-Hash 
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $HashTarget,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateSet("File","String")]
    [System.String]
    $HashType
    #endregion
    begin{
    try{ $objGetHashMD5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider } 
    catch {throw $_ }
    process{
    try {
    #Checking hash target is file or just string
    switch($HashType){
    "String" {
    $objGetHashUtf8 = New-Object -TypeName System.Text.UTF8Encoding
    $arrayGetHashHash = $objGetHashMD5.ComputeHash($objGetHashUtf8.GetBytes($HashTarget.ToUpper()))
    break
    "File" {
    $arrayGetHashHash = $objGetHashMD5.ComputeHash([System.IO.File]::ReadAllBytes($HashTarget))
    break
    #Return hash
    Write-Output $([System.Convert]::ToBase64String($arrayGetHashHash))
    catch { throw $_ }
    function Copy-File
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Any'})] 
    [System.String]
    $SourceFile,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()] 
    [System.String]
    $DestFile
    #endregion
    try{
    #The script fails when folder being copied to file. So the item will be removed to avoid the error.
    if(Test-Path -LiteralPath $DestFile -PathType Any){
    Remove-Item -LiteralPath $DestFile -Force -Recurse
    #Creating destination if doesn't exist. It's required because Copy-Item doesn't create destination folder
    if(Test-Path -LiteralPath $SourceFile -PathType Leaf){
    New-Item -ItemType "File" -Path $DestFile -Force
    #Copying file to destination directory
    Copy-Item -LiteralPath $SourceFile -Destination $DestFile -Force
    catch{ throw $_ }
    function Backup-Files
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()] 
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNull()] 
    [System.Collections.Hashtable]
    $HashTable
    #endregion
    try{
    $xmlBackupFilesHashFile = $HashTable
    Write-Host "Backup started" 
    Get-ChildItem -Recurse -Path $SourceDir|ForEach-Object{
    $currentBackupFilesItem = $_
    #Full path to source and destination item
    $strBackupFilesSourceFullPath = $currentBackupFilesItem.FullName
    $strBackupFilesDestFullPath = $currentBackupFilesItem.FullName.Replace($SourceDir,$DestDir)
    #Checking that the current item is file and not directory. True - the item is file. 
    $bBackupFilesFile = $($($currentBackupFilesItem.Attributes -band [System.IO.FileAttributes]::Directory) -ne [System.IO.FileAttributes]::Directory)
    Write-Host -NoNewline ">>>Processing item $strBackupFilesSourceFullPath..."
    #Generating path hash
    $hashBackupFilesPath = $(Get-Hash -HashTarget $strBackupFilesSourceFullPath -HashType "String")
    $hashBackupFilesFile = "d"
    #If the item is file then generate hash for file content
    if($bBackupFilesFile){
    $hashBackupFilesFile = $(Get-Hash -HashTarget $strBackupFilesSourceFullPath -HashType "File")
    #Checking that the file has been copied
    if($xmlBackupFilesHashFile[$hashBackupFilesPath] -ne $hashBackupFilesFile){
    Write-Host -NoNewline $("hash changed=>$hashBackupFilesFile...")
    Copy-File -SourceFile $strBackupFilesSourceFullPath $strBackupFilesDestFullPath|Out-Null
    #Returning result
    Write-Output @{$hashBackupFilesPath=$hashBackupFilesFile}
    else{
    Write-Host -NoNewline "not changed..."
    Write-Host "done"
    Write-Host "Backup completed"
    catch { throw $_ }
    function Backup-Full
    [CmdletBinding()]
    [OutputType([System.String])]
    #region Params
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()] 
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()] 
    [System.String]
    $HashFile,
    [Parameter(Position=3, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]  
    [System.String]
    $ChainKey
    #endregion
    try{
    #Creating an empty hash table
    $xmlBackupFullHashFile = @{}
    #Starting directory lookup 
    $uintBackupFullCount = 0
    Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$ChainKey\Full_" + $(Get-Date -Format "ddMMyyyy")) -HashTable $xmlBackupFullHashFile|`
    ForEach-Object{ 
    $xmlBackupFullHashFile.Add([string]$_.Keys,[string]$_.Values) 
    $uintBackupFullCount++
    #Saving chain key.
    $xmlBackupFullHashFile.Add("ChainKey",$ChainKey)
    Write-Host -NoNewline "Saving XML file to $HashFile..."
    Export-Clixml -Path $HashFile -InputObject $xmlBackupFullHashFile -Force
    Write-Host "done"
    Write-Output $uintBackupFullCount
    catch { throw $_ }
    function Backup-Diff
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()] 
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'leaf'})]
    [System.String]
    $HashFile
    #endregion
    try{
    #Loading hash table
    $xmlBackupDiffHashFile = Import-Clixml $HashFile
    $chainKeyBackupDiffDifferential = $xmlBackupDiffHashFile["ChainKey"]
    $uintBackupDiffCount = 0
    #Starting directory lookup 
    Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$chainKeyBackupDiffDifferential\Differential_" + $(Get-Date -Format "ddMMyyyy.HHmm")) -HashTable $xmlBackupDiffHashFile|`
    ForEach-Object{ $uintBackupDiffCount++ }
    Write-Output $uintBackupDiffCount
    catch { throw $_ }
    function Backup-Inc
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()] 
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'leaf'})]
    [System.String]
    $HashFile
    #endregion
    try{
    #Loading hash table
    $xmlBackupIncHashFile = Import-Clixml $HashFile
    $chainKeyBackupIncIncremental = $xmlBackupIncHashFile["ChainKey"]
    $uintBackupIncCount = 0
    #Starting directory lookup 
    Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$chainKeyBackupIncIncremental\Incremental_" + $(Get-Date -Format "ddMMyyyy.HHmm")) -HashTable $xmlBackupIncHashFile|`
    ForEach-Object{ 
    $xmlBackupIncHashFile[[string]$_.Keys]=[string]$_.Values
    $uintBackupIncCount++
    Write-Host -NoNewline "Saving XML file to $HashFile..."
    Export-Clixml -Path $HashFile -InputObject $xmlBackupIncHashFile -Force
    Write-Host "Done"
    Write-Output $uintBackupIncCount
    catch { throw $_ }
    #0 - is OK. 1 - some error
    $exitValue=0
    process{
    try{
    $filesCopied=0
    $strSourceFolderName = $(Get-Item $SourceDir).Name
    $strHasFile = $("$HashPath\Hash_$strSourceFolderName.xml")
    $strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir started")
    #Automatically switch to full backup
    $bSwitch = $(!$(Test-Path -LiteralPath $strHasFile -PathType "Leaf") -and $SwitchToFull)
    Write-Log -Message $strMessage -LogFile $LogFile
    switch($true){
    $($BackupType -eq "Full" -or $bSwitch) {
    $filesCopied = Backup-Full -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile -ChainKey $("Backup_$strSourceFolderName" + "_" + $(Get-Date -Format "ddMMyyyy"))
    break
    $($BackupType -eq "Incremental") {
    $filesCopied = Backup-Inc -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile 
    break
    $($BackupType -eq "Differential") {
    $filesCopied = Backup-Diff -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile 
    break
    $strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir completed successfully. $filesCopied items were copied.")
    Write-Log -Message $strMessage -LogFile $LogFile
    Write-Output $filesCopied
    catch { 
    $strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir failed:" + $_)
    Write-Log -Message $strMessage -LogFile $LogFile
    $exitValue = 1
    end{exit $exitValue}
    I have some experience writing Powershell scripts,but I am lost at how this script prompts for Source and Destination paths. I tried modifying the Param section, but this didnt work and up until now I thought the only way you could get a prompt was with
    "read-host". Any and all education on this matter would be greatly appreciated. (Side note: I have posted this question  on the forum in which I found it and have not got an answer yet).
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$false,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $HashPath,
    [Parameter(Position=3, Mandatory=$false,ValueFromPipeline=$false)]
    [ValidateSet("Full","Incremental","Differential")]
    [System.String]
    $BackupType="Full",
    [Parameter(Position=4, Mandatory=$false,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $LogFile=".\Backup-Files.log",
    [Parameter(Position=5, Mandatory=$false,ValueFromPipeline=$false)]
    [System.Management.Automation.SwitchParameter]
    $SwitchToFull
    #endregion
    begin{
    function Write-Log
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $Message,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $LogFile
    #endregion
    try{
    Write-Host $Message
    Out-File -InputObject $Message -Append $LogFile
    catch {throw $_}
    function Get-Hash
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $HashTarget,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateSet("File","String")]
    [System.String]
    $HashType
    #endregion
    begin{
    try{ $objGetHashMD5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider }
    catch {throw $_ }
    process{
    try {
    #Checking hash target is file or just string
    switch($HashType){
    "String" {
    $objGetHashUtf8 = New-Object -TypeName System.Text.UTF8Encoding
    $arrayGetHashHash = $objGetHashMD5.ComputeHash($objGetHashUtf8.GetBytes($HashTarget.ToUpper()))
    break
    "File" {
    $arrayGetHashHash = $objGetHashMD5.ComputeHash([System.IO.File]::ReadAllBytes($HashTarget))
    break
    #Return hash
    Write-Output $([System.Convert]::ToBase64String($arrayGetHashHash))
    catch { throw $_ }
    function Copy-File
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Any'})]
    [System.String]
    $SourceFile,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $DestFile
    #endregion
    try{
    #The script fails when folder being copied to file. So the item will be removed to avoid the error.
    if(Test-Path -LiteralPath $DestFile -PathType Any){
    Remove-Item -LiteralPath $DestFile -Force -Recurse
    #Creating destination if doesn't exist. It's required because Copy-Item doesn't create destination folder
    if(Test-Path -LiteralPath $SourceFile -PathType Leaf){
    New-Item -ItemType "File" -Path $DestFile -Force
    #Copying file to destination directory
    Copy-Item -LiteralPath $SourceFile -Destination $DestFile -Force
    catch{ throw $_ }
    function Backup-Files
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNull()]
    [System.Collections.Hashtable]
    $HashTable
    #endregion
    try{
    $xmlBackupFilesHashFile = $HashTable
    Write-Host "Backup started"
    Get-ChildItem -Recurse -Path $SourceDir|ForEach-Object{
    $currentBackupFilesItem = $_
    #Full path to source and destination item
    $strBackupFilesSourceFullPath = $currentBackupFilesItem.FullName
    $strBackupFilesDestFullPath = $currentBackupFilesItem.FullName.Replace($SourceDir,$DestDir)
    #Checking that the current item is file and not directory. True - the item is file.
    $bBackupFilesFile = $($($currentBackupFilesItem.Attributes -band [System.IO.FileAttributes]::Directory) -ne [System.IO.FileAttributes]::Directory)
    Write-Host -NoNewline ">>>Processing item $strBackupFilesSourceFullPath..."
    #Generating path hash
    $hashBackupFilesPath = $(Get-Hash -HashTarget $strBackupFilesSourceFullPath -HashType "String")
    $hashBackupFilesFile = "d"
    #If the item is file then generate hash for file content
    if($bBackupFilesFile){
    $hashBackupFilesFile = $(Get-Hash -HashTarget $strBackupFilesSourceFullPath -HashType "File")
    #Checking that the file has been copied
    if($xmlBackupFilesHashFile[$hashBackupFilesPath] -ne $hashBackupFilesFile){
    Write-Host -NoNewline $("hash changed=>$hashBackupFilesFile...")
    Copy-File -SourceFile $strBackupFilesSourceFullPath $strBackupFilesDestFullPath|Out-Null
    #Returning result
    Write-Output @{$hashBackupFilesPath=$hashBackupFilesFile}
    else{
    Write-Host -NoNewline "not changed..."
    Write-Host "done"
    Write-Host "Backup completed"
    catch { throw $_ }
    function Backup-Full
    [CmdletBinding()]
    [OutputType([System.String])]
    #region Params
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $HashFile,
    [Parameter(Position=3, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $ChainKey
    #endregion
    try{
    #Creating an empty hash table
    $xmlBackupFullHashFile = @{}
    #Starting directory lookup
    $uintBackupFullCount = 0
    Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$ChainKey\Full_" + $(Get-Date -Format "ddMMyyyy")) -HashTable $xmlBackupFullHashFile|`
    ForEach-Object{
    $xmlBackupFullHashFile.Add([string]$_.Keys,[string]$_.Values)
    $uintBackupFullCount++
    #Saving chain key.
    $xmlBackupFullHashFile.Add("ChainKey",$ChainKey)
    Write-Host -NoNewline "Saving XML file to $HashFile..."
    Export-Clixml -Path $HashFile -InputObject $xmlBackupFullHashFile -Force
    Write-Host "done"
    Write-Output $uintBackupFullCount
    catch { throw $_ }
    function Backup-Diff
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'leaf'})]
    [System.String]
    $HashFile
    #endregion
    try{
    #Loading hash table
    $xmlBackupDiffHashFile = Import-Clixml $HashFile
    $chainKeyBackupDiffDifferential = $xmlBackupDiffHashFile["ChainKey"]
    $uintBackupDiffCount = 0
    #Starting directory lookup
    Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$chainKeyBackupDiffDifferential\Differential_" + $(Get-Date -Format "ddMMyyyy.HHmm")) -HashTable $xmlBackupDiffHashFile|`
    ForEach-Object{ $uintBackupDiffCount++ }
    Write-Output $uintBackupDiffCount
    catch { throw $_ }
    function Backup-Inc
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'leaf'})]
    [System.String]
    $HashFile
    #endregion
    try{
    #Loading hash table
    $xmlBackupIncHashFile = Import-Clixml $HashFile
    $chainKeyBackupIncIncremental = $xmlBackupIncHashFile["ChainKey"]
    $uintBackupIncCount = 0
    #Starting directory lookup
    Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$chainKeyBackupIncIncremental\Incremental_" + $(Get-Date -Format "ddMMyyyy.HHmm")) -HashTable $xmlBackupIncHashFile|`
    ForEach-Object{
    $xmlBackupIncHashFile[[string]$_.Keys]=[string]$_.Values
    $uintBackupIncCount++
    Write-Host -NoNewline "Saving XML file to $HashFile..."
    Export-Clixml -Path $HashFile -InputObject $xmlBackupIncHashFile -Force
    Write-Host "Done"
    Write-Output $uintBackupIncCount
    catch { throw $_ }
    #0 - is OK. 1 - some error
    $exitValue=0
    process{
    try{
    $filesCopied=0
    $strSourceFolderName = $(Get-Item $SourceDir).Name
    $strHasFile = $("$HashPath\Hash_$strSourceFolderName.xml")
    $strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir started")
    #Automatically switch to full backup
    $bSwitch = $(!$(Test-Path -LiteralPath $strHasFile -PathType "Leaf") -and $SwitchToFull)
    Write-Log -Message $strMessage -LogFile $LogFile
    switch($true){
    $($BackupType -eq "Full" -or $bSwitch) {
    $filesCopied = Backup-Full -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile -ChainKey $("Backup_$strSourceFolderName" + "_" + $(Get-Date -Format "ddMMyyyy"))
    break
    $($BackupType -eq "Incremental") {
    $filesCopied = Backup-Inc -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile
    break
    $($BackupType -eq "Differential") {
    $filesCopied = Backup-Diff -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile
    break
    $strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir completed successfully. $filesCopied items were copied.")
    Write-Log -Message $strMessage -LogFile $LogFile
    Write-Output $filesCopied
    catch {
    $strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir failed:" + $_)
    Write-Log -Message $strMessage -LogFile $LogFile
    $exitValue = 1
    end{exit $exitValue}

    Hi Ryan Blaeholder,
    Thanks for your posting.
    To schedule a powershell script with input value, instead of modifying the script above, you can also try to add the input during creating a scheduled task like this:(save the script above as D:\backup.ps1)
    -command "& 'D:\backup.ps1' 'input1' 'input2'"
    For more detailed information, please refer to this article to complete:
    Schedule PowerShell Scripts that Require Input Values:
    http://blogs.technet.com/b/heyscriptingguy/archive/2011/01/12/schedule-powershell-scripts-that-require-input-values.aspx
    I hope this helps.
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Batch Script to display last modified file from multiple directories

    Hello,
    I am trying to find the name of the latest file  or the last modified file from a set of given directories.
    I approached the problem by breaking it into two parts: a.  read all the directories from a .txt file and display the names of those directories and files under those directories.  b. Find the latest file present in any given directory and
    Upto this point all worked well, but when I am trying to combine both these batch scripts, I am unable to get the desired result.
    I am using BATCH SCRIPT only to complete this.
    My tried script is as below:
    a.  TO SEE ALL THE DIRECTORIES SPECIFIED IN "path_123.txt" FILE
    for /f "delims=" %%i in ('type "C:\Users\sonkar\Desktop\path_123.txt" ') do (set locall=%%i
    echo %locall%
    echo %%i
    b.  TO FIND THE LATEST FILE PRESENT IN THE DIRECTORY
    set BUILDDROP="E:\"
    for /f "tokens=*" %%a in ('dir /A:-D /B /O:-D %BUILDDROP%') do set NEWEST=%%a&& goto :next
    :next
    echo %NEWEST%
    But when combining both is not working at all.
    Any help is appreciated.

    Since batch files lack the ability to do date arithmetic you are pushing them well beyond their limits. Either VBScript of PowerShell could handle this type of task. Here is a VBScript solution. You must save the code in a .vbs file, then modify it to suit
    your exact needs.
    'Find the most recent file in a collection of folders
    '11 Sep 2014 FNL
    Dim dDate, dDate_, sFile, sFile_
    Set oFSO = CreateObject("Scripting.Filesystemobject")
    Set oList = ofso.OpenTextFile("C:\Users\sonkar\Desktop\path_123.txt", 1)
    aList = Split(olist.ReadAll, vbCrLf)
    oList.close
    dDate = cdate("01/01/2000")
    For Each sFolder In aList
        dDate_ = cdate("01/01/2000")
        ProcessFolder(sFolder)
        If dDate_ > dDate Then
            dDate = dDate_
            sFile = sFile_
        End If
    Next
    WScript.Echo "Most recent file: """ & sFile & """"
    WScript.Echo "File date: " &dDate
    Sub ProcessFolder(sFldr)
        If LTrim(sFldr) = "" Then Exit Sub
        WScript.Echo "Processing """ & sFldr & """"
        For Each oFile In oFSO.GetFolder(sFldr).Files
            If oFile.DateLastModified > dDate_ Then
                dDate_ = oFile.DateLastModified
                sFile_ = oFile.Path
            End if
        Next
        For Each oFolder In oFSO.GetFolder(sFldr).SubFolders
            ProcessFolder(oFolder.path)
        Next
    End sub

  • Modifying cold fusion script not picked up by iis

    We are running coldfusion 7, using iis6 on windows 2003 server. I am not a cold fusion developer and this is the only coldfusion script we have created by a previous developer. I needed to make some simple modifications to the script, including a change in the title. After having restarted IIS the modified script is still not showing the new title. What do I need to do for iis to show the changes in the cfm file?

    It's in the CF Administrator, which is separate from the JRun Admin Console. You can often get to the CF Administrator with a URL like this:
    http://your_server/CFIDE/administrator/
    Dave Watts, CTO, Fig Leaf Software
    http://www.figleaf.com/
    http://training.figleaf.com/
    Fig Leaf Software is a Veteran-Owned Small Business (VOSB) on
    GSA Schedule, and provides the highest caliber vendor-authorized
    instruction at our training centers, online, or onsite.
    Read this before you post:
    http://forums.adobe.com/thread/607238

  • I want to create xml file using photoshop script and also i can easily add, modify, delete based on file name

    Hi,
    Please help me for this.
    I need to create XML file for mentioned below. when i run the photoshop script i need deatails for active document name, date, time and status.
    <?xml version="1.0" encoding="UTF-8"?>
    <sample>
    <filename>Cradboard_Boxes_Small.tif</filename>
    <date>today date</date>
    <starttime>now</starttime>
    <status>delivered</status>
    </sample>
    <sample>
    <filename>Cardboard_Boxes_Student_Vaue_Pack.jpg</filename>
    <date>today date</date>
    <starttime>now</starttime>
    <status>delivered</status>
    </sample>
    I need read that xml after creating and modify based on file name. i need to modify status after file finished.
    if the file name is already exist i want to modify or delete or add whatever i need.
    Kindly help me simple way

    You may want to look into getting Xtools ps-scripts - Browse Files at SourceForge.net then.  Most of the support is for ActionManager script code where XML code is use as an intermediate step.  There are quite a few Photoshop script in XTools .   Ross Huitt is an expert javascript programmer though is is fed up with Adobe's lack of support for Photoshop scripting particularly the bugs in ScriptUI he is still maintaining tool he  has created for us free of charge. Tools like Image Processor Pro. None of his scripts are save as binary so you can read all of his code there is a wealth of knowledge in there....
    Also there is a scripting forum Photoshop Scripting

  • How to test the modified script (Global Dunning)

    Hi all,
    I have a requirement in SAP Script.
    I had modified Global Dunning Letter F150_DUNN_01 and the print program is SAPF150D2.
    Transaction code is F150.
    How to test the script and what is the access path for checking print preview of the modified form.
    Do i need to configure anything before testing.
    Rewards to all.
    Thanks & Regards,
    Aarthi.

    Hi
    Your functional consultant will help you to trigger the output of this dunning output
    or search in SPRO tcode
    o in the related tcode F150
    <b>Reward points for useful Answers</b>
    Regards
    Anji

  • Need a powershell script that will modify the path of an existing Outlook 2010 OST file

    For new users we have a GPO in place that creates the OST file in the correct location.  The GPO also works on existing users who create a new Outlook Profile.
    The problem I am trying to solve is modifying an existing Outlook Profile and changing the location that the OST file is written to.
    I have to put the users OST file on the network.  Yes, I do know that this is not supported by Microsoft, but I have zero clients (WYSE units) on non-persistent VMware virtual machines.  This means they cannot keep the OST file on the PC. 
    Our email provider requires us to run in Cached-Mode so OST files are here to stay.
    Forcing 3000 users to create new Outlook profiles would not have a good outcome.
    I also have to relocate their PST files onto the network, which is also not supported.  I have been able to connect PST's with powershell, but disconnecting them has been challenging (can't make it work even once) so if I could get some help with that
    too that would be excellent.
    Thanks

    Hi,
    You may refer to the following article to create the registry keys and deploy it with logon script:
    http://www.slipstick.com/exchange/moving-outlook-ost-file/
    Best regards,
    Rex Zhang
    TechNet Community Support

Maybe you are looking for