Export/ import of a subsite using Powershell and STSADM

Can any one clarify the difference between export/import of subsite between PS and STSADM.
I was able to export a subsite using Powershell but when using Powershell, it gives me error as subsite is not there.
When I used the stsadm it worked even the subsite is not there.
Appreciate help and thanks much.

STSADM is depreciated in 2010, you should be using PowerShell wherever possible.
If the site isn't there then exporting should be tricky. Could be a false error due to permissions though

Similar Messages

  • Possible to use stsadm export / import to copy subsite to the same site with diff subsite name?

    In the same site is that possible to use stsadm export/import to copy subsite to a new diff subsite name?
    I do not want to use the SaveAs Template method, I need command mode.

    You should create an empty site.
    Go to your site collection and click siteactions --> new site --> Select custom template, click create.
    Here you choose meeting2 as Title/url
    Now execute your import command (Import-SPWeb http://vee:111/meeting2 -Path
    C:\temp\meeting_exp.cmp -UpdateVersions -Overwrite) 
    Now wait, when the import is finished, check your site!
    That should do it.

  • Export/Import Parameters dissapear when using user-exit

    I am using some import/export parameters in a dynamic action when I create a new record (infotype). I am also using a user exit to avoid modifying BEGDA and ENDDA when I modify the record (IPSYST = 'MOD'). Using this user-exit, the parameters dissapear from memory so the dynamic action does not execute well. What can I do to use the user-exit?? Anything to add?

    In the dinamic actions when I create, I delimit records on infotype with export/import parameters defined in infotype Module Pool. When I delete, I avoid deleting the record if it is not the last one. With the user-exit the modification of begda endda in infotype is not allowed. If I use the user-exit, the dinamic actions which use export/import parameters, don't work.
    I have tried to do in MP what I do in user-exit but it is not easy because I haven´t got in PSAVE what I want.

  • How to add calendar enries to all users in organization using powershell and EWS.

    I am one of the exchange admins for our organization.  Every year, we publish academic calendar data to all faculty and staff calendars.  We recently updated and migrated from Exchange 2003 to Exchange 2010 which, of course, desupported MAPI and
    ADO.  The processes we previously used had to be re-written using Exchange Web Services API (EWS).  Because I find that powershell is easy to work with, I wanted to integrate the calendar dispersal using powershell.
    Having not found much help online using the EWS .NET library in powershell for this purpose, I decided to share my code:
    # Bulk load calendar entries script
    # Description:
    # Script used to deploy Academic Calendar entries to all Exchange account calendars
    # Prerequisites:
    # Service account must have ApplicationImpersonation ManagementRoleAddisgnment
    # New-ManagementRoleAssignment -Name:impersonationRole -Role:ApplicationImpersonation -User:<srv_account>
    # Usage:
    # .\academicCalendar.ps1 calEntries.csv
    # Where calEntries.csv = list of calendar entries to add
    Param ([string]$calInputFile = $(throw "Please provide calendar input file parameter..."))
    $startTime = Get-Date
    $strFileName = "<path to log file>"
    if(Test-Path $strFileName)
    $logOutFile = Get-Item -path $strFileName
    else
    $logOutFile = New-Item -type file $strFileName
    # Load EWS Managed API library
    Import-Module -Name "C:\Program Files\Microsoft\Exchange\Web Services\1.0\Microsoft.Exchange.WebServices.dll"
    # Load all Mailboxes
    $exchangeUsers = Get-Mailbox -ResultSize Unlimited | Select PrimarySmtpAddress
    # Load all calendar Entries
    # Input file is in the following format
    # StartDate,EndDate,Subject
    # 8/29/2011,8/30/2011,First Day of Fall Classes
    $calEntries = Import-Csv $calInputFile
    # Setup the service for connection
    $service = new-object Microsoft.Exchange.WebServices.Data.ExchangeService([Microsoft.Exchange.WebServices.Data.ExchangeVersion]::Exchange2010)
    $service.Url = new-object System.Uri("https://<CAS_server_URL>/ews/exchange.asmx")
    $service.Credentials = new-object Microsoft.Exchange.WebServices.Data.WebCredentials("<service_account>","<password>","<domain>")
    $totalCount = $exchangeUsers.Count
    $currentCount = 0
    Write-Output "Exchange Version: $service.RequestedServerVersion"
    Write-Output "Mailbox Count: $totalCount"
    # Add message to log file
    $timeStamp = Get-Date -Format "MM/dd/yyyy hh:mm:ss"
    $message = "$timeStamp -- Begin Calendar Deployment `n"
    $message += "Total Exchange Accounts: $totalCount"
    Add-Content $logOutFile $message
    # Perform for each Mailbox
    $error.clear()
    foreach($mailbox in $exchangeUsers)
    $currentCount++
    if($mailbox.PrimarySmtpAddress -ne "")
    # Output update to screen
    $percentComplete = $currentCount/$totalCount
    Write-Output $mailbox.PrimarySmtpAddress
    "{0:P0}" -f $percentComplete
    # Setup mailbox parameters for impersonation
    $MailboxName = $mailbox.PrimarySmtpAddress
    $iUserID = new-object Microsoft.Exchange.WebServices.Data.ImpersonatedUserId([Microsoft.Exchange.WebServices.Data.ConnectingIdType]::SmtpAddress,$MailboxName)
    $service.ImpersonatedUserId = $iUserID
    # Indicate which folder to work with
    $folderid = new-object Microsoft.Exchange.WebServices.Data.FolderId([Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::Calendar)
    $CalendarFolder = [Microsoft.Exchange.WebServices.Data.CalendarFolder]::Bind($service,$folderid)
    # For each entry in the input file
    $error.clear()
    foreach($entry in $calEntries)
    # First check to make sure the entry is not already in the calendar
    # use a calendarview object to pull the entries for the given date and make sure an entry with the same subject line doesnt already exist
    $cvCalendarview = new-object Microsoft.Exchange.WebServices.Data.CalendarView([System.DateTime]($entry.StartDate),[System.DateTime]($entry.EndDate))
    $cvCalendarview.PropertySet = new-object Microsoft.Exchange.WebServices.Data.PropertySet([Microsoft.Exchange.WebServices.Data.BasePropertySet]::FirstClassProperties)
    $frCalendarResult = $CalendarFolder.FindAppointments($cvCalendarview)
    $entryFound = $False
    foreach ($appointment in $frCalendarResult.Items)
    if($appointment.Subject -eq $entry.Subject)
    $entryFound = $True
    # If entry was found, then skip this entry
    if($entryFound)
    $entryFound = $False
    else # Create the appointment object and save it to the users calendar
    $appt = New-Object Microsoft.Exchange.WebServices.Data.Appointment($service)
    $appt.Subject = $entry.Subject
    $appt.Start = [System.DateTime]($entry.StartDate)
    $appt.End = [System.DateTime]($entry.EndDate) #For AllDayEvent, end date must be after start date
    $appt.IsAllDayEvent = $True #Set event as "All Day Event"
    $appt.LegacyFreeBusyStatus = "Free" #Make sure free/busy info shows user as "free" rather than "busy"
    $appt.IsReminderSet = $False #Make sure reminder is not set to remind the user of the event
    $appt.Save([Microsoft.Exchange.WebServices.Data.SendInvitationsMode]::SendToNone)
    if($error)
    $timeStamp = Get-Date -Format "MM/dd/yyyy hh:mm:ss"
    $message = $timeStamp + "...Exception Occurred while processing Save for: `n"
    $message += " Account: " + $MailboxName + "`n"
    $message += " Subject: " + $entry.Subject + "`n"
    $message += " Exception: " + $error[0].Exception + "`n"
    Add-Content $logOutFile $message
    $error.clear()
    if($error)
    $error.clear()
    else
    $message = "" + $MailboxName + "`t Success! `n"
    Add-Content $logOutFile $message
    Write-Output $currentCount
    $endTime = Get-Date
    $duration = New-TimeSpan $startTime $endTime
    $totalMin = $duration.TotalMinutes
    # Build and send email notification upon completion
    $body = "The Calendar deployment has completed. `n `n "
    $body += "Start Timestamp: $startTime `n "
    $body += "End Timestamp: $endTime `n "
    $body += "Duration: $totalMin min `n "
    $body += "Exchange accounts affected: $currentCount `n"
    $smtpServer = "<mysmtpserver>"
    $smtp = new-object Net.Mail.SmtpClient($smtpServer)
    $msg = new-object Net.Mail.MailMessage
    $msg.From = "<from_email_address>"
    $msg.To.Add("<to_email_address>")
    $msg.Subject = "Calendar Deployment"
    $msg.Body = $body
    $smtp.Send($msg)
    # Add closing message to log file
    $timeStamp = Get-Date -Format "MM/dd/yyyy hh:mm:ss"
    $message = "Accounts affected: $currentCount"
    Add-Content $logOutFile $message
    $message = "$timeStamp -- Completed in $totalMin min."
    Add-Content $logOutFile $message
    Please let me know if you think I can make any performance modifications.
    Daniel
    --Edit-- I have updated the script for Exchange 2010 SP1, also added logging, error checking and email notifications.  This new script also checks first to make sure the appointment doesn't already exist before adding it.  (To prevent multiple
    entries of the same event... Note: This check, although necessary in my opinion, is very time consuming.)

    Hi Daniel
    I am trying to add addition propertires like TV, Copier etc. to Room Mailbox in Exchange 2010 using following commands:-
    [PS] C:\Windows\system32>$ResourceConfiguration = Get-ResourceConfig
    [PS] C:\Windows\system32>$ResourceConfiguration.ResourcePropertySchema+=("Room/Whiteboard")
    Upper two commands run fine but following command gives error:-
    [PS] C:\Windows\system32>Set-ResourceConfig -ResourcePropertySchema $ResourceConfiguration.ResourcePropertySchema
    The term 'Set-ResourceConfig' is not recognized as the name of a cmdlet, function, script file, or operable program. Ch
    eck the spelling of the name, or if a path was included, verify that the path is correct and try again.
    At line:1 char:19
    + Set-ResourceConfig <<<<  -ResourcePropertySchema $ResourceConfiguration.ResourcePropertySchema
        + CategoryInfo          : ObjectNotFound: (Set-ResourceConfig:String) [], CommandNotFoundException
        + FullyQualifiedErrorId : CommandNotFoundException
    I also tried with space after set but still getting error:
    [PS] C:\Windows\system32>Set -ResourceConfig -ResourcePropertySchema $ResourceConfiguration.ResourcePropertySchema
    Set-Variable : A parameter cannot be found that matches parameter name 'ResourceConfig'.
    At line:1 char:20
    + Set -ResourceConfig <<<<  -ResourcePropertySchema $ResourceConfiguration.ResourcePropertySchema
        + CategoryInfo          : InvalidArgument: (:) [Set-Variable], ParameterBindingException
        + FullyQualifiedErrorId : NamedParameterNotFound,Microsoft.PowerShell.Commands.SetVariableCommand
    Pl advise the solution at [email protected] . I got this help from
    http://zbutler.wordpress.com/2010/03/17/adding-additional-properties-to-resource-mailboxes-exchange-2010/

  • How to export Listitems in sharepoint 2010 using powershell

    I am able to export all items in sharepoint list by below command in powershell
    Export-SPWeb -Identity
    "http://myurl.com" -   path
    "C:\Export\Lists\Announcements.cmp" -ItemUrl
    "/Lists/Announcements/AllItems.aspx" -Force
    My list is having huge data in production I want to export only 10 items from the list using powershell. Is there any command to export only certain items.
    I want to filter the Items and export only certain set of items using Export command.
    Rajendra

    Hi all,
    To export a list you can try: 
    Export data from SharePoint 2010 List using Management Shell: http://www.experts-exchange.com/OS/Microsoft_Operating_Systems/Server/MS-SharePoint/A_9182-Export-data-from-SharePoint-2010-List-using-Management-Shell.html
    Export SharePoint 2010 List to Excel with PowerShell: http://iwillsharemypoint.blogspot.com.es/2011/02/export-sharepoint-2010-list-to-excel.html
    Export SharePoint List Items to CSV using PowerShell: http://www.sharepointdiary.com/2013/04/export-sharepoint-list-items-to-csv-using-powershell.html
    If a post answers your question, please click Mark As Answer on that post and
    Vote as Helpful

  • Change content of default.aspx for subsites using Powershell

    Hi.
    I need to update the content of the default.aspx of 60+ subsites in a site Collection. I am able to loop all subsites and get the corresponding default.aspx files using a PowerShell snippet like:
    $site = Get-SPSite -WebApplication https://mysite.domain.no/ -Limit All
    $wc = $site.allwebs | where {$_.url.StartsWith(https://mysite.domain.no/sites/blahblah/blahblahblah) }
    foreach ($web in $wc){
    $file = $web.GetFile("default.aspx")
    The problem is I don't know whether to use the Set-Content command-let or an XMLDocument approach.
    I have the file With the correct content on the local machine. I tried With $file.Parentfolder in order to use
    $f = $web.GetFolder($file.ParentFolder)
    $fc = $f.files
    Get-ChildItem "C:\temp\default.aspx" | foreach {
    $spFileCollection.add($($_.Name),$_.OpenRead(), $true)
    to replace the file but this gives an error on empty folder. Can I use this approach or is there a way to change the actual content of the file?

    Hi Jorgen,
    According to your description, my understanding is that you want to update the page content using PowerShell.
    I suggest you read the content data like below:
    $data = $file.OpenBinary()
    $encode = New-Object System.Text.ASCIIEncoding
    $test = $encode.GetString($data)
    Then you can change the string you want, and then you can save it using SaveBinary() method.
    Here are some detailed articles for your reference:
    Reading the contents of a SharePoint library file using PowerShell
    Using powershell to read/modify/rewrite sharepoint xml document
    Best Regards
    Zhengyu Guo
    TechNet Community Support

  • How do I bulk upload documents using PowerShell and extract metadata from file name?

    I have a requirement to upload a bunch of documents into a document library. Based on the content type, the rules of updating the metadata is different...the one giving me trouble is to extract the metadata from the file name. If I have a file name like
    "part1_part2_part3.pdf" how do I extract part1, part2, part3 and tag each document being uploaded into SharePoint, using PowerShell? I have searched and have not been able to find anything to get me started.
    Has anyone done this before? Or is there a blog I can take a look at? Thanks
     

    You will have to write a PS script encompassing this logic.
    Read files from the folder using
    Get-Item cmdlet
    Determine the content type based on the path or filename.
    Split the file name to extract the tag names.
    If the metadata fields in the content type is a managed metadata field, check whether the term exists and set it.
    Updating SharePoint Managed Metadata Columns
    with PowerShell
    This post is my own opinion and does not necessarily reflect the opinion or view of Slalom.

  • Export/Import the mapping value used in Value Mapping Function

    I used value mapping function, so I defined some mapping data at Integration Directory. now I need migration from development xi to production xi, so How to export the data I input at development xi and later import it? or anyone could provide other solution for replacing value mapping function?
    Hope from your helps
    Thanks
    Spring
    Message was edited by: Spring Tang
    Message was edited by: Spring Tang
    Message was edited by: Spring Tang

    Hi Spring Tang,
      You can export your mapping object in .mte format.
      For this press ctrl+shift and right click in the
      mapping editor. Then go to tools and export the same.
      Similarly you can import the mapping object.
      Hope this will solve your problem.
      Thanks and Regards
      Vishal Kumar

  • How-to move objects (users) from one ou to another using Powershell and an XLSX

    Hi all,
    I have a spreadsheet that has headers. I need to move all of the objects on this exception report to the proper OU (all going to the same OU).
    The header that validates the need to move is called "Display Name".
    The process now is as follows.
    1) Copy displayname
    2) Open AD search
    3) enter display name in find box
    4) locate object
    5) right click object in results and click move.
    6) move to the OU "Home.test.com/uk Online/Users OU/Business Process"
    --- How can i use Import-CSV to automate this process?
    Thanks for any help, there is about 4K lines on this sheet and it normally takes about 25 days of "busy work" to accomplish this, then 5 days later I have to re-run the report and start over.
    Josh
    Josh Borges

    Hi Josh,
    This assumes that you can save your file as an actual CSV file:
    $skippedUsers = @()
    Import-Csv .\userList.csv | ForEach {
    $displayName = $_."Display Name"
    $user = Get-ADUser -Filter "DisplayName -eq '$displayName'" -Properties DisplayName
    If ($user.Count) {
    $skippedUsers += $displayName
    Else {
    $user | Move-ADObject -TargetPath 'OU=Business Process,OU=Users OU,OU=uk Online,DC=home,DC=test,DC=com' -WhatIf
    If ($skippedUsers) { Write-Host 'The following users could not be moved automatically:' -ForegroundColor Red ; $skippedUsers }
    Do you have to use the display name property? That's not guaranteed to be unique, so you might run into problems. The script above will not attempt to move the user if more than one is returned by the command.
    EDIT: I've also added -WhatIf to Move-ADObject. Now the command won't actually move your users, it will just tell you about it. Remove it if you're happy with the output.
    Don't retire TechNet! -
    (Don't give up yet - 12,420+ strong and growing)

  • Is it possible to get Exchange emails downloaded to SharePoint document library using Powershell and or Custom Workflow?

    I have been asked to see if it would be possible to get  exchange emails downloaded and or sent to a document library .
    I know of the sitemail box app. but we are not running Exchange 2013.
    and setting up the lists and or document library to receive emails using the built in doesn't seem to be working...( maybe not configured correctly, i would need to see what the prior admin/developer did)
    But is there a way to get the emails downloaded to a document library using a workflow and or a powershell script that is triggered via  workflow?

    Hi,
    Since workflow can only work on items in SharePoint sites, they cannot get Exchange emails, let alone download emails to SharePoint library. However, you could manually save email to local and upload it to SharePoint list/library.
    I'd suggest you toubleshooting the incoming email settings in SharePoint. Please refer to the article below and check your configuration:
    https://technet.microsoft.com/en-us/library/cc262947.aspx
    http://blogs.technet.com/b/harmeetw/archive/2012/12/29/sharepoint-2013-configure-incoming-emails-with-exchange-server-2013.aspx
    Regards,
    Rebecca Tu
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • Database Migration 10g, difference in export/import from 8.1.6 and DBUA

    i would like to know if there is any difference in migrating using export/import methods from 8.1.6 and migration using dbua from 8.1.6 -> 8.1.7 -> 10g. In terms of tablespaces, datafiles performance .
    Will the DBUA convert the tablespaces automatically to take advantage of 10g ?
    Thanks .

    Hi,
    If we refer to the Oracle doc, it's depend of your 10g release.<br>
    Into Upgrade Paths 10.2, you need to upgrade your 8i firstly in 8.1.7.4.<br>
    Into Upgrade Paths 10.1, you you can directly migrate.<br>
    Anyway, you need to use 8iexp utility, and 10g imp utility : Using Different Releases of Export and Import 10.1 or Using Different Releases of Export and Import 10.2.<br>
    <br>
    Nicolas.

  • Importing/Parsing XML using SQL and/or PL/SQL

    What is the recomended way of importing/parsing XML data using SQL and/or PL/SQL?
    I have an XSD that defines the structure of the file, and an XML file that has the content in the appropriate structure. I have parsed (checked) the structure of the XML file using JDOM in my java application, and then passed it to a function in a package on the database as a CLOB.
    What I need to do is parse the contents of the XML file that is passed into the function and extract the values of each XML element so that I can then do some appropriate validation before inserting and committing the data to the database (hence completing the import function).
    A DBA colleague of mine has been experimenting with various ways of acheiving this, but has encountered a number of problems along the way, one of which being that he thinks that it is not possible to parse XML elements that are nested more than four levels deep - is this the case?
    The structure of the XSD/XML that the database function needs to manipulate and import data from is very complex (and recursive by it's nature).
    I would appreciate any suggestions as to how I can achieve the above in the most efficient manner.
    Thanks in advance for your help
    David

    This is the forum for the SQLDeveloper tool. You will get better answers in the SQL and PL/SQL forum, and especially the XML DB forum.
    Oracle has comprehensive and varied support for XML, including a PL/SQL parser.

  • How to get Document Set property values in a SharePoint library in to a CSV file using Powershell

    Hi,
    How to get Document Set property values in a SharePoint library into a CSV file using Powershell?
    Any help would be greatly appreciated.
    Thank you.
    AA.

    Hi,
    According to your description, my understanding is that you want to you want to get document set property value in a SharePoint library and then export into a CSV file using PowerShell.
    I suggest you can get the document sets properties like the PowerShell Command below:
    [system.reflection.assembly]::loadwithpartialname("microsoft.sharepoint")
    $siteurl="http://sp2013sps/sites/test"
    $listname="Documents"
    $mysite=new-object microsoft.sharepoint.spsite($siteurl)
    $myweb=$mysite.openweb()
    $list=$myweb.lists[$listname]
    foreach($item in $list.items)
    if($item.contenttype.name -eq "Document Set")
    if($item.folder.itemcount -eq 0)
    write-host $item.title
    Then you can use Export-Csv PowerShell Command to export to a CSV file.
    More information:
    Powershell for document sets
    How to export data to CSV in PowerShell?
    Using the Export-Csv Cmdlet
    Thanks
    Best Regards
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • How do you produce both Customs Export & Import documentation in an STO?

    Hello All -
    How do you produce both Customs Export & Import documentation in an STO scenario? For instance, if a company would like to create a STO in SAP to move materials from one plant to another and would like to produce both the Export & Import paperwork in GTS, what documents would be needed in R/3 to make this possible?
    Would the following be needed, or is there some other way?
    - STO
    - Outbound Delivery & F8 Invoice to generate the Customs Shipment in GTS for the Export documentation
    - Inbound Delivery to generate the Customs Declaration in GTS for the Import documentation

    Thanks Sameer -
    Yes, for the Import documentation, I think we will be declaring prior to Goods Receipt, so we will first use the Inbound delivery, and then the GR document later on once the materials have been received. (I believe you and I were talking about some of this in another thread actually.) Thank you for confirming this.
    As for the Note, I have also seen this - but thought that this was to more or less give visibility to the sending plant up-front at STO of what export licenses might be needed. (According to the note, currently they would only be able to check if licenses were needed at the outbound delivery, and not on the PO)
    Per the note: "Given a Stock Transport Purchase Order scenario: Plant A in Germany issues a STO PO against vendor plant US10 in the US. GTS looks at the PO in terms of importing from the US (departure country) coming into Plant A in Germany and performs compliance checks. Meanwhile, exporting plant US10 only has visibility at outbound delivery creation to see if am Export License is needed. Exporting plant US10 may want to know if they need a license to the country of destination, in this case Germany, at STO PO creation."
    However with the proper country-level export/import activation of the legal regulations, I believe that you can still have the STO check export-related licenses even though it is mapped to GTS as IMPORD. I have done some testing on this and it appears to work (you just need to ensure that you are activating the relevant countries for export/import properly in the legal regulation, and you can have both export & import documents check the same legal regulation and deliver the correct license determination). I was just wondering if you had any experience with this that was contrary to what I have found, or if there is something you think I am missing..
    Thanks

  • Export/import in 904 versus 10.1.2

    We have two 10G AS 904 installs (dev and prod - same O/S too). From time to time we move our production portal content to the dev server. We sometimes get corruption and have to use the Schema validation Utility scripts to fix the import to the dev server. Not a show stopper but this adds another manual step to the process.
    I see the 10.1.2 version has improved import/export checks. Can anyone give feedback on the improvements in the export/import process?
    Thanks
    Kevin

    Kevin,
    Careful with this approach. Passing from DEV to PROD is ok, but then to DEV back again, what I suggest you is that you have a clean start on DEV, ie, clean the DEV machine and later do the same to the PROD. Portal Export / Import works only in one-way and not both ways (this basically has to do with the checks we do make and with possible conflicts with the objects on both sides). Also check the available documentation... which is all compiled in Metalink Note:263995.1 - Master Note for OracleAS Portal Export / Import Issues
    As to the improvement of the process, please give it a check on the New Features papers (make a find on "export" - it is easier to pick the references):-
    10.1.2 -- http://www.oracle.com/technology/products/ias/pdf/1012_nf_paper.pdf
    10.1.4 -- http://www.oracle.com/technology/products/ias/portal/pdf/portal_1014_new_features.pdf
    I hope it helps...
    Cheers,
    Pedro.

Maybe you are looking for

  • How to install the Netbeans?

    Hello... The present is so that they help me with the installation of the Netbeans, my problem is the following one: I want to program in Java with all that offers me the J2EE platform, but when I go to will install the Netbeans it requests me the Vi

  • My phone thinks I am abroad. How can I get it to realise I am in the UK?

    Whenever I go to Orange World I get 'Welcome to your international page' and the option 'Costs while abroad' is highlighted. I try to select some other page such as 'All today's sport'  but it just reverts to 'Cost while abroad' and I can't get . I h

  • Changing the color of sortable columns in reports

    Hi all, In my Report template (custom made) I have my headers as purple with white text, which is all good except when I make it sortable, it turns all black and hard to see against deep purple background. Is there a way to retain (or change the colu

  • PARAMETER_CONVERSION_ERROR in case of RFC

    Hi, I am calling rfc function module FM1 with tables parameter where i have two tables, in the interface i have added two tables by referring structures. FM1 interface table1 like struct1 ( structure with enahncement category can be enhanced characte

  • Change schedule line date

    Dear All, I have a requirement where required delivery date for one divisions material in sale order and STO should automatically propose todays date plus 10 days i.e. if any sale order or STO created today then required delivery date in STO and SO s