SPL Delta file upload purge process

Please indicate how GTS tables are purged prior to running the SPL delta file upload batch.

Hi, I am not sure I understand your question,
The Tables do not need to be purged prior to uploading delta files, as changes will overwrite the existing Data. If you want to delete entries you need to follow the instructions as they where given to my by  Chris Koniecny and that would be the following:
If I understand the question correctly, you will have to archive the old SPL Master Data using the SPL_MD archiving object. This archiving object will allow you to create an archive of your current SPL Master Data (e.g. flat file) and then allow you to delete all data from the /SAPSLL/TSPL, /SAPSLL/TSPLA and /SAPSLL/TSPLN tables -- these tables are part of the SPL_MD archiving object. Once you have archived and deleted the prior SPL Master Data, you ought to be able to load the new SPL Master Data without any problems.
This method of course has pitfalls. If you're upgrading your SPL Master Data from the non-extended XML data to the extended XML data of the SPL entities, your audit trails will lose some details (e.g. which SPL entity matched on a specific business partner). Another pitfall is the amount of time it takes to rescreen all of the business partners against the new SPL Master Data. You may also encounter a large amount of blocked business partners as a result. My advice is to definitely test in a non-production environment.
Yet as I said you do not need to purge the data prior to every upload of delta files but you must know the following. Some Providers will give you delta files with entries of expired Sanction Parties with a validity date in the past so that you can upload delta files. Some will not do that and on their delta files, the expired parties are just not in anymore. With these you are obliged to load the full data set everytime and can not work with delta.
Hope that helps.

Similar Messages

  • Flat file upload in Process Chain error!

    Dear Expert:
       We wirte routine in infopackage and use ftp upload flat file from BW Application server(AL11--> can open the file), I manually upload, it can sucessful upload, but in Process chain, it always deal with the process and display yellow light. I must manually stop the job.
    any sugession for me? thank you very much!
    Best Regards
    Bessie

    Dear all:
        I found the way to slove this problem. in the Routine, i create a  TCP/IP connection. I adjust it manually in SM59.
      At "Technical Setting" Tab:
    Set Activation Type: Start on Application Server.
    Set Start type of External Program: Remote Execution
    That's OK!
    Thank you for your feedback!
    Bessie

  • Error on File upload. Error processing wwv_flow.accept.

    Hi,
    I'm developing an application to upload and download files using Oracle XE 10g and APEX 3.2.0.00.27 on OS Windows XP.
    The file upload page is very simple: one file browse item, a submit button and a report based on the wwv_flow_files table.... and an extra button to save the uploaded files in the DB.
    Sometimes (almost everytime) when I press the submit button, the wwv_flow.accept process fails and I'm redirected to a page saying Internet Explorer cannot display the webpage.
    I'm forced to refresh the page and then I get the following message:
    Expecting p_company or wwv_flow_company cookie to contain security group id of application owner.
    Error ERR-7621 Could not determine workspace for application (:) on application accept.
    I recreated the same page in the apex.oracle.com environment and it works, but it fails in my local environment.
    I have read some posts int his forum, but I haven't found an answer yet.
    Please help!!!
    AUJ

    varad acharya wrote:
    Download the standalone OHS.
    http://www.oracle.com/technology/software/products/database/oracle11g/111060_win32soft.html
    varadSorry about this...old post and off topic.
    Can anyone point me to a standalone version of OHS for linux x86-64? I'm having troubles finding this, it doesn't appear to have installed with 11g Enterprise Edition and I cannot find it in the available packages when I re-run the installer. I'm running 11g EPG now and want to convert to Apache.
    Thanks!!!

  • How to Upload a File in Bpel Process using JSP

    I am trying to upload file in bpel process using front end as a Jsp.
    i create the jsp page and i am able to pass the value from jsp to bpel process.
    In bpel process i don't know how to pass or assign the specified file name into file adapter for reading the files.
    Please help me...
    Saravanan

    You don't assign the url of the file to it.
    To either get the data from the file into the bpel process you could use the url-parameter together with the ora:readFile function....or you could let your web-application upload the file to some location on the server...and on this location you could use the file-adapter together with the polling, to start your bpel process.

  • Upload and Process large files

    We have a SharePoint 2013 OnPrem installation and have a business application that provides an option to copy local files into UNC path and some processing logic applied before copying it into SharePoint library. The current implementation is
    1. Users opens the application and  clicks “Web Upload” link from left navigation. This will open a \Layouts custom page to select upload file and its properties
    2. User specifies the file details and chooses a Web Zip file from his local machine 
    3. Web Upload Page Submit Action will
         a. call WCF  Service to copy Zip file from local machine to a preconfigure UNC path
         b. Creates a list item to store its properties along with the UNC path details
    4. Timer Job executes in a periodic interval to
         a. Query the List to see the items that are NOT processed and finds the path of ZIP file folder
         b. Unzip the selected file 
         c. Loops of unzipped file content - Push it into SharePoint library 
         d. Updates list item in “Manual Upload List”
    Can someone suggest a different design approach that can manage the large file outside of SharePoint context? Something like
       1. Some option to initiate file copy from user local machine to UNC path when he submits the layouts page
       2. Instead of timer jobs, have external services that grab data from a UNC path and processes periodic intervals to push it into SharePoint.

    Hi,
    According to your post, my understanding is that you want to upload and process files for SharePoint 2013 server.
    The following suggestion for your reference:
    1.We can create a service to process the upload file and copy the files to the UNC folder.
    2.Create a upload file visual web part and call the process file service.
    Thanks,
    Dennis Guo
    TechNet Community Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]
    Dennis Guo
    TechNet Community Support

  • Credit card file upload(Tcode: PRCC) in a batch process

    Hi all,
    Is it possible to make credit card file upload(Tcode: PRCC) in a batch process.
    when i tried doing so got message "frontend function cannot be created in batch mode" .
    I am aware that it is because this program is using "GUI_UPLOAD" function module which is for front end only and not for background processing.
    But as this is standard program I cannot change method of uploading flat file.
    Please suggest me any method to solve my requirement. I need to make credit card file upload in a batch process.
    Thanks ,
    Snehal

    Check mark parameter 'File is not local' for SAP to read file from application server (file is read using open dataset instead of gui_upload). This would allow you to run this tcode in background.

  • ICR - FBICS3 file upload process

    Dear experts,
    In the process of ICR to we want to use the file upload option in the FBICS3(select documents) for reconciliation of the docs. And our file is placed in the application server. We have maintained the configuration in FBIC032 for the company codes(Here we are using multiple company codes), we mentioned the logical file path, custom structure as this file has some new fileds and we are data source as file upload in the data source field in this(FBIC032) config. But when I run the FBICS3 transaction it is not selecting the documents from the file placed on the application server. Can some one help me on this, whether i missed something in the config?
    Regards,
    Karthik.
    Moderator message: not related to ABAP development, please ask again in the appropriate functional or technical forum.
    Edited by: Thomas Zloch on Oct 22, 2010 11:23 PM

    Hi Rafael,
    Thank you for your answer.
    But the file upload functionality is not activated by the transfer type config but by the data source config (=File upload), no?
    Unfortnulately, the field data source cannot be modified for process 002 (always equals Documents of current process)
    BR
    Bernard

  • Process triggered by file upload - is GP the right tool?

    Hello All,
    I have a process I need to model that starts with a file upload from the user, followed by some manipulation of the file by some web services.
    Is GP the right tool for modeling that process?
    After completing the file upload action, how can I access the the uploaded file on the server?
    Do I need to implement the file upload handling with Webdynpor?
    Thanks!
    Yossi

    Yossin,
    Based on your requirement, the GP process would be to use File Input CO followed by Web Service CO (in that order). Consolidate the output param of File Input CO (which is the File Structure) to the input for the Web Service CO (which is also a structure). I dont think you need to save the file to the local drive. The web service should be able to consume the uploaded file (which resides in the process context).
    Thanx,
    Mahesh

  • File upload process

    I want to limit the size and type of files that users put in the database through the Oracle Portal.
    Has anyone implemented it?

    Hi Rafael,
    Thank you for your answer.
    But the file upload functionality is not activated by the transfer type config but by the data source config (=File upload), no?
    Unfortnulately, the field data source cannot be modified for process 002 (always equals Documents of current process)
    BR
    Bernard

  • ICR File Upload Process 002 G/L Accounts

    Dear SAP experts,
    For external companies, we want to upload their intercompany transactions in the intercompany reconciliation cockpit.
    But it seems that is not possible for process 002 (G/L accounts): indeed, when I go on the customizing step 'Companies to be reconciled' (FBIC009), I cannot set the data source for one company code to File Upload.
    D
    o you have any idea so that I can load data in ICR process 002 tables by file upload?
    Thank you in advance for helping me,
    Bernard

    Hi Rafael,
    Thank you for your answer.
    But the file upload functionality is not activated by the transfer type config but by the data source config (=File upload), no?
    Unfortnulately, the field data source cannot be modified for process 002 (always equals Documents of current process)
    BR
    Bernard

  • BPC delta flat file upload possible?

    Hi experts,
    Ive been trying to find an answer of whether or not it is possible to retain member data on manual flat file upload into a dimension. We have created a dimension (employee) and have created some dummy nodes member data and would like to determine if it is possible to then keep this manual member data while manually loading further member data through flat file load. Is this sort of manual flatfile delta load possible? If so how? What is best practice here?
    Also, is it possible to load data into a dimension using two methods IE manual upload file and then another form (IE BW load) without overwriting data? Im assuming this is where the merge data option comes in to play.
    Any guidance would be much appreciated.
    Regards,
    Danielle

    Hi Danielle,
    While uploading data, if you have added dimension members manually, then, it never gets deleted untill and unless you specify for it.
    In your scenario, you have created dummy members and want to upload data thru flat file. Then, the data will be uploaded only for those members for which you want to upload data, rest of the members will display the old data.
    As you rightly said earlier, while running data manager package, you need to select MERGE option instead of REPLACE & CLEAR data values. This keeps the old data intact and adds the new ones as well.
    Hope this clarifies further.
    Rgds,
    Poonam

  • Multipart form (file upload) processing in providers

    Hello,
    Just want to find out if anyone has successfully implemented a file upload mechanism within a Portal channel.
    According to the Provider API (http://docs.sun.com/source/816-6428-10/com/sun/portal/providers/Provider.html), the wrapped request/response objects do not support several methods that are essential to process file uploads, namely "getContentLength" and "getInputStream". I am currently trying to use the Apache commons-fileupload utility which uses those methods to process file uploads. This is also the case for another popular file upload utility from servlets.com.
    Does anyone have any info/explanation regarding this limitation in Portal Server 6, and any workarounds to this issue. One workaround is to have a window popup that interacts directly with an external webapp.
    Any ideas/suggestions will be appreciated, thanks in advance.
    jeff

    Hi Jeff,
    The Sun ONE Portal Server DesktopServlet does not have the ability to process a request with the content encoding type of multipart/form-data. DesktopServlet does not pass the input stream for the request on to the Provider.
    To accomplish handling of multipart/form-data type requests, it is necessary to create a companion servlet or JSP that process the multipart/form-data. This servlet can then pass control back to the Portal channel. The data from the file can be shared between the servlet and the provider by using static Java members or by storing the data in a back-end database and then passing a reference to the data over to the provider.
    Sanjeev

  • Php file upload processing

    Hi I have been working with javascript and php to upload files. I am having problems with
    the backend of the file upload.
    So when the file is recieved in the backend.php . It comes like:
    $fileupload=$_POST['upload_file'];
    The above $fileupload is the url of the file:
    I am trying to extract name , type and size using:
    $name=$_FILE['$fileupload']['name'];
    $tmp_name=$_FILE['$fileupload']['tmp_name'];
    $size=$_FILE['$fileupload']['size'];
    But this seems not to work.
    echo $fileupload; //gives me the file url.
    but:
    echo $name or $tmp_name or $size
    does not work.
    Can any one help.

    Hi Rob
    Thanks so much for the replay. $name=$_FILES['photofield']['name'];
    does not work in this circumstance because I am recieving the file through the ajax code below:
    <script type="text/javascript">
       // JavaScript Document
    var phototitle;
    var photogenre;
    var photodesc;
    var photofield;
    function AjaxStuff(){
    phototitle = jQuery("#phototitle").attr("value");
    photogenre = jQuery("#photogenre").attr("value");
    photodesc = jQuery("#photodesc").attr("value");
    photofield = jQuery("#photofield").attr("value");
    jQuery.ajax({
      type: "POST",
      url: "Uploadfix.php",
      cache: false,
      dataType: "html",
      data: "phototitle=" + phototitle + "&photogenre=" + photogenre + "&photodesc=" + photodesc + "&photofield=" + photofield,
      success: function(response){
       // if sucessful; response will contain some stuff echo-ed from .php
      // alert("Here is your response: " + response);
       // Append this response to some <div> => let's append it to div with id "serverMsg"
       jQuery("#allresult").append(response);
       jQuery("#contentgrid").trigger("reloadGrid");});
    } // END OF FormAjaxStuff()
    </script>
    photofield is the file. Uploadfix.php is where the data is posted:
    Uploadfix.php
    $phototitle=$_POST['photofield];
    the for file:
    $photo=$_FILES['photofield']['name'];
    echo $photo; //Nothing comes out.
    echo $photofield; //The url for the file appears.

  • Inconsistent in file upload process

    I have a file upload method for Azure blob storage. code as below
    CloudStorageAccount cloudStorageAccount;
                    CloudBlobClient blobClient;
                    CloudBlobContainer blobContainer;
                    BlobContainerPermissions containerPermissions;
                    CloudBlob blob;
                    cloudStorageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["CLOUDSTORAGE_ACCOUNT"]);
                    blobClient = cloudStorageAccount.CreateCloudBlobClient();
                    blobContainer = blobClient.GetContainerReference(fileType);
                    blobContainer.CreateIfNotExist();                
                    containerPermissions = new BlobContainerPermissions();
                    containerPermissions.PublicAccess = BlobContainerPublicAccessType.Blob;
                    blobContainer.SetPermissions(containerPermissions);
                    blob = blobContainer.GetBlobReference(FileName);
                    blobClient.ParallelOperationThreadCount = 2;
                    IAsyncResult result = blob.BeginUploadFromStream(InputStream, null, null, null);
                    blob.EndUploadFromStream(result);
    Which is working fine in local but once hosted as Azure application , file upload is throwing error as :
    The server encountered an unknown 
       failure: The underlying connection was closed: 
       Could not establish trust relationship for the SSL/TLS secure channel. 
    After some time it will work fine.  Can you help me exactly what may be wrong

    Hi Srinivas,
    Have you checked if there is a problem in the system time syncing to the time servers?
    If its is a sync issue, to correct it, you could:
    Right-click the clock in the task bar
    Select Adjust
    Date/Time
    Select the Internet
    Time tab
    Click Change
    Settings
    Select Update
    Now
    There might be an issue with the SSL certificate as well.
    Are you using a self signed certificate? Do the host name between the certificate and the server match?
    You could try overriding the client certificate (This is dangerous if you are calling a server outside of your direct control, since you can no longer be as sure that you are talking to the server
    you think you're connected to.) You could try the following code:
     //Trust all certificates
                System.Net.ServicePointManager.ServerCertificateValidationCallback =
                    ((sender, certificate, chain, sslPolicyErrors) => true);
    Also for details you could refer the following links, where the customers are facing similar issues:
    https://social.msdn.microsoft.com/Forums/vstudio/en-US/bb0fc194-5bf3-4c24-94bb-c86f94c76bc2/could-not-establish-trust-relationship-for-the-ssltls-secure-channel-with-authority-pc1?forum=wcf
    http://stackoverflow.com/questions/703272/could-not-establish-trust-relationship-for-ssl-tls-secure-channel-soap
    Regards,
    Malar.

  • Processing file uploads

    How do I increase the size of uploading files on the adf rich faces?
    my web.xml:
    <context-param>
    <param-name>javax.faces.STATE_SAVING_METHOD</param-name>
    <param-value>client</param-value>
    </context-param>
    <context-param>
    <param-name>locales</param-name>
    <param-value>pt_BR</param-value>
    </context-param>
    <context-param>
    <param-name>defaultLocale</param-name>
    <param-value>pt_BR</param-value>
    </context-param>
    <context-param>
    <param-name>contextConfigLocation</param-name>
    <param-value>/WEB-INF/spring-context.xml</param-value>
    </context-param>
    <listener>
    <listener-class>org.springframework.web.context.ContextLoaderListener</listener-class>
    </listener>
    <context-param>
    <!-- Maximum memory per request (in bytes) -->
    <param-name>org.apache.myfaces.trinidad.UPLOAD_MAX_MEMORY</param-name>
    <!-- Use 500K -->
    <param-value>512000</param-value>
    </context-param>
    <context-param>
    <!-- Maximum disk space per request (in bytes) -->
    <param-name>org.apache.myfaces.trinidad.UPLOAD_MAX_DISK_SPACE</param-name>
    <!-- Use 5,000K -->
    <param-value>5120000</param-value>
    </context-param>
    <context-param>
    <!-- directory to store temporary files -->
    <param-name>org.apache.myfaces.trinidad.UPLOAD_TEMP_DIR</param-name>
    <!-- Use a TrinidadUploads subdirectory of /tmp -->
    <param-value>/tmp/TrinidadUploads/</param-value>
    </context-param>
    <!-- This filter is always required; one of its functions is
    file upload. -->
    <filter>
    <filter-name>trinidad</filter-name>
    <filter-class>org.apache.myfaces.trinidad.webapp.TrinidadFilter</filter-class>
    </filter>
    <filter-mapping>
    <filter-name>trinidad</filter-name>
    <servlet-name>Faces Servlet</servlet-name>
    <dispatcher>FORWARD</dispatcher>
    <dispatcher>REQUEST</dispatcher>
    </filter-mapping>
    <servlet>
    <servlet-name>Faces Servlet</servlet-name>
    <servlet-class>javax.faces.webapp.FacesServlet</servlet-class>
    <load-on-startup>1</load-on-startup>
    </servlet>
    <servlet>
    <servlet-name>resources</servlet-name>
    <servlet-class>org.apache.myfaces.trinidad.webapp.ResourceServlet</servlet-class>
    </servlet>
    Even putting the boot parameters, the default value that is being considered. I am not using any UploadedFileProcessor custom.

    Hi
    you want to change
    <param-name>org.apache.myfaces.trinidad.UPLOAD_MAX_DISK_SPACE</param-name>
    Frank

Maybe you are looking for