Data Loader On Demand Proxy Usage for Resume operation

Hi,
My project required me to use the proxy feature available in Data Loader R19 release.
I could use the proxy at command line for insert /update operations.
However, the same doesnt work for RESUME operation in Data loader.
Tried using proxy settings from command line as well as property file but to no use.
Any suggestions...
Regards,
Sumeet

Its a Java application so it may run on your Linux/Unix system, you would have to test to see if works. Last time I checked Oracle only supports the application running on windows.

Similar Messages

  • Oracle Data Loader On Demand on EHA Pod

    Oracle Data Loader doesn't work correctly.
    I downloaded it from Staging(EHA Pod).
    And I did the following work.
    1.Move to "config" folder,and update "OracleDataLoaderOnDemand.config".
    hosturl=https://secure-ausomxeha.crmondemand.com
    2.Move to "sample" folder,and change Owner_Full_Name at "account-insert.csv".
    And at the command prompt,run the batch file.
    It runs successfully,but records aren't inserted on EHA Pod.Records exist on EGA Pod.
    This is the log.
    Is Data Loader for only EGA Pod?Would please give me some advices?
    [2012-09-19 14:49:55,281] DEBUG - [main] BulkOpsClient.main(): Execution begin.
    [2012-09-19 14:49:55,281] DEBUG - [main] BulkOpsClient.main(): List of all configurations loaded: {sessionkeepchkinterval=300, maxthreadfailure=1, testmode=production, logintimeoutms=180000, csvblocksize=1000, maxsoapsize=10240, impstatchkinterval=30, numofthreads=1, hosturl=https://secure-ausomxeha.crmondemand.com, maxloginattempts=1, routingurl=https://sso.crmondemand.com, manifestfiledir=.\Manifest\}
    [2012-09-19 14:49:55,281] DEBUG - [main] BulkOpsClient.main(): List of all options loaded: {datafilepath=sample/account-insert.csv, waitforcompletion=False, clientlogfiledir=., datetimeformat=usa, operation=insert, username=XXXX/XXXX, help=False, disableimportaudit=False, clientloglevel=detailed, mapfilepath=sample/account.map, duplicatecheckoption=externalid, csvdelimiter=,, importloglevel=errors, recordtype=account}
    [2012-09-19 14:49:55,296] DEBUG - [main] BulkOpsClientUtil.getPassword(): Entering.
    [2012-09-19 14:49:59,828] DEBUG - [main] BulkOpsClientUtil.getPassword(): Exiting.
    [2012-09-19 14:49:59,828] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Entering.
    [2012-09-19 14:49:59,937] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Sending Host lookup request to: https://sso.crmondemand.com/router/GetTarget
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Host lookup returned: <?xml version="1.0" encoding="UTF-8"?>
    <HostUrl>https://secure-ausomxega.crmondemand.com</HostUrl>
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Successfully extracted Host URL: https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Exiting.
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Entering.
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Host URL from the Routing app=https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Host URL from config file=https://secure-ausomxeha.crmondemand.com
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Successfully updated the config file: .\config\OracleDataLoaderOnDemand.config
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Host URL set to https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Exiting.
    [2012-09-19 14:50:03,953] INFO - [main] Attempting to log in...
    [2012-09-19 14:50:10,171] INFO - [main] Successfully logged in as: XXXX/XXXX
    [2012-09-19 14:50:10,171] DEBUG - [main] BulkOpsClient.doImport(): Execution begin.
    [2012-09-19 14:50:10,171] INFO - [main] Validating Oracle Data Loader On Demand Import request...
    [2012-09-19 14:50:10,171] DEBUG - [main] FieldMappingManager.parseMappings(): Execution begin.
    [2012-09-19 14:50:10,171] DEBUG - [main] FieldMappingManager.parseMappings(): Execution complete.
    [2012-09-19 14:50:11,328] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): Submitting BulkOpImportGetRequestDetail WS call
    [2012-09-19 14:50:11,328] INFO - [main] A SOAP request was sent to the server to create the import request.
    [2012-09-19 14:50:13,640] DEBUG - [Thread-3] SOAPImpRequestManager.sendImportGetRequestDetail(): SOAP request sent successfully and a response was received
    [2012-09-19 14:50:13,640] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): BulkOpImportGetRequestDetail WS call finished
    [2012-09-19 14:50:13,640] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): SOAP response status code=OK
    [2012-09-19 14:50:13,640] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): Going to sleep for 300 seconds.
    [2012-09-19 14:50:20,328] INFO - [main] A response to the SOAP request sent to create the import request on the server has been received.
    [2012-09-19 14:50:20,328] DEBUG - [main] SOAPImpRequestManager.sendImportCreateRequest(): SOAP request sent successfully and a response was received
    [2012-09-19 14:50:20,328] INFO - [main] Oracle Data Loader On Demand Import validation PASSED.
    [2012-09-19 14:50:20,328] DEBUG - [main] BulkOpsClient.sendValidationRequest(): Execution complete.
    [2012-09-19 14:50:20,343] DEBUG - [main] ManifestManager.initManifest(): Creating manifest directory: .\\Manifest\\
    [2012-09-19 14:50:20,343] DEBUG - [main] BulkOpsClient.submitImportRequest(): Execution begin.
    [2012-09-19 14:50:20,390] DEBUG - [main] BulkOpsClient.submitImportRequest(): Sending CSV Data Segments.
    [2012-09-19 14:50:20,390] DEBUG - [main] CSVDataSender.CSVDataSender(): CSVDataSender will use 1 threads.
    [2012-09-19 14:50:20,390] INFO - [main] Submitting Oracle Data Loader On Demand Import request with the following Request Id: AEGA-FX28VK...
    [2012-09-19 14:50:20,390] DEBUG - [main] CSVDataSender.sendCSVData(): Creating thread 0
    [2012-09-19 14:50:20,390] INFO - [main] Import Request Submission Status: Started
    [2012-09-19 14:50:20,390] DEBUG - [main] CSVDataSender.sendCSVData(): Starting thread 0
    [2012-09-19 14:50:20,390] DEBUG - [main] CSVDataSender.sendCSVData(): There are pending requests. Going to sleep.
    [2012-09-19 14:50:20,406] DEBUG - [Thread-5] CSVDataSenderThread.run(): Thread 0 submitting CSV Data Segment: 1 of 1
    [2012-09-19 14:50:24,328] INFO - [Thread-5] A response to the import data SOAP request sent to the server has been received.
    [2012-09-19 14:50:24,328] DEBUG - [Thread-5] SOAPImpRequestManager.sendImportDataRequest(): SOAP request sent successfully and a response was received
    [2012-09-19 14:50:24,328] INFO - [Thread-5] A SOAP request containing import data was sent to the server: 1 of 1
    [2012-09-19 14:50:24,328] DEBUG - [Thread-5] CSVDataSenderThread.run(): There is no more pending request to be picked up by Thread 0.
    [2012-09-19 14:50:24,328] DEBUG - [Thread-5] CSVDataSenderThread.run(): Thread 0 terminating now.
    [2012-09-19 14:50:25,546] INFO - [main] Import Request Submission Status: 100.00%
    [2012-09-19 14:50:26,546] INFO - [main] Oracle Data Loader On Demand Import submission completed succesfully.
    [2012-09-19 14:50:26,546] DEBUG - [main] BulkOpsClient.submitImportRequest(): Execution complete.
    [2012-09-19 14:50:26,546] DEBUG - [main] BulkOpsClient.doImport(): Execution complete.
    [2012-09-19 14:50:26,546] INFO - [main] Attempting to log out...
    [2012-09-19 14:50:31,390] INFO - [main] XXXX/XXXX is now logged out.
    [2012-09-19 14:50:31,390] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): Interrupted.
    [2012-09-19 14:50:31,390] DEBUG - [main] BulkOpsClient.main(): Execution complete.

    Hi,
    the Data Loader points by default to the production environment regardless if you download it from staging or production.
    To change the pod edit the config file and input the below content:
    hosturl=https://secure-ausomxeha.crmondemand.com
    routingurl=https://secure-ausomxeha.crmondemand.com
    testmode=debug

  • Oracle Data Loader On Demand : Account Owner Field mapping

    I was trying to import account records using data loader. For the 'Account Owner' field the data file contains 'User ID', then the import is failed. When I use "Company Sign In ID/User Id" value in the data file it is sucessfull.
    Do we have any way to use the 'User ID" value in the data file for 'Account Owner' field and run data loader suscessfully.

    The answer is no. It is my understanding that you need to map Account Owner to the User Sign In ID, which has the format of:
    <Company Sign In ID>/<User ID>

  • Bandwidt, Latency and Proxy usage for Fusion Cloud

    Hello,
    Do we have any information on Fusion pre-requisites from a performance standpoint ?
    - Do we have bandwidth requirements or recommendation? Any figures concerning the bandwidth consumption will be per user (min, average,max) ?
    - Do we have latency requirements or recommendation?
    - Do we have recommendation on the proxy usage ? How does it affect the performance and is there any parameter to configure (no cache for instance)?
    Many thanks in advance
    Nicolas

    Hi Helmut,
    valid question. And to cut a long story short: yes
    Right now we have the packages which you have described on the HANA Marketplace. We start with 128 GB there.
    We also have 64 GB packages. We have not yet published them on the HANA Marketplace. And I guess, you would still consider them expensive, if you do not need the persistency layer in the cloud (if you need a HANA database, I consider them fairly priced).
    The infrastructure with Git, cloud proxy / dispatcher etc. which you refer to (Lightweight HTML5 apps and Git on SAP HANA Cloud Platform) is currently available on the trial landscape. While working on the necessary steps to productize them, we also think about corresponding packages to make these new features available for the scnearios you describe and at a price which makes them attractive for exactly these scenarios.
    If you are interested in a discussion, feel free to drop me your contact details (via eMail or DM).
    Best regards
    Thorsten Schneider
    SAP HANA Cloud Platform Product Management

  • How to make data loaded into cube NOT ready for reporting

    Hi Gurus: Is there a way by which data loaded into cube, can be made NOT available for reporting.
    Please suggest. <removed>
    Thanks

    See, by default a request that has been loaded to a cube will be available for reporting. Bow if you have an aggregate, the system needs this new request to be rolled up to the aggregate as well, before it is available for reporting...reason? Becasue we just write queries for the cube, and not for the aggregate, so you only know if a query will hit a particular aggregate at its runtime. Which means that if a query gets data from the aggregate or the cube, it should ultimately get the same data in both the cases. Now if a request is added to the cube, but not to the aggregate, then there will be different data in both these objects. The system takes the safer route of not making the 'unrolled' up data visible at all, rather than having inconsistent data.
    Hope this helps...

  • Data Loader On Demand Inserting Causes Duplicates on Custom Objects

    Hi all,
    I am having a problem that i need to import around 250,00 records on a regular basis so have built a solution using Dataloader with two processes, one to insert and one to update. I was expecting that imports that had an existing EUI would fail therefore only new records would get inserted (as it says in the PDF) but it keeps creating duplicates even when all the data is exactly the same.
    does anyone have any ideas?
    Cheers
    Mark

    Yes, you have encountered an interesting problem. There is a field on every object labelled "External Unique Id" (but it is inconsistent as to whether there is a unique index there or not). Some of the objects have keys that are unique and some that seemingly have none. The best way to test this is to use the command line bulk loader (because the GUI import wizard can do both INSERT/UPDATE in one execution, you don't always see the problem).
    I can run the same data over and over thru the command line loader with the option to INSERT and you don't get unique key constraints. For example, ASSET and CONTACT, and CUSTOM OBJECTS. Once you have verified whether the bulk loader is creating duplicates or not, that might drive you to the decision of using a webservice.
    The FINANCIAL TRANSACTION object I believe has a unique index on the "External Unique Id" field and the FINANCIAL ACCOUNT object has a unique key on the "Name" field I believe.
    Hope this helps a bit.
    Mychal Manie ([email protected])
    Hitachi Consulting

  • Regarding Master Data loading In BI 7.0 for  SD Module

    Hi Guys ,
    Iam new to BI 7.0 Impln.Can anybody  help me how to load the Master data for cubes like 0sd_c01,0sd_c03.I have activated the MD ds like 0Plant_Attr,0Plant_Text,0Master_attr,0master_text and replicated into BI 7.0.After that i need help from u guys.what are the steps required to activate the Masterdata.
    thanks,
    Chinnu

    Hi Chinnu,
    1)By replication I think you mean to say that you now have the datasources 0Plant_Attr,0Plant_Text,0Master_attr,0master_text in BW now.
    2)You then have to complete the mapping to the corresponding Infosources
    3) activate the infoobjects 0PLANT, 0Master.
    4)Load your master data
    regards,
    Vinay

  • Mainframe data loaded into Oracle tables - Test for low values using PL/SQL

    Mainframe legacy data has been copied straight from the legacy tables into mirrored tables in Oracle. Some columns from the mainframe data had 'low values' in them. These columns were defined on the Oracle tables as varchar2 types. In looking at the data, some of these columns appear to have data that looks like little square boxes, not sure but maybe that is the way Oracle interprets the 'low values' in the original data into varchar. When I run a select to find all rows where this column is not null, it selects these columns. In the results of the select statement, the columns appear to be blank, however, in looking at the data in the column using SQL Developer, I can see the odd 'square boxes'. My guess is that the select statement is detecting that something exists in this column. Long story short, some how I am going to have to test this legacy data on the Oracle table using Pl/Sql to test for 'low values'. Does anyone have any suggestions on how I could do this????? Help! The mainframe data we are loading into these tables is loaded with columns with low values.
    I am using Oracle 11i.
    Thanks
    Edited by: ncsthbell on Nov 2, 2009 8:38 AM

    ncsthbell wrote:
    Mainframe legacy data has been copied straight from the legacy tables into mirrored tables in Oracle. Not a wise thing to do. Mainframe operating systems typically use EBCDIC and Unix and Windows servers use ASCII. The endian is also different (big endian vs little endian).
    Does anyone have any suggestions on how I could do this????? As suggested, use the SQL function called DUMP() to see the actual contents (in hex) of these columns.

  • Data loading with routine displays zero for key fig values

    Hi all,
    my source field is amount...........and target field is research amount...........
    if i restrict amount(source field) with cost element(coaeom7) and value type(010 then it is equal to target field research amount.
    for this my code is
    IF COMM_STRUCTURE-VTYPE = '010' AND
       COMM_STRUCTURE-COSTELMNT = 'COAEOM12'.
       RESULT = COMM_STRUCTURE-AMOUNT.
    ENDIF.
    but when i load the data it displays only zeros.........
    Pleas suggest
    Regards,
    Raj.

    Hi Raj,
    Do u need costelement values other than 'COAEOM12' into the target.
    Are you writing this routine in Start Routine/End Routine. if its BI.7
    If you are loading data from soucre cube/DSO to target structure, in the transformation you need to write start routine, before which you need to map amount field, cost element,and value type from source to Research amount in target,
    The you need to write the code
    If source_packake cost element = COAEOM12'  and source_packake value type =010
    then research amount = amount
    End If.
    i hope this will solve your problem.

  • Data Load Error: No SID Found for value.....

    Hi,
    I'm loading data for 0REFX_C01-Conditions Cube (Real Estate). Load Fails & error message is "Record 1 :No SID found for value 'IB20010000777700000001 ' of characteristic 0REOBJECT". Master data for 0REOBJ_ATTR & 0REOBJ_TEXT has already been loaded. 
    Description of Error:
    Characteristic value IB2001000077770000 does not exist in the master data table of characteristic 0REOBJECT. This value could not be be transformed into the internal SID.
    Could someone help me resolve this issue??
    Thanks in advance. Points will be rewarded.

    The record is coming from your source system. The reason why it's not in your Master Data can be either because it's new Master Data (created since last time you loaded) or a wrong input in the source system.
    Either way, the answer is in your source system, so make sure this definition exists there and its a valid value.
    If it is, try to load and activate your master data again and then load the cube once more.
    Regards,
    Luis

  • Data Load process for 0FI_AR_4  failed

    Hi!
    I am aobut to implement SAP Best practices scenario "Accounts Receivable Analysis".
    When I schedule data load process in Dialog immediately for Transaction Data 0FI_AR_4 and check them in Monitor the the status is yellow:
    On the top I can see the following information:
    12:33:35  (194 from 0 records)
    Request still running
    Diagnosis
    No errors found. The current process has probably not finished yet.
    System Response
    The ALE inbox of BI is identical to the ALE outbox of the source system
    or
    the maximum wait time for this request has not yet been exceeded
    or
    the background job has not yet finished in the source system.
    Current status
    No Idocs arrived from the source system.
    Question:
    which acitons can  I do to run the loading process succesfully?

    Hi,
    The job is still in progress it seems.
    You could monitor the job that was created in R/3 (by copying the technical name in the monitor, appending "BI" to is as prefix, and searching for this in SM37 in R/3).
    Keep on eye on ST22 as well if this job is taking too long, as you may have gotten a short dump for it already, and this may not have been reported to the monitor yet.
    Regards,
    De Villiers

  • Data load process for FI module

    Dear all,
    We are using BI7.00 and in one of our FI data source 0EC_PCA_1 we had data load failure, the cause for the failure was analysed and we did the following
    1) deleted the data from cube and the PSA
    2) reloaded (full load) data - without disturbing the init.
    This solved our problem. Now when the data reconciliation is done we find that there are doubled entries for some of the G/L codes.
    I have a doubt here.
    Since there is no setup table for FI transactions (correct me if i am wrong), the full load had taken the data which was also present in the delta queue and subsequently the delta load had also loaded the same data
    (some g/l which was available as delta).
    Kindly provide the funtioning of FI data loads. Should we go for a Down time and how FI data loads works without setup tables.
    Can experts provided valuable solution for addressing this problem. Can anyone provide step by step process that has to be adopted to solve this problem permenantly.
    Regards,
    M.M

    Hi Magesh,
    The FI datasources do not involve Setup tables while performing full loads and they do not involve outbound queue during delta loads.
    Full load happens directly from your datasource view to BI and delta is captured in the delta queue.
    Yes you are right in saying that when you did a full load some of the values were pulled that were also present in the delta queue. Hence you have double loads.
    You need to completely reinitialise as the full load process is disturbed. Taking a down time depends on how frequent the transactions are happening.
    You need to.
    1. Completely delete the data in BW including the initialisation.
    2. Take a down time if necessary.
    3. Reintialise the whole datasource from scratch.
    Regards,
    Pramod

  • BPC NW 7.0: Data Load: rejected entries for ENTITY member

    Hi,
    when trying to load data from a BW info provider into BPC (using UJD_TEST_PACKAGE & process chain scheduling), a number of records is being rejected due to missing member entries for the ENTITY dimension in the application.
    However, the ENTITY member actually do exist in the application. Also, the dimension is processed with no errors. The dimension member are also visible usnig the Excel Client naviagtion pane for selecting members.
    The error also appears when kicking of the data load from the Excel Client for BPC. Any ideas how to analyze this further or resolve this?
    Thanks,
    Claudia Elsner

    Jeffrey,
    this question is closely related to the issue, because there is also a short dump when trying to load the data into BPC. I am not sure whether both problems are directly related though:
    Short dump with UJD_TEST_PACKAGE
    Problem desription of the post:
    When running UJD_TEST_PACKAGE, I get a short dump.
    TSV_TNEW_PAGE_ALLOC_FAILED
    No more storage space available for extending an internal table.
    Other keywords are CL_SHM_AREA and ATTACHUPDATE70.
    When I looked at NOTES, I found this Note 928044 - BI lock server". Looking at the note and debugging UJD_TEST_PACKAGE leaves me some questions:
    1. Do I need a BI lock server?
    2. Should I change the enque/table_size setting be increased on the central instance from 10000 to 25000 or larger?
    Claudia

  • Data loading mechanism for flat file loads for hierrarchy

    Hi all,
    We have a custom hierarchy which is getting data from a flat file that is stored in the central server and that gets data from MDM through XI. Now if we delete few records in MDM, the data picked in BI will not consist of the records which are deleted. Does it mean that the hierarchy itself deletes the data it consists of already and does a full load or does it mean every time we load the data to the BI, do weu delete the records from the tables in BI and reload?
    Also we have some Web Service(gets loaded from XI) text data sources.
    Is the logic about updating the hierrarchy records different as compared to the existing web service interfaces?
    Can anyone please tell me the mechanism behind these data loads and differentiate the same for above mentioned data loads?

    create the ODS with the correct keys. And load full loads from the flat files. You can have a cube pulling data from the ODS.
    Load data in ODS
    Create the cube.
    Generate export datasource ( rsa1 > rt clk the ods > generate export Datasource )
    Replicate the export ds ( rsa1 > source system > ds overview > search the ds starting with 8 + the ODS name
    press the '+' button activate the transfer rules and comm str
    create the update rules for the cube with the above infource ( same as '8ODSNAME' Datasource )
    create infopackage with intial load (in the update tab)
    Now load data to cube
    Now load new full loads to ODS
    create a new infopackage for delta (in the update tab)
    run in infopackage. (any changes / new records will be loaded to cube)
    Regards,
    BWer
    Assing points if helpful.

  • Data loader : Import -- creating duplicate records ?

    Hi all,
    does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
    Duplicate Checking Method == External Unique ID
    Action Taken if Duplicate Found == Overwrite Existing Records
    but data loader have created new records where the "External Unique ID" is already existent..
    Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
    I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
    Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
    thanks in advance, Juergen
    Edited by: 791265 on 27.08.2010 07:25
    Edited by: 791265 on 27.08.2010 07:26

    Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
    Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
    You should review all documentation on Oracle Data Loader On Demand before using it.
    These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
    Pete

Maybe you are looking for

  • No picture when exporting

    i have a project made up of avi (optimized) (codec CineForm HD) + apple prores422 files. the first time i exported the project i was having some trouble but somehow was able to succesfully manage. exported onto vimeo+master file. now, after making a

  • Frequent iTunes dilemmas.

    So I go to go on iTunes the other day and a notice comes up that says "Newer Version of QuickTime Required" and then it says "QuickTime version 7.0d0 is installed, iTunes requires QuickTime version 7.1.3 or later. Please reinstall iTunes." So i reins

  • Mass Reversal of Documents -

    Hi, The users when try to do the Mass Reversal of the documents using the T Code F.80, the system displays the documents that cannot be reversed and gives the error message - Message : 00349 Field BSIS-BLDAT does not exist in the screen SAPMF05A 0105

  • Reg ABAP-HR

    Hi plz tell me how to find out the object type wether we hav to give for onganization unit- (0) position(s) like for remainnig objects Regards, shahi

  • Does Anyone know how to add a new partner to purchase order created on VI01/VI02?

    Does Anyone know how to add a new partner to purchase order created on VI01/VI02? Please, anyone knows that? Regards. Rafael.