Oracle Data Loader On Demand : Account Owner Field mapping

I was trying to import account records using data loader. For the 'Account Owner' field the data file contains 'User ID', then the import is failed. When I use "Company Sign In ID/User Id" value in the data file it is sucessfull.
Do we have any way to use the 'User ID" value in the data file for 'Account Owner' field and run data loader suscessfully.

The answer is no. It is my understanding that you need to map Account Owner to the User Sign In ID, which has the format of:
<Company Sign In ID>/<User ID>

Similar Messages

  • Oracle Data Loader On Demand on EHA Pod

    Oracle Data Loader doesn't work correctly.
    I downloaded it from Staging(EHA Pod).
    And I did the following work.
    1.Move to "config" folder,and update "OracleDataLoaderOnDemand.config".
    hosturl=https://secure-ausomxeha.crmondemand.com
    2.Move to "sample" folder,and change Owner_Full_Name at "account-insert.csv".
    And at the command prompt,run the batch file.
    It runs successfully,but records aren't inserted on EHA Pod.Records exist on EGA Pod.
    This is the log.
    Is Data Loader for only EGA Pod?Would please give me some advices?
    [2012-09-19 14:49:55,281] DEBUG - [main] BulkOpsClient.main(): Execution begin.
    [2012-09-19 14:49:55,281] DEBUG - [main] BulkOpsClient.main(): List of all configurations loaded: {sessionkeepchkinterval=300, maxthreadfailure=1, testmode=production, logintimeoutms=180000, csvblocksize=1000, maxsoapsize=10240, impstatchkinterval=30, numofthreads=1, hosturl=https://secure-ausomxeha.crmondemand.com, maxloginattempts=1, routingurl=https://sso.crmondemand.com, manifestfiledir=.\Manifest\}
    [2012-09-19 14:49:55,281] DEBUG - [main] BulkOpsClient.main(): List of all options loaded: {datafilepath=sample/account-insert.csv, waitforcompletion=False, clientlogfiledir=., datetimeformat=usa, operation=insert, username=XXXX/XXXX, help=False, disableimportaudit=False, clientloglevel=detailed, mapfilepath=sample/account.map, duplicatecheckoption=externalid, csvdelimiter=,, importloglevel=errors, recordtype=account}
    [2012-09-19 14:49:55,296] DEBUG - [main] BulkOpsClientUtil.getPassword(): Entering.
    [2012-09-19 14:49:59,828] DEBUG - [main] BulkOpsClientUtil.getPassword(): Exiting.
    [2012-09-19 14:49:59,828] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Entering.
    [2012-09-19 14:49:59,937] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Sending Host lookup request to: https://sso.crmondemand.com/router/GetTarget
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Host lookup returned: <?xml version="1.0" encoding="UTF-8"?>
    <HostUrl>https://secure-ausomxega.crmondemand.com</HostUrl>
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Successfully extracted Host URL: https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Exiting.
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Entering.
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Host URL from the Routing app=https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Host URL from config file=https://secure-ausomxeha.crmondemand.com
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Successfully updated the config file: .\config\OracleDataLoaderOnDemand.config
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Host URL set to https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Exiting.
    [2012-09-19 14:50:03,953] INFO - [main] Attempting to log in...
    [2012-09-19 14:50:10,171] INFO - [main] Successfully logged in as: XXXX/XXXX
    [2012-09-19 14:50:10,171] DEBUG - [main] BulkOpsClient.doImport(): Execution begin.
    [2012-09-19 14:50:10,171] INFO - [main] Validating Oracle Data Loader On Demand Import request...
    [2012-09-19 14:50:10,171] DEBUG - [main] FieldMappingManager.parseMappings(): Execution begin.
    [2012-09-19 14:50:10,171] DEBUG - [main] FieldMappingManager.parseMappings(): Execution complete.
    [2012-09-19 14:50:11,328] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): Submitting BulkOpImportGetRequestDetail WS call
    [2012-09-19 14:50:11,328] INFO - [main] A SOAP request was sent to the server to create the import request.
    [2012-09-19 14:50:13,640] DEBUG - [Thread-3] SOAPImpRequestManager.sendImportGetRequestDetail(): SOAP request sent successfully and a response was received
    [2012-09-19 14:50:13,640] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): BulkOpImportGetRequestDetail WS call finished
    [2012-09-19 14:50:13,640] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): SOAP response status code=OK
    [2012-09-19 14:50:13,640] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): Going to sleep for 300 seconds.
    [2012-09-19 14:50:20,328] INFO - [main] A response to the SOAP request sent to create the import request on the server has been received.
    [2012-09-19 14:50:20,328] DEBUG - [main] SOAPImpRequestManager.sendImportCreateRequest(): SOAP request sent successfully and a response was received
    [2012-09-19 14:50:20,328] INFO - [main] Oracle Data Loader On Demand Import validation PASSED.
    [2012-09-19 14:50:20,328] DEBUG - [main] BulkOpsClient.sendValidationRequest(): Execution complete.
    [2012-09-19 14:50:20,343] DEBUG - [main] ManifestManager.initManifest(): Creating manifest directory: .\\Manifest\\
    [2012-09-19 14:50:20,343] DEBUG - [main] BulkOpsClient.submitImportRequest(): Execution begin.
    [2012-09-19 14:50:20,390] DEBUG - [main] BulkOpsClient.submitImportRequest(): Sending CSV Data Segments.
    [2012-09-19 14:50:20,390] DEBUG - [main] CSVDataSender.CSVDataSender(): CSVDataSender will use 1 threads.
    [2012-09-19 14:50:20,390] INFO - [main] Submitting Oracle Data Loader On Demand Import request with the following Request Id: AEGA-FX28VK...
    [2012-09-19 14:50:20,390] DEBUG - [main] CSVDataSender.sendCSVData(): Creating thread 0
    [2012-09-19 14:50:20,390] INFO - [main] Import Request Submission Status: Started
    [2012-09-19 14:50:20,390] DEBUG - [main] CSVDataSender.sendCSVData(): Starting thread 0
    [2012-09-19 14:50:20,390] DEBUG - [main] CSVDataSender.sendCSVData(): There are pending requests. Going to sleep.
    [2012-09-19 14:50:20,406] DEBUG - [Thread-5] CSVDataSenderThread.run(): Thread 0 submitting CSV Data Segment: 1 of 1
    [2012-09-19 14:50:24,328] INFO - [Thread-5] A response to the import data SOAP request sent to the server has been received.
    [2012-09-19 14:50:24,328] DEBUG - [Thread-5] SOAPImpRequestManager.sendImportDataRequest(): SOAP request sent successfully and a response was received
    [2012-09-19 14:50:24,328] INFO - [Thread-5] A SOAP request containing import data was sent to the server: 1 of 1
    [2012-09-19 14:50:24,328] DEBUG - [Thread-5] CSVDataSenderThread.run(): There is no more pending request to be picked up by Thread 0.
    [2012-09-19 14:50:24,328] DEBUG - [Thread-5] CSVDataSenderThread.run(): Thread 0 terminating now.
    [2012-09-19 14:50:25,546] INFO - [main] Import Request Submission Status: 100.00%
    [2012-09-19 14:50:26,546] INFO - [main] Oracle Data Loader On Demand Import submission completed succesfully.
    [2012-09-19 14:50:26,546] DEBUG - [main] BulkOpsClient.submitImportRequest(): Execution complete.
    [2012-09-19 14:50:26,546] DEBUG - [main] BulkOpsClient.doImport(): Execution complete.
    [2012-09-19 14:50:26,546] INFO - [main] Attempting to log out...
    [2012-09-19 14:50:31,390] INFO - [main] XXXX/XXXX is now logged out.
    [2012-09-19 14:50:31,390] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): Interrupted.
    [2012-09-19 14:50:31,390] DEBUG - [main] BulkOpsClient.main(): Execution complete.

    Hi,
    the Data Loader points by default to the production environment regardless if you download it from staging or production.
    To change the pod edit the config file and input the below content:
    hosturl=https://secure-ausomxeha.crmondemand.com
    routingurl=https://secure-ausomxeha.crmondemand.com
    testmode=debug

  • Trying to add Contact information through Oracle Data Loader

    Hi,
    I have checked Oracle On Demand guide PDF and can able to insert a valid account data on oracle on demand via client batch pgm. Can you tell me any valid Contact Map and contact*.csv file which can insert contact information on Oracle On Demand. If I get dealer , vehicle or any other that would also help. Where I can check the map details for all these record types. That is the biggest problem I am facing.
    Thanks in advance for your help !!!
    JD.

    I am able to inser a basic contact on Oracle On demand through Data loader. But it is getting partially completed with errors. The first Name, Last Name is getting inserted, but columns like Title , Address those are not getting inserted. Can you give me why it is behaving wierdly. The Map I found to be okie.
    Appreciate your reply...
    Thanks...

  • Oracle Data Loader

    Hi guys!
    I'm planning to import a file with about 400k records with data loader (insert function).
    I do this operation with web services and I took about 7 hours. With web services I import about 20k records per time.
    Someone know if i use dataloader, will the time be improved?
    Another question, do you know how data loader do to import a file (if it divide the record, how many records per time, parallel importation, etc...)?
    Thanks in advance,
    Rafael Feldberg

    Rafael, I would recommend clicking on the Training and Support link in the upper right of your CRM On Demand application, then click on Browse Training, then click on Training Resources by Job Role, then click on Administrator and look for the following:
    Data Loader FAQ
    Data Loader Overview for R17
    Data Loader User Guide
    If you are successful using web services I would stick with that method.

  • Data Loader On Demand Proxy Usage for Resume operation

    Hi,
    My project required me to use the proxy feature available in Data Loader R19 release.
    I could use the proxy at command line for insert /update operations.
    However, the same doesnt work for RESUME operation in Data loader.
    Tried using proxy settings from command line as well as property file but to no use.
    Any suggestions...
    Regards,
    Sumeet

    Its a Java application so it may run on your Linux/Unix system, you would have to test to see if works. Last time I checked Oracle only supports the application running on windows.

  • Data Loader On Demand Inserting Causes Duplicates on Custom Objects

    Hi all,
    I am having a problem that i need to import around 250,00 records on a regular basis so have built a solution using Dataloader with two processes, one to insert and one to update. I was expecting that imports that had an existing EUI would fail therefore only new records would get inserted (as it says in the PDF) but it keeps creating duplicates even when all the data is exactly the same.
    does anyone have any ideas?
    Cheers
    Mark

    Yes, you have encountered an interesting problem. There is a field on every object labelled "External Unique Id" (but it is inconsistent as to whether there is a unique index there or not). Some of the objects have keys that are unique and some that seemingly have none. The best way to test this is to use the command line bulk loader (because the GUI import wizard can do both INSERT/UPDATE in one execution, you don't always see the problem).
    I can run the same data over and over thru the command line loader with the option to INSERT and you don't get unique key constraints. For example, ASSET and CONTACT, and CUSTOM OBJECTS. Once you have verified whether the bulk loader is creating duplicates or not, that might drive you to the decision of using a webservice.
    The FINANCIAL TRANSACTION object I believe has a unique index on the "External Unique Id" field and the FINANCIAL ACCOUNT object has a unique key on the "Name" field I believe.
    Hope this helps a bit.
    Mychal Manie ([email protected])
    Hitachi Consulting

  • Data load for GL Account hierarachy is failing at PSA

    Hi,
    I am loading data for GL Account hierarchy using process chains. It is failing at PSA. Status tab of the monitor is given below.
    Error message during processing in BI
    Diagnosis
    An error occurred in BI while processing the data. The error is documented in an error message.
    System Response
    A caller 01, 02 or equal to or greater than 20 contains an error meesage.
    Further analysis:
    The error message(s) was (were) sent by:
    Update
    Update
    Procedure
    Check the error message (pushbutton below the text).
    Select the message in the message dialog box, and look at the long text for further information.
    Follow the instructions in the message.
    Can someone please advice what the problem is and how to resolve it?

    HI
    sneh saboo     ,
    Please check the hierarchy in ECC system , there is a possibility that   some nodes are duplicate in the same level.
    This error mostly coming when you are trying to load the hierarchies with ranges in same node. Please add these nodes with different nodes below the same parent .
    Hope this will help you.
    With Regards
    Avenai.

  • Announcing 3 new Data Loader resources

    There are three new Data Loader resources available to customers and partners.
    •     Command Line Basics for Oracle Data Loader On Demand (for Windows) - This two-page guide (PDF) shows command line functions specifc to Data Loader.
    •     Writing a Properties File to Import Accounts - This 6-minute Webinar shows you how to write a properties file to import accounts using the Data Loader client. You'll also learn how to use the properties file to store parameters, and to use the command line to reference the properties file, thereby creating a reusable library of files to import or overwrite numerous record types.
    •     Writing a Batch File to Schedule a Contact Import - This 7-minute Webinar shows you how to write a batch file to schedule a contact import using the Data Loader client. You'll also learn how to reference the properties file.
    You can find these on the Data Import Resources page, on the Training and Support Center.
    •     Click the Learn More tab> Popular Resources> What's New> Data Import Resources
    or
    •     Simply search for "data import resources".
    You can also find the Data Import Resources page on My Oracle Support (ID 1085694.1).

    Unfortunately, I don't believe that approach will work.
    We use a similar mechanism for some loads (using the bulk loader instead of web services) for the objects that have a large qty of daily records).
    There is a technique (though messy) that works fine. Since Oracle does not allow the "queueing up" of objects of the same type (you have to wait for "account" to finish before you load the next "account" file), you can monitor the .LOG file to get the SBL 0363 error (which means you can submit another file yet (typically meaning one already exists).
    By monitoring for this error code in the log, you can sleep your process, then try again in a preset amount of time.
    We use this allow for an UPDATE, followed by an INSERT on the account... and then a similar technique so "dependent" objects have to wait for the prime object to finish processing.
    PS... Normal windows .BAT scripts aren't sophisticated enough to handle this. I would recommend either Windows POWERSHELL or C/Korn/Borne shell scripts in Unix.
    I hope that helps some.

  • Data Loader errors

    I have been using data loader for large imports and have received two errors that are causing me issues:
    The first error is from the log file and is: [main] Oracle Data Loader On Demand Import validation FAILED: String index out of range: -1
    The second error is from cmd window when import is in progress: WARNING: Unable to connect to URL: https://secure-ausomxxxx.crmondemand.com/Services/Integration;jsessionid=38dd67911f7fxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx due to java.security.PrivilegedActionExcep
    tion: javax.xml.soap.SOAPException: Error parsing envelope: (2, 7110257) Invalid char in text.
    What characaters are considered invalid in a csv file?
    What does String index out of range: -1 error message mean?
    Thanks in advance for your help or suggestions.
    Edited by: user572322 on Dec 6, 2010 12:41 PM

    Hi
    Have you checked the length of the data with regards to the field type?
    That might cause that error.
    Thanks,
    Mayank

  • Data loader : Import -- creating duplicate records ?

    Hi all,
    does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
    Duplicate Checking Method == External Unique ID
    Action Taken if Duplicate Found == Overwrite Existing Records
    but data loader have created new records where the "External Unique ID" is already existent..
    Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
    I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
    Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
    thanks in advance, Juergen
    Edited by: 791265 on 27.08.2010 07:25
    Edited by: 791265 on 27.08.2010 07:26

    Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
    Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
    You should review all documentation on Oracle Data Loader On Demand before using it.
    These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
    Pete

  • Rate data loader

    Hey, Gurus!
    Does anyone know how many records does Oracle Data Loader On Demand send by package?
    I didn't find anything on the documentation (Data Loader FAQ, Data Loader Overview for R17, Data Loader User Guide).
    thanks in advance
    Rafael Feldberg

    Rafael, there is no upper limit for the number of records that the Data Loader can import. However, after doing a test import using the Import Wizard I would recommend keeping the number of records at a reasonable level.

  • Account Owner: restricting owner field to current owner and direct manager

    Does anyone know a way to restrict 'Account Owner' field to current owner and owner's direct manager?
    Currently attempting workflow and join field validation and not proving successful.
    Thanks,

    I can see the usage, but i want to read the actual text. We suspect that my girlfriends teen age daugther is sending/receiving in appropriate messages. The problem is, that we only have her on weekends and her dad has her on weekedays and has refused to help us in the matter. To be able to view the info that I want, a seperate account has to be set up with her phone number, and the only way to do that is to set up the password, which sends the verification text to her line. This is the problem because we dont want her to know about it.

  • Bulk 2.0 Deletion of Field mapping and Data

    I have created a Account Export field mapping using Bulk 2.0 API and after creation, I tried  SYNC and retrieved the Data. This was completed without issues.
    Then I tried  deleting export entity from Eloqua using an HTTP DELETE request.
    Using HTTP DELETE,."accounts/exports/24/data";I tried deleting the data.
    The request was successful, Eloqua returned a 204 response.
    But Using Get request, I was able to retrieve the data again. It wasn't deleted
    Again,Then I tried deleting the mapping itself, I got the status as '204 No Content'.
    The data got deleted but  When I tried '/accounts/exports', the mapping still exists there.
    Is anyone aware of this issue and also how to resolve this?

    I finally found it. Its mapping's are present in FieldTypes class. Even though its not mentioned in the API.

  • Start Routine to Populate Account Group Field from Master data of 0Customer

    Hello Friends. Please help me edit this ABAP code to make it work. I am putting this code in start routine in between two DSO. where I am using the
    Start Routine to Populate Account Group Field from Master data of 0Customer. I do not want to use read from master data functionality since that field 0customer is not there in dso but similar field 0debitor is there. so i want to put this code
    during the load from source DSO to Target DSO.
    Error Explicit length specifications are necessary with types C, P, X, N und
    DATA: L_S_DP_LINE TYPE DATA_PACKAGE_sTRUCTURE.
        types: begin of comp,
         CUSTOMER       type  /BI0/OICUSTOMER,
         ACCNT_GRP          type /BI0/OIACCNT_GRP,
       end of comp.
        DATA: l_S_comp type comp.
        DATA: L_th_COMP TYPE HASHED TABLE OF COMP WITH UNIQUE KEY customer INITIAL SIZE 0.
    IF  L_th_COMP[] IS INITIAL.
    SELECT CUSTOMER ACCNT_GRP FROM /BI0/PCUSTOMER APPENDING CORRESPONDING FIELDS OF TABLE L_th_COMP.
    ENDIF.
    LOOP AT SOURCE_PACKAGE INTO L_S_DP_LINE.
    READ TABLE L_TH_COMP INTO L_S_COMP WITH TABLE KEY CUSTOMER = L_s_DP_LINE-CUSTOMER
    IF SY-SUBRC = 0.
    L_S_DP_LINE-/BIC/ACCNT_GRP = L_S_COMP-/BIC/ACCNT_GRP.
    MODIFY SOURCE_PACKAGE FROM L_S_DP_LINE.
    ENDIF.
    ENDLOOP.
    soniya kapoor
    Message was edited by:
            soniya kapoor

    Hello Wond Thanks for Good Answer and good option, But Client does not like this option and does not like Nav Attribute so he does not want to turn on any Nav Attribute, In general also We hav requirement to read a third table while uploading 1 dso table to 2 dso table,
    so  Please help me edit this ABAP code to make it work. I am putting this code in start routine in between two DSO. where I am using the
    Start Routine to Populate Account Group Field from Master data of 0Customer.
    No syntax Error But during the load it is updating the source table and not the target table. how to define now target table.
    ***SOURCE DSO Table
    types: begin of typ_tgl1.
        include type /BIC/AZDAFIAR000.
        types: end of typ_tgl1.
        types: begin of comp,
         CUSTOMER       type  /BI0/OICUSTOMER,
         ACCNT_GRP          type /BI0/OIACCNT_GRP,
       end of comp.
    DATA: L_th_COMP TYPE HASHED TABLE OF COMP WITH UNIQUE KEY customer
    INITIAL SIZE 0.
      data: wa_itab type COMP.
        data: wa_zdtg type typ_tgl1.
    IF  L_th_COMP[] IS INITIAL.
    ***Master Data Table
    SELECT CUSTOMER ACCNT_GRP FROM /BI0/PCUSTOMER APPENDING CORRESPONDING
    FIELDS OF TABLE L_th_COMP.
    sort L_th_COMP by CUSTOMER.
    ENDIF.
    LOOP AT L_th_COMP into wa_itab.
    select * from /BIC/AZDAFIAR000 into wa_zdtg
                        where DEBITOR  eq wa_itab-CUSTOMER.  *** SOURCE DSO Table
    IF SY-SUBRC = 0.
    wa_zdtg-ACCNT_GRP = wa_itab-ACCNT_GRP.
    MODIFY /BIC/AZDAFIAR000 from wa_zdtg. *** modify SOURCE DSO Table
    ENDIF.
      endselect.
        endloop.
    soniya kapoor

  • Error while running bulk load utility for account data with CSV file

    Hi All,
    I'm trying to run the bulk load utility for account data using CSV but i'm getting following error...
    ERROR ==> The number of CSV files provided as input does not match with the number of account tables.
    Thanks in advance........

    Please check your child table.
    http://docs.oracle.com/cd/E28389_01/doc.1111/e14309/bulkload.htm#CHDCGGDA
    -kuldeep

Maybe you are looking for