Lab Batch Data Load

I have a query related to Lab Batch Data Load.
I am testing to load a lab but when I submit the PSUB Job a fatal failure message “Cannot open the file” is coming.
I wanna to know that what is the full path for “Lab Data File Name”
Can you please help me out?
regards,
Suniti

user11309982 wrote:
I have a query related to Lab Batch Data Load. Never heard of it. It's not an Oracle standard thing. Something your college/school/business has provided perhaps?
I am testing to load a lab but when I submit the PSUB Job a fatal failure message “Cannot open the file” is coming.Never heard of PSUB job. I guess it's part of the same non oracle standard thing you're doing. The fact it "cannot open the file" would suggest that the files are in the wrong place or the access permissions are wrong somehow. Have you followed your instructions correctly?
I wanna to know that what is the full path for “Lab Data File Name”If it's Oracle, the chances are that you only need to know the full path for setting the directory object, then the job will call some procedure or pl/sql code that will probably use utl_file package and reference that directory object, and thus the file. However without your code, how on Earth can we tell what you are doing?
Can you please help me out?Post your code or more details.

Similar Messages

  • Data load of Batch Characterstics values

    Hi,
    We are using batch management. For initial stock we have data map for batch creation. We need to enter the characterstics values for all these batches.
    We can enter the batch characterstics values using MSC2N. However, I am not sure if that is the correct way of doing it.
    If I use CL20N, then we can not enter the characterstics values at batch level.
    Am I correct in my understabding of using MSC2N or it has to be something else?
    Thank you for all your help.
    Thanks,
    -Shekhar

    I usually use LSMW for batch classifiction load.
    Here I use IDOC import method with message type CLFMAS.

  • Using data loading softwares better than SM35 as batch input

    Hi All,
    Can anyone tell me what are the pros and cons of using SM35 batch input in SAP as compared with commercial data loading software like win shuttle, data loader etc
    Thanks in advance.

    Our data loading software runs in the background, with each particular run set as a separate batch. It doesn't matter if this data comes from a program generated on site, or via a data feed interfaced from another system. With the transaction, SM35, I can look at each run and know immediately if any entry within that run did not post correctly, as well as those that did. I can view each transaction and get a detailed log on all that happened. If an entry failed to post, I can see exactly where within the posting process the transaction failed and why it failed.  When we have a problem with a new - or recently changed, interface, SM35 is the tool I use to debug the problem fast. Another benefit of this batch input session is when a user tells me her/his interface failed. Often it didn't; someone just ran off with their interface report. In those cases, I don't have to restore & re-run anything. I just reprint the run to the relief  - and gratitude, of the user.

  • Data load problem - BW and Source System on the same AS

    Hi experts,
    I’m starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
    BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
    I’ve configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
    Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, I’ve created an  InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 – source system). I’ve started the data load process, but the monitor says that “no Idocs arrived from the source system” and keeps the status yellow forever.
    Additional information:
    <u><b>BW Monitor Status:</b></u>
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System Response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    No Idocs arrived from the source system.
    <b><u>BW Monitor Details:</u></b>
    0 from 0 records
    – but there are 2 records on RSA3 for this data source
    Overall status: Missing messages or warnings
    -     Requests (messages): Everything OK
    o     Data request arranged
    o     Confirmed with: OK
    -     Extraction (messages): Missing messages
    o     Missing message: Request received
    o     Missing message: Number of sent records
    o     Missing message: Selection completed
    -     Transfer (IDocs and TRFC): Missing messages or warnings
    o     Request IDoc: sent, not arrived ; Data passed to port OK
    -     Processing (data packet): No data
    <b><u>Transactional RFC (sm58):</u></b>
    Function Module: IDOC_INBOUND_ASYNCHRONOUS
    Target System: SRMCLNT100
    Date Time: 08.03.2006 14:55:56
    Status text: No service for system SAPSRM, client 001 in Integration Directory
    Transaction ID: C8C415C718DC440F1AAC064E
    Host: srm
    Program: SAPMSSY1
    Client: 001
    Rpts: 0000
    <b><u>System Log (sm21):</u></b>
    14:55:56 DIA  000 100 BWREMOTE  D0  1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
    Documentation for system log message D0 1 :
    The transaction has been terminated.  This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction.  The actual reason for the termination is indicated by the T100 message and the parameters.
    Additional documentation for message IDOC_ADAPTER        601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
    <b><u>RFC Destinations (sm59):</u></b>
    Both RFC destinations look fine, with connection and authorization tests successful.
    <b><u>RFC Users (su01):</u></b>
    BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
    Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
    Someone could help ?
    Thanks,
    Guilherme

    Guilherme
    I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
    Also check this weblog on data Load errors basic checks. it may help
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Thanks
    Sat

  • Error while starting data loading on InfoPackage

    Hi everybody,
    I'm new at SAP BW and I'm working in the "Step-By-Step: From Data Model to the BI Application in the web" document from SAP.
    I'm having a problem at the (Chapter 9 in the item c - Starting Data Load Immediately).
    If anyone can help me:
    Thanks,
    Thiago
    Below are the copy of the error from my SAP GUI.
    <><><><><><><><><><<><><><><><><><><><><><><><><><><><><><><><>
    Runtime Errors         MESSAGE_TYPE_X
    Date and Time          19.01.2009 14:41:22
    Short text
    The current application triggered a termination with a short dump.
    What happened?
    The current application program detected a situation which really
    should not occur. Therefore, a termination with a short dump was
    triggered on purpose by the key word MESSAGE (type X).
    What can you do?
    Note down which actions and inputs caused the error.
    To process the problem further, contact you SAP system
    administrator.
    Using Transaction ST22 for ABAP Dump Analysis, you can look
    at and manage termination messages, and you can also
    keep them for a long time.
    Error analysis
    Short text of error message:
    Batch - Manager for BW Processes ***********
    Long text of error message:
    Technical information about the message:
    Message class....... "RSBATCH"
    Number.............. 000
    Variable 1.......... " "
    Variable 2.......... " "
    Variable 3.......... " "
    Variable 4.......... " "
    How to correct the error
    Probably the only way to eliminate the error is to correct the program.
    If the error occures in a non-modified SAP program, you may be able to
    find an interim solution in an SAP Note.
    If you have access to SAP Notes, carry out a search with the following
    keywords:
    "MESSAGE_TYPE_X" " "
    "SAPLRSBATCH" or "LRSBATCHU01"
    "RSBATCH_START_PROCESS"
    If you cannot solve the problem yourself and want to send an error
    notification to SAP, include the following information:
    1. The description of the current problem (short dump)
    To save the description, choose "System->List->Save->Local File
    (Unconverted)".
    2. Corresponding system log
    Display the system log by calling transaction SM21.
    Restrict the time interval to 10 minutes before and five minutes
    after the short dump. Then choose "System->List->Save->Local File
    (Unconverted)".
    3. If the problem occurs in a problem of your own or a modified SAP
    program: The source code of the program
    In the editor, choose "Utilities->More
    Utilities->Upload/Download->Download".
    4. Details about the conditions under which the error occurred or which
    actions and input led to the error.
    System environment
    SAP-Release 701
    Application server... "sun"
    Network address...... "174.16.5.194"
    Operating system..... "Windows NT"
    Release.............. "5.1"
    Hardware type........ "2x Intel 801586"
    Character length.... 8 Bits
    Pointer length....... 32 Bits
    Work process number.. 2
    Shortdump setting.... "full"
    Database server... "localhost"
    Database type..... "ADABAS D"
    Database name..... "NSP"
    Database user ID.. "SAPNSP"
    Terminal.......... "sun"
    Char.set.... "English_United State"
    SAP kernel....... 701
    created (date)... "Jul 16 2008 23:09:09"
    create on........ "NT 5.2 3790 Service Pack 1 x86 MS VC++ 14.00"
    Database version. "SQLDBC 7.6.4.014 CL 188347 "
    Patch level. 7
    Patch text.. " "
    Database............. "MaxDB 7.6, MaxDB 7.7"
    SAP database version. 701
    Operating system..... "Windows NT 5.0, Windows NT 5.1, Windows NT 5.2, Windows
    NT 6.0"
    Memory consumption
    Roll.... 8112
    EM...... 11498256
    Heap.... 0
    Page.... 65536
    MM Used. 6229800
    MM Free. 1085264
    User and Transaction
    Client.............. 001
    User................ "THIAGO"
    Language key........ "E"
    Transaction......... "RSA1 "
    Transactions ID..... "CD47E6DDD55EF199B4E6001B782D539C"
    Program............. "SAPLRSBATCH"
    Screen.............. "SAPLRSS1 2500"
    Screen line......... 7
    Information on where terminated
    Termination occurred in the ABAP program "SAPLRSBATCH" - in
    "RSBATCH_START_PROCESS".
    The main program was "RSAWBN_START ".
    In the source code you have the termination point in line 340
    of the (Include) program "LRSBATCHU01".
    Source Code Extract
    Line
    SourceCde
    310
    endif.
    311
    l_lnr_callstack = l_lnr_callstack - 1.
    312
    endloop.      " at l_t_callstack
    313
    endif.
    314
    315
    *---- Eintrag für RSBATCHHEADER -
    316
    l_s_rsbatchheader-batch_id    = i_batch_id.
    317
    call function 'GET_JOB_RUNTIME_INFO'
    318
    importing
    319
    jobcount        = l_s_rsbatchheader-jobcount
    320
    jobname         = l_s_rsbatchheader-jobname
    321
    exceptions
    322
    no_runtime_info = 1
    323
    others          = 2.
    324
    call function 'TH_SERVER_LIST'
    325
    tables
    326
    list           = l_t_server
    327
    exceptions
    328
    no_server_list = 1
    329
    others         = 2.
    330
    data: l_myname type msname2.
    331
    call 'C_SAPGPARAM' id 'NAME'  field 'rdisp/myname'
    332
    id 'VALUE' field l_myname.
    333
    read table l_t_server with key
    334
    name = l_myname.
    335
    if sy-subrc = 0.
    336
    l_s_rsbatchheader-host   = l_t_server-host.
    337
    l_s_rsbatchheader-server = l_myname.
    338
    refresh l_t_server.
    339
    else.
    >>>>>
    message x000.
    341
    endif.
    342
    data: l_wp_index type i.
    343
    call function 'TH_GET_OWN_WP_NO'
    344
    importing
    345
    subrc    = l_subrc
    346
    wp_index = l_wp_index
    347
    wp_pid   = l_s_rsbatchheader-wp_pid.
    348
    if l_subrc <> 0.
    349
    message x000.
    350
    endif.
    351
    l_s_rsbatchheader-wp_no           = l_wp_index.
    352
    l_s_rsbatchheader-ts_start        = l_tstamps.
    353
    l_s_rsbatchheader-uname           = sy-uname.
    354
    l_s_rsbatchheader-module_name     = l_module_name.
    355
    l_s_rsbatchheader-module_type     = l_module_type.
    356
    l_s_rsbatchheader-pc_variant      = i_pc_variant.
    357
    l_s_rsbatchheader-pc_instance     = i_pc_instance.
    358
    l_s_rsbatchheader-pc_logid        = i_pc_logid.
    359
    l_s_rsbatchheader-pc_callback     = i_pc_callback_at_end.

    Hi,
    i am also getting related this issue kindly see this below short dump description.
    Short text
        The current application triggered a termination with a short dump.
    What happened?
        The current application program detected a situation which really
        should not occur. Therefore, a termination with a short dump was
        triggered on purpose by the key word MESSAGE (type X).
    What can you do?
        Note down which actions and inputs caused the error.
        To process the problem further, contact you SAP system
        administrator.
            Using Transaction ST22 for ABAP Dump Analysis, you can look
        at and manage termination messages, and you can also
        keep them for a long time.
    Error analysis
        Short text of error message:
        Variant RSPROCESS0000000000705 does not exist
        Long text of error message:
         Diagnosis
             You selected variant 00000000705 for program RSPROCESS.
             This variant does not exist.
         System Response
         Procedure
             Correct the entry.
        Technical information about the message:
        Message class....... "DB"
    Number.............. 612
        Variable 1.......... "&0000000000705"
        Variable 2.......... "RSPROCESS"
        Variable 3.......... " "
        Variable 4.......... " "
    How to correct the error
        Probably the only way to eliminate the error is to correct the program.
        If the error occures in a non-modified SAP program, you may be able to
        find an interim solution in an SAP Note.
        If you have access to SAP Notes, carry out a search with the following
        keywords:
    "MESSAGE_TYPE_X" " "
    "SAPLRSPC_BACKEND" or "LRSPC_BACKENDU05"
    "RSPC_PROCESS_FINISH"
        If you cannot solve the problem yourself and want to send an error
        notification to SAP, include the following information:
        1. The description of the current problem (short dump)
           To save the description, choose "System->List->Save->Local File
    (Unconverted)".
       2. Corresponding system log
          Display the system log by calling transaction SM21.
          Restrict the time interval to 10 minutes before and five minutes
       after the short dump. Then choose "System->List->Save->Local File
    (Unconverted)".
       3. If the problem occurs in a problem of your own or a modified SAP
       program: The source code of the program
          In the editor, choose "Utilities->More
    Utilities->Upload/Download->Download".
       4. Details about the conditions under which the error occurred or which
       actions and input led to the error.
    System environment
        SAP-Release 701
        Application server... "CMCBIPRD"
        Network address...... "192.168.50.12"
        Operating system..... "Windows NT"
    Release.............. "6.1"
        Hardware type........ "16x AMD64 Level"
        Character length.... 16 Bits
        Pointer length....... 64 Bits
        Work process number.. 0
        Shortdump setting.... "full"
        Database server... "CMCBIPRD"
        Database type..... "MSSQL"
        Database name..... "BIP"
        Database user ID.. "bip"
        Terminal.......... "CMCBIPRD"
      Char.set.... "C"
      SAP kernel....... 701
      created (date)... "Sep 9 2012 23:43:54"
      create on........ "NT 5.2 3790 Service Pack 2 x86 MS VC++ 14.00"
      Database version. "SQL_Server_8.00 "
      Patch level. 196
      Patch text.. " "
    Database............. "MSSQL 9.00.2047 or higher"
      SAP database version. 701
      Operating system..... "Windows NT 5.0, Windows NT 5.1, Windows NT 5.2, Windows
       NT 6.0, Windows NT 6.1, Windows NT 6.2"
      Memory consumption
      Roll.... 16192
      EM...... 4189840
      Heap.... 0
      Page.... 16384
      MM Used. 2143680
      MM Free. 2043536
    User and Transaction
    Client.............. 001
    User................ "BWREMOTE"
        Language Key........ "E"
    Transaction......... " "
        Transactions ID..... "9C109BE2C9FBF18BBD4BE61F13CE9693"
    Program............. "SAPLRSPC_BACKEND"
    Screen.............. "SAPMSSY1 3004"
        Screen Line......... 2
        Information on caller of Remote Function Call (RFC):
    System.............. "BIP"
        Database Release.... 701
        Kernel Release...... 701
        Connection Type..... 3 (2=R/2, 3=ABAP System, E=Ext., R=Reg. Ext.)
        Call Type........... "asynchron without reply and transactional (emode 0, imode
         0)"
        Inbound TID.........." "
        Inbound Queue Name..." "
        Outbound TID........." "
        Outbound Queue Name.." "
    Information on where terminated
        Termination occurred in the ABAP program "SAPLRSPC_BACKEND" - in
    "RSPC_PROCESS_FINISH".
        The main program was "SAPMSSY1 ".
        In the source code you have the termination point in line 75
        of the (Include) program "LRSPC_BACKENDU05".
    Source Code Extract
    Line  SourceCde
       45         l_t_info        TYPE rs_t_rscedst,
       46         l_s_info        TYPE rscedst,
       47         l_s_mon         TYPE rsmonpc,
       48         l_synchronous   TYPE rs_bool,
       49         l_sync_debug    TYPE rs_bool,
       50         l_eventp        TYPE btcevtparm,
       51         l_eventno       TYPE rspc_eventno,
       52         l_t_recipients  TYPE rsra_t_recipient,
    52 l_t_recipients  TYPE rsra_t_recipient,
    53         l_s_recipients  TYPE rsra_s_recipient,
    54         l_sms           TYPE rs_bool,
    55         l_t_text        TYPE rspc_t_text.
    56
    57   IF i_dump_at_error = rs_c_true.
    58 * ==== Dump at error? => Recursive Call catching errors ====
    59     CALL FUNCTION 'RSPC_PROCESS_FINISH'
    60       EXPORTING
    61         i_logid       = i_logid
    62         i_chain       = i_chain
    63         i_type        = i_type
    64         i_variant     = i_variant
    65         i_instance    = i_instance
    66         i_state       = i_state
    67         i_eventno     = i_eventno
    68         i_hold        = i_hold
    69         i_job_count   = i_job_count
    70         i_batchdate   = i_batchdate
    71         i_batchtime   = i_batchtime
    72       EXCEPTIONS
    73         error_message = 1.
    74     IF sy-subrc <> 0.
    >>> MESSAGE ID sy-msgid TYPE 'X' NUMBER sy-msgno
    76               WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    77     ELSE.
    78       EXIT.
    79     ENDIF.
    80   ENDIF.
    81 * ==== Cleanup ====
    82   COMMIT WORK.
    83 * ==== Get Chain ====
    84   IF i_chain IS INITIAL.
    85     SELECT SINGLE chain_id FROM rspclogchain INTO l_chain
    86            WHERE log_id = i_logid.
    87   ELSE.
    88     l_chain = i_chain.
    89   ENDIF.
    90 * ==== Lock ====
    91 * ---- Lock process ----
    92   DO.
    93     CALL FUNCTION 'ENQUEUE_ERSPCPROCESS'
    94       EXPORTING
    If we do this
    Use table RSSDLINIT and in OLTPSOURCRE: enter the name of the data  source
    And in the logsys enter the name of the source system and delete the entry for that info package. what will happen is process chain will run suceesfully and short dump will not come or what kindly give the detail explanation about this RSSDLINIT.
    Regards,
    poluru

  • How to automate the data load process using data load file & task Scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of data_Load.bat file & Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler or what are the rest of the file is require to achieve this.
    Thanks

    To follow up on your question are you using the maxl scripts for the dataload?
    If so I have seen and issue within the batch (ex: load_data.bat) that if you do not have the full maxl script path with a batch when running it through event task scheduler the task will work but the log and/ or error file will not be created. Meaning the batch claims it ran from the task scheduler although it didn't do what you needed it to.
    If you are using maxl use this as the batch
    "essmsh C:\data\DataLoad.mxl" Or you can also use the full path for the maxl either way works. The only reason I would think that the maxl may then not work is if you do not have the batch updated to call on all the maxl PATH changes or if you need to update your environment variables to correct the essmsh command to work in a command prompt.

  • Maxl data load error

    Hi All,
    I'm trying to load data from a csv file to an Aggregate Storage cube.
    I keep getting a wierd error "Syntax error near ['$']".
    Where there is no $ sign in either the script or the flat file source from where I'm loading data.
    This same script worked earlier using the same rule file.
    I'm running the maxl from the EAS console, and not invoking it through a batch.
    I only had to make a change to the cube name and the spool and error file names in the script.
    In the data file i had to make changes to a few member names, and I remapped it through the rule file, where I ignored one particular column.
    I've validated the rule file more than a couple of times, but do not get any errors.
    I'm not sure why I get such an error.
    Can anyone tell me what I'm doing wrong? Or if any one has seen such an error before can you help me out?
    THanks,
    Anindyo
    Edited by: Anindyo Dutta on Feb 4, 2011 7:39 PM
    Edited by: Anindyo Dutta on Feb 4, 2011 7:40 PM

    HEy,
    I'm running the MaxL script through EAS, it doesn't seem like any of the records are going through.
    The script is below:
    login 'usr' 'pwd!' on 'ec2-184-72-157-215.compute-1.amazonaws.com';
    spool on to 'E:/dump/MaxL Scripts/ASO Scripts/Spool files/full_dt_ld_visa_aso_spool.log';
    import database 'VISA_ASO'.'VISA' data from data_file 'E:/dump/Data Load Files/ASO Load files Old format/master_data_load.csv' using server rules_file 'fulldtld' on error write to 'E:/dump/MaxL Scripts/ASO Scripts/Spool files/full_dt_ld_visa_aso_spool.err';
    spool off;
    logout;
    I rechecked a couple of times, doesn't seem like I'm missing any quotes.
    Robb and Jeanette Thanks for your response!
    I'm going to try with a smaller file and update this thread.
    CHeers!
    Anindyo
    Edited by: Anindyo Dutta on Feb 4, 2011 9:14 PM
    Edited by: Anindyo Dutta on Feb 4, 2011 9:16 PM

  • Announcing 3 new Data Loader resources

    There are three new Data Loader resources available to customers and partners.
    •     Command Line Basics for Oracle Data Loader On Demand (for Windows) - This two-page guide (PDF) shows command line functions specifc to Data Loader.
    •     Writing a Properties File to Import Accounts - This 6-minute Webinar shows you how to write a properties file to import accounts using the Data Loader client. You'll also learn how to use the properties file to store parameters, and to use the command line to reference the properties file, thereby creating a reusable library of files to import or overwrite numerous record types.
    •     Writing a Batch File to Schedule a Contact Import - This 7-minute Webinar shows you how to write a batch file to schedule a contact import using the Data Loader client. You'll also learn how to reference the properties file.
    You can find these on the Data Import Resources page, on the Training and Support Center.
    •     Click the Learn More tab> Popular Resources> What's New> Data Import Resources
    or
    •     Simply search for "data import resources".
    You can also find the Data Import Resources page on My Oracle Support (ID 1085694.1).

    Unfortunately, I don't believe that approach will work.
    We use a similar mechanism for some loads (using the bulk loader instead of web services) for the objects that have a large qty of daily records).
    There is a technique (though messy) that works fine. Since Oracle does not allow the "queueing up" of objects of the same type (you have to wait for "account" to finish before you load the next "account" file), you can monitor the .LOG file to get the SBL 0363 error (which means you can submit another file yet (typically meaning one already exists).
    By monitoring for this error code in the log, you can sleep your process, then try again in a preset amount of time.
    We use this allow for an UPDATE, followed by an INSERT on the account... and then a similar technique so "dependent" objects have to wait for the prime object to finish processing.
    PS... Normal windows .BAT scripts aren't sophisticated enough to handle this. I would recommend either Windows POWERSHELL or C/Korn/Borne shell scripts in Unix.
    I hope that helps some.

  • Oracle Data Loader On Demand on EHA Pod

    Oracle Data Loader doesn't work correctly.
    I downloaded it from Staging(EHA Pod).
    And I did the following work.
    1.Move to "config" folder,and update "OracleDataLoaderOnDemand.config".
    hosturl=https://secure-ausomxeha.crmondemand.com
    2.Move to "sample" folder,and change Owner_Full_Name at "account-insert.csv".
    And at the command prompt,run the batch file.
    It runs successfully,but records aren't inserted on EHA Pod.Records exist on EGA Pod.
    This is the log.
    Is Data Loader for only EGA Pod?Would please give me some advices?
    [2012-09-19 14:49:55,281] DEBUG - [main] BulkOpsClient.main(): Execution begin.
    [2012-09-19 14:49:55,281] DEBUG - [main] BulkOpsClient.main(): List of all configurations loaded: {sessionkeepchkinterval=300, maxthreadfailure=1, testmode=production, logintimeoutms=180000, csvblocksize=1000, maxsoapsize=10240, impstatchkinterval=30, numofthreads=1, hosturl=https://secure-ausomxeha.crmondemand.com, maxloginattempts=1, routingurl=https://sso.crmondemand.com, manifestfiledir=.\Manifest\}
    [2012-09-19 14:49:55,281] DEBUG - [main] BulkOpsClient.main(): List of all options loaded: {datafilepath=sample/account-insert.csv, waitforcompletion=False, clientlogfiledir=., datetimeformat=usa, operation=insert, username=XXXX/XXXX, help=False, disableimportaudit=False, clientloglevel=detailed, mapfilepath=sample/account.map, duplicatecheckoption=externalid, csvdelimiter=,, importloglevel=errors, recordtype=account}
    [2012-09-19 14:49:55,296] DEBUG - [main] BulkOpsClientUtil.getPassword(): Entering.
    [2012-09-19 14:49:59,828] DEBUG - [main] BulkOpsClientUtil.getPassword(): Exiting.
    [2012-09-19 14:49:59,828] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Entering.
    [2012-09-19 14:49:59,937] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Sending Host lookup request to: https://sso.crmondemand.com/router/GetTarget
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Host lookup returned: <?xml version="1.0" encoding="UTF-8"?>
    <HostUrl>https://secure-ausomxega.crmondemand.com</HostUrl>
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Successfully extracted Host URL: https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.lookupHostURL(): Exiting.
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Entering.
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Host URL from the Routing app=https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Host URL from config file=https://secure-ausomxeha.crmondemand.com
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Successfully updated the config file: .\config\OracleDataLoaderOnDemand.config
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Host URL set to https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03,953] DEBUG - [main] BulkOpsClientUtil.determineWSHostURL(): Exiting.
    [2012-09-19 14:50:03,953] INFO - [main] Attempting to log in...
    [2012-09-19 14:50:10,171] INFO - [main] Successfully logged in as: XXXX/XXXX
    [2012-09-19 14:50:10,171] DEBUG - [main] BulkOpsClient.doImport(): Execution begin.
    [2012-09-19 14:50:10,171] INFO - [main] Validating Oracle Data Loader On Demand Import request...
    [2012-09-19 14:50:10,171] DEBUG - [main] FieldMappingManager.parseMappings(): Execution begin.
    [2012-09-19 14:50:10,171] DEBUG - [main] FieldMappingManager.parseMappings(): Execution complete.
    [2012-09-19 14:50:11,328] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): Submitting BulkOpImportGetRequestDetail WS call
    [2012-09-19 14:50:11,328] INFO - [main] A SOAP request was sent to the server to create the import request.
    [2012-09-19 14:50:13,640] DEBUG - [Thread-3] SOAPImpRequestManager.sendImportGetRequestDetail(): SOAP request sent successfully and a response was received
    [2012-09-19 14:50:13,640] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): BulkOpImportGetRequestDetail WS call finished
    [2012-09-19 14:50:13,640] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): SOAP response status code=OK
    [2012-09-19 14:50:13,640] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): Going to sleep for 300 seconds.
    [2012-09-19 14:50:20,328] INFO - [main] A response to the SOAP request sent to create the import request on the server has been received.
    [2012-09-19 14:50:20,328] DEBUG - [main] SOAPImpRequestManager.sendImportCreateRequest(): SOAP request sent successfully and a response was received
    [2012-09-19 14:50:20,328] INFO - [main] Oracle Data Loader On Demand Import validation PASSED.
    [2012-09-19 14:50:20,328] DEBUG - [main] BulkOpsClient.sendValidationRequest(): Execution complete.
    [2012-09-19 14:50:20,343] DEBUG - [main] ManifestManager.initManifest(): Creating manifest directory: .\\Manifest\\
    [2012-09-19 14:50:20,343] DEBUG - [main] BulkOpsClient.submitImportRequest(): Execution begin.
    [2012-09-19 14:50:20,390] DEBUG - [main] BulkOpsClient.submitImportRequest(): Sending CSV Data Segments.
    [2012-09-19 14:50:20,390] DEBUG - [main] CSVDataSender.CSVDataSender(): CSVDataSender will use 1 threads.
    [2012-09-19 14:50:20,390] INFO - [main] Submitting Oracle Data Loader On Demand Import request with the following Request Id: AEGA-FX28VK...
    [2012-09-19 14:50:20,390] DEBUG - [main] CSVDataSender.sendCSVData(): Creating thread 0
    [2012-09-19 14:50:20,390] INFO - [main] Import Request Submission Status: Started
    [2012-09-19 14:50:20,390] DEBUG - [main] CSVDataSender.sendCSVData(): Starting thread 0
    [2012-09-19 14:50:20,390] DEBUG - [main] CSVDataSender.sendCSVData(): There are pending requests. Going to sleep.
    [2012-09-19 14:50:20,406] DEBUG - [Thread-5] CSVDataSenderThread.run(): Thread 0 submitting CSV Data Segment: 1 of 1
    [2012-09-19 14:50:24,328] INFO - [Thread-5] A response to the import data SOAP request sent to the server has been received.
    [2012-09-19 14:50:24,328] DEBUG - [Thread-5] SOAPImpRequestManager.sendImportDataRequest(): SOAP request sent successfully and a response was received
    [2012-09-19 14:50:24,328] INFO - [Thread-5] A SOAP request containing import data was sent to the server: 1 of 1
    [2012-09-19 14:50:24,328] DEBUG - [Thread-5] CSVDataSenderThread.run(): There is no more pending request to be picked up by Thread 0.
    [2012-09-19 14:50:24,328] DEBUG - [Thread-5] CSVDataSenderThread.run(): Thread 0 terminating now.
    [2012-09-19 14:50:25,546] INFO - [main] Import Request Submission Status: 100.00%
    [2012-09-19 14:50:26,546] INFO - [main] Oracle Data Loader On Demand Import submission completed succesfully.
    [2012-09-19 14:50:26,546] DEBUG - [main] BulkOpsClient.submitImportRequest(): Execution complete.
    [2012-09-19 14:50:26,546] DEBUG - [main] BulkOpsClient.doImport(): Execution complete.
    [2012-09-19 14:50:26,546] INFO - [main] Attempting to log out...
    [2012-09-19 14:50:31,390] INFO - [main] XXXX/XXXX is now logged out.
    [2012-09-19 14:50:31,390] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run(): Interrupted.
    [2012-09-19 14:50:31,390] DEBUG - [main] BulkOpsClient.main(): Execution complete.

    Hi,
    the Data Loader points by default to the production environment regardless if you download it from staging or production.
    To change the pod edit the config file and input the below content:
    hosturl=https://secure-ausomxeha.crmondemand.com
    routingurl=https://secure-ausomxeha.crmondemand.com
    testmode=debug

  • Auto-kick off MaxL script after Oracle GL data load?

    Hi guys, this question will involve 2 different modules: Hyperion and Oracle GL.
    My client has their accounting department updating Oracle GL on a daily basis. My end-user client would like to write a script to automatically kick off the existing MaxL script which is for our daily data load in Hyperion. Currently, the MaxL script is manually executed.
    What's the best approach to build a connection for both modules to communicate with each other? Can we use a timer to trigger the run? If so, how?

    #1 External scheduler.
    I've worked on Appworx and it has build a chain dependent task. There are many other external schedulers like Tivoli,....
    #2 As Daniel pointed out you can use Windows scheduler.
    For every successful GL load add a file to a folder which is accessible for your Essbase task.
    COPY Nul C:\Hyperion\Scripts\Trigger\GL_Load_Finished.txt
    Create another bat file which is scheduled to run on every 5 or 10 mins (this should start just after your GL Load scheduled task)
    This is an example i've for a triggered Essbase job.
    IF EXIST %BASE_DIR%\Trigger\Full_Build_Started.txt (
    Echo "Full Build started"
    ) else (
         IF EXIST %BASE_DIR%\Trigger\Custom_Build_Started.txt (
         Echo "Custom Build started"
         ) else (
              IF EXIST %BASE_DIR%\Trigger\Post_Build_Batch_Started.txt (
              Echo "Post Build started"
              ) else (
              IF EXIST %BASE_DIR%\Trigger\Start_Full_Build.txt (
              Echo "Trigger found starting batch"
              MOVE %BASE_DIR%\Trigger\Start_Batch.txt %BASE_DIR%\Trigger\Full_Build_Started.txt
              call %BASE_DIR%\Scripts\Batch_Files\Monthly_Build_All_Cubes.bat
              ) else (
                   IF EXIST %BASE_DIR%\Trigger\Start_Custom_Build.txt (
                   Echo "Trigger found starting Custom batch"
                   MOVE %BASE_DIR%\Trigger\Start_Custom_Batch.txt %BASE_DIR%\Trigger\Custom_Build_Started.txt
                   call %BASE_DIR%\Scripts\Batch_Files\Monthly_Build_All_Cubes_Custom.bat
                   ) else (
                        IF EXIST %BASE_DIR%\Trigger\Start_Post_Build_Batch.txt (
                        Echo "Trigger found starting Post Build batch"
                        MOVE %BASE_DIR%\Trigger\Start_Post_Build_Batch.txt %BASE_DIR%\Trigger\Post_Build_Batch_Started.txt
                        call %BASE_DIR%\Scripts\Batch_Files\Monthly_Post_Build_All_Cubes.bat
    )So this bat file if it finds Start_Full_Build.txt in the trigger location, it'll rename that to Full_Build_Started.txt and will call the Full Build (likewise for custom and post build)
    Regards
    Celvin
    http://www.orahyplabs.com

  • Is there any setting in ODS to accelerate the data loads to ODS?

    Someone mentions that there is somewhere in ODS setting that make the data load to this ODS much faster.  Don't know if there is kind of settings there.
    Any idea?
    Thanks

    hi Kevin,
    think you are looking for transaction RSCUSTA2,
    Note 565725 - Optimizing the performance of ODS objects in BW 3.0B
    also check Note 670208 - Package size with delta extraction of ODS data, Note 567747 - Composite note BW 3.x performance: Extraction & loading
    hope this helps.
    565725 - Optimizing the performance of ODS objects in BW 3.0B
    Solution
    To obtain a good load performance for ODS objects, we recommend that you note the following:
    1. Activating data in the ODS object
                   In the Implementation Guide in the BW Customizing, you can implement different settings under Business Information Warehouse -> General BW settings -> Settings for the ODS object that will improve performance when you activate data in the ODS object.
    2. Creating SIDs
                   The creation of SIDs is time-consuming and may be avoided in the following cases:
    a) You should not set the indicator for BEx Reporting if you are only using the ODS object as a data store.Otherwise, SIDs are created for all new characteristic values by setting this indicator.
    b) If you are using line items (for example, document number, time stamp and so on) as characteristics in the ODS object, you should mark these as 'Attribute only' in the characteristics maintenance.
                   SIDs are created at the same time if parallel activation is activated (see above).They are then created using the same number of parallel processes as those set for the activation. However:if you specify a server group or a special server in the Customizing, these specifications only apply to activation and not the creation of SIDs.The creation of SIDs runs on the application server on which the batch job is also running.
    3. DB partitioning on the table for active data (technical name:
                   The process of deleting data from the ODS object may be accelerated by partitioning on the database level.Select the characteristic after which you want deletion to occur as a partitioning criterion.For more details on partitioning database tables, see the database documentation (DBMS CD).Partitioning is supported with the following databases:Oracle, DB2/390, Informix.
    4. Indexing
                   Selection criteria should be used for queries on ODS objects.The existing primary index is used if the key fields are specified.As a result, the characteristic that is accessed more frequently should be left justified.If the key fields are only partially specified in the selection criteria (recognizable in the SQL trace), the query runtime may be optimized by creating additional indexes.You can create these secondary indexes in the ODS object maintenance.
    5. Loading unique data records
                   If you only load unique data records (that is, data records with a one-time key combination) into the ODS object, the load performance will improve if you set the 'Unique data record' indicator in the ODS object maintenance.

  • Data Load Speed

    Hi all.
    We are starting the implementation of SAP at the company I work and I am designated to prepare the Data Load of the legacy systems. I have already asked our consultants about the data load speed but they didn´t answer really what I need.
    Does anyone have a statistic of the data load speed using tools like LSMW, CATT, eCATT, etc... per hour?
    I know that the speed depends of what data I´m loading and also the CPU speed but any information is good to me.
    Thank you and best regards.

    hi friedel,
    Again here is the complete details regarding data transfer techniques.
    <b>Call Transaction:</b>
    1.Synchronous Processing
    2.Synchronous and Asynchrounous database updates
    3.Transfer of data for individual transaction each time CALL TRANSACTION statement is executed.
    4.No batch input log gets generated
    5.No automatic error handling.
    <b>Session Method:</b>
    1.Asynchronous Processing
    2.Synchronous database updates.
    3.Transfer of data for multiple transaction
    4.Batch input log gets generated
    5.Automatic error handling
    6.SAP's standard approach
    <b>Direct Input Method:</b>
    1.Best suited for transferring large amount of data
    2.No screens are processed
    3.Database is updated directly using standard function modules.eg.check the program RFBIBL00.
    <b>LSMW.</b>
    1.A code free tool which helps you to transfer data into SAP.
    2.Suited for one time transfer only.
    <b>CALL DIALOG.</b>
    This approach is outdated and you should choose between one of the above techniques..
    Also check the knowledge pool for more reference
    http://help.sap.com
    Cheers,
    Abdul Hakim

  • Regarding ERPI Data Loading

    Dear All,
    I have few doubts on ERP Integrator.
    1) What are things required from Oracle GL to Planning for Data Loading using ERP Integrator? (Trail Balance is enough or we required some other file from Oracle GL)
    2) Is there any scheduling options available for Data loading using ERP Integrator?
    3) what is process for loading the data to Planning using ERP Integrator?
    4) How we load the data to Planning? (i.e. monthly load, hourly load)
    Anyone please guide me in this situation.
    Thanks,
    PC

    1) What are things required from Oracle GL to Planning for Data Loading using ERP Integrator? (Trail Balance is enough or we required some other file from Oracle GL)
    Assuming you have the right version of Oracle EBS, ERP Integrator queries the tables within the Oracle EBS database to get the appropriate information. In my case, the trail balance file was enough. Within the trail balance file you will have the appropriate dimension intersection (account, entity, period, etc.), the type of account (asset vs. liability, etc.) and finally the dollar amount.
    2) Is there any scheduling options available for Data loading using ERP Integrator?
    Yes. You can use FDQM to map and validate the data, then use the FDQM batch scheduler to load the data via command line or you can use the FDQM batch scheduler as well.
    3) what is process for loading the data to Planning using ERP Integrator?
    I'll try to do my best to summarize. (Assuming you are using FDQM) Create rules in ERPi -> Configure the adapters in the Workbench Client for the ERPi Rules -> Configure the FDQM Web Client to call the Adapters set in the Workbench Client -> Import the data into FDQM. Then from here you can call your command line automation for batching if you wish.
    4) How we load the data to Planning? (i.e. monthly load, hourly load)
    This depends on your business. Assuming you are going to load the data for budget and planning purposes then maybe your business is happy with a monthly load (and most of the time this is the case). An hourly load might be helpful if you deal with users that need up to date actuals. Loading hourly acutals data might be an overkill for a budget or planning application, but I have ran into situations where this is needed, but then find myself worried about speeding up the calculations after the data is loaded. Long store short you can load monthly or hourly.

  • Automate the data load process using task scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler.
    Thanks

    Thanks for your help.
    I have done data loading using Data_load.batch file . the .bat file call to .msh file using this i have followed your steps & automatic data lload is possible.
    If you need this then please let me know.
    Thanks again.
    Edited by: 949936 on Oct 26, 2012 5:46 AM

  • Trying to add Contact information through Oracle Data Loader

    Hi,
    I have checked Oracle On Demand guide PDF and can able to insert a valid account data on oracle on demand via client batch pgm. Can you tell me any valid Contact Map and contact*.csv file which can insert contact information on Oracle On Demand. If I get dealer , vehicle or any other that would also help. Where I can check the map details for all these record types. That is the biggest problem I am facing.
    Thanks in advance for your help !!!
    JD.

    I am able to inser a basic contact on Oracle On demand through Data loader. But it is getting partially completed with errors. The first Name, Last Name is getting inserted, but columns like Title , Address those are not getting inserted. Can you give me why it is behaving wierdly. The Map I found to be okie.
    Appreciate your reply...
    Thanks...

Maybe you are looking for

  • Adobe Air Message Problem

    I keep getting the following message on start up. "Adobe Air. This installation of this application is damaged. Try re-installing or contacting the publisher for assisitance." I've tried uninstalling and re-installing a few times, but the message sti

  • Disallow saving of production order if there is error calculating cost

    Hi Gurus, We have a requirement that when saving a production order, an error message should appear if there is no planned cost for the material. As of now there is only a warning message that says "Error when calculating cost". How to make it an err

  • Creating application user accounts

    OK, I need to create two application accounts. They will need to be able to insert and update into several tables owned by the schema owner..... So let's say we have the following users: HR APP1 APP2 And let's say we have just the HR schema: Now when

  • Navigate to files in iPhoto Library

    New to mac! I am used to organizational software organizing photos, but still being able to navigate to the files through windows explorer (now Finder). The way iPhoto is set up, I can't open a file that is in the iPhoto library from anything except

  • How can i connect the printer without a usb?

    I have recently changed my internet provider and have tried to reinstall the printer software but it wont find the printer and says that i need a usb cable which i dont have. Can i connect the printer without one? Its a photosmart all-in-one c4380 pr