Loading parallely

Hi,
I have a requirement of loading multiple files parallely using sql loader through odi.
Currently we are not using any odi km's and we are using ODI Procedure which calls shell script/sql loader script.
If i want to load all the files at the same time into different tables how can we do that?
Cheers

Hi,
You can either use a Load Plan and define parallel steps ( http://www.rittmanmead.com/2011/06/odi-11g-new-mapping-and-interface-features-part-2-load-plans/ ), or use asynchronous steps in a package and add an extra step OdiWaitForChildSession : ( https://blogs.oracle.com/dataintegration/entry/parallel_processing_in_odi ).
Hope it helps.
Regards,
Jerome

Similar Messages

  • Few Queries in ODI

    Hi ,
    I am very new to ODI so might be the question would sound silly .
    Few Questions:
    1. I want to know if there is any way of hardcoding SQL query in ODI for the source.
    2. I have 4 total interfaces(LoadEmp,LoadCustomer,LoadProd,LoadFact) . I am bundling them in a package. LoadEmp,LoadCustomer and LoadProd are DimensionLoads and LoadFact is fact which depends on Dimensions to be loaded. I want Dimensions to load parallely and finally once the dimensions are loaded the fact should be loaded. Is there a way in Package I can specify this.
    3. I tried to play with Sequences a lot in ODI but was never able to make it hence finally ended up making sequence in Database. I read somewhere a good practise is defining the sequence in Database but if I need to do it in ODI can someone guide me through the method.
    Any inputs on these query would be appreaciated.

    Thanks a lot for your quick response.
    1. I would like to write a simple Select query or a overide query instead of what ODI does . Is it possible ? eg. Select ProductId,ProductDesc from Dim_Product
    2. Can you please let me know how can I have parallel sessions using Scenarios. I tried creating Scenarios for 4 Interfaces LoadEmp, LoadCustomer, LoadProd, LoadFact and tried to embed them in a package but there I had to put "Next Step on success" on every scenarios which means it would run sequential. How can I acheive the same without putting the "Next Step on Success" .What I want to achieve is loading all master together ie LoadEmp, LoadCustomer, LoadProd and fact LoadFact once the Master is loaded hence Masters would be parallel and fact would be sequential.
    3. I am using Oracle as Database . I know I can have sequences created in Oracle and use it in ODI but I wanted to try the sequence part from ODI hence the query. What I am doing is creating a sequence in ODI and when I do the diagram I have tried executing on target and staging . ODI updates the sequence number only once hence my mapping fails since sequence number is a surrogate key .

  • SqlLoader

    Could anyone help me on following question.
    How many Sql Loader parallely can be triggered which inserts on the same table but has different data files?
    Thanks,
    Praneet

    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_modes.htm#i1008225

  • I want to load data parallely into two ods.

    i want to load data parallely into two ods.

    Hai ,
       As Question is not clear........
      U can do this by going in update tab in sechdule menu.
    Regards
    Suman

  • Loading multiple files into a target table parallely

    Hi
    We are building an interface to load from flat files to an Oracle table. The input can be more than one files all of th same format. I have created a variable for the flat file data store. I have created 2 packages
    In package 1
    I have put odi file wait to trigger whenever files come into the directory . The list of files is written to a metadata table table. A procedure is triggered which picks a record from the metadata table and calls the scenario for calls package 2 by passing the selected record as parameter.
    In package 2
    The variable value is set to the parameter passed from package 1 procedure
    The interface runs to load the flat file = to the variable value into the target table
    Now if the directory has more than one flat file , the procedure in package 1 calls the package 2 scenario more than one times. This causes sequential load from each flat file to the target table. Is there a way wherein the multiple flat files can be loaded in parallel to the same target table? At this point ,I am not sure if this is possible as there is only one interface and one variable for the source file

    As per your requirement , your process seems to be always continues like a loop[ reason being as you have mentiond - I dont want to wait for all files to come in. As soon as a file comes in the file has to be loaded.  ] .
    You can solve the issue of file capture using OdiFileWait to let you know when the files have arrived.
    Tell me this what do you plan to when the File loading is complete, will that be in the same folder and when new files come will that update with the same name or different name , becuase irresptive of same file or different if the files are present in the same folder after loading into target , then we might do repetitive loading of files. Please tell me how do you plan to handle this.
    When you have plan to use LKM file to sql , what is the average number of source records coming into each sample files. ?
    Just to add to above question , what is the average numner of parallel loads you are expecting and what is the time interval between each paralled loading you are expecting.

  • The DSO is included in two deamon processes how can i load at the same time

    Hi Gurus,
    i am working on BI7.0 and our application is based on RDA(real time data acquisition) and we are pushing the data using web services and the problem is now we have a DSO which is included in 2 deamon processes and its allowing us to load after the completion of first one.i just want to load them parallely and want to execute at the same time to the DSO.so please let me know how can it be possible and how we can achieve this..
    your help will be appreciated greatly..
    Thanks & Regards.
    Ashok

    Hi
    We managed this requirement by closing the DTP request. You have to ask your ABAP team to code this closing.
    Thanks and regards
    Kwong Tat

  • Taking long time to load from the planning area to the cube since cvc big?

    Hi all,
    We have a huge number of cvc for various countries, while we are planning to load the data from the planning area to the cube using those cvc it consuming 40 hours.
    I happneed to check the cvc are more than 15,000 that is one reason. But i still need to improve the time taken to load the data from the planning area to the cube using the process chain.
    I happened to split the process chain to load data respectively to the sales organisation but still the same!!
    Can anyone help me nor recommend the sap process to improve the data loading from planning to cube in apo demand planning?
    Thanks
    Pooja

    Hi Pooja,
    15K is not huge at all. We have worked with 50k and still managed do the extract into cube in about an hours time.
    Pls help me understand a few things, so that we can help you better.
    1) Number of KF?
    2) Number of Periods?
    3) Key Figure Storage types - Is there any KF stored in Infocube rather than Time Series, this can be found in the Key Figure details in the Planning Area.
    4) Pls explain your data flow like PA--> Datasource --> Communication Structure --> UPdate rules --> Cube???
    5) Are you using Parallel extraction in your Datasource? This can be checked in Data Extraction tools from your Planning Area Screen.
    Few general tips.
    1) Parallelize your datasource.
    2) Load into PSA and Infocube parallely.
    3) Do not include KF stored in Infocube for your backup, use only KF stored in LC.
    Thanks
    Mani

  • Error while loading data into SAP BW using BO Data Services......

    Hello All,
    I have a job to load data from SQL Server to SAP BW. I have followed the steps got from SAP wiki to do this.
    1. I have created an RFC connection between two servers(SAP and BODS Job Server)
    when I schedule and start the job immediately from the SAP BW, i get this error and it aborts the RFC connection....
    "Error while executing the following command:  -sJO return code:238"
    Error message during processing in BI
    Diagnosis
    An error occurred in BI while processing the data. The error is documented in an error message.
    System Response
    A caller 01, 02 or equal to or greater than 20 contains an error meesage.
    Further analysis:
    The error message(s) was (were) sent by:
    Scheduler
    Any help would be appreciated......
    Thanks
    Praveen...

    Hi Praveen,
    I want to know which version of BODS you are using for your development of ETL jobs?.
    If it's BODS 12.2.2.2 then  you will get this type of problems frequently as in BODS 12.2.2.2 version , only two connection
    is possible to create while having RFC between BW and BODS.
    So , i suggest if you are using BODS 12.2.2.2 version  , then upgrade it with BODS 12.2.0.0 with Service PACK  3 and Fix Pack3 .
    AS in BODS 12.2.3.3. we have option of having ten connection parallely at a time which helps in resolving this issues.
    please let me know what is your BODS version and if you have upgraded your BODS to SP3 with FP3 , whether your problem is resolved or not..
    All the best..!!
    Thanks ,
    Shandilya Saurav

  • Multiple base load in same Hyperion Application

    Need to have some clarity over the multiple base load in same Hyperion application. Just to give some background of the requirement our current HFM application captures the base level data in Indian GAAP reporting structure only just like any other incorporation would do in their statutory reporting format. Now we are looking at prospects and possible impact of capturing, at base level data, both as per Indian GAAP and IFRS formats- both being active but populated by different end users.
    We would like to receive some valuable inputs as to how feasible it is to capture at base level, financials of different GAAP as part of same application substantiated by an example, if any. In other words what could be the possible challenges and adverse impact because of the the multiple base load. One such impact which I could foresee could be handling of business rules and validations for different base load in the same rule file.
    Please provide your valuable inputs on the same.

    To clarify the question:
    the query asks for multiple base loads - i.e Base / Lowest level data being loaded into the application (loaded thru FDM, Data Forms etc - and not Journals/Adjustments), both for Indian GAAP and IFRS, parallely
    The requirement is to specifically load both Indian GAAP and IFRS as base level information - to enable both reporting journeys, in the same application.
    i.e:
    - 1 base load in Indian GAAP, with related IFRS adjustments to reach financials as per IFRS requirement
    - 1 base load in IFRS, with related Indian GAAP adjustments to reach financials as per Indian Gaap requirements
    Both hierarchies - operating within the same application - but for two different set of entity hierarchies, for two different end-user groups, with different reporting timelines.
    What are your inputs, from a Best Practice approach - whether to develop such structures as part of the same application or go for separate applications - from a design, development, testing and ongoing maintenance perspective
    Edited by: Indraneel Mazumder on Feb 19, 2013 7:41 PM

  • Sales data load taking more time

    Hi ,
    I am loading 1 year sales data from setiup tables but it is taking more than 3 years .
    One more thing is no ABAP routines only extraction from r/3 and loading it into ODS ....
    Is there any i can speed to load the data into ODS .
    Please any can help me

    Hi,
    Make sure that you have deleted index in the Data Target before you load. Also put different selection criteria in the infopackages and load them parallely.
    Make sure when the laod is running no other major process is running in the system. Yu can check this using SM50 transaction.

  • BSEG table data load problem

    Hi friends,
    I have a problem where I need to load the BSEG table data but when it loads only one million records, it stuck and gives error. there are more than 5 millions records in it. is there any way to load the data to PSA. I doubt it is due to memory but how I increase the memory or enhancement for loading the data.
    Thanks
    Ahmed

    Hi Ahmed....
    Don't load all the data in a single go..............split the load with selection...........If it is a Transaction data datasource..........then copy the same infopackage..........in the Data selection field give the selection.........and execute them parallely.............since fact table will never loack each other....
    If it a Master data datasource......then don't run paralley..............since  master data table locks each other..........just give the selection one by one.........and load the data.....
    Hope this helps......
    Regards,
    Debjani.....

  • Loading simultaneously several process chains

    Is it recommended to load fiscal year 1, fiscal year 2, fiscal year 3, fiscal year 4, etc. at the same time using process chains ALEREMOTE?
    What are the consequences that would cause to the system if several process chains are laoded simultaneously to 1 InfoCube?
    There are millions of records coming from the DSO for each fiscal year.

    Hi Yosemite,
    It is a good option to load data parallely into same target, when the selection criteria is different. But please ensure that number of parallel processes in each info package/DTP are restricted  to sustain parallel loads in the system.
    For DTP this setting can be enabled from Goto->Settings for batch manager.
    Regards,
    Vikram

  • Suppose the loads are running in BW, but the BW & System Times are differe

    BW system time is greater than the R/3 system, how we can ensure wether the loads are uploaded into BW System?

    Hi...........
    Look..............While loading data From R/3 to BW............first Extraction will start...........ie the Extraction Job..............if u copy the Request no from the Header tab of the IP Monitor...............then go to the Source System ..........SM37 >> Put the Request no with BI as prefix..............there u can see the Status of the Extraction Job...........
    Now if u r in BW............then U hav different options.......
    PSA and Data Targets/InfoObjects in Parallel >> A process is started to write the data from this data packet into the PSA for each data package. If the data is successfully updated in the PSA, a second parallel process is started. In this process, the transfer rules are used for the package data records, data is adopted by the communication structure, and it is finally written to the data targets. Posting of the data occurs in parallel by packet...................This method is used to update data into the PSA and the data targets with a high level of performance. The BW system receives the data from the source system, writes it to the PSA, and starts the update immediately and in parallel into the corresponding data target............
    In this case Extraction and Loading to the Target will complete parallely.....................when the Technical status will be green..and the Extraction node and the Processing node both will be green............at that time u will understand that load got completed..............
    PSA and Then into Data Target/InfoObject >>A process that writes the package to the PSA table is started for each data package. When the data has been successfully updated to the PSA, the same process then writes the data to the data targets.  Posting of the data occurs in serial by packet.
    In This case first the Extraction Job will get completed.............ie the Extraction node will be Green..........at that time Data Packets will be Yellow...........After some time all data packets will green..............Technical status will be green.along with the data paclets..............it means load completed............
    Only PSA >> Using this method, data is written to the PSA and is not updated any further. In this two seperate Background JObs will be created...............First the Extraction Job will completed.............then the Technical status will be green............then a new JOb will Start to push from PSA to data target............After Loading till PSA...............if u go to the Status tab..............there u will find Processs Manually tab............if u click there update to Target will start..........then Tecxhnical status will again be yellow................after update to the Target get completed....................Technical status will aagain be Green.............
    Now if u r in BI 7.0...................Then infopackage will loaddata till PSA.............u can monitor the Extraction job............Then u hav to Excute DTP to load from PSA to the target..............In the DTP monitor u will find the DTP request............in the SM#& in BI side...............u can monitor this Job.............
    Hope this helps.........
    Regards,
    Debjani........

  • How to update the Left shell string in Tree Parallely

    Hi Friends,
     I am using Tree Control and subpanel in the same front panel.....  For example the subpanel will load the authentication.... in that he user will enter the name ... at the same time while the user types i want to update that value in the Tree control ..... how to do that....parallely..... 

    parthabe wrote:
    Darren wrote:
    One of the right-click menu options on the string control is "Update Value While Typing".  If you select this option, then you'll get a Value Change event firing every time someone enters a character in the string.
    I never knew this before!
    I guess that means it would make a good nugget...I'll add it to my list.
    -D
    Darren Nattinger, CLA
    LabVIEW Artisan and Nugget Penman

  • Full Load" and "Full load with Repair full request"

    Hello Experts,
    Can any body share with me what is the difference between a "Full Load" and "Full load with Repair full request"?
    Regards.

    Hi......
    What is function of full repair?? what it does?
    How to delete init from scheduler?? I dont see any option like that in infopackage
    For both of you question there is a oss note 739863-Repairing data in BW ..........Read the following.....
    Symptom
    Some data is incorrect or missing in the PSA table or in the ODS object (Enterprise Data Warehouse layer).
    Other terms
    Restore data, repair data
    Reason and Prerequisites
    There may be a number of reasons for this problem: Errors in the relevant application, errors in the user exit, errors in the DeltaQueue, handling errors in the customers posting procedure (for example, a change in the extract structure during production operation if the DeltaQueue was not yet empty; postings before the Delta Init was completed, and so on), extractor errors, unplanned system terminations in BW and in R/3, and so on.
    Solution
    Read this note in full BEFORE you start actions that may repair your data in BW. Contact SAP Support for help with troubleshooting before you start to repair data.
    BW offers you the option of a full upload in the form of a repair request (as of BW 3.0B). If you want to use this function, we recommend that you use the ODS object layer.
    Note that you should only use this procedure if you have a small number of incorrect or missing records. Otherwise, we always recommend a reinitialization (possibly after a previous selective deletion, followed by a restriction of the Delta-Init selection to exclude areas that were not changed in the meantime).
    1. Repair request: Definition
    If you flag a request as a repair request with full update as the update mode, it can be updated to all data targets, even if these already contain data from delta initialization runs for this DataSource/source system combination. This means that a repair request can be updated into all ODS objects at any time without a check being performed. The system supports loading by repair request into an ODS object without a check being performed for overlapping data or for the sequence of the requests. This action may therefore result in duplicate data and must thus be prepared very carefully.
    The repair request (of the "Full Upload" type) can be loaded into the same ODS object in which the 'normal' delta requests run. You will find this request under the "Repair Request" option in the InfoPackage (Maintenance) menu.
    2. Prerequisites for using the "Repair Request" function
    2.1. Troubleshooting
    Before you start the repair action, you should carry out a thorough analysis of the possible cause of the error to make sure that the error cannot recur when you execute the repair action. For example, if a key figure has already been updated incorrectly in the OLTP system, it will not change after a reload into BW. Use transaction RSA3 (Extractor Checker) in the source system for help with troubleshooting. Another possible source of the problem may be your user exit. To ensure that the user exit is correct, first load a user exit with a Probe-Full request into the PSA table and check whether the data is correct. If it is not correct: Search for the error in the exit user. If you do not find it, we recommend that you deactivate the user exit for testing purposes and request a new Full Upload. It If the data arrives correctly, it is highly probable that the error is indeed in the user exit.
    We always recommend that you load the data into the PSA table in the first step and check the result there.
    2.2. Analyze the effects on the downstream targets
    Before you start the Repair request into the ODS object, make sure that the incorrect data records are selectively deleted from the ODS object. However, before you decide on selective deletion, you should read the Info Help for the "Selective Deletion" function, which you can access by pressing the extra button on the relevant dialog box. The activation queue and the ChangeLog remain unchanged during the selective deletion of the data from the ODS object, which means that the incorrect data is still in the change log afterwards. After the selective deletion, you therefore must not reconstruct the ODS object if it is reconstructed from the ChangeLog. (Reconstruction is usually from the PSA table but, if the data source is the ODS object itself, the ODS object is reconstructed from its ChangeLog). You MUST read the recommendations and warnings about this (press the "Info" button).
    You MUST also take into account the fact that the delta for the downstream data targets is created from the changelog. If you perform selective deletion and then reload data into the deleted area, this may result in data inconsistencies in the downstream data targets.
    If you only use MOVE and do not use ADD for updates in the ODS object, selective deletion may not be required in some cases (for example, if incorrect records only have to be changed, rather than deleted). In this case, the DataMart delta also remains intact.
    2.3. Analysis of the selections
    You must be very precise when you perform selective deletion: Some applications do not provide the option of selecting individual documents for the load process. Therefore, you must first ensure that you can load the same range of documents into BW as you would delete from the ODS object. This note provides some application-specific recommendations to help you "repair" the incorrect data records.
    If you updated the data from the ODS object into the InfoCube, you can also delete it there using the "Selective deletion" function. However, if it is compressed at document level there and deletion is no longer possible, you must delete the InfoCube content and fill the data in the ODS object again after repair.
    You can only perform this action after a thorough analysis of all effects of selective data deletion. We naturally recommend that you test this first in the test system.
    The procedure generally applies for all SAP applications/extractors. The application determines the selections. For example, if you cannot use the document number for selection but you can select documents for an entire period, then you are forced to delete and then update documents for the entire period in the data target. Therefore, it is important to look first at the selections in the InfoPackage exactly before you delete data from the data target.
    Some applications have additional special features:
    Logistics cockpit: As preparation for the repair request, delete the SetUp table (if you have not already done so) and fill it selectively with concrete document numbers (or other possible groups of documents determined by the selection). Execute the Repair request.
    Caution: You can currently use the transactions that fill SetUp tables with reconstruction data to select individual documents or entire ranges of documents (at present, it is not possible to select several individual documents if they are not numbered in sequence).
    FI: The Repair request for the Full Upload is not required here. The following efficient alternatives are provided: In the FI area, you can select documents that must be reloaded into BW again, make a small change to them (for example, insert a period into the assignment text) and save them -> as a result, the document is placed in the delta queue again and the previously loaded document under the same number in the BW ODS object is overwritten. FI also has an option for sending the documents selectively from the OLTP system to the BW system using correction programs (see note 616331).
    3. Repair request execution
    How do you proceed if you want to load a repair request into the data target? Go to the maintenance screen of the InfoPackage (Scheduler), set the type of data upload to "Full", and select the "Scheduler" option in the menu -> Full Request Repair -> Flag request as repair request -> Confirm. Update the data into the PSA and then check that it is correct. If the data is correct, continue to update into the data targets.
    And also search in forum, will get discussions on this
    Full repair loads
    Regarding Repair Full Request
    Instead of doing of all these all steps.. cant I reload that failed request again??
    If some goes wrong with delta loads...it is always better to do re-init...I mean dlete init flag...full repair..all those steps....If it is an infocube you can go for full update also instead of full repair...
    Full Upload:
    In full upload all the data records are fetched....It is similar to full repair..Incase of infocube to recover missed delta records..we can run full upload.....but in case of ODS  does'nt support full upload and delta upload parallely...so inthis case you have to go for full repair...otherwise delta mechanism will get corrupted...
    Suppose your ODS activation is failing since there is a Full upload request in the target.......then you can convert full upload to full repair using the program : RSSM_SET_REPAIR _ FULL_FLAG
    Hope this helps.......
    Thanks==points as per SDN.
    Regards,
    Debjani.....
    Edited by: Debjani  Mukherjee on Oct 23, 2008 8:32 PM

Maybe you are looking for