Matser and trasactional data upload thru  a single process chain

Hi,
is it possible to automize  both master data and transactional data in a single process chain?i mean to say can we design a  process chain so that we can automize transaction and master data together?
if yes, can any one send me the steps and process blocks?
i have in master data one text and one hioerarchy and transactional data!
Thanks,
Ravi

Hi Ravi,
We can load the 2 types of Data through a Process chain.
1. Create start variant.
2. Drag n drop Execute Info Pack from left side panel ( Load Process and post Processing ). Slect ur text Info Package.
Do the same and select Hier Info Package.
Place all the Files on Application server AL11.
3. Drag n drop Collection process AND
4. Drag n drop Execute Info Pack from left side panel ( Load Process and post Processing ). Slect ur Transaction Data Info Package.
5. Establish a connection between Master Data Info Packages and Transaction Data Info Packages using AND.
By this ur specifying that load Master data first, if it is successful load Transaction data.
6. Goto Start variant > Maintain Variant.  Schedule it according to ur time like when u want to start like Immediately / After Job / After Event.
Assign points if it is helpful.

Similar Messages

  • Short dump while loading data in to Bw using Process Chain.

    got the problem in the Production System.......... I won't get any confirmation from my customer to debug the program in the Poduction.
    In BW I have to load the data every day using the process chain. in the monitor I can't find any problem every thing is working fine. status is green.. and loads the data into ODS successfully. But the job runs more than 6 Hrs and automatically an E-mail is sent to us that the processing takes long time.But it loads the data successfully into ODS.
    Since 10 days everyday we got a mail with the same problem. My customer asked me tTo analyse the problem.
    I check the short dump there I found that due to some problem the program get intrupted.
    <b>the short dump</b>
    ""Lokale Schnittstelle:
    *" IMPORTING
    *" VALUE(I_INFOCUBE) TYPE RSINFOCUBE
    *" EXCEPTIONS
    *" ILLEGAL_INPUT
    *" REQUEST_NOT_CLOSED
    *" INHERITED_ERROR
    DATA: l_s_cube TYPE rsd_s_cube,
    l_t_dummy TYPE rsdri_t_rfcdata,
    l_s_rsapoadm TYPE rsapoadm,
    l_nothing_found TYPE rs_bool.
    get type of cube
    CALL FUNCTION 'RSD_CUBE_GET_ONLY_DB'
    EXPORTING
    i_infocube = i_infocube
    i_objvers = rs_c_objvers-active
    i_with_atr_nav = rs_c_false
    i_with_message = rs_c_false
    IMPORTING
    e_s_cube = l_s_cube
    EXCEPTIONS
    infocube_not_found = 1
    illegal_input = 2
    OTHERS = 3.
    CASE sy-subrc.
    WHEN 0.
    WHEN 1.
    RAISE illegal_input
    WHEN OTHERS.
    RAISE inherited_error.
    ENDCASE.
    In the short dump I observed that the porgram get intrupted at RAISE illegal_input when sy-subrc = 1.
    I have to analyse this. problem........ whay is so happened...............
    Any one of you face the same problem............... if so please let me give your valuble suggestions regarding this.
    Cheers
    sailekha

    Looks like it  reaches the water mark level thats why you are getting message.

  • Multiple DTP's in single process chain

    Hello BW guru's
    can we have many DTP's in a single process chain? If so, how can we do that?

    Krishna,
    You can have multipl DTP's in single process chain.
    Assume a scenario where you need to update from DSO to Multiple targets.
    If you want to run the DTP one after another use AND
    otherwise drag and drop the DTP and link it...
    Cheer's
    HVR.

  • Reading file and dump data into database using BPEL process

    I have to read CSV files and insert data into database.. To achieve this, I have created asynchronous bpel process. Added Filed Adapter and associated it with Receive activity.. Added DB adapter and associated with Invoke activity. Total two receive activity are available in  process, when tried to Test through EM, only first receive activity is completed, and waiting on second receive activity. Please suggest how to proceed with..
    Thanks, Manoj.

    Deepak, thank for your reply.. As per your suggestion I created BPEL composite with
    template "Define Service Later". I followed below steps, please correct me if I am wrong/missing anything. Your help is highly appreciated...
    Step 1-
    Created File adapter and corresponding Receive Activity (checkbox create instance is checked) with input variable.
    Step 2 - Then in composite.xml, dragged the
    web service under "Exposed Services" and linked the web service with Bpel process.
    Step 3 - Opened .bpel file and added the DB adapter with corresponding Invoke activity, created input variable. Web service is created of Type "Service" with existing WSDL(first option aginst WSDL URL).
    and added Assign activity between receive and invoke activities.
    Deployed the composite to server, when triedTest it
    manually through EM, it is promting for input like "subElmArray Size", then I entered value as 1 with corresponding values for two elements and click on Test We Service button.. Ptocess is completing in error. The error is
    Error Message:
    Fault ID
    service:80020
    Fault Time
    Sep 20, 2013 11:09:49 AM
    Non Recoverable System Fault :
    Correlation definition not registered. The correlation set definition for operation Read, process default/FileUpload18!1.0*soa_3feb622a-f47e-4a53-8051-855f0bf93715/FileUpload18, is not registered with the server. The correlation set was not defined in the process. Redeploy the process to the containe

  • How to find from Data target filled by which  process chain.

    hi all.
    i have list of cubes i want to know that which cube is filled by which prcess chain ...
    can anyone suggest me appropriate solution.
    simply i want to know that when reconcilation which process chain affect on which bex report so that if one process chain got failed then i will not generate that specific report for that day..
    i have list of multiprovider on which we are generating rports and tech. name of report also i have.

    Ganesh ji,
    Choose your infopackage which holds the data targets, in the Infopackage screen reach the Scheduler tab and you will find the where-Used list. This will let you know the particular process chain responsible.
    thanks
    Prabhakaran
    09176665954

  • HOW TO LOAD R/3 DATA INTO SAP BI USING PROCESS CHAINS?

    Hi,
    Can we load R/3 data into BI using process chains?... I loaded data from R/3 into Infocube using generic extraction using view... took 2 tables EBKN and EBAN and ceated view.
    In PSA I can find all the 2388 records but when I load into datatarget in transferred tab there are 2388 records but in added colum i could find only 2096.....
    I deleted the request and want to load through process chains....... but  how to do ?????? without flat file ...can we laod using process chains?
    I appreciate any inputs.......
    Regards,
    Prasanthi.

    did you even bother looking at the links in my previous posts???
    read the docs...try yourself...if you encounter specific issues, you can post them on the forum...
    if you're really expecting somebody to post a step by step for process chain, i think you can wait a long, long time...

  • Data Change in the Infoprovider-Process Chain

    Hello Friends,
                       I want to broadcast the Queries and Workbooks periodically.I want to automate the process.For this i came to know that we have to  select the EXECUTION WITH DATA CHANGE IN THE INFOPROVIDER TRIGGERED BY A PROCESS button.Can u please guide me how to do it successfully.
    Thanks
    Mohan Chand Reddy A

    Hi Mohan,
    You can achieve this using process type in Process chain creation.
    In the below links you will find step by step procedure to create this in process chain
    http://help.sap.com/saphelp_nw70/helpdata/en/ec/0d0e405c538f5ce10000000a155106/frameset.htm
    http://www.tli-usa.com/download/Taking_Charge_of_Information_Broadcasting_in_BI_7%5B1%5D.0_for_improved_report_and_data_distribution.pdf
    Regards,
    Venkatesh

  • Selective deletion of Data in DSO in a process chain

    Dear all,
    Is there any standard functionality that allows to selectively delete the data from a DSO (the three tables to avoid delta inconsistencies) in a process chain? The unique option I could see in standard functionality is to delete the entire content.
    Any help will be much appreciated.
    Thanks&Regards
    Begonia

    Hi.
    There is standard FM for such purpose RSDRD_SEL_DELETION.
    You should write report (SE38) that using this FM and then run this report within process chain with variant.
    Please follow the [example|http://wiki.sdn.sap.com/wiki/display/BI/Schedulingselectivedeletioninprocesschain-%28infocube..%29].
    Regards.

  • How to trigger a single Process Chain within a Metchain

    Hi All,
    I had a couple of failures in the overnight dataloads.  I need to trigger just 1 process chain within the daily metachain.  I do not want the subsequent chains to be triggered. 
    The variant on this process chain is set to 'Start Using Metachain or API'  I am aware that I need to change this to 'Direct Scheduling' and immediate and then execute.  I need to ensure however that the subsequent chains are NOT triggered.
    Many Thanks BI Experts!
    Michelle

    Hi,
    you can goforward making it to immediate schedule  as you said.
    Also after your chain completes execution make it back to start using metachain.
    I faced the same scenario and did this step and next daily load was succesful .
    Also going forward this way will not trigger the subsequent chains.
    Thanks & Regards,
    Vishnu

  • Single Process chain triggers with two different users

    Hello all,
    I have process chain which is  triggering two times with R/3 user at 7 PM and same process chain with 9 PM by BW user. but in rspc i have given only with BW user.
    how second R/3 Chain is triggering?
    R/3 to PSA Full load daily. BI System.
    Thanks,
    Ranga

    Are you using an event to trigger this process chain?
    If yes then check in R/3, whether you are triggering that event in R/3.

  • Help reqd on MM data uploading thru LSMW by view steps...

    Hi all....
    Pls understand my requirement and post ur replies...i dont need the basic LSMW learning steps for MM,Vendor etc.,like that...
    My problem is...im want to upload my material master datas by steps like 1st i want to uplaod the basic views 1&2 for all the material types in one shot...
    next thing i want to load the other views like purchasing sales mrp plant qm accounting etc., in single shot......like screen to be extended formats....im mm01 or by mm02...
    when i did....no problem i've faced upto MRP loading...but the next view like plan,QM,accounting is not coming when recording in RUN batch input session..it says..."the material has been already extended and no input was given for this screen"....
    Hope u understand,my problem....
    if u have answers...pls post me on urgent basis.....
    thanks...
    sankar

    verify if all the materials have all the views you require.
    regards,
    srinivas
    <b>*reward for useful answers*</b>

  • Employees data uploaded thru lsmw not visible in ppome

    hi experts,
    i have uploaded some HR master data thru lsmw using tcode pa40.
    i am getting all the data in master tables pa0000,pa0001,pa0002.
    for example
    100 is a person with position manager(e.g. 5000234).
    i am getting his details in all the tables .and in pa 30 also when i type pernr i am getting all the informations.
    buit when in go in ppome to see the details of position organisational structure .
    under manager and that position no his name is not assigned .
    what may be the problem ????
    it's very urgent .... i am in production stage. pl. hep immediately
    regards,
    mani

    Hi Mani,
    Sorry yaar i dont have enough knowledge to discuss/tell abt spro, anyway go through this hope u can .
    More or less, the SPRO transaction is just a collection of table maintenance programs. So in order to see the changes to the tables, the tables must have the "log changes" checkbox assigned to the table itself. Some programs in spro may provide there own "changes" functionality.
    Here is the help on "Log Data Changes".
    Log data changes
    The logging flag defines whether changes to the data records of a table should be logged. If logging is activated, every change (with UPDATE, DELETE) to an existing data record by a user or an application program is recorded in a log table in the database.
    Note: Activating logging slows down accesses that change the table. First of all, a record must be written in the log table for each change. Secondly, many users access this log table in parallel. This could cause lock situations even though the users are working with different application tables.
    Dependencies
    Logging only takes place if parameter rec/client in the system profile is set correctly. Setting the flag on its own does not cause the table changes to be logged.
    The existing logs can be displayed with Transaction Table history (SCU3).
    As the help has described, use transaction SCU3 to view the change logs regarding customizing tables.
    Thanks
    NAveen khan

  • Employee data uploaded thru lsmw not visible in ppome

    hi experts,
    i have uploaded some HR master data thru lsmw using tcode pa40.
    i am getting all the data in master tables pa0000,pa0001,pa0002.
    for example
    100 is a person with position manager(e.g. 5000234).
    i am getting his details in all the tables .and in pa 30 also when i type pernr i am getting all the informations.
    buit when in go in ppome to see the details of position organisational structure .
    under manager and that position no his name is not assigned .
    what may be the problem ????
    it's very urgent .... i am in production stage. pl. hep immediately
    regards,
    mani

    hello is there anyone to help me

  • Group By element ID and continuous date range as a single row - URGENT!!!!

    Hi All,
    I have a source table and a target table.
    Source Table Target Table
    Element ID Period_dt Element ID Effective date End date
    DD001 200901 DD001 200901 200903
    DD001 200902 DD001 200906 200908
    DD001 200903 DD002 200801 200803
    DD001 200906
    DD001 200907
    DD001 200908
    DD002 200801
    DD002 200802
    DD002 200803
    I want the result as in the target table. Basically, continuous date range should be grouped and shown as single row even it falls under the same elment_id.
    I have tried the LAG and LEAD function and RANK function as well but unsuccessful.
    I was able to get like this in the target table using MIN and MAX function.
    DD001 200901 200908
    DD002 200801 200803
    For DD001, you can see there is a break in the months. 200901 - 200903 and 200906 - 200908. we are missing 4th,5th month. 1 to 3rd month and 6th to 8th month should be grouped and shown as separate rows in the target table for the same DD001 element_ID
    I will post the SQL query tommorrow. Please give your suggestions.
    Regards
    Balaji

    Thanks guys. It worked perfectly. I apologize for using the 'U' word. This is my first post here.
    select prod_element_cd,
    min(period_dt) effective_date,
    max(Last_day(period_dt)) end_date,
    SUM(Fixed_factor),
    SUM(var_factor),
    val1
    from (
    select prod_element_cd, period_dt,Fixed_factor,var_factor, val, last_value(val ignore nulls) over(partition by prod_element_cd order by period_dt) val1
    from (
    select prod_element_cd,
    period_dt,
    NVL(Fixed,0) Fixed_factor,
    NVL(variable,0) var_factor,
    lag(period_dt) over(partition by prod_element_cd order by period_dt) dt,
    case when add_months(period_dt,-1) = lag(period_dt) over(partition by prod_element_cd order by period_dt)
    then null
    else rownum end val
    from pmax_land.TMP_COST_CASH_STD_INPUT)
    group by prod_element_cd, val1
    order by prod_element_cd
    The above query pulls the below result
    PROD_ELEMENT_CD EFFECTIVE_DATE END_DATE FIXED VARIABLE VAL1
    DDA001 01/01/2009 03/31/2009 4.20 7.62 1.00
    DDA001 06/01/2009 11/30/2009 4.80 0.72 10.00
    DDA001 01/01/2010 01/31/2010 0.75 0.50 13.00
    DDA002 07/01/2008 09/30/2008 2.40 0.36 11.00
    DDA002 02/01/2008 03/31/2008 1.50 1.00 14.00
    one more logic added to the requirement
    for each occurance, for eg: DDA001, the last row of DDA001 should be taken and the end_date of that should be hardcoded to 12/31/9999
    here we have two cases
    last row for DDA001 end_date
    DDA001 01/01/2010 01/31/2010 0.75 0.50 13.00
    end date is 01/31/2010 for this above row of DDA001. It should be hardcoded to 12/31/9999
    similarly
    last row for DDA002 end_date
    DDA002 02/01/2008 03/31/2008 1.50 1.00 14.00
    end date is 03/31/2008 for this above row of DDA002. It should be hardcoded to 12/31/9999
    Similarly for DDA003,DDA004.......... etc
    Thanks for your previous replies. Please give your suggestions.
    Regards
    Balaji
    Edited by: user12119826 on Oct 27, 2009 11:49 PM

  • Qualifications and proficiency data upload

    Hi people,
    can you please help me with one matter. I need to upload data from excel to SAP regarding qualifications for jobs and qualifiactions for persons.
    I have read on the internet many solutions but I still don't have the answer for these questions:
    1. What is the best way to upload qualifications with proficiency on objects C (jobs) and P (Persons)
    2. How should the excel look like, which columns
    Thank you for your answers.
    Best regards,
    Romano

    As our friend use PP01 Tcode.
    Regarding method to upload. If  you are familier with LSMW you can create it.
    If not ask you your ABAPer to create BDC program.which  is very easy to use.
    Regarding Excel template
    Place mandatory columns as field of excel or as per your business requriement.
    eg
    StartDate ` End Date   Object abbr        Object name  etic.
    Warm Regards

Maybe you are looking for