Broadcasting through process chains

How to broadcast reports to indivdual mailboxs in the company through process chains. say current data should be broadcasted into individual email account every time a new sales order or purchase order is created or by the end of the day , it should automatically trigger the mail with the report. can anyone help on this.

Hi,
Here see the blog details.
The goal of information broadcasting is to distribute the right information, in the appropriate format, to the right people, through different channels, at the right time.
With the BEx Broadcaster, you can precalculate queries, query views, Web templates, reports and workbooks and broadcast them by e-mail or to the portal. In addition to the precalculated documents in various formats (HTML, MHTML, ZIP and so on) that contain historical data, you can also generate online links.
Accessing the Broadcaster
The broadcaster can also be accessed via the Portal through the delivered BI Role.
You can determine the Scheduling for Information Broadcaster
Based on a data change event triggered by a process chain.
Based on a pre-defined time point.
Freely definable scheduling.
Steps to Schedule Information Broadcaster based on a data change event triggered by a process chain
Create the Query in the Query Designer.
Execute the report in the BEx Web Analyzer.
Click on u201CSendu201D to create the settings for broadcasting.
The Broadcast Wizard takes you through a series of prompts where you supply the key information required to develop a broadcast. At any time you can leave the Wizard and use the standard settings dialogs which offer more settings.
Then schedule the broadcast. If you start the Broadcaster for a query (or template or workbook) that gets data from an InfoProvider that will be selected in the process chain, you can select the InfoProvider for Scheduling.
Create the process chain and include the event data change, include process type "Trigger Event Data Change (for Broadcaster), itu2019s available under "Load Process and Post -Processing".
The Process Chain is created including the process types: (1) Start (2) Execute InfoPackage (3) Delta Data Transfer Process (4) Activate DSO (5) Trigger Event data Change.
When you create the Variant for the Event Data Change, using checkbox we can indicate when the Broadcast should trigger.
As soon as that InfoProvider is affected by a Process Chain, the Broadcasting is triggered.
After successful activation you can now schedule your chain. Press button u201CScheduleu201D or menu u201CExecution -> scheduleu201D. The chain will be scheduled as background job. You can see it in SM37. You will find a job named u201CBI_PROCESS_TRIGGERu201D. Unfortunately every process chain is scheduled with a job with this name. In the job variant you will find which process chain will be executed. During execution the steps defined in RSPCPROCESSCHAIN will be executed one after each other. The execution of the next event is triggered by events defined in the table.  You can watch SM37 for new executed jobs starting with u201CBI_u201D or look at the protocol view of the chain.
You can monitor the Broadcaster from the SOST transaction.
Note:
Depending on authorizations, end-users can schedule their Broadcasting Settings.
Only for those queries which are checked "Execution with Data Change in the Infoprovider" while you schedule, will be triggered by process chain event.
You may wish to refer the Note on Settings for Information Broadcasting- 760775
Hope this help you
Regards,
Rakesh

Similar Messages

  • Information broadcasting through process chain

    Hi,
    I've created ABAP program in process chain to broadcast a report using prec server. and can't understand where to put a receiver email adress?
    Thank you for your answers.

    Hi,
    Including the Trigger Event Data Change (for Broadcaster) process type in process chains that load data allows you to process broadcast settings when data is changed. The Trigger Event Data Change (for Broadcaster) process type is added to the data load processes. In this process type, you can choose the changed InfoProvider for which information broadcasting can be triggered.
    The end users that have created settings for precalculation and distribution in the BEx Broadcaster can specify when scheduling their broadcast settings that the precalculation and distribution be executed whenever there is a data change in the InfoProvider on which the scheduled BI object (query, query view, Web template, report, or workbook) is based.
    Thanks and regards

  • Broadcasting Workbooks through Process Chain

    Dear friends,
    I want to broadcast workbooks through process chain. I have tried two programs
    1) RSRD_BROADCAST_STARTER
    2) RSRD_BROADCAST_BATCH
    My requirement is that, I want a log to be mailed to me stating the success or failure of the broadcast of the workbook.
    Through the first program, I am able to send the log to my e-mail ID however I am not able to send the workbook.
    Through the second program, when I execute the process chain, it immediately says process chain executed. After some time the log mail comes stating succss. After some more time the workbook gets broadcasted.
    I want the workbook to be broadcasted first and then the log should come specifying the success or failure of the precalculation of the workbook in the precal server.
    Please let me know if somebody knows how to do it.
    Regards
    Himanshu

    Hi Sapna,
    Yes You have to install pre-calculation server for broadcasting and the process for that is
    http://service.sap.com/swdc > Download>Support Packages & Patches>Entry by Application Group>SAP NETWEAVER -->SAP NETWEAVER 04 -->BI Precalculation
    and can check its status in transaction : RSPRECADMIN   or
    Transaction SPRO>SAP Referance IMG> SAP Netweaver> Business Interlligence>
      Settings for Reporting and Analysis> Settings for Information Broadcasting> Administrate Pre-calculation Server.
    and the precalculation is done : Open Query>Publish>BEx Broadcaster , then you can also make new settings --> three or foue tabs  and one of the tab is pre-calculate.
    Regards,
    Nisha Jagtap.

  • Loading through Process Chains 2 Delta Loads and 1 Full Load (ODS to Cube).

    Dear All,
    I am loading through Process chains with 2 Delta Loads and 1 Full load from ODS to Cube in 3.5. Am in the development process.
    My loading process is:
    Start - 2 Delta Loads - 1 Full Load - ODS Activation - Delete Index - Further Update - Delete overlapping requests from infocube - Creating Index.
    My question is:
    When am loading for the first am getting some data and for the next load i should get as Zero as there is no data for the next load but am getting same no of records for the next load. May be it is taking data from full upload, i guess. Please, guide me.
    Krishna.

    Hi,
    The reason you are getting the same no. of records is as you said (Full load), after running the delta you got all the changed records but after those two delta's again you have a full load step which will pick whole of the data all over again.
    The reason you are getting same no. of records is:
    1> You are running the chain for the first time.
    2> You ran this delta ip's for the first time, as such while initializing these deltas you might have choosen "Initialization without data transfer", as such now when you ran these deltas for the first time they picked whole of the data.Running a full load after that will also pick the same no. of records too.
    If the two delats you are talking are one after another then is say u got the data because of some changes, since you are loading for a single ods to a cube both your delta and full will pick same "For the first time " during data marting, for they have the same data source(ODS).
    Hope fully this will serve your purpose and will be expedite.
    Thax & Regards
    Vaibhave Sharma
    Edited by: Vaibhave Sharma on Sep 3, 2008 10:28 PM

  • Incorrect data after activating the request through Process chain.

    Dear SDN chaps.
    Today morning. I encountered a strange issue in DSO..
    I have DSO which is updating from the AL11(application server) flat file.
    While i am loading it to PSA there were no issues and after loading it to the DSO there is no issue and its passing through the routine and the data is populating properly in NEW data Table .But after successful activation of  the request through process i am getting the wrong records in active data table.
    Then i deleted the request and reran it manually i mean triggered the DTP and ran the manual activation surprisingly accurate records are coming through manual process..
    I am just wondering why it is not working through process chain and why it is showing incorrect records through process chain execution and how it is showing accurate records through manual uploading process..'
    Could some one please help to come out from this..By the way mine is SAP BI 7 SP20 &SP05 for BW 7.01
    Thanks
    K M R
      

    Hi Pra
    Thanks for your response..
    We are doing PSA deletion and then we are uploading the data to PSA as well as DSO.
    Now the issue is not in the part of loading we are facing the issue in Actiation of request if i am executing the activation through process chain it is sucess but the values are incorrect. If i am doing the manual activation it sucess with correct data.
    Even i tried with a new chain but still i am facing the issue.
    Surprise thing is in new data table the data is perfect in both the ways like manual upate and Process chain update only during activation i am getting incorrect record in the active data table..
    Appreciate your help on this....
    Thanks
    K M R
    Edited by: K M R on Jul 9, 2010 11:09 AM

  • How to call program through process chain

    Hi Gurus,
    I am in the position to execute the abap program through process chain, I have used  abap program as process type in process chain(First time I am using this process type).when I am executing the process chain, the abap program is not executing.Eagerly anticipating your reply.
    Regards
    Shiva

    Hi
    I managed the execution of rscrm jobs in PC as follows:
    1. Execute query through RSCRM_BAPI transaction
    2. goto sm37 and copy the Jobname (the active one)
    3. Create following progromm
    *& Report /WST/RSCRM_START *
    REPORT /WST/RSCRM_START .
    parameter: l_bid TYPE sysuuid_c.
    CALL METHOD cl_rscrmbw_bapi=>exec_rep_in_batch
    EXPORTING
    i_barepid = l_bid
    4. Execute Programm and fill the Parameter with the Jobname
    5. Save as a new program variant and use in PC as a normal program
    I hope that helps.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3507aa90-0201-0010-6891-d7df8c4722f7
    regards
    ashwin

  • Problem in Activating Hierarchy Attribute change run through process chain.

    Hi All,
    I have the problem in Hierarchy change run processw running through process chain..and also process chain completes successfully without any error..
    Even hierarchy change run  completes successfully the hierarchy objects are not activating,
    so daily when i have to activate hierarchy objects manually..
    and what i did was in one attributechange run process i have included 10 hierarchy object is it correct or else i have to separatly create 10 attribute change hierachy proceess in the process chain.. pls through some lights on this issue
    With regards,
    Hari

    Hi venkat,
    Yes.. i already include the save hierarchies process in process chain...
    my main problem is hierarchy infobject are not activating  using process chain and also even process chain is not giving error message.. and in which log table the hierarchy objects are maintained (such as run time, date , status etc..).
    IS there is any other settings to be made in process chain..
    with regards,
    hari

  • Error in the Source system while data loding through Process Chain

    Hi,
    I am facing issue while data loading for certain extractors through Process chain. Load of 0BPM_WIHEAD, 0BP_LOGHIST, OBPM_OBJREL, 0BPM_DEADLINES(there are 2-3 more extractors) is getting failed daily with message Error occurred in the source system and if we repeat the same it is getting executed successfully.
    Regards,
    Javed

    Hi..
    It means that the extraction job is failing in the source system..
    Please check the Job log in the source system  --> Copy the request nimber --> In the source system go to SM37 --> Give * then the request number......execute --> Check the Job log..
    Look there may be multiple reasons behind this....
    1) Connection problem : Go to SM59 and test the connection, If you don't have the access in SM59...then go to RSA1 --> Source Systems --> Right click --> Connection Parameters --> Connection Test
    2) Or may be work processes are not available and due to which jobs are failing due to time out...
    In both the cases you can check with the Basis Team if they can suggest something....or you can change the process chain scheduling time if possible, and schedule it in such a time when less no of jobs are running..
    Regards,
    Debjani..

  • Archiving Infocube through Process Chain...

    Hi All,
    I need help in creating process chain for archiving infocube.I am able to archive infocube manually but not getting through process chain.
    Is it possible to archive infocube through Process Chain?If yes then please give steps to create process chain for archiving.
    Thanks in advance.
    Bandana.

    Hi,
    It is possible to Archive data from an infocube via process chain.
    Have a start process followed by Archive data from an infoprovider. Here the trick lies in the variants used for the archive steps used to make the chain.
    Create a process by dragging the "Archive data..." process and give name for the variant and use this variant for writing the archive file, choose your archive process (the same archive process u have created to archive data from the infocube) , as this is write phase do not check "Continue Open Archiving  requests" chk box, and choose "40 Wirte phase completed succesfully option" from the options in Continue Process Until target status.Now enter the required selection conditions. In case u want to reuse this chain give a relative value under 'Primary Time Restriction tab' and save this variant. This is your variant 1.
    Now drag the same 'Archiving process' and create a variant and in this process u need to select "Continue Open Archiving  request" and choose the option from the dropdown list '70 Deletion phase confirmed and request completed' this variant for deleting the data from the infocube. This is your variant 2.
    So now have a Process chain with Start >Archive Process with Variant 1 ( to write the archive file) > Archive Process with Variant 2 ( to delete the data from the cube).
    Thats it !!
    Regards
    Edited by: Ellora Dobbala on Apr 8, 2009 5:28 PM

  • PSA Deletion through Process Chain

    Hi Experts,
    Currently I am working on the BW3.5 version. I would like to delete the old PSA req through Process Chain. I need some clarification. Please provide me your suggestions.  Thanks in advance.
    1)In SAP-BW 3.5 version, there is only one process type u2018Deleting Requests from the PSAu2019 is available to do this activity. This process type can delete the requests from the PSA only or Will it delete PSA and Change Log tables!!!!! In the SDN threads, some people said it delete the PSA requests only and some said  it delete the PSA requests as well as Change Log.
    2)Currently we are having six process chain and each has master & transaction data load(Data load through PSA).
    Similarly I am planning to create six PSA deletion process chain which includes master & transaction data deletion. Here I am got little complication on finding Object Name(PSA Table Name).Please refer the screen shot.Is there any short cut to find full list of Object Name(PSA Table) which is currently available in the Process chain every day load.
    3)On the request selection, i prefer "Only sucessfully booked/updated req.". I did not select"Only those req with errors,that are not booked in a data target". Please share your view on this selection preference.
    http://img818.imageshack.us/img818/3963/psa1.jpg
    Thanks,
    RR

    Hi Murali,
    Thanks for the response. I do understand about the identifying the data source & PSA retention period(days). Let me elaborate my question little more. Thanks.
    1)In SAP-BW 3.5 version, there is only one process type u2018Deleting Requests from the PSAu2019 is available to do this activity. This process type can delete the requests from the PSA only or Will it delete PSA and Change Log tables!!!!! In the SDN threads, some people said it delete the PSA requests only and some said it delete the PSA requests as well as Change Log.If i am not mistaken, In BW 3.5 PSA deletion will leads through Change Log also.Am i right?
    2)Currently we are having six process chain and each has master & transaction data load(Data load through PSA).
    Similarly I am planning to create six PSA deletion process chain which includes master & transaction data deletion. Here I am got little complication on finding Object Name(PSA Table Name).Please refer the screen shot.Is there any short cut to find full list of Object Name(PSA Table) which is currently available in the Process chain every day load.In the very day process chain, we are having many stages in master & transaction load. So I can go through info pkg level and find the data source,then i can identify the PSA table name. But is there any other simplest way to find the PSA table name.
    http://img818.imageshack.us/img818/3963/psa1.jpg
    3)On the request selection, i prefer "Only sucessfully booked/updated req.". I did not select"Only those req with errors,that are not booked in a data target". Please share your view on this selection preference. May i know that which option is optimal on normal business process. If i did not select both options, then will that delete all the request in PSA!!!
    Thanks.
    RR.

  • Updated no of records through process chain

    We are executing an iP and we want to add a message through email in the IP ,if it is success or fail and also is it possible to update the no of records that are updated in the iP in the same email...
    We are using BW 3.5 version..

    Hi Prince,
    Kindly have a look at below links,
    Process Chains - number of records loaded in various targets
    Message Alret  through Process Chain
    Hope this helps.
    Regards,
    Mani

  • How to stop the data loads through process chains

    hi,
    I want to stop all the data loads to BI through Process chains where load happens periodic.
    kindly suggest how can I proceed.

    Hi,
    Goto RSPC find your PC and double click on START then change the timings, i.e. give starting date is 01.01.9999 like that Save and ACtivate the PC, it won't start till 01.01.9999.
    Thanks
    Reddy

  • Compression through process chain

    Hi Gurus,
    I have 76 cubes in my producation system.
    I want compress all cubes through process chain in every weekend.
    adding all cubes in one process chain...
    pls suggest me ,
    thanks,alex

    When using the 'Compression of the InfoCube' Process Variant, you don't have the capability of given a date range or year/month of requests to compress. It only has the capability of n days old.
    So, you'd have to create two Process Variants, one with marker and one without, for each month of requests to be compressed. Since you haven't compressed for two years, the values to put in the 'Collapse only those requests that were loaded XXX days ago' would be:
    730
    700
    670
    640
    610
    580
    550
    So in total, you're going to end up with something like 50 Process Variants (25 with marker and 25 without marker). You'd want then to execute in inverse order (730 first, then 710, then 670, and so on) so that it does only 30 days worth of work.
    Then once you have completed compressing the InfoCubes, you can then remove all of these Process Variants and only have one with marker and one without and set to how many days old you wish (SAP recommended to us not to compress requests that are less than 21 days old in the case there's an issue with a request - once you compress a request you can't identify and delete data by request).

  • Data Load Requirement through Process Chain

    Hello All,
    I have implemented an area through a custom data source and following is my data load requirement through Process Chain. Request you to kindly help me for the same.
    1. The design is an InfoCube and updated using a Transformation through the custom data source.
    2. For the entire year 2008 I want a single request to be loaded. So it gets loaded into the PSA and then into the Infocube through a (Delta) DTP.
    3. Now, I have created an InfoPackage (Full Update) with year as 2009 in the selection. Tht is I want tht hencforth the data load should be for the year 2009.
    4. Hence, the Infopackage will run to bring the 2009 data into PSA and I run the (Delta) DTP to update the same in the Cube.
    5. Now, what i require is everyday the InfoPackage (Full Update) with 2009 should run and bring data into PSA. and the same should be updated in the InfoCube after deleting the 2009 request already present keeping 2008 request loaded previously intact.
    I hope the above is nt confusing.
    Please let me know if i can elaborate the same.
    Thank you.
    Regards,
    Kunal Gandhi

    Hi,
    Please go through the links.
    http://help.sap.com/saphelp_nw04/Helpdata/EN/21/15843b74f7be0fe10000000a114084/frameset.htm
    http://help.sap.com/saphelp_nw04/Helpdata/EN/f8/e5603801be792de10000009b38f842/frameset.htm
    http://help.sap.com/saphelp_nw04/Helpdata/EN/b0/078f3b0e8d4762e10000000a11402f/frameset.htm
    These may help you in designing the process chain as required.
    Regards,
    Sunil
    Edited by: sunil kumar on Apr 17, 2009 6:20 PM

  • How to delete master data in bi 7? through process chain

    Hi all,
    I can see millons of records for master data is getting loaded daily. Can anyone pls advise me how to delete the master data through the process chain?
    Thanks
    pooja

    Hi,
    If Cube/DSO then we can choose the option Delete the Contente of the DataTarget in Process Type in PC. But for Mater data you can see any FM is available and keep that FM in Program and Insert that Program in PC's
    RSDMDD_DELETE_MASTER_DATA
    RSDPW_MASTERDATA_DELETE
    See the above TWO FMs in SE37 and if it is working then keep in in one simple program and then call that Propgram in Process Chain
    Thanks
    Reddy
    Edited by: Surendra Reddy on Jun 30, 2010 7:19 AM

Maybe you are looking for