Master Data Save Through BPS - BI and BPS Related

Hi,
I have 2 questions regarding BI and BPS working together.
1.Does all infoobject used in BPS layout have to be master data enabled to avoid BPS created master data? By this I mean, if an infoobject doesn't have 'with master data' check box enabled, and has only'with text' enabled, the new masters created in BPS gets saved to the infoobject? Is this how it is expected to behave?Is there a work around to avoid creating master data from BPS?
2.We have a multiprovider for actuals and budget. Budget comes from BPS which has only cal quarter. But the budget and actuals cube have year and quarter. In the actuals we are deriving quarter and year from trnasaction date in update rule. But for budget, we have only quarter being entered by user. Is there a way to derive the year and store in budget infocube so that reports derived out of the multiprovider can show year correctly for both budget and actuals?
Thanks
Suja

>1. BPS cannot create master data at all and that setting is irrelevant fo bps. Users can only forecast on existing master data.
This is wrong. If your infoobject doesnt have master data, then values inputted in layouts will be written into the infoobject.
>1.Does all infoobject used in BPS layout have to be master data enabled to avoid BPS created master data?
By this I mean, if an infoobject doesn't have 'with master data' check box enabled, and has only'with text' enabled, the new masters created in BPS gets saved to the infoobject?
yes
>Is this how it is expected to behave?
I think, yes
>Is there a work around to avoid creating master data from BPS?
The only way, i suppose, to create infoobject with master data

Similar Messages

  • IDOC - Master data changes( through MM41 & MM42) to XI

    Hi Experts,
    My requirement is:
    Any updates/ changes to the article master through SAP transaction MM42 or any article creation through SAP transaction MM41 will trigger an update for the interface which will be picked up during the nightly batch job. For all updates/changes made throughout the day only the latest change will be interfaced to other non-sap system during the nightly batch.
    An IDOC will be generated for the changes made throughout the day by picking the latest one. IDOC name is WBBDLD03.
    Actually i am working on IDOC first time. I think i need to track changes in master data table through ABAP statements only and then a internal table ( with required data) need to be sent through IDOC. Please suggest if my approach is correct?
    Also, tell me the method to be followed for sending IDOC?
    Some examples implementing the similar scenario would be highly appreciated.
    Hoping for a useful response:-)
    Regards
    Gaurav K

    Hi,
    This is a change pointer scenario.
    IDOC:WBBDLD03
    Messge type:WBBDLD
    You need to write any code. You have to just make some ALE settings, this itself will meet ur requirement.
    First activate change pointer through BD61.
    Also activate change pointers for ur message type using txn BD50.
    Create LS for both sender and receiver (Use txn SALE)
    Assign the LS created to the respective clients .
    Create a rfc dest for ur receiving system using SM59.
    Later create a distribution model using txn BD64 (Here u mention ur receiver sender and message type )
    Create partner profile for ur logical system with this message type using txn WE20.
    Now you can trigger sending of the idocs through job schedule prg.
    But i dont know what is the prg name for ur scenario.
    I hope it helps you.
    Reward points.
    Thanks,
    Prasanna

  • Updating Master data attributes through BPS

    Hi All,
    Can we modify master data attributes via BPS? Like we have some Attribute characteristics for 'vendor' say 'class(good/bad/ok)' and we want that to be updated via BPS (not from R/3) then how it can be achived?
    Please help by explaining this.

    Hi,
    Create two variables one for vendor and other for status (variable of type attribute), give both the variables in the folder. User will select the vendor and the attribute status value in selections. Create an exit planning function to update the attribute.
    Import parameters
    i_area type upc_y_area
    i_variable type upc_y_variable
    i_chanm type upc_y_chanm
    Export parameters
    eto_charsel type upc_yto_charsel
    tables
    i_t_attributes structure rsd_s_iobjnm optional
    i_t_data structure rsndi_s_chavl optional
    In the code, Read the above two variable values selected by user. Then delete the existing entry of MD by calling the function RSNDI_MD_DELETE.
    Now update the master data with the new attribute value selected by the user in the variable by calling the function
    RSNDI_MD_ATTRIBUTES_UPDATE. After this activate the master data by calling RSDMD_MD_ACTIVATE.
    Hope this solves the issue.
    Bindu

  • MATERIAL MASTER DATA UPLOADING THROUGH  LSMW

    Hi.
    I am new to lsmw.
    we have a requirement to upload the material master data through LSMW,.
    the data is on Excel sheet on the Desktop of the system.
    Can any body help me to upload the Material Master data, procedure in step wise.
    Thanks in advance,
    regards,
    Eswar.M

    Hi Venkat,
    Go through the following Steps
    Using Tcode MM01 -- Maintain the source fields are
    1) mara-amtnr  char(18)
    2) mara-mbrsh  char(1)
    3) mara-mtart  char(4)
    4) makt-maktx  char(40)
    5) mara-meins  char(3)
    the flate file format is like this as follows
    MAT991,C,COUP,Srinivas material01,Kg
    MAT992,C,COUP,Srinivas material02,Kg
    AMT993,C,COUP,Srinivas material03,Kg
    MAT994,C,COUP,Srinivas material04,Kg
    MAT995,C,COUP,Srinivas material05,Kg
    goto Tcode LSMW
    give Project Name
         Subproject Name
         object Name
    Press Enter -
    Press Execute Button
    It gives 13 radio-Button Options
    do the following 13 steps as follows
    1) select radio-Button 1 and execute
       Maintain Object Attributes
    select Standard Batch/Direct Input
       give Object -- 0020
           Method -- 0000
       save & Come Back
    2) select radio-Button 2 and execute
       Maintain Source Structures
       select the source structure and got to click on create button
       give source structure name & Description
       save & Come Back
    3) select radio-Button 3 and execute
       Maintain Source Fields
       select the source structure and click on create button
       give
       first field
            field name    matnr
            Field Label   material Number
            Field Length  18
            Field Type    C
       Second field
            field name    mbrsh
            Field Label   Industrial Sector
            Field Length  1
            Field Type    C
       Third field
            field name    mtart
            Field Label   material type
            Field Length  4
            Field Type    C
       fourth field
            field name    maktx
            Field Label   material description
            Field Length  40
            Field Type    C
       fifth field
            field name    meins
            Field Label   base unit of measurement
            Field Length  3
            Field Type    C
      save & come back
    4) select radio-Button 4 and execute
       Maintain Structure Relations
       go to blue lines 
          select first blue line and click on create relationship button
          select Second blue line and click on create relationship button
          select Third blue line and click on create relationship button
      save & come back
    5) select radio-Button 5 and execute
       Maintain Field Mapping and Conversion Rules
       Select the Tcode and click on Rule button there you will select constant
       and press continue button
       give Transaction Code : MM01 and press Enter
       after that
       1) select MATNR field click on Source filed(this is the field mapping) select MATNR and press Enter
       2) select MBRSH field click on Source filed(this is the field mapping) select MBRSH and press Enter
       3) select MTART field click on Source filed(this is the field mapping) select MTART and press Enter
       4) select MAKTX field click on Source filed(this is the field mapping) select MAKTX and press Enter
       5) select MEINS field click on Source filed(this is the field mapping) select MEINS and press Enter
      finally     
      save & come back
    6) select radio-Button 6 and execute
       Maintain Fixed Values, Translations, User-Defined Routines
       Create FIXED VALUE Name & Description as MM01
       Create Translations Name & Description as MM01
       Create User-Defined Routines Name & Description as MM01
       after that delete  all the above three just created in the 6th step
       FIXED VALUE --MM01
       Translations --MM01
       User-Defined Routines --MM01
       come back
    7) select radio-Button 7 and execute
       Specify Files
       select On the PC (Frontend) -- and click on Create button(f5)
                                      give the path of the file like "c:\material_data.xls"
                                      description : -
                                      separators as select tab radiao- button
       and press enter   save & come back
    8) select radio-Button 8 and execute
       Assign Files
       Save & come back
    9) select radio-Button 9 and execute
       Read Files
       Execute
       come back
       come back
    10) select radio-Button 10 and execute
        Display Imported Data
        Execute and press enter
        come back
        Come back
    11) select radio-Button 11 and execute
        Convert Data
        Execute
        come back
        Come back
    12) select radio-Button 12 and execute
        Display Converted Data
        Execute & come back
    13) select radio-Button 13 and execute
        Start Direct Input Program
       select the Program
       select continue button
    go with via physical file
    give the lock mode as 'E'
    and execute
    Regards
    Sreeni

  • Master Data coming from 1 backend and GOA distributed to many bakends

    Hello,
    In our case (using SRM 5.0) one backend system is used to replicate all material master data and GOA will be distributed to multiple backends including the one that is used for master data. As far as i know the logical backend system is determined through materials/material groups while GOA is distributed. In the above case how can we intervene with this logic to be able to distribute the GOA to whatever backend we have, even though if the MD is not coming from them?
    Thank you,

    Hi,
    You can use the BADI BBP_DETERMINE_LOGSYS for this purpose.
    The method :CONTRACT_LOGSYS_DETERMINE (Determine Target System of Contract) has been provided for the same purpose.
    Sample BADI Implementation at SDN Wiki:
    https://www.sdn.sap.com/irj/sdn/wiki?path=/display/srm/bbp_determine_logsys-contract_logsys_determine&
    Regards
    Kathirvel
    Edited by: Kathirvel Balakrishnan on Feb 25, 2008 1:00 PM

  • Master Data Extraction through Open Hub services(InfoSpoke)

    Hello,
             I need your help on this issue,
    Here is my scenario...
    I want to extract Master data from BW (Info Object) as a file and put it in a server then later that can be picked up by some other deprtment for their process. This process has to be automated to run on a weekly basis.
    I know this can be done through a InfoSpoke(either ATTR or Text) and Process Chain but I want both attributes & text data in a single file because through InfoSpoke we can get either ATTR or TEXT data in a file.
    1)Is there any way we can have both TEXT and ATTRIBUTEs data in a single file.
    2) Also what are the options to transfer the file from a BW app server into another non-BW server?
    Currenty I created a BEx query against a InfoObject, ruuning a report and sending them the file manually. I want to automate the process.
    I would appreciate if you could give me input on the above.
    Thanks
    BI Consultant

    Hi,
    The simpleset way to do is to create a ODS with the required columns and map it to your TEXT and ATTR. Then you could use either a Query or a Info Spoke to extract data how ever you want it. Since you are using master data it would have been validated, so you don't need to use a reportable ODS, which takes time to activate the data. So create a info set on top of the ODS and do a simple query if you want to use it. If you want to automate completely and deliver the output as .CSV file to the destination server you may have to do little bit more and that will involve some serious ABAP work. The simpleset would be have a ODS 'Non reportable' and extract the data using info spoke.
    Let me know if you need more info.
    Thanks,
    Alex.

  • HR Master Data Distribution through ALE

    Hi All,
    Iam new to HR module.
    I want to know what are the transactions to create HR master data like employee etc.
    I want to send this HR master data from one SAP system to another SAP/non-SAP system through Idocs.
    Can anyone guide me with an example for craeting the master data and sending the data through Idoc.
    Thanks a lot in advance

    For HR master data : PA30,PA40 .
    For HR ALE data transfer: PFAL.
    Plz SEARCH in SCN before posting, you will get lot of information.

  • BPC Master Data Deletion through ABAP code

    Hi All,
    I have a requirement of deleting the orphan nodes for one of the dimension in BPC using ABAP code.
    Please let me know any standard program or classes for deleting the master data from dimension.
    Regards
    Pratibha Biradar

    Hi Pratibha,
    Here is the code to add master data, you can change the flag to 'D' to delete, i have not checked it for delete, it is working for adding.
    DATA: ls_message TYPE uj0_s_message,
    lt_messages TYPE uj0_t_message,
    l_success TYPE uj_flg,
    l_appset_id TYPE uj_appset_id,
    l_dimension_id TYPE uj_dim_name,
    lo_member_mgr TYPE REF TO if_uja_member_manager,
    lo_dimension TYPE REF TO if_uja_dimension_manager,
    lo_master_data_store TYPE REF TO if_ujam_master_data_store,
    lo_context TYPE REF TO if_uj_context,
    ls_dimension TYPE uja_s_dimension,
    lt_errors TYPE uja_t_members_error,
    lr_members TYPE REF TO data,
    lr_data TYPE REF TO data.
    FIELD-SYMBOLS:
    <lt_member_data> TYPE STANDARD TABLE,
    <ls_member_data> TYPE any,
    <lv_field> TYPE any.
    TRY.
        lo_context = cl_uj_context=>get_cur_context( ).
        cl_uj_context=>set_cur_context(
        i_appset_id = l_appset_id
        i_module_name = lo_context->d_calling_module
        is_user = lo_context->ds_user
        lo_dimension = cl_uja_bpc_admin_factory=>get_dimension_manager(
        i_appset_id = l_appset_id
        i_dimension_id = l_dimension_id
        lo_dimension->get(
        EXPORTING
        if_with_hier_maxlevel = abap_false
        IMPORTING
        es_dimension = ls_dimension
        CREATE OBJECT lo_master_data_store TYPE cl_ujam_master_data_store.
    * creating masterdata table
        lr_members = lo_master_data_store->get_table_buffer( ls_dimension ).
        ASSIGN lr_members->* TO <lt_member_data>.
    * Add members to <LT_MEMBER_DATA>, these are the member that will be saved.
        CREATE DATA lr_data LIKE LINE OF <lt_member_data>.
        ASSIGN lr_data->* TO <ls_member_data>.
    * Fill each field, such as ID, and EVDESCRIPTION, update other fields here as well as
    * any properties that need to be updated.
        ASSIGN COMPONENT 'ID' OF STRUCTURE <ls_member_data> TO <lv_field>.
        IF sy-subrc = 0.
          <lv_field> = 'ProductD'.
        ENDIF.
        ASSIGN COMPONENT 'MBR_NAME' OF STRUCTURE <ls_member_data> TO <lv_field>.
        IF sy-subrc = 0.
          <lv_field> = 'ProductD'.
        ENDIF.
        ASSIGN COMPONENT 'EVDESCRIPTION' OF STRUCTURE <ls_member_data> TO <lv_field>.
        IF sy-subrc = 0.
          <lv_field> = 'Product D Update'.
        ENDIF.
        ASSIGN COMPONENT 'PARENTH1' OF STRUCTURE <ls_member_data> TO <lv_field>.
        IF sy-subrc = 0.
          <lv_field> = 'TotalProduct'.
        ENDIF.
        ASSIGN COMPONENT 'OBJVERS' OF STRUCTURE <ls_member_data> TO <lv_field>.
        IF sy-subrc = 0.
          <lv_field> = 'A'. "Version flag, should be "A" for Active
        ENDIF.
        ASSIGN COMPONENT 'ROWFLAG' OF STRUCTURE <ls_member_data> TO <lv_field>.
        IF sy-subrc = 0.
    """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""Here you can change it to 'D' for Deleting """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
          <lv_field> = 'I'. " This is an action flag, I=Insert, M=Modify """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
        ENDIF.
        APPEND <ls_member_data> TO <lt_member_data>. " Add to the table.
    * Create member manager
        lo_member_mgr = cl_uja_bpc_admin_factory=>get_member_manager(
        i_appset_id = l_appset_id
        i_dimension_id = l_dimension_id ).
    * Save the members UNCOMMENT ONLY when you want to write the data!!!!!
    * lo_member_mgr->save(
    * EXPORTING
    * ir_members = lr_members " List of members to save
    * IMPORTING
    * et_errors = lt_errors
    * NOW PROCESS THE DIMENSION
        DATA: ls_dimensions TYPE uja_s_dim_name,
        lt_dimensions TYPE uja_t_dim_name.
        CLEAR ls_dimensions. REFRESH lt_dimensions.
        ls_dimensions-dimension = l_dimension_id. " Add dimensions to the list
        APPEND ls_dimensions TO lt_dimensions.
        lo_member_mgr->process(
        EXPORTING
        it_dim_list = lt_dimensions
        if_set_offline = abap_false
        if_validate = abap_true
        IMPORTING
        ef_success = l_success
        et_message_lines = lt_messages ).
      CATCH cx_uj_no_auth .
    ENDTRY.
    hope this will help,
    thanks,
    Rishi

  • Data Sources for the Master data of FI-CO,MM and SD from R3

    Hi Gurus ,
    Could you please tell me what are the main datasources of FI-CO,MM and SD needed from R/3 for the master data to be loaded in new implementation of BI .
    Could you please guide me step by step .
    thanks in advance ,
    Pratham

    Hi
    It varies from Client and project. But the Most common SD & MM are from LO **** pit.
    help.sap.com is the best source for this
    Hope it helps

  • Mass Master data Upload from MS Excel and JDE into SAP ( ECC 6.0)

    Hi
    We are deciding the best method of uploading 2 million fixed assets Master data from Excel and JDE
    We are following Batch Input (RAALTD001) and Direct Input methods (RAALTD11)
    I am looking for some other efficient alternative for this upload
    Look forward ot hear form you experts !!
    Thanks
    Milind

    hi
    good
    both works not possible simultaneously.
    If you want to do it in two separate task than you can use the GUI_UPLOAD function module to fulfill your requirement.
    thanks
    mrutyun^

  • To enhance 0EMPLOYEE master data attributes, need to delete and load again?

    Dear Support,
    In order to meet business intelligence demands, I had three more attributes (region, time rate and worksite) to 0employee time dependent characteristic.  The datasource to load 0employee attributes and its structure were enhanced to fill the new fields.
    After first load, 0employee master data could not be activated. So I deleted 0employee master data, what I couldnu2019t do without deleting its data from transactional HR infocubes. I deleted transactional data and afterwards master data. I load it again and activated, with no errors this time.
    This occured in development system.
    Changes are about to go to production environment and deleting almost all transactional RH area infocubes and 0employee master data is something that we all want to avoid.
    I am asking if there is a workaround in order to solve this without doing the stated time and resources consuming task.
    Thanks in advance for your help,
    Best Regards,
    André Oliveira

    Hi Andre,
    You should be able to do a full load to achieve this. No need to delete the data. If there are corrupt data in BI with overlapping time periods, fix it by maintaining the master data. If corrupt data comes from source fix it in PSA.
    Also you can specify to load the data for a time range only which might be easier than to load all history.
    good luck,
    vijay

  • Transporting Master data - New GLs,Cost Center and profit Center.

    Dear Friends,
    1. I have to create new GLs in Dev, Now i have to transport the new GL to Quality and Production. As per my knowledge, Master data has to created in DEV,Quality and Production.
    please suggest, is there any t.code where i can transport GL from Dev to quality and production.
    2. when Creation of the site, New Cost center,Profit center, Changes to standard hierarchy both in cost center and profit center hierarchy has to done in dev,quality and production manually.
    Any suggestions where i can do only in dev and transport to quality and production.
    3. After creating new cost center and profit center, i have to go to okb9, select the co.code,cost element and insert the new cost center and profit center in every cost element. This takes almost one hour.
    Please suggest how to reduce this manually process.
    All the help will be greatly appreciated.
    Regards
    Sridhar Reddy

    Hi ,
    Master data is to be created in diffrent clients as it is a non transportable object .
    So in case of GL , cost centers you will have to again re run the LSMW which you had created for DEV .
    Secondly OKB9 is a transportable object , so i dont feel there should be an issue in adding it to a transportable object .
    In case of adding the values of tables in a request what you can do is create  a request , go to SE01 open the request in change mode in the objects enter TABU and enter on the relevant entries that you want to transport .
    But again one has to be very clear on what all table entries are to be transported , like SKA1 , SKB1 , SKAT etc. looking at the complexities and moving ahead with the standard way it is advisable to re run the LSMW in QUA and PRD .
    Hope this helps .
    Regards ,
    Dewang T.

  • ALE distribution of HR master data to CRM - subtype CELL and MAIL

    Hello,
    Overview/Introduction:
    we are using the ALE distribution of HR master data to CRM. The distribution has already been set up and it is running. We know that the infotype communication (0105) is especially distributing the subtypes 0010 (e-Mail) and 0020 (telephone number). This e-Mail address and the telephone number are stored in the business partner in the CRM system.
    For this infotype 0105 there are also the subtypes CELL (cell phone) and MAIL (e-Mail address). Unfortunately this cell phone number and the mail address (from subtype MAIL) are not stored at the business partner in CRM.
    Our request is on the one hand to store the data from subtype CELL into the field mobile number of the business partner. And on the other hand we want to store the e-mail address of subtype MAIL at the business partner instead of subtype 0010.
    For some reason it seems that these requests canu2019t be setup within SAP standard.
    So we found SAP note 758426 which suggests to use the BAdI HRALEX_INBOUND and the report RH_ENHANCE_BP_TEMPLATE.
    Problem:
    Now the problem is that the BAdI does not get called! The BAdI is active! We tried to set break points -> unfortunately did not stop! And we also implemented an endless loop in the BAdI -> even with this the system did not u2018stopu2019!?
    Any ideas?
    Thank you,
    Roman

    Hi,
    First you need to create a distribution model using transaction BD64 : Edit > Model View > create.
    Then you will need to maintain fields for description and technical name. In partner box you need to enter the source system (HR) and the target system (FI) in the second field. Then you need to assign a message type and filter settings if you need it. You need to create this distribution model in source system (HR).
    After, still in BD64, go to environment > change partner profile or transaction WE20
    Select your target system and enter the corresponding message type ( HRMD_A or HRMD_ABA depends on your scenario) with the correct customizing (receiver port, packet size, idoc type, syntax check and output mode).
    Then please do the same things in your source system using WE20 for your source system and define your inbound parameters (Process code HRMD, processing by function module : use trigger immediately if it is not a production system or else trigger by background program and this case you will need to run report RBDAPP01 to process Idocs).
    Once done, then go back to your model view in your source system (BD64) and go to Environment > generate partner profiles. Then please go to Edit > Model View > distribute
    After all these settings done, it should work.
    Please also check out documentation :
    http://help.sap.com/erp2005_ehp_04/helpdata/EN/82/933b90aa9e11d6b28800508b5d5211/frameset.htm
    Best Regards
    Christine

  • Master Data in IO as InfoProvider and as Not InfoProvider

    Hi all
    I found that I can directly input Master Data into InfoObject (IO) with Maintain Master Data menu. Before I thought, I have to insert an IO into InfoProvider to do ETL stuff.
    Then what is the different between having this IO inside InfoProvider and not having it inside InfoProvider?
    Thanks
    ps. If this question is quite confusing, please let me know.
    Regards,
    Halomoan

    Main difference of having and not having a MD InfoObject in the InfoProvider is for example
    you have
    Customer Master data which has a attribute called Location
    Customer         Location             Year
    ABC                USA                   2007
    ABC                France                2008
    From the above example if the Master Data is not time depenedent then we would have only one value for Customer Location as France and the User would like to see the location for Customer ABC in 2007 as USA and since the MD for Customer contains only the latest information France the history is lost so at the time of designing the Cube we look at these scenarios to decide which InfoObjects we would include in the Cube to avoid these situations. (Slowly changing dimensions).
    Lets say that we included Location in the Cube. Now the users will have the flexibility to report the current location for all the years using the MD attribute  or can use the location InfoObject from the InfoCube to get the snapshot of 2007 as well. Below is the result from these 2 scenarios
    1.Scenario 1 Using Customer MD to get Location information
    Customer           Location        Year            Amt
    ABC                  France           2007           1000
    ABC                  France           2008            500
    2. Scenario 2 using the Location information stored in Cube
    Customer          Location          Year           Amt
    ABC                 USA                2007          1000
    ABC                 France             2008           500
    Also there would be some performance improvement if you are using the InfoObejct from the Cube as you are reducing a join.
    Hope this is helpful.

  • Item master Data upload through DTW

    Hi Experts
                        I want to upload Item master Data through DTW what are the fields needed as mandatory..
    Regards
    Vinoth             

    Hi,
    Please refer sample template to get an idea for mandatory field.
    C:\Program Files (x86)\SAP\Data Transfer Workbench\Templates\Samples\1. Add New Data\Inventory\Item Master Data
    For item master data, only item code is mandatory when you are adding new item.
    Thanks & Regards,
    Nagarajan

Maybe you are looking for