Mass Re-Read of PP master data

Hi
Is there is any transaction or Way to have MASS  reread PP master data in Production orders?

Dear Mayuresh,
Its not possible in standard SAP,as there is no T Code to perform the same.
check this link,
Read PP Master - Mass processing
Regards
Mangalraj.S
Edited by: Mangalraj.S on Jun 5, 2009 3:31 PM

Similar Messages

  • Routine to read time dependent master data

    Hi Experts,
    I got a requirement, where I have to read time dependent Master Data.
    I need to write a field level Routine on "DEPTID" and to read from EMPLOYEE Master and Dateto=31.12.9999. I need to get last department ID for each employee.
    Below are Source fields mapped to target InfoObjects in my transformation.
    EMPID            mapped to  0EMPLOYEE
    STARTDATE mapped to  0DATEFROM
    ENDDATE      mapped to  0DATETO
    EMPID            mapped to  DEPTID
    Could any one please help out with the abap code to fulfil the above requirement?
    Appreciate for your quick response.

    Cris,If you follow the approach given above will also give you the latest dept id only.I would suggest to give this a try in dev system with some test data and see if that works or not.
    Still if you feel that code has to be written then map end date and dept id to the dept id.
    Code should be something like this:
    IF source_fields-enddate = '99991231'.
    Result = deptid.
    Regards,
    AL

  • Force read/refresh of master data attributes on query

    Hi there,
    We're having troubles with one input ready query that changes and attribute value (KYF) of one characteristic. That works fines and whe can save changed data on the infoprovider via DTP. Problem is when data is saved we need to refresh query as whe have both values on screen (original value as char attribute on rows) new value as input ready KYF, so after save we would like to see that both values are the same.
    Is there any way of force query no to read from caché as whe are changing master data attributes. I read something about IF_RSMD_RS_ACCESS class that can be implemented on Master Data access and force it there but sounds really hard so if this is the way can some of you guys give us some help.
    I hope I make myself clear on the explanation...
    Thanks in advance,
    Regards
    Carlos

    Dear All,
    The recent days that I tried working on changing master data attributes through BPS didn't work out.The Primary reasons was  that some of the attributes that I needed to change were not present in the transaction or planning cubes and the characteristics that are not part of your cube on which the planning area is based then you can not do changes on them.
    This is my undestanding .Please correct me if I am wrong.
    Besides , I was also thinking if we can do the same through portal.i.e retriving the master data infoobject ( based on the value seleceted for that infoobject by the user  )  and its attributes in the portal , edit them and save them back so that the updated values goes back to BW master data infoobject data base tables and updates the value.
    Eg . I have Natural Account master data infoobject in the BW with attributes fucntional area and expense center.Based on the user selection of any values for the Natural account lets say  01110 , then for 01110 natural account the portal should display the correspoding attributes values of fucntional area and expense center.Lets take this values as 10 , 20 respectively for fucntional area and expense center . What I want to do now is to change these attrbute values to 30 and 40  and I would like to save it back as the changed attribute values for that natural account for 01110 with new attribute values 30 & 40 respectively for fucntional area and expense center .
    Is this possible through portal and BW?
    Any idea on this would be appriciated.
    Regards,
    Ankit
    Edited by: Ankit Bhandari on Nov 21, 2008 12:21 PM
    Edited by: Ankit Bhandari on Nov 21, 2008 12:32 PM

  • Read dimension member(master data) to ABAP internal table  in BPC 10.0 NW

    Hi all,
    I manage to read transaction data using this example [replacement for IF_UJ_MODEL~GET_APPL_DATA;
    I am now trying to read members(master data) from a dimension to a ABAP internal table but I have no idea how to.
    Can anyone advise me on how to read members(master data) from a dimension to a ABAP internal table.
    Some sample code would be really appreciated.
    Regards
    Edited by: HK Kang on Jan 3, 2012 4:26 AM

    Hi Chanaveer,
    UJD_ADMIN_RUN_OPTIMIZE can be used only for executing the FULL & LITE OPTIMIZER packages.
    Looking at the code of UJD_RUN_OPTIMIZE_PACKAGE it seems this FM can be used to trigger process chain from BW.
    Please refer below link on SDN showing how to load Master Data on FLY in SAP BPC 10.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/2020b522-cdb9-2e10-a1b1-873309454fce?QuickLink=index&overridelayout=true
    Thanks,
    Rohit

  • Mass Download/Extract of BW Master Data Documents

    Hello
    I am looking for a solution to enable a mass extraction of Master Data Documents on a nightly basis.
    Basically we are storing documents against materials in R/3 using CV03n, we are then using the Mass Upload Program to load these into BW on a nightly basis and store them against the master data as a document.
    The documents will be used for reporting internally, but there is also a requirement to extract this on a nightly basis to external sources as a csv file to be used as an internet feed.
    Does anybody know of a program or FM which will allow this, or any other way of extracting?
    Any help would be greatly appreciated.
    Thanks
    Daniel

    Hi Manfred,
    Unfortunately the documents are stored independently of the info-object so you can't use an infospoke to extract the data.
    Also it looks as though the document is not physically stored in a table on BW, only links to the document name and properties.
    Thanks
    Daniel

  • How InfoSpoke reads time dependent master data ?

    Hello Experts !!
    How InfoSpoke reads the time dependent master data ?
    What key date it reffers to ?
    Can you please explain, I want to use this concept in writing master data lookup for time dependent attributes of 0MATERIAL.
    Thank a lot !

    You can either specify the time period in the filtering area of infospoke or you can implement a transformation BAdI -OPENHUB_TRANSFORM to manipulate the data whichever way that suites your requirement. All time dependent infobjects have datefrom and dateto fields which you can use to choose your data range accordingly.
    Hope this helped you.

  • How to Read multiple material master data in single idoc

    Iam working inbound material master custoum idoc .IN a single idoc i am receiveing multiple materials data .
    i want to read each material  data in idoc that data need to pass BAPI(perform REC_CREATION_MM Iin side this perform material posting bapi is there.).
    Please provide any suggestion <removed by moderator>.this below logic is not working if idoc contain single material and last record in multiple material is not posting
    lOOP AT idoc_data WHERE docnum = idoc_contrl-docnum.
    if sy-tabix NE 1.
          If idoc_data-segnam = 'E1MARAM'.
            perform REC_CREATION_MM.
          endif.
        endif.
        CASE idoc_data-segnam.
          WHEN 'E1MARAM'.
            wa_E1MARAM = idoc_data-sdata .
            APPEND wa_E1MARAM TO it_E1MARAM.
            MOVE-CORRESPONDING wa_E1MARAM TO it_BAPIMARA.
            APPEND it_BAPIMARA.
            MOVE wa_E1MARAM-MATNR TO it_TABFILD-MATNR.
            MOVE wa_E1MARAM-MEINS TO it_TABFILD-MEINH.
            APPEND it_TABFILD.
          WHEN 'ZMM_C_CLFMAS01'.
            wa_ZMM_C_CLFMAS01 = idoc_data-sdata.
            APPEND  wa_ZMM_C_CLFMAS01 TO it_ZMM_C_CLFMAS01.
          WHEN 'E1MAKTM'.
            wa_E1MAKTM = idoc_data-sdata.
            APPEND  wa_E1MAKTM TO it_E1MAKTM.
            MOVE-CORRESPONDING wa_E1MAKTM TO it_BAPIMAKT .
            APPEND it_BAPIMAKT.
          WHEN 'E1MARCM'.
            wa_E1MARCM = idoc_data-sdata.
            APPEND wa_E1MARCM to it_E1MARCM.
            MOVE-CORRESPONDING wa_E1MARCM TO it_BAPIMARC .
            APPEND it_BAPIMARC.
         WHEN 'E1MVKEM'.
            wa_E1MVKEM = idoc_data-sdata.
            APPEND wa_E1MVKEM to it_E1MVKEM.
            MOVE-CORRESPONDING wa_E1MVKEM TO  it_BAPIMVKE .
            APPEND it_BAPIMVKE .
          WHEN 'E1MLANM'.
            if NOT idoc_data-sdata is INITIAL.
              wa_E1MLANM = idoc_data-sdata.
              if  NOT wa_e1mlanm is INITIAL.
                APPEND wa_E1MLANM TO it_E1MLANM.
                MOVE-CORRESPONDING wa_E1MLANM TO it_BAPIMLAN .
                APPEND it_BAPIMLAN .
              ENDIf.
            endif.
        ENDCASE.
      ENDLOOP.
    Thanks
    VIJAY
    Edited by: Thomas Zloch on May 4, 2010 3:09 PM - priority normalized

    Check the material available in MARC table in SE11 transaction against your plant.If it is not available please maintain through MM01 transaction.

  • Read PP master data in order

    Hi...
    I have one query reference to Read PP master data in order.
    Can we read PP master data in the orders at plant level in mass processing?Or any other alternative...?
    Regards
    Tushar

    Dear There is no T Code in standard SAP to read the PP master data in mass.
    You need to develop a LSMW or BDC solution.
    Please refer below thread for more information.
    MASS read PP master data
    Cheers
    KK

  • Performance: reading huge amount of master data in end routine

    In our 7.0 system, each day a full load runs from DSO X to DSO Y in which from six characteristics from DSO X master data is read to about 15 fields in DSO Y contains about 2mln. records, which are all transferred each day. The master data tables all contain between 2mln. and 4mln. records. Before this load starts, DSO Y is emptied. DSO Y is write optimized.
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups. We redesigned and fill all master data attributes in the end routine, after fillilng internal tables with the master data values corresponding to the data package:
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE ucpremise EQ RESULT_PACKAGE-ucpremise.
    And when we loop over the data package, we write someting like:
        LOOP AT RESULT_PACKAGE ASSIGNING <fs_rp>.
          READ TABLE lt_0ucpremise INTO ls_0ucpremise
            WITH KEY ucpremise = <fs_rp>-ucpremise
            BINARY SEARCH.
          IF sy-subrc EQ 0.
            <fs_rp>-ucpremisty = ls_0ucpremise-ucpremisty.
            <fs_rp>-ucdele_ind = ls_0ucpremise-ucdele_ind.
          ENDIF.
    *all other MD reads
    ENDLOOP.
    So the above statement is repeated for all master data we need to read from. Now this method is quite faster (1,5 hr). But we want to make it faster. We noticed that reading in the master data in the internal tables still takes a long time, and this has to be repeated for each data package. We want to change this. We have now tried a similar method, but now load all master data in internal tables, without filtering on the data package, and we do this only once.
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise.
    So when the first data package starts, it fills all master data values, which 95% of them we would need anyway. To accomplish that the following data packages can use the same table and don't need to fill them again, we placed the definition of the internal tables in the global part of the end routine. In the global we also write:
    DATA: lv_data_loaded TYPE C LENGTH 1.
    And in the method we write:
    IF lv_data_loaded IS INITIAL.
      lv_0bpartner_loaded = 'X'.
    * load all internal tables
    lv_data_loaded = 'Y'.
    WHILE lv_0bpartner_loaded NE 'Y'.
      Call FUNCTION 'ENQUEUE_SLEEP'
      EXPORTING
         seconds = 1.
    ENDWHILE.
    LOOP AT RESULT_PACKAGE
    * assign all data
    ENDLOOP.
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    Well this all seems to work: it takes now 10 minutes to load everything to DSO Y. But I'm wondering if I'm missing anything. The system seems to work fine loading all these records in internal tables. But any improvements or critic remarks are very welcome.

    This is a great question, and you've clearly done a good job of investigating this, but there are some additional things you should look at and perhaps a few things you have missed.
    Zephania Wilder wrote:
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups.
    This is not accurate. After SP14, BW does a prefetch and buffers the master data values used in the lookup. Note [1092539|https://service.sap.com/sap/support/notes/1092539] discusses this in detail. The important thing, and most likely the reason you are probably seeing individual master data lookups on the DB, is that you must manually maintain the MD_LOOKUP_MAX_BUFFER_SIZE parameter to be larger than the number of lines of master data (from all characteristics used in lookups) that will be read. If you are seeing one select statement per line, then something is going wrong.
    You might want to go back and test with master data lookups using this setting and see how fast it goes. If memory serves, the BW master data lookup uses an approach very similar to your second example (1,5 hrs), though I think that it first loops through the source package and extracts the lists of required master data keys, which is probably faster than your statement "FOR ALL ENTRIES IN RESULT_PACKAGE" if RESULT_PACKAGE contains very many duplicate keys.
    I'm guessing you'll get down to at least the 1,5 hrs that you saw in your second example, but it is possible that it will get down quite a bit further.
    Zephania Wilder wrote:
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    This sleeping approach is not necessary as only one data package will be running at a time in any given process. I believe that the "global" internal table is not be shared between parallel processes, so if your DTP is running with three parallel processes, then this table will just get filled three times. Within a process, all data packages are processed serially, so all you need to do is check whether or not it has already been filled. Or are you are doing something additional to export the filled lookup table into a shared memory location?
    Actually, you have your global data defined with the statement "DATA: lv_data_loaded TYPE C LENGTH 1.". I'm not completely sure, but I don't think that this data will persist from one data package to the next. Data defined in the global section using "DATA" is global to the package start, end, and field routines, but I believe it is discarded between packages. I think you need to use "CLASS-DATA: lv_data_loaded TYPE C LENGTH 1." to get the variables to persist between packages. Have you checked in the debugger that you are really only filling the table once per request and not once per package in your current setup? << This is incorrect - see next posting for correction.
    Otherwise the third approach is fine as long as you are comfortable managing your process memory allocations and you know the maximum size that your master data tables can have. On the other hand, if your master data tables grow regularly, then you are eventually going to run out of memory and start seeing dumps.
    Hopefully that helps out a little bit. This was a great question. If I'm off-base with my assumptions above and you can provide more information, I would be really interested in looking at it further.
    Edited by: Ethan Jewett on Feb 13, 2011 1:47 PM

  • Read from master data attribute

    Hi BI experts,
    Here is a challenging question.
    One of the FIGL infoprovider do not have infoobject 0company. Added 0company and set up update type 'overwrite'. Update method 'Master data attribute'. So 0company should read from the master data attribute of 0Comp_code.
    Master data attribute loaded for 0comp_code. Did transaction data load for FIGL infoprovider.
    But 0company is blank in FIGL ODS.
    Any thoughts..
    Thanks in advance

    It should ideally pull the 0company, Otherwise try doing it thru Start routine ..
    Create an internal table, pull company code and company from the master data table of Company code into this table.
    Read from this table in your update routine and corresponding to the company code read the company and populate in your ODS.
    It will also be better in performance compared to a master data look up !!!!!!!
    Regards.

  • Mass Material Read

    Hi,
    I have a requirement to read all material master data(basic data, plant data.. & other views data) in mass scenario. Is there any specific function module already available for this?
    I have checked the existing BAPIs and most of them support just a single material.
    Thanks in advance for your help.
    Best Regards,
    Pramod

    Dear Pramod,
    You can create a query using SQVI or else through SQ02 and SQ01 to fetch the data from the tables like
    MARA,MARC,MBEW,MARD.
    Some of the Functional Module's available are
    MATERIAL_PRE_READ_MA00         Mass Access to Table MAKT                       
    MATERIAL_READ_MA00             Read material                                   
    MATERIAL_READ_MA03             Read material                                   
    MATERIAL_READ_MABW             Read material                                   
    MATERIAL_READ_ALL_SINGLE       Reading all material master data in the select single mode
    Check and revert back.
    Regards
    Mangalraj.S

  • Not able to Retrieve Transaction Data based on the property of master data

    Hi,
    I am trying to retrieve transaction data based on property of Master Data for ACCOUNT (property  ACCTYPE = ‘EXP’)
    in BPC 10 version for netweaver.
    Transaction data is present at backend, But I am not getting data in Internal table after running RSDRI Query.
    I am using this code.
    DATA: lt_sel TYPE uj0_t_sel,
    ls_sel TYPE uj0_s_sel.
    ls_sel-dimension = 'ACCOUNT'.
    ls_sel-attribute = 'ACCTYPE'.
    ls_sel-sign = 'I'.
    ls_sel-option = 'EQ'.
    ls_sel-low = 'EXP'.
    APPEND ls_sel TO lt_sel.
    lo_query = cl_ujo_query_factory=>get_query_adapter(
    i_appset_id = lv_environment_id
    i_appl_id = lv_application_id ).
    lo_query->run_rsdri_query(
    EXPORTING
    it_dim_name = lt_dim_list " BPC: Dimension List
    it_range = lt_sel" BPC: Selection condition
    if_check_security = ABAP_FALSE " BPC: Generic indicator
        IMPORTING
    et_data = <lt_query_result>
        et_message = lt_message
    Data is coming if i use ID of ACCOUNT directly, for e.g.
    ls_sel-dimension = 'ACCOUNT'.
    ls_sel-attribute = 'ID'.
    ls_sel-sign = 'I'.
    ls_sel-option = 'EQ'.
    ls_sel-low = 'PL110.
    APPEND ls_sel TO lt_sel.
    so in this case data is coming , but it is not coming for property.
    So Please can you help me on this.
    Thanks,
    Rishi

    Hi Rishi,
    There are 2 steps you need to do,.
    1. read all the master data with the property you required into a internal table.  in your case use ACCTYPE' = EXP
    2. read transaction data with the masterdata you just selected.
    Then you will get all your results.
    Andy

  • Can routine replace "master data attribute of" update rule for performance?

    Hi all,
    We are working on CRM-BW data modeling, We have to look up agent master data for agent level and position for each transaction data. So now we are using "Master data attribute of" update rule. Can we use routine instead of "Master data Attribute of" ? Will it improve the loading performance? Since we have to load 1 lack transaction records , where as we have 20,000 agent details in agent master data.My understanding is, for each record in data package the system has to go to master data table and bring the agent details & store in cubes. Say one agent created 10 transactions, then this option "master data attribute of" will read the agent master data 10 times even though we are going to pull same details for all 10 transactions from master data. if we use routine, we can pull the agent details& storing in internal table removing all duplicates and in update routine we can read the internal table.
    Will this way improve performance?
    let me know if you need further info?
    Thanks in advance.
    Arun Thangaraj

    Hi,
    your thinking is absolutely right!
    I don't recommend to use the standard attribute derivation since it will perform a SELECT to the database for EACH record.
    Better implement a sorted table in your start routine; fill it with SELECT <fields> FROM <master_data_table> FOR ALL ENTRIES OF datapak WHERE OBJVERS = 'A' etc...
    In your routine perform a READ itab ... BINARY SEARCH.... I believe that you won't be able to go faster...
    hope this helps...
    Olivier.

  • Multiple Analysis Period in Cost Center Master Data

    Hello  Experts
    My client did the mass change of cost center master data through T.Code KS12.  Here, few of the cost centers analysis (validity period is different) like cost center X - analysis period was 01.01.2008 to 31.12.9999, & Cost center Y - analysis period was 01.01.2011 to 31.12.9999.
    Since he did the change (through T.Code KS12) w.e.f 01.01.2011 to 31.12.9999, few of the cost centers created the mutliple analysis periods at cost center level ( i understand this is SAP standard behaviour)
    Now, if he choose the display of cost center - X through KS03, he is getting two analysis periods
        (a) 01.01.2008 to 31.12.2010
        (b) 01.01.2011 to 31.12.9999
    However my client does not want to see two analysis periods in cost center master data.
    Kindly advise is there any SAP standard procedure to revert back / SAP note which can coorect the same
    Regards
    Anil Kumar

    If there are no postings during that analysis period you can delete the same if not required to avoid the message.
    I will test the same if any other solution is possible.
    Regards,
    Divraj

  • Info-object Maintenance -- Master Data/Texts Tab options

    Hello Bi Gurus,
    In BI 7.o:
    Path:
    Info-object Maintenance -->Master Data/Texts Tab -->Master Data infoSource/Data Taget/Infoprovider/Master Data Read Access.
    Scenario 1:
    1. Master Data Access - Own Implementation
       Name of Master Data Access -
    CL_RSMD_RS_SPEC     class for accessing generic Bw special info objects
    CL_RSR_XLS_TABF4     Table Master Data Read Class
    CL_RSROA_LOCAL_MASTERDATA     Master Data for Virtual Characteristics
    CL_RSMD_RS_BW_SPEC     Base class for accessing Bw special info objects
    CL_RSMD_RS_SPEC_TXT     Class for Accessing Generic BW-Specific InfoObjects
    Under what scenairo we go for this option?
    Any body used this option?
    Scenario 2:
    1. Master Data Access - Direct Access
       Name of Master Data Access - CL_RSR_REMOTE_MASTERDATA
    Under what scenairo we go for this option?
    Any body used this option?

    Hi,
    With Master Data:
    If you set this indicator, the characteristic may have attributes. In this case the system generates a P table for this characteristic. This table contains the key of the characteristic and any attributes that might exist. It is used as a check table for the SID table. When you load transaction data, there is check whether there is a characteristic value in the P table if the referential integrity is used.
    With Maintain Master Data you can go from the main menu to the maintenance dialog for processing attributes.
    The master data table can have a time-dependent and a time-independent part.
    In attribute maintenance, determine whether an attribute is time-dependent or time independent.
    With Texts:
    Here, you determine whether the characteristic has texts.
    If you want to use texts with a characteristic, you have to select at least one text. The short text (20 characters) option is set by default but you can also choose medium-length texts (40 characters) or long texts (60 characters).
    Helpfull link:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/71/f470375fbf307ee10000009b38f8cf/frameset.htm
    Regards,
    Suman

Maybe you are looking for

  • Issue while loading data to sample essbase app using odi

    while executing data load the error is Now, while loading data to an essbase app i am getting the following error: org.apache.bsf.BSFException: exception from Jython: Traceback (innermost last): File "<string>", line 23, in ? com.hyperion.odi.essbase

  • ORACLE 10G 2 Node RAC on servers AB to 3 node 11GR2 RAC on  new servers XYZ

    Hi Gurus , We have a business requirement to upgrade existing oracle 10.2.0.4 2 Node RAC on servers A,B (HP UX 11.31 ) to 3 Node ORACLE 11GR2 on new Servers X,Y and Z(Linux and servers are from different vendor) We don't have ASM.We have RAW file sys

  • Migration Assistance problems for PC running XP to iMac with osx lion

    I had a very old PC with XP that was cluttered with lots of crap.  I didn't want to migrate much to my new computer, so I created a folder of music and artwork I wanted to transfer.  Everything successfully migrated, but it is now trapped in a differ

  • How do I get steam to work with lion?

    How do I get Steam to install with Lion? Worked ok with Snow Leopard.

  • Frozen in Erase Content mode

    I've hit erase all content and settings and my Ipod has a screen with the apple logo and a bar underneath. At first the process seemed to have been moving along, but now it seems to be stuck with about 75% of the bar colored in. It has been about 4 h