Master Data lookup in Update Rule problem

Hi all,
I am currently having a problem loading data to an InfoCube using flat files.
The architecture is as follows:
1) The source of the data is a flat file
2) The data is loaded thru an Update Rule and is of type Full-Update
3) The Update Rules determines the Profit Center using the Master Data of the WBS-Element
4) The data is written in an InfoCube
This solution however does not always work as planned. In the following situation a problem occurs:
1) The flat file contains WBS-element RD.00753.02.01, which has a Profit Center attribute value 8060
2) When I load the flat file, the PC value 8060 is written into the row in the InfoCube, which is correct
3) Then I change the master data of the WBS-element by setting the Profit Center attribute value to 8068
4) I run the Attribute Change Run
5) Then i load a flat file again, which also contains WBS-element RD.00753.02.01
6) The master data attribute value should now write the value 8068 into the InfoCube. HOWEVER, this is when the evil occurs. BW does not write a PC value of 8068, but it write the value 8060. This is wrong.
Why does BW not take the newest version of the Master Data to performe the attribute value look-up? Or why doesn't BW write the correct Profit Center into the cube?
Thanks,
Onno

Hi Ricardo,
The debug via PSA simulation of the update indicates that the CORRECT Profit Center value is to be written into the InfoCube.
However, if I check the contents of the cube (after the load has finished) using the request-id the WRONG Profit Center value is shown. This indicates that the correct Master Data is used, however the update of the Cube is wrong. Why does this happen. the load is of type full-update, so should add a new row in the cube using the value in the data from the UR.
Onno

Similar Messages

  • Can routine replace "master data attribute of" update rule for performance?

    Hi all,
    We are working on CRM-BW data modeling, We have to look up agent master data for agent level and position for each transaction data. So now we are using "Master data attribute of" update rule. Can we use routine instead of "Master data Attribute of" ? Will it improve the loading performance? Since we have to load 1 lack transaction records , where as we have 20,000 agent details in agent master data.My understanding is, for each record in data package the system has to go to master data table and bring the agent details & store in cubes. Say one agent created 10 transactions, then this option "master data attribute of" will read the agent master data 10 times even though we are going to pull same details for all 10 transactions from master data. if we use routine, we can pull the agent details& storing in internal table removing all duplicates and in update routine we can read the internal table.
    Will this way improve performance?
    let me know if you need further info?
    Thanks in advance.
    Arun Thangaraj

    Hi,
    your thinking is absolutely right!
    I don't recommend to use the standard attribute derivation since it will perform a SELECT to the database for EACH record.
    Better implement a sorted table in your start routine; fill it with SELECT <fields> FROM <master_data_table> FOR ALL ENTRIES OF datapak WHERE OBJVERS = 'A' etc...
    In your routine perform a READ itab ... BINARY SEARCH.... I believe that you won't be able to go faster...
    hope this helps...
    Olivier.

  • Master data Attribute of ( Update rule)

    Hi SDN,
    I Just want to know the procedure to create update method of Master data Attribute of . coild any one tell me step by step
    Regards
    sujan

    3.     Creation of data targets
    •     In left panel select info provider
    •     Select created info area and right click to select Insert Characteristics as info provider
    •     Select required info object ( Ex : Employee ID)
    •     Under that info object select attributes
    •     Right click on attributes and select create transformation.
    •     In source of transformation , select object type( data  source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    •     Activate created transformation
    •     Create Data transfer process (DTP) by right clicking the master data attributes
    •     In extraction tab specify extraction mode ( full)
    •     In update tab specify error handling ( request green)
    •     Activate DTP and in execute tab click execute button to load data in data targets.
    cheers
    John

  • ABAP for Master Data lookup

    I am trying ABAP for master data lookup in update rule.
    Here is the scenario ---
    There is a Master Data object MDABC  with attributes A1 , A2 . I need to Map IO1 to A1 and IO2 to A2.
    What should be the Start Routine and Update Routine . Please help with a working code.
    Thanks

    >
    sap_newbee wrote:
    > Thanks Aashish ,
    > Here is the code I am usind but Its not populating any result May be you could help me out in debugging
    >
    >
    > Start Routine -
    >
    > DATA: BEGIN OF ITAB_MDABC OCCURS 0,
    > MDABC LIKE /BIC/PMDABC-/BIC/MDABC,
    >  A1 LIKE /BIC/PMDABC-/BIC/A1,
    > A2 LIKE /BIC/PMDABC-/BIC/A2,
    > END OF ITAB_NMDABC.
    >
    >  SELECT
    > /BIC/MDABC
    > /BIC/A1
    > /BIC/A2
    >  FROM /BIC/PMDABC INTO TABLE ITAB_MDABC
    >  FOR ALL ENTRIES IN DATA_PACKAGE
    > WHERE /BIC/MDABC = DATA_PACKAGE-/BIC/MDABC.
    > ENDSELECT.
    >
    >
    > In Update Routine for Infoobject IO1 The code Iam using is
    >
    >
    > READ TABLE ITAB_MDABC   WITH KEY
    >  MDABC   =  COMM_STRUCTURE-/BIC/MDABC
    >  BINARY SEARCH.
    > IF sy-subrc = 0.
    >  RESULT = ITAB_MDABC-A1.
    > ENDIF.
    >   RETURNCODE = 0.
    >
    >   ABORT = 0.
    >
    > Please help.
    > Thanks
    Please use table in the select statement. Modifications in BOLD
    Edited by: Ashish Gour on Oct 17, 2008 2:57 PM

  • Change communication structure for master data with direct update

    Hi All,
    I am having a problem with a change I want to make to some master data. I have added the attribute to the characteristic, but when I have gone to change the communication structure, its not possible (the add line button is greyed out)
    I can see the new infoobject in the datasource/trans structure, but not in the comms structure (And yes it is in change mode   ).
    The master data uses direct update, and I have read that this causes some hassles in changing the comms str.
    Can someone please give me some steps in doing this??
    Thanks
    Ryan

    Hi Ryan,
    Why u have nothing to map is ?
    U r datasource trasfer structure from the source ssytem has nothing new ..
    1)TO ur Info object u can only write the routing or give any formula or anythign like the trasfer rules oftions that have when u click on the IO icon..
    2) Enhave the structure of the datasoruce that is getting from the  R3 source system...
    3) Replicate it ..
    Then only u will be able to find the extra field at the Trasfer structure... to map to the IO which have been added..
    hope it helps
    regards
    AK

  • 0CALDAY for time dependent master data lookup unknown when migrated to 7.0

    I am in the process of migrating a number of InfoProviders from the 3.x Business Content to the new methodology for BI 2004s.
    When I try to create a transformation from the Update Rules for 0PA_C01, all of the rules that use a master data lookup into 0EMPLOYEE give me the error such as "Rule 41 (target field: 0PERSON group: Standard Group): Time char. 0CALDAY in rule unknown". 
    How do I fix the transformation rule that is generated from the update rule for these time-dependent master data attributes?

    Hi Mark,
    look at http://www.service.sap.com/. I guess you need to implement some corrections or newer supp-packages.
    kind regards
    Siggi
    PS: take a look: https://websmp104.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=941525&_NLANG=EN
    Message was edited by:
            Siegfried Szameitat

  • Error handling for master data with direct update

    Hi guys,
    For master data with flexible update, error handling can be defined in InfoPackege, and if the load is performed via PSA there are several options - clear so far. But what about direct update...
    But my specific question is: If an erroneous record (e.g invalid characters) occur in a master data load using direct update, this will set the request to red. But what does this mean in terms of what happens to the other records of the request (which are correct) are they written to the master data tables, so that they can be present once the masterdata is activated, or are nothing written to masterdata tables if a single record is erroneous???
    Many thanks,
    / Christian

    Hi Christian -
    Difference between flexible upload & Direct upload is that direct upload does not have Update Rules, direct upload will have PSA as usual & you can do testing in PSA.
    second part when you load master data - if error occurs all the records for that request no will be status error so activation will not have any impact on it i.e. no new records from failed load will be available.
    hope it helps
    regards
    Vikash

  • Poor MDX performance on F4 master data lookup

    Hi,
    <P>
    I've posted this to this forum as it didn't get much help in the BW 7.0 forum. I'm thinking it was too MDX oriented to get any help there. Hopefully someone has some ideas.
    <P>
    We have upgraded our BW system to 7.0 EHP1 SP6 from BW 3.5. There is substantial use of SAP BusinessObjects Enterprise XI 3.1 (BOXI) and also significant use of navigational attibutes. Everything works fine in 3.5 and we have worked through a number of performance problems in BW 7.0. We are using BOXI 3.1 SP1 but have tested with SP2 and it generates the same MDX. We do however have all the latest MDX related notes, including the composite note 1142664.
    <P>
    We have a number of "fat" queries that act as universes for BOXI and it is when BOXI sends a MDX statement that includes certain crossjoins with navigational attributes that things fall apart. This is an example of one that runs in about a minute in 3.5:
    <P>
    SELECT { [Measures]. [494GFZKQ2EHOMQEPILFPU9QMV], [Measures].[494GFZSELD3E5CY5OFI24BPCN], [Measures].[494GG07RNAAT6M1203MQOFMS7], [Measures]. [494GG0N4P7I87V3YBRRF8JK7R] } ON COLUMNS , NON EMPTY CROSSJOIN( CROSSJOIN( CROSSJOIN( CROSSJOIN( CROSSJOIN( [0MAT_SALES__ZPRODCAT]. [LEVEL01].MEMBERS, EXCEPT( { [0MAT_SALES__ZASS_GRP]. [LEVEL01].MEMBERS } , { { [0MAT_SALES__ZASS_GRP].[M5], [0MAT_SALES__ZASS_GRP].[M6] } } ) ), EXCEPT( { [0SALES_OFF]. [LEVEL01].MEMBERS } , { { [0SALES_OFF].[#] } } ) ), [0SALES_OFF__ZPLNTAREA].[LEVEL01].MEMBERS ), [0SALES_OFF__ZPLNTREGN]. [LEVEL01].MEMBERS ), [ZMFIFWEEK].[LEVEL01].MEMBERS ) DIMENSION PROPERTIES MEMBER_UNIQUE_NAME, MEMBER_NAME, MEMBER_CAPTION ON ROWS FROM [ZMSD01/ZMSD01_QBO_Q0010]
    <P>
    However in 7.0 there appear to be some master data lookups that are killing performance before we even get to the BW queries. Note that in RSRT terms this is prior to even getting the popup screen withe "display aggregate".
    <P>
    They were taking 700 seconds but now take about 150 seconds after an index was created on the ODS /BIC/AZOSDOR0300. From what I can see, the navigational attributes require BW to ask "what are the valid SIDs for SALES_OFF in this multiprovider". The odd thing is that BW 3.5 does no such query. It just hits the fact tables directly.
    <P>
    SELECT "SID" , "SALES_OFF" FROM ( SELECT "S0000"."SID","P0000"."SALES_OFF" FROM "/BI0/PSALES_OFF" "P0000" JOIN "/BI0/SSALES_OFF" "S0000" ON "P0000"."SALES_OFF" = "S0000"."SALES_OFF" WHERE "P0000"."OBJVERS" = 'A' AND "S0000"."SID" IN ( SELECT "D"."SID_0SALES_OFF" AS "SID" FROM "/BI0/D0PCA_C021" "D" ) UNION SELECT "S0000"."SID" ,"P0000"."SALES_OFF" FROM "/BI0/PSALES_OFF" "P0000" JOIN "/BI0/SSALES_OFF" "S0000" ON "P0000" . "SALES_OFF" = "S0000" . "SALES_OFF" WHERE "P0000"."OBJVERS" = 'A' AND "S0000"."SID" IN ( SELECT "D"."SID_0SALES_OFF" AS "SID" FROM "/BIC/DZBSDBL018" "D" ) UNION SELECT "S0000"."SID" ,"P0000"."SALES_OFF" FROM "/BI0/PSALES_OFF" "P0000" JOIN "/BI0/SSALES_OFF" "S0000" ON "P0000" . "SALES_OFF" = "S0000" . "SALES_OFF" WHERE "P0000"."OBJVERS" = 'A' AND "S0000"."SID" IN ( SELECT "D"."SID_0SALES_OFF" AS "SID" FROM "/BIC/DZBSDOR028" "D" ) UNION SELECT "S0000"."SID" ,"P0000"."SALES_OFF" FROM "/BI0/PSALES_OFF" "P0000" JOIN "/BI0/SSALES_OFF" "S0000" ON "P0000" . "SALES_OFF" = "S0000" . "SALES_OFF" WHERE "P0000"."OBJVERS" = 'A' AND "S0000"."SID" IN ( SELECT "D"."SID_0SALES_OFF" AS "SID" FROM "/BIC/DZBSDOR038" "D" ) UNION SELECT "S0000"."SID" ,"P0000"."SALES_OFF" FROM "/BI0/PSALES_OFF" "P0000" JOIN "/BI0/SSALES_OFF" "S0000" ON "P0000" . "SALES_OFF" = "S0000" . "SALES_OFF" WHERE "P0000"."OBJVERS" = 'A' AND "S0000"."SID" IN ( SELECT "D"."SID_0SALES_OFF" AS "SID" FROM "/BIC/DZBSDOR058" "D" ) UNION SELECT "S0000"."SID" ,"P0000"."SALES_OFF" FROM "/BI0/PSALES_OFF" "P0000" JOIN "/BI0/SSALES_OFF" "S0000" ON "P0000" . "SALES_OFF" = "S0000" . "SALES_OFF" WHERE "P0000"."OBJVERS" = 'A' AND "S0000"."SID" IN ( SELECT "D"."SID_0SALES_OFF" AS "SID" FROM "/BIC/DZBSDOR081" "D" ) UNION SELECT "S0000"."SID" ,"P0000"."SALES_OFF" FROM "/BI0/PSALES_OFF" "P0000" JOIN "/BI0/SSALES_OFF" "S0000" ON "P0000" . "SALES_OFF" = "S0000" . "SALES_OFF" WHERE "P0000"."OBJVERS" = 'A' AND "S0000"."SID" IN ( SELECT "D"."SID_0SALES_OFF" AS "SID" FROM "/BIC/DZBSDPAY016" "D" ) UNION SELECT "S0000"."SID" ,"P0000"."SALES_OFF" FROM "/BI0/PSALES_OFF" "P0000" JOIN "/BI0/SSALES_OFF" "S0000" ON "P0000" . "SALES_OFF" = "S0000" . "SALES_OFF" WHERE "P0000"."OBJVERS" = 'A' AND "P0000"."SALES_OFF" IN ( SELECT "O"."SALES_OFF" AS "KEY" FROM "/BIC/AZOSDOR0300" "O" ) ) ORDER BY "SALES_OFF" ASC
    <P>
    I had assumed this had something to do with BOXI - but I don't think this is a MDX specific problem, even though it's hard to test in RSRT as it's a query navigation. Also I assumed it might be something to do with the F4 master data lookup but that's not the case, because of course this "fat" query doesn't have a selection screen, just a small initial view and a large number of free characteristics. Still I set the characteristic settings not only to do lookups on the master data values and that made no difference. Nonetheless you can see in the MDXTEST trace that event 6001: F4: Read Data. Curiously this is an extra one that sits between event 40011: MDX Initialization and event 40010: MDX Execution.
    <P>
    I've tuned this query as much as I can from the Oracle perspective and checked the indexes and statistics. Also checked Oracle is perfectly tuned and parameterized as for 10.2.0.4 with the May 2010 patchset for AIX. But this query returns an estimated 56 million rows and runs an expensive UNION join on them - so no suprise that it's slow. As a point of interest changing it from UNION to UNION ALL cuts the time to 30 seconds. Don't think that helps me though, other than confirming that it is the sort which is expensive on 56m records.
    <P>
    Thinking that the UNORDER MDX statement might make a difference, I changed the MDX to the following but that didn't make any difference either.
    <P>
    SELECT { [Measures].[494GFZKQ2EHOMQEPILFPU9QMV], [Measures].[494GFZSELD3E5CY5OFI24BPCN], [Measures].[494GG07RNAAT6M1203MQOFMS7], [Measures].[494GG0N4P7I87V3YBRRF8JK7R] } ON COLUMNS ,
    NON EMPTY UNORDER( CROSSJOIN(
      UNORDER( CROSSJOIN(
        UNORDER( CROSSJOIN(
          UNORDER( CROSSJOIN(
            UNORDER( CROSSJOIN(
              [0MAT_SALES__ZPRODCAT].[LEVEL01].MEMBERS, EXCEPT(
                { [0MAT_SALES__ZASS_GRP].[LEVEL01].MEMBERS } , { { [0MAT_SALES__ZASS_GRP].[M5], [0MAT_SALES__ZASS_GRP].[M6] } }
            ) ), EXCEPT(
              { [0SALES_OFF].[LEVEL01].MEMBERS } , { { [0SALES_OFF].[#] } }
          ) ), [0SALES_OFF__ZPLNTAREA].[LEVEL01].MEMBERS
        ) ), [0SALES_OFF__ZPLNTREGN].[LEVEL01].MEMBERS
      ) ), [ZMFIFWEEK].[LEVEL01].MEMBERS
    DIMENSION PROPERTIES MEMBER_UNIQUE_NAME, MEMBER_NAME, MEMBER_CAPTION ON ROWS FROM [ZMSD01/ZMSD01_QBO_Q0010]
    <P>
    Does anyone know why BW 7.0 behaves differently in this respect and what I can do to resolve the problem? It is very difficult to make any changes to the universe or BEx query because there are thousands of Webi queries written over the top and the regression test would be very expensive.
    <P>
    Regards,
    <P>
    John

    Hi John,
    couple of comments:
    - first of all you posted it in the wrong forum. This belongs into the BW forum
    - MDX enhancements in regards to BusinessObjects are part of BW 7.01 SP05 - not just BW 7.0
    would suggest you post it in the BW forum
    Ingo

  • Update rule problem - validation of "sales/cost w/ tax" keyfigure

    BW Gurus,
        Hi to all, i have this update rule problem at "sales/cost w/tax" keyfigure here is the senario.
        Our Goverment mandatory implemented an additional 2% tax from the original 10%, this will affect our report on sales, and also the BW "sales/cost w/ tax" key figure.
        My question is How can I validated the effectivity of the new tax? i have tax 10% from previous sales and 12% on the current sales. What "date field" can I use to validated this. I am using /BIC/CS2LIS_13_VDITM stucture to get the data i need.
    Thanks in Advance
    Joven

    Hi,
    Till to day how are extracting the data for tax(original 10%) is it available directly in 2LIS_13_VDITM ?
    Usually all taxes( condition types) can be extracted by the data source 2LIS_13_VDKON.Discuss with SD team, they may give the condition type used for different taxes.
    With rgds,
    Anil Kumar sharma .P

  • Master data Lookup missing for sold to party

    Master data Lookup missing for sold to party?
    The look up flow is from E_BOM to E_BOM1
    regards

    Hi,
    Check if SIDs are generated for the infoobject, if not try to do the change run attribute and then load the data to Cube.
    Hope this helps...
    Rgs,
    Ravikanth.

  • Master Data InfoSource (Flexible Update)

    Hello Everyone,
    I have a small doubt regarding a Master Data InfoSource (Flexible Update). I see that there are two data targets for this infosource. They are Attribute data target and text data target.
    My question is, do i have to create 2 separate InfoPackages to load the 2 data targets (one each for attribute and text) or a single infopackage will be enough for both.
    I created an infopackage and chose Attribute data target. When i checked the contents, i see that even the text data target is populated with data even though i didn't choose it in the infopackage. Also, i get multiple entries for each record (time period 1000-1899 record and time-period 1900-9999 record). How to get only a single entry.
    Could someone help.
    Thanks.

    Hi Sachin,
    This post should help to clarify why different records exist with different to and from dates:
    Re: Master data
    Hope this helps...

  • Update rule problem for date in Prod

    My scenario is like this:-
    ODS 2 is loaded from ODS 1, in ODS 1 there is a data field calendar day (DATS, time characteristic) and there is a data field posting date (DATS, characteristic) in ODS 2. In the update rule, the posting date is updated from calendar day by a formula source. 
    The problem is the posting date data field is updated into ODS 2 correctly in development box but it is not updated (blank) in production box.  I can't figure out what is the cause, hopefully someone can give me some help.  Thanks.
    Cheers!
    Cecil

    The only different is the development system is having this part of code in the update rules program compare to production system.  Do I need to compile the update rules formula manually?
    *This ABAP Code was generated automatically          *
    *Formula Calculator                                  *
    *Generated :2008:09:12-10:47
    *User: XXX
    *Calculation:
    result = COMM_STRUCTURE-CALDAY.
      ENDCATCH.
      if sy-subrc <> 0.
        perform error_message using 'RSAU' 'E' '507'
                'ROUTINE_0004' g_s_is-recno
                rs_c_false rs_c_false g_s_is-recno
                changing c_abort.
      endif.
    Cheers!

  • Problem with master data lookup in transformations

    Hi,
    We're experiencing a strange problem in the transformations for an Error Reporting scenario.
    We've the SAP standard customer (0CUSTOMER) in DSO 2 which derives it's value from the navigational attribute of YCUSTNR (0CUSTOMER is the navigational attribute of YCUSTNR) in DSO1. But the problem here is that even when there is no entry for YCUSTNR, the 0CUSTOMER in DSO 2 returns a value.
    Referential Integrity check is not employed for this scenario. Is it because of this reason that lack of master data is returning some value held in memory. I'm not sure whether it's an issue with the cache. Is there any work around for solving this problem other than the standard master data look up?
    Any help in this issue would be appreciated and rewarded.
    Thanks & Regards
    Hari

    Note 1031553 - Reading master data returns value that does not exist
    will solve this problem.
    Regards,
    József.

  • Update rule problem - while data load

    Hi friends,
    I got the following error while doing initialisation for 2lis_02_sgr.
    "ABORT was set in the customer routine 9998
    Error 1 in the update "
    In the forum i searched for this error and this error is something related to the start routine in my update rule.
    But i dont know whats wrong with my routine.
    Im giving the start routine below,pls go through this and give me your suggestions..
    PROGRAM UPDATE_ROUTINE.
    $$ begin of global - insert your declaration only below this line  -
    TABLES: ...
    <i>TABLES /bic/AZMM_PUR100 .
    DATA:  T_PUR1 LIKE /bic/AZMM_PUR100 OCCURS 0 WITH HEADER LINE.</i>
    $$ end of global - insert your declaration only before this line   -
    The follow definition is new in the BW3.x
    TYPES:
      BEGIN OF DATA_PACKAGE_STRUCTURE.
         INCLUDE STRUCTURE /BIC/CS2LIS_02_SGR.
    TYPES:
         RECNO   LIKE sy-tabix,
      END OF DATA_PACKAGE_STRUCTURE.
    DATA:
      DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
           WITH HEADER LINE
           WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    FORM startup
      TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
               MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
               DATA_PACKAGE STRUCTURE DATA_PACKAGE
      USING    RECORD_ALL LIKE SY-TABIX
               SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
      CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    $$ begin of routine - insert your code only below this line        -
    fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
    to make monitor entries
    if abort is not equal zero, the update process will be canceled
      CLEAR: T_PUR1[] ,
             T_PUR1,
             ABORT.
      SELECT * INTO TABLE T_PUR1 FROM /bic/AZMM_PUR100.
      IF SY-SUBRC EQ 0.
        SORT T_PUR1 BY DOC_DATE
                       DOC_ITEM
                        DOC_NUM.
      ELSE.
        MONITOR-msgid = sy-msgid.
        MONITOR-msgty = sy-msgty.
        MONITOR-msgno = sy-msgno.
        MONITOR-msgv1 = sy-msgv1.
        MONITOR-msgv2 = sy-msgv2.
        MONITOR-msgv3 = sy-msgv3.
        MONITOR-msgv4 = sy-msgv4.
        append MONITOR.
      if abort is not equal zero, the update process will be canceled
             ABORT = 1.
      ENDIF.
       ABORT = 0.
    $$ end of routine - insert your code only before this line         -
    ENDFORM.
    Thanks & Regards
    Ragu

    thanks gimmo and a.h.p,
    i have done the correction as you said,pls verify that.
    And also kindly explain me what is the reason for this start routine,what exactly it does???
    CLEAR: T_PUR1[] ,
             T_PUR1,
             ABORT.
      SELECT * INTO TABLE T_PUR1 FROM /bic/AZMM_PUR100.
      IF SY-SUBRC EQ 0.
        SORT T_PUR1 BY DOC_DATE
                       DOC_ITEM
                        DOC_NUM.
    abort = 0.    (  added  abort = 0 as per your suggestion )
      ELSE.
        MONITOR-msgid = sy-msgid.
        MONITOR-msgty = sy-msgty.
        MONITOR-msgno = sy-msgno.
        MONITOR-msgv1 = sy-msgv1.
        MONITOR-msgv2 = sy-msgv2.
        MONITOR-msgv3 = sy-msgv3.
        MONITOR-msgv4 = sy-msgv4.
        append MONITOR.
      if abort is not equal zero, the update process will be canceled
             ABORT = 1.
    exit. ( added exit as per your suggestion )
      ENDIF.
       ABORT = 0.
    $$ end of routine - insert your code only before this line         -
    ENDFORM.
    Thanks & Regards
    ragu

  • Update Rule problem

    Hi frndz,
    I have one query regarding update rule.
    i want to ask you the question that while defining the routine on particular keyfigure is it requierd to maintain the sequence of the field as it is in cube.
    Like i am facing a starnge problem i have just now updated system to 3.5 now my problem is i have a master data material in that i have wriiten few update rule on specific fields,now when i am loading the data in material master data few of the material having the value others dont have.
    i think some problem lies in update rule i have checked correctly every thing is fine with the update rule,
    can any one suggest me why its behaving strangely.

    i have tried that but no where i am getting the problem.
    like after debugging i got the value correctly till the PSA but its not going correctly in data target.

Maybe you are looking for

  • Itunes won't sync music to iphone 5

    I've just upgraded to the iphone 5 32gb from 16gb and having a few issues with syncing music.  My  iphone 5 (16gb), I passed on to my mother as I wanted a 32gb, but I'm using the same apple id for both phones. I used restore from icloud to set up the

  • Help with Photoshop Elements 12

    I have had this in my computer for over a year, yesterday it was heart burn, today it wouldn't open until I signed in....it asked me to confirm my serial number and then told me I had downloaded the program onto to many computers.  I have not downloa

  • Java.lang.IllegalStateException: algorithm ElGamal in provider ...

    below is the complete error message: java.lang.IllegalStateException: algorithm ElGamal in provider BLNT but class "com.belant.securityprovider.ElGamalCipher" inaccessible! I created Provider based on Java Cryptograhy written by Jonathan Knudsen, the

  • How to exchanging data between SAP and P3?

    could use BAPI directly between SAP and P3? And if we use XI as middleware, which protocols or methods should we use? any answer should be appreciated.

  • Emaling photos as jpegs to pc

    When I try to email photos from iPhoto to other pc's the photos come across as "large opened pictures". I tried control clicking and selecting "view as icon" before sending pics by email. This didn't help. Any suggestions?