Manipulation done at the end routine

Hi,
I have huge volume of data records coming from the source and we have many many duplicate incoming records.
There is a huge flow of data.
Now my task to create error message for particular record in the end routine.
I achieved with monitor_rec (RSTMONITOR).
But the problem is that the error messages have to be unique but there are duplicate records coming via various data packages.
For Ex, During first load error record (1234) comes with the first ( Data package 1) via 50,000 records and error message is set for the particular data record. Now again for the next data package ( Data package 2) the error record (1234) again comes along with 50,000 records and the error message gets duplicated. But this is not what client wants.
So to solve the problem, i have to create a global internal table and it must exist until all of the data packages are processed.
How can this be achieved? Where to declare it? if i do it in start routine will not be for every datapackage rather than one internal table for all of the data records? Also the second challenge is to identify how many data packages have been generated, so that by the end of the data package, i can sort the internal table and remove duplicate records from it and then publish the messages in the job log. Any idea gurus. Thanking you.
Lakshminarasimhan.N

Please make use of semantic group in DTP which will make sure that records with same key combination will come under same data package. 
From your example, if you had set the field(which contains 1234) as semantic key in your dtp, then DTP load will make sure that all records with value 1234 comes in one data package. 
Then create a Internal table in your end routine and append all error records in it.
At the end of your endroutine, sort your internal table and delete adjacent duplicate entries. and then create error message.
And you can't create global internal table for all the datapackages.  Start and end routine will be performed on each and every datapackage separately only.
--- Thanks..

Similar Messages

  • Delivery is to be done at the end user location not at the storage location

    Hai Gurus.
       I have a problem
    I am working for a construction industry.. I had a storage location at say" X":. And in case of emergency for the material to the end user at different locations, delivery is to be done at the end user location not at the storage location.. Where to give the end user location address .As delivery address in PO item details will be plant address. pl help
    regards
    chandrasekhar

    As i underdstand from your  description of issue .You dont want to recive the material in the storage location of your  main plant but you want to recive the goods directly to the end user from the vendor.This is typical scenario of Third party PO ..so kindly traise third part PO to vendor ..by changing the item category ...and so the delivery addfres popolutaed in ur PO will be directly from the ship to address from your sales order.rathewr than the storgae location you wd have specified in ur material master

  • Updating Cube in the End Routine based on the incoming Valid From Date

    Hello Experts
    Here is my scenario.
    We are using BW 7.x.  There is DSO and from DSO it goes to the Cube, the promotions information.  These data targets will have Material (Article), Plant (Site), Promotions Number, Valid from and Valid to dates.  The Valid To date will come in as 12/31/9999 all the time.
    If I receive a new record from ECC with same Material, Plat and Promotion combination again, I need to get the Valid From Date from the new record and Update the existing record' Valid To date in DSO/Cube as Valid From of the new record - 1.
    So Valid To date of the existing record = New Record's Valid From date minus - 1.
    I would like to do the update in "End Routine" of the Cube.
    Would you please suggest if this can be done and if so would you please provide the sample code.
    THANK YOU in Advance.
    Nag.

    yes, you can do this...
    just use a RESULT_PACKAGE in end routine.
    and using RESULT_PACKAGE fatch the existing records from DSO and append that into RESULT_PACKAGE after your desired changes,
    just try to create a code using this l;ogic. if you can't then i will provide you a code.
    Regards,
    Ashish

  • Using the end routine to populate the Cubes

    Hi BI Gurus,
    I am having following requirement:
    DSO: ZODS1  -
    > This DSO gets all the raw data from Source system.
    DSO:ZODS4   -
    >  Updates ZCUBE4
           ,ZODS5    -
    > Updates ZCUBE5
    there is no data flow between ZODS1 and  ZODS4
                                                     ZODS1  and  ZODS5
    I have added same new fields to ZODS1,ZCUBE4,ZCUBE5.
    So i want to populate fields of ZCUBE4,ZCUBE5.What is best possible way to do that without chaning ZODS4,ZODS5 structure?
    I am thinking to write an end routine ?Any idea if its possible?
    Also if possible can somebody sample code .
    Please help.

    Hi,
      U can populate ur new fields easily thorugh end routine in transformation ds04 to cube 4 ....
    Provided u can sufficient key in ds04 and which is the same key in ds01 to get the information of newly added fields in cube....
    please give ur fields and some sample data then it will be more clear ....
    but it can be achieved if u have necessary key fields in both dso's.
    Regards
    vamsi

  • IDVD  doesn't respond after it says "Done" at the end of Burning

    Firstly, I have burn many, many, MANY DVD's before in iDVD (for school camps, special occasions, etc.) and have met this problem before, but in a different way. Those times I was burning about 20 copies, and after about 7 or 8, it would say this problem. My solution was to press cancel, then start burning again IMMEDIATELY, and it would work again.
    The Problem: that when burning a DVD (it's a project on Apple as a company. Kinda ironic, because it's all about how fantastic they are, and then this happens). What happens, is that iDVD processes the movies, menus, slideshows, etc. and then gets to the "Burn" part of the burning. Here, the process bar is slowly moving forward, and after about 5 minutes of burning (very normal for my mac) it ejects; BUT!: the menu left on iDVD stays on the "Done" screen forever (at the end of the burning, the remaining time is Done). I checked on my other computer and again on my mac, and it is positively blank (the drive also didn't make as much noise as it usually does when burning). This may have something to do with that it is the first time I have exported a Keynote to a QuickTime, and then put it into iDVD (and I also put some "Get a Mac" ads off the Apple website on it). I asked a good friend of mine about this, and suggested that I put the ads into iMovie '09 and then export into iDVD, just so that they are all definitely the same format, but it didn't work AGAIN! So I put the ads into Keynote, deleted them from iDVD, re-exported them and failed to burn AGAIN X 2! HHEELLPP!!!!!!!!!!!!!!!!!

    Update: The problem still occurs, but I had the project due in today, so I put the quicktime of the Keynote on a flash-drive and ran it off my School computer (not a mac... unfortunatley).

  • Update takes long and its never done at the end of each day.

    I also download adope reader on several occasion thinking this would help, but installation is never complete. It keep saying there's a problem that needs to be reported to Microsoft. Please help.

    What version of Reader are you trying to update?
    If Reader XI, download the update from http://www.adobe.com/support/downloads/detail.jsp?ftpID=5715

  • End Routine - Modify a record in the cube

    Hello Guys,
    In the end routine I have to update a field . The transformation is the same cube to the cube. I want the record to be modified like add a value to a field in the end routine. Since it is a cube it creates a new record instead of overwriting the existing record. Is there anyway I can modify the record. I know I can make the existing record 0.00 and then create a new record with the new value.Is there any other solution.
    For example:This is the existing record in the cube
    sales order           Item No                      Backlog Amount                      Indicator
    1000                     10                              1000.00
    After applying the end routine it has 2 records ( I modify the record with the indicator value)
    sales order           Item No                      Backlog Amount                      Indicator
    1000                     10                              1000.00                                  
    1000                     10                              1000.00                                    REV
    After applying the end routine I need the record to be like overwrite(similar to DSO)
    sales order           Item No                      Backlog Amount                      Indicator
    1000                     10                              1000.00                                   REV
    How to achieve the above result in end routine.
    Thanks
    Senthil

    Hi there,
    Since you create new records in end routine in the InfoCube, why not delete the old ones?
    You can use the
    delete RESULT_PACKAGE where ...
    Therefore deleting the old records after inserting the new ones.
    Diogo.

  • What is the last question at the end of auto-updates? "Allow" what? (I clicked on "Done" too fast.)

    Hi, I installed an auto-update today, lazily clicked on "Done" at the end without taking the time to read the last "pop-up" dialog box. I saw options about "allowing" something but I did not get a chance to read the question before it disappeared. What was that last question???

    replying to JS1111
    Now I only get one HKEYLOCALMACHINE\Software\
    QuicktimePlayerLib.QuicktimePlayerApp\CLSID.
    some folks have been having some success with pgfpdwife's technique in the following post:
    pgfpdwife: Re: Could not open key HKEYLOCALMACHINE\Software\Classic\Quicktime.Quicktime\
    note carefully that the technique involves a registry edit. be sure to make a backup of any keys you edit. if you're unfamiliar with your registry or registry editing, head to your XP help and support, do a search on registry, and read through the articles that come up.
    There are also some instructions on how to back up registry keys in the following document:
    Error 1406 or 1402 appears when you install iTunes or QuickTime for Windows

  • END ROUTINE clarification

    Hi,
    I have seen below text in help
    Only fields that have a rule in the transformation are transferred from the end routine
    http://help.sap.com/saphelp_nw70/helpdata/en/20/a894ed07e75648ba5cf7c876430589/frameset.htm
    what does it mean..for ex if i populate some infoobject with data in target structure where i don't have any single source field but i read from masterdata tables in end routine then for which source field I should map as i dont have any specific ??
    because actually i written end routine and its syntax everything is fine development system and transport getting failed into quality saying syntax error in end routine..as currently i didnt map any source fields to the target field for which i am populating in end routine ;;;so just i am suspecting whether is it because of this problem or some thing else...
    If any one has idea on this please reply as soon as possible..
    thanks in advance
    BRK
    Edited by: BRK on Jul 22, 2008 10:40 AM

    Hi Banu,
    Is all the field contain special character?.
    Include only those field which contain the special character.
    Structure End routie.
    Take one field abcwith all the special character.like & * ^ %.
    loop at RESULT_PACKAGE assigning <result_fields>.
           do.
    Here compare all the fields which might contain the special character
    with the above filed(which you have defiend earlier).
            if not <result_fields>-/BIC/ZLOCATION co abc.
              apply your logic   
    Include another field.
    Apply the logic.   
    else.
              exit.
            endif.
          enddo.
    endloop.
    Please include all the field in the above structure.
    Hope this is helpful.
    Thanks,
    Saveen

  • End routine field not populated

    Hi,
    I have made the following end routine in order to populate the field YNEGOCIO with two characters.
    I´m uploading data from DSO 0FIAR_O03 to customized DSO.  When i activate this DSO i don´t see the field YNEGOCIO populated but the strange thing is that when i make a debbugg to the end routine,  the result package-YNEGOCIO at the end of routine IS POPULATED with the correct values.
    Can anybody help me with this?
    LOOP AT RESULT_PACKAGE INTO e_s_result.
    **Recover characters 14,15 from YNEGOCIO.
    CLEAR lv_negocio.
               SELECT SINGLE
                 /BIC/YYKEY
                FROM /BIC/AYSDLASPV00
                INTO lv_key
                    WHERE
                      /BIC/YYVALUE EQ e_s_result-GL_ACCOUNT.
                IF sy-subrc EQ 0.
                  MOVE lv_key+13(2) TO lv_negocio.
                ENDIF.
          LOOP AT gt_inv_gl9_doc INTO gs_inv_gl8_doc
            WHERE ac_doc_no = e_s_result-ac_doc_no.
    * Calculate Importe Aplicado
            MOVE e_s_result to aux_s_result.
             aux_s_result-record = v_count + 1.
            IF aux_s_result-DEB_CRE_DC IS NOT INITIAL.
              aux_s_result-PROFIT_CTR = gs_inv_gl8_doc-profit_ctr.
              aux_s_result-USERNAME = gs_inv_gl8_doc-USERNAME.
              aux_s_result-deb_cre_dc = gs_inv_gl8_doc-DEB_CRE_DC.
              aux_s_result-/BIC/YI_WRBTR = gs_inv_gl8_doc-DEB_CRE_DC *
                aux_s_result-/BIC/YI_WRBTR / aux_s_result-DEB_CRE_DC.
              aux_s_result-/BIC/YNEGOCIO = lv_negocio.
              APPEND aux_s_result to e_t_result.
            ENDIF.
          endloop.
        endloop.
        REFRESH RESULT_PACKAGE.
        MOVE e_t_result[] TO RESULT_PACKAGE[].
    Regards,
    Diego

    hi,
       check your changelog table and see data over there, i guess this situation generally happens when you do delta laod , itmight happening bcoz of two image are created which might be cancelling each other when request gets actiavted in DSO and also see for 0recordmode value.
    hope it helps
    regards
    laksh

  • End Routine Issue - It does not move data from E_T_RESULT to RESULT_PACKAGE

    Hi,
    I am facing an issue with end routine. I have gone through previous posts on, how to write end routine and all.I wrote the end routine accordingly.
    Here is my scenario,
    I have 0CUST_SALES master data , which has all the Sales Org, Distribution Channel and Division, Sold to Party, Sales Grp and Sales Dist.
    I am getting , Sold to party and Distribution channel at the field routine.
    I am using, Sold to Party and Dist Channel and Division = '01'- whatever i populated using a field routine  and trying to get the Sales Org, Sales Grp and Sales Dist at the end routine.
    It looks like, all the code that i wrote seems correct but it does not populate any values into RESULT_PACKAGE.
    Here is the code I wote at the end routine. I am not sure, whats wrong in it. I used, this link to write this routine :
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/203eb778-461d-2c10-60b3-8a94ee91cbfc&overridelayout=true
    Global Declaration----
      DATA : BEGIN OF IT_CUST_SALES,
        DIV TYPE /bi0/pcust_sales-DIVISION,
        DIST_CH TYPE /bi0/pcust_sales-DISTR_CHAN,
        SALES_ORG TYPE /bi0/pcust_sales-SALESORG,
        CUST_SAL TYPE /bi0/pcust_sales-CUST_SALES,
        SALESDIST TYPE /bi0/pcust_sales-SALES_DIST,
        SALESGRP TYPE /bi0/pcust_sales-SALES_GRP,
        END OF IT_CUST_SALES.
    DATA: T_CUST_SALES LIKE TABLE OF IT_CUST_SALES.
    Start of End Routine
       SELECT DIVISION DISTR_CHAN SALESORG CUST_SALES SALES_DIST SALES_GRP
        from
        /bi0/pcust_sales INTO TABLE T_CUST_SALES for all entries in
        RESULT_PACKAGE
        where CUST_SALES = RESULT_PACKAGE-SOLD_TO
         AND DISTR_CHAN = RESULT_PACKAGE-DISTR_CHAN
         AND DIVISION = '01'.
        LOOP AT RESULT_PACKAGE INTO e_s_result.
          READ TABLE T_CUST_SALES INTO IT_CUST_SALES
              WITH KEY CUST_SAL = e_s_result-SOLD_TO
                 DIST_CH = e_s_result-DISTR_CHAN
                 DIV = '01'.
          IF SY-SUBRC EQ 0 .
            MOVE IT_CUST_SALES-SALES_ORG TO E_S_RESULT-SALESORG.
            MOVE IT_CUST_SALES-SALESDIST TO E_S_RESULT-SALES_DIST.
            MOVE IT_CUST_SALES-SALESGRP TO E_S_RESULT-SALES_GRP.
            APPEND E_S_RESULT  TO  E_T_RESULT .
          ENDIF.
        ENDLOOP.
        REFRESH RESULT_PACKAGE.
        MOVE E_T_RESULT[] TO RESULT_PACKAGE[] .
    End End Routine
    Data comes into E_T_RESULT but it does not move to RESULT_PACKAGE. Any inputs will be helpful.
    Regards,
    Kumar

    Hi Hegde,
    Declaration is same , its like this.
       datA: e_s_result type tys_TG_1.
        data: e_t_result type tyt_TG_1.
    I don't know, when i inserted this code in this post, initially it was OK but once i post i also saw , its not that read friendly.
    FYI, i am trying to put the code again, lets see if it works.
      SELECT DIVISION DISTR_CHAN SALESORG CUST_SALES SALES_DIST SALES_GRP
        from    /bi0/pcust_sales INTO TABLE T_CUST_SALES for all entries in
        RESULT_PACKAGE   where CUST_SALES = RESULT_PACKAGE-SOLD_TO
         AND DISTR_CHAN = RESULT_PACKAGE-DISTR_CHAN
         AND DIVISION = '01'.
        LOOP AT RESULT_PACKAGE INTO e_s_result.
          READ TABLE T_CUST_SALES INTO IT_CUST_SALES
              WITH KEY CUST_SAL = e_s_result-SOLD_TO
                 DIST_CH = e_s_result-DISTR_CHAN
                 DIV = '01'.
          IF SY-SUBRC EQ 0 .
            MOVE IT_CUST_SALES-SALES_ORG TO E_S_RESULT-SALESORG.
            MOVE IT_CUST_SALES-SALESDIST TO E_S_RESULT-SALES_DIST.
            MOVE IT_CUST_SALES-SALESGRP TO E_S_RESULT-SALES_GRP.
            APPEND E_S_RESULT  TO  E_T_RESULT .
          ENDIF.
        ENDLOOP.
        REFRESH RESULT_PACKAGE.
        MOVE E_T_RESULT[] TO RESULT_PACKAGE[] .
    Regards,
    Kumar

  • SOURCE_PACKAGE and DATA_PACKAGE are incompatible in the start routine.

    Hi All,
       I have seen similar posts on the forums, but none relate to my issue. Here is what I am seeing.
       I have installed a cube, DSO and transformations b/w the cube and the DSO from the business content. The transformations cannot be activated as there is the following error in the end routine...
    E:In PERFORM or CALL FUNCTION "ROUTINE_9998", the actual parameter "SOURCE_PACKAGE" is incompatible with the formal parameter "DATA_PACKAGE".
    This is a fresh installation and I have not done any changes. The delivered version code (only the relevant portions) is as follows...
    Note: I have installed the transformations(TRFN) from the business content and have not migrated the update rules(UPDR) to transformations. So, this is an out of the box issue.
    METHODS
          start_routine
            IMPORTING
              request                  type rsrequest
              datapackid               type rsdatapid
            EXPORTING
              monitor                  type rstr_ty_t_monitors
            CHANGING
              SOURCE_PACKAGE              type tyt_SC_1
            RAISING
              cx_rsrout_abort.
    FORM routine_9998
    TABLES DATA_PACKAGE TYPE tyt_SC_1_full
      TABLES DATA_PACKAGE TYPE tyt_SC_1
      CHANGING
        ABORT          LIKE sy-subrc
      RAISING
        cx_sy_arithmetic_error
        cx_sy_conversion_error.
    Migrated update rule call
      Perform routine_9998
      TABLES
        SOURCE_PACKAGE
      CHANGING
        l_abort.
    Any help will be appreciated.
    Thanks.

    Hi,
    just try this code.
    METHODS
    start_routine
    IMPORTING
    request type rsrequest
    datapackid type rsdatapid
    EXPORTING
    monitor type rstr_ty_t_monitors
    CHANGING
    SOURCE_PACKAGE type tyt_SC_1
    RAISING
    cx_rsrout_abort.
    FORM routine_9998
    TABLES DATA_PACKAGE TYPE tyt_SC_1_full
    TABLES SOURCE_PACKAGE TYPE tyt_SC_1
    CHANGING
    ABORT LIKE sy-subrc
    RAISING
    cx_sy_arithmetic_error
    cx_sy_conversion_error.
    Migrated update rule call
    Perform routine_9998
    TABLES
    SOURCE_PACKAGE
    CHANGING
    l_abort.

  • Performance: reading huge amount of master data in end routine

    In our 7.0 system, each day a full load runs from DSO X to DSO Y in which from six characteristics from DSO X master data is read to about 15 fields in DSO Y contains about 2mln. records, which are all transferred each day. The master data tables all contain between 2mln. and 4mln. records. Before this load starts, DSO Y is emptied. DSO Y is write optimized.
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups. We redesigned and fill all master data attributes in the end routine, after fillilng internal tables with the master data values corresponding to the data package:
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE ucpremise EQ RESULT_PACKAGE-ucpremise.
    And when we loop over the data package, we write someting like:
        LOOP AT RESULT_PACKAGE ASSIGNING <fs_rp>.
          READ TABLE lt_0ucpremise INTO ls_0ucpremise
            WITH KEY ucpremise = <fs_rp>-ucpremise
            BINARY SEARCH.
          IF sy-subrc EQ 0.
            <fs_rp>-ucpremisty = ls_0ucpremise-ucpremisty.
            <fs_rp>-ucdele_ind = ls_0ucpremise-ucdele_ind.
          ENDIF.
    *all other MD reads
    ENDLOOP.
    So the above statement is repeated for all master data we need to read from. Now this method is quite faster (1,5 hr). But we want to make it faster. We noticed that reading in the master data in the internal tables still takes a long time, and this has to be repeated for each data package. We want to change this. We have now tried a similar method, but now load all master data in internal tables, without filtering on the data package, and we do this only once.
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise.
    So when the first data package starts, it fills all master data values, which 95% of them we would need anyway. To accomplish that the following data packages can use the same table and don't need to fill them again, we placed the definition of the internal tables in the global part of the end routine. In the global we also write:
    DATA: lv_data_loaded TYPE C LENGTH 1.
    And in the method we write:
    IF lv_data_loaded IS INITIAL.
      lv_0bpartner_loaded = 'X'.
    * load all internal tables
    lv_data_loaded = 'Y'.
    WHILE lv_0bpartner_loaded NE 'Y'.
      Call FUNCTION 'ENQUEUE_SLEEP'
      EXPORTING
         seconds = 1.
    ENDWHILE.
    LOOP AT RESULT_PACKAGE
    * assign all data
    ENDLOOP.
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    Well this all seems to work: it takes now 10 minutes to load everything to DSO Y. But I'm wondering if I'm missing anything. The system seems to work fine loading all these records in internal tables. But any improvements or critic remarks are very welcome.

    This is a great question, and you've clearly done a good job of investigating this, but there are some additional things you should look at and perhaps a few things you have missed.
    Zephania Wilder wrote:
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups.
    This is not accurate. After SP14, BW does a prefetch and buffers the master data values used in the lookup. Note [1092539|https://service.sap.com/sap/support/notes/1092539] discusses this in detail. The important thing, and most likely the reason you are probably seeing individual master data lookups on the DB, is that you must manually maintain the MD_LOOKUP_MAX_BUFFER_SIZE parameter to be larger than the number of lines of master data (from all characteristics used in lookups) that will be read. If you are seeing one select statement per line, then something is going wrong.
    You might want to go back and test with master data lookups using this setting and see how fast it goes. If memory serves, the BW master data lookup uses an approach very similar to your second example (1,5 hrs), though I think that it first loops through the source package and extracts the lists of required master data keys, which is probably faster than your statement "FOR ALL ENTRIES IN RESULT_PACKAGE" if RESULT_PACKAGE contains very many duplicate keys.
    I'm guessing you'll get down to at least the 1,5 hrs that you saw in your second example, but it is possible that it will get down quite a bit further.
    Zephania Wilder wrote:
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    This sleeping approach is not necessary as only one data package will be running at a time in any given process. I believe that the "global" internal table is not be shared between parallel processes, so if your DTP is running with three parallel processes, then this table will just get filled three times. Within a process, all data packages are processed serially, so all you need to do is check whether or not it has already been filled. Or are you are doing something additional to export the filled lookup table into a shared memory location?
    Actually, you have your global data defined with the statement "DATA: lv_data_loaded TYPE C LENGTH 1.". I'm not completely sure, but I don't think that this data will persist from one data package to the next. Data defined in the global section using "DATA" is global to the package start, end, and field routines, but I believe it is discarded between packages. I think you need to use "CLASS-DATA: lv_data_loaded TYPE C LENGTH 1." to get the variables to persist between packages. Have you checked in the debugger that you are really only filling the table once per request and not once per package in your current setup? << This is incorrect - see next posting for correction.
    Otherwise the third approach is fine as long as you are comfortable managing your process memory allocations and you know the maximum size that your master data tables can have. On the other hand, if your master data tables grow regularly, then you are eventually going to run out of memory and start seeing dumps.
    Hopefully that helps out a little bit. This was a great question. If I'm off-base with my assumptions above and you can provide more information, I would be really interested in looking at it further.
    Edited by: Ethan Jewett on Feb 13, 2011 1:47 PM

  • End Routine in virtual infoprovider based on DTP

    Hi gurus!!
    I'm facing the following problem. I need to have a Virtual Provider based on DTP based on a DataSource from a view of a table. But I want to do a end routine in the transformation rules. When I use the end routine, my filters don't work (always give al the information). If I don't use the end Routine (or start Routine) I can do filters work perfect.
    Any suggestions for this problems.
    Thanks a lot!
    Gorka Ibor.

    hi
    initially try to debug the rotines
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b0038ad7-a0c7-2c10-cdbc-dd674682c8e7?QuickLink=index&overridelayout=true
    if you havent found any bug then
    check the links will help you
    Can I call values entered in DTP filter in transformation End Routine??
    Routines in DTP filters

  • Start And End Routine !

    Hi All,
                  In BI 7 we have a start routine and end routine. In end routine we delete all the transformed records that need not be updated to the infocube. Which we can also do it in the start routine. In that case Y do we need a End routine.
    Regards.

    Hi,
    1. Start Routine : whenever you create a start routine, the system will automatically give you some predefined data declaration like structure of the type of your source. In start routine you can declare a local data declaration and Global data declaration also. I will give one example.
    Suppose you are creating Start routine for transformation from Datasource to DSO. Then when you create start routine the System will provide you a structure of your datasource. By using this structure it will also define an internal table which will be used as changing parameter of a method of a class. It is this changing parameter where you have to do all sorts of manipulation. The changing parameter is both export as well as import parameter. This changing parameter will contain data in the package wise.
    In start routine you can also read some database table and store that data into an internal table and this internal table data can then be read in the field level routine. In this case it acts as a substitute of master data look up in the field level transformation.
    2. End routine : The end routine is also same as start routine, the only difference is that here the system will provide you with a structure of the target not the source. and rest is all same as start routine.
    assign points if it helps
    Thanks & Regards
    santo

Maybe you are looking for

  • Time Machine in addition to Super Duper

    My previous backup external drive died and I just got a replacement. I have always used Super Duper (which is great) to back up my computers so that I have a bootable backup. The question is---does it make any sense to partition the drive so that the

  • Movies stored in the cloud

    I have run out of cloud storage and removed my pictures to make room;however,  I am still short by 10 gigs out of the 40 I already have. Do I remove movies in ITunes for space?

  • How do I remove an unwanted disk/volume?

    I recently purchased a nearly new mini, but the previous owner had created an extended journaled volume that I cannot remove. I've erased it and written over with zeroes, but it's still there, using about half of my HD space. It's probably a ridiculo

  • Can i activate my twin sim card in a new iphone 4s while being in the States?

    Hi all! I live in Cyprus and I'm travelling to the States next week. I'm planning to buy the iphone 4s from an apple store (factory unclocked) and I would like to know if I can activate the twin sim card I purchased from my carrier here in Cyprus. Th

  • Is it possible to use the google chrome app as standard browser on ipad?

    Is it possible to use the google chrome app as standard browser on ipad, or do i have to jailbreak it or some thing? Thanks in advance. Regards Christoffer