DTP Query

Hello, I have a query regarding DTP option in 7.3 system "
Only retrieve last request
I have this checked and it used to work fine but all of sudden it stopped working. i had to re-init my datasource and re init brought back 20,000 records. ok. then i kicked off this dtp whih loaded 20,000 records in my dso which is fine. i put my delta package in process chain and it has been briging 0 record into psa ok. so then when this dtp get kicked off it keeps loading 20,000 records again and again. does any body know what i may have done to disturb this? it used to work fine and onyl extract teh last request in to my dso but no more.
please help.
thanks.

Hi,
Your dtp is delta one?
if yes then after init of your data source/info pack, you need to reinit your dtp also.
other wise it won't work.
Reinit your dtp and check it .
Thanks

Similar Messages

  • SAP BI - OLAP variable in DTP unable to read from input ready variable in query

    Hello,
    Here's the problem:
    We have a process chain which is launched in a WAD.
    We would like to filter the DTPs in the PC with the values entered in the variable screen of the WAD
    I created a variable of type customer exit X which reads the input ready variable Y in CMOD.
    Customer exit variable X was placed in the filter of the DTP.
    The input ready variable Y in the query contained in the WAD.
    I am getting an error that customer exit variable X is getting not value.
    Is it possible to read an input ready variable in a customer exit variable placed in a DTP?
    Or am I missing a step here?
    Please help.
    Thanks in advance

    Hi Anjalee,
    Maybe it would be good to share a bit more information on your scenario and the chosen solution design. I am currently missing the point of triggering a Process Chain using Web Application Designer.
    Anyway, I recommend to be careful with using OLAP variables in DTP filters. For sure not all functionality is present, e.g. because the variable pop-up is absent. Your constellation of an input ready variable and a customer-exit variable will only work in the context of a BEx Query and probably not if used in a DTP filter.
    Best regards,
    Sander

  • Query on DTP error

    Hi BW Experts,
    I am loading data to master data infoobject from ECC 6.0.The data has come to psa correctly.While running the DTP to move the data from PSA to MASTER DATA INFOOBJECT(ZZEMPLOYEE) some records are showing error.
    The error is Data record 2 ('00050008):version(HYD) is not valid.
    The above error comes for another 10 records.
    what i have to do now? Could you please someone help me out?
    Thanks

    Hi Vishwaanand,
    Datatypes are same  for both (ECC 6.0 Field : Infoobject in BI 7.0)that is CHAR
    In this scenario 10 fields are enhanced to 0EMPLOYEE_ATTR in ECC 6.0. Out of these 10 fields 2 fields are coming from one table(T500P-NAME1(Transfer from Circle) and 2 fields are coming from table(T001P-BTEXT(Transfer from Location) and 2 fields are coming from table(PA0000-BEGDA),   2 fields are coming from (PA0023-ARBGB,Difference between BEGDA and ENDDA)  etc.
    I have checked  the datatype for those fields in ECC 6.0 and created corresponding infoobject prefix with Z. I have mapped these infoobjects with(YEMPLOYEE( Target : Master data infoobject) to 0EMPLOYEE_ATTR)).
    Dataflow is in 7.0 like 0EMPLOYEE_ATTR(3.X)->INFOSOURCE--->YEMPLOYEE.
    I have replicated and  run the infopackage then i got the data till PSA(362) Correctly.
    While runnig DTP Out of 362 Error records are 29. Showing the message as
    YTRTOLOC(Transfer to location) : Datarecord  6 ("00050008") : Version 'chennai' is not valid
    YTRTOCIR(Transfer to Circle)   : Datarecord  6 ("00050008") : Version 'Corporate' is not valid
    YTRFRLOC(Transfer from Location):Datarecord  6 ("00050008") : Version 'Chennai' is not valid
    YTRFRCIR(Transfer from Circle)       :Datarecord  6 ("00050008") : Version 'Corporate' is not valid  etc
    How will i correct those records?
    Could you please help me out?

  • A keyfigure is not getting displayed in the DSO and query

    hi friends,
    i have newly developed a DSO with 11 keyfigures and with some 10 characteristics. i created DTP , transformations etc., and i loaded data into it and activated it succesfully.
    now when i select  display data of this DSO one of my keygure is not getting displayed.
    even tht same keyfigure is not appearing in the query  too.
    but when  i check the active table of this dso in SE11 , tht keyfigure is displayed with values.
    could anyone help through this issue.

    Hi
    Even I faced such an issue earlier. I could resolve it simply by readjusting the DSO i.e. to delete the keyfigure and add it in the structure once again, before this you have to delete the data in the DSO. Also, if you have a multiprovider on the DSO make sure that the keyfigure concerned is identified.
    Let us know if this works for you. Thanks.

  • Force read/refresh of master data attributes on query

    Hi there,
    We're having troubles with one input ready query that changes and attribute value (KYF) of one characteristic. That works fines and whe can save changed data on the infoprovider via DTP. Problem is when data is saved we need to refresh query as whe have both values on screen (original value as char attribute on rows) new value as input ready KYF, so after save we would like to see that both values are the same.
    Is there any way of force query no to read from caché as whe are changing master data attributes. I read something about IF_RSMD_RS_ACCESS class that can be implemented on Master Data access and force it there but sounds really hard so if this is the way can some of you guys give us some help.
    I hope I make myself clear on the explanation...
    Thanks in advance,
    Regards
    Carlos

    Dear All,
    The recent days that I tried working on changing master data attributes through BPS didn't work out.The Primary reasons was  that some of the attributes that I needed to change were not present in the transaction or planning cubes and the characteristics that are not part of your cube on which the planning area is based then you can not do changes on them.
    This is my undestanding .Please correct me if I am wrong.
    Besides , I was also thinking if we can do the same through portal.i.e retriving the master data infoobject ( based on the value seleceted for that infoobject by the user  )  and its attributes in the portal , edit them and save them back so that the updated values goes back to BW master data infoobject data base tables and updates the value.
    Eg . I have Natural Account master data infoobject in the BW with attributes fucntional area and expense center.Based on the user selection of any values for the Natural account lets say  01110 , then for 01110 natural account the portal should display the correspoding attributes values of fucntional area and expense center.Lets take this values as 10 , 20 respectively for fucntional area and expense center . What I want to do now is to change these attrbute values to 30 and 40  and I would like to save it back as the changed attribute values for that natural account for 01110 with new attribute values 30 & 40 respectively for fucntional area and expense center .
    Is this possible through portal and BW?
    Any idea on this would be appriciated.
    Regards,
    Ankit
    Edited by: Ankit Bhandari on Nov 21, 2008 12:21 PM
    Edited by: Ankit Bhandari on Nov 21, 2008 12:32 PM

  • Not able to access data in Query

    Hi,
    I have loaded data using DTP from 2 different source systems.
    One is 3.x emulated datasource and the other is New 7.0 datasource.I have loaded data sucessfully in to DSO.Built a query on same.
    But when iam executing the query by material as selection criteria.Iam not able to found data in query from one source system.The same data is available in DSO.Please needed in this regard.

    Hi Venkat,
    After extracting data into DSO check the request whether active or not.
    Check data in DSO in contents.
    If is there any restrictions on info providers in Queries.
    Let us know status clearly.......
    Reg
    Pra

  • Receiving SQL Error: INTERNAL_ERROR  while executing the query

    Dear All,
    I am receiving the below error while executing a query.
    SQL Error: INTERNAL_ERROR
    Diagnosis
    The database system registered an SQL error. As available, the error number and a description are included in the short text. Possible causes for SQL errors include:
    Overflow of database objects such as buffers, temporary tablespaces, rollback segments or data containers/tablespaces.
    ->These problems can generally be eliminated by system administrators.
    Missing generated database objects such as tables or views based on inconsistent or inactive InfoCubes or InfoObjects. Examples of this include the view of the fact table for an InfoCube or the attribute SID table (X/Y table) of a characteristic.
    -> These problems can generally be eliminated by a BW administrator.
    SQL requests with incorrect content.
    -> Problems of this type are generally programming errors.
    System Response
    Procedure
    Contact your system administrator.
    Procedure for System Administration
    If there is no error description, look for one in the reference from the database producer.
    Decide on the correct category for the SQL error and check if it can be eliminated. Generally the error can also be found in the syslog (transaction sm21). From there, transactions sm51 and sm50, the developer's trace log of the work process can be determined and the erroneous statement can be viewed in the log. This procedure is described in SAP Note 568768.
    Notification Number DBMAN 257 
    I verified the table spaces and also done RSRV Repair of InfoObjects and InfoCubes and didnt find any error. But I am receiving this error for one particular month and remaining months are executing fine. Any ideas why i am receiving this error.
    Regards
    Ravi Y

    OSS Note 1305568:
    Symptom
    A data mart query that
    you want to execute within a data transfer process (DTP), for example, terminates with the
    SQL error -1013 "Too many order columns".
    The following is displayed in the monitor log of the data transfer process:
    Error while extracting from source xxxxxx (type InfoProvider)
    Message number RSBK 242
    Exception CX_SQL_EXCEPTION occurred (program:
    CL_SQL_STATEMENT==============CP, include:
    CL_SQL_STATEMENT==============CM004, line: 32).
    Message number RS_EXCEPTION 000
    SQL error: POS(3306) Too many order columns
    Message number DBMAN 257
    Error reading the data of InfoProvider xxxxxx
    Message number DBMAN 305
    You have already implemented Note 1065380.
    Other terms
    CX_SQL_EXCEPTION, message number, RS_EXCEPTION 000, DBMAN 257,  RSBK 242,
    RS_EXCEPTION 000
    Reason and Prerequisites
    The MaxDB internal limit of 4016 bytes or 128 fields for GROUP BY columns was exceeded.
    Solution
    When you implement these corrections, no aggregation is performed for the data (GROUP BY) if the limits of the MaxDB database have been exceeded.
    The limit values for the aggregation bahavior can also be manually reduced if there are problems with the default values.
    Two RSADMIN parameters are provided for this.
    MAXDB_MAX_GROUP_BY_FLDS is the maximum number of GROUP BY fields. The default value is 128.
    MAXDB_MAX_GROUP_BY_LEN is the maximum total length of the GROUP BY fields. The default value is 4000.
    SAP NetWeaver BI 7.00
               Import Support Package 21 for SAP NetWeaver BI 7. 00 (SAPKW70021) into your BI system. The Support Package is available when Note 1270629"SAPBINews NW 7.00 BI Support Package 21", which describes this Support Package in more detail, is released for customers.

  • Need your help !!! -- Authorization error for Real-Time Bex query

    Dear experts:
    I have a Bex query built on a multiprovider, this multiprovider is consist of one standard cube and one real-time cube(virtual provider). When I run this query, I can retrive the data and no error occured, but when others execute this query, they will get the error message below:
    Operation Generate a Request could not be carried out for DTP DTP_D7JVT8DGBQWPWL13VIUITNODG
    You do not have authorization for the data transfer process
    Errors occurred during parallel processing of query 2, RC: 3
    Error while reading data; navigation is possible
    Row: 54 Inc: WRITE_MESSAGES Prog: CL_RSDR_AT_QUERY
    I used tcode rsecadmin to track the authorization, when I execute as user XXX, I can get the same error above, but when I go to "Display Log", no error showed there.
    Did anybody meet the same error before? How to resolve that?
    Any post would be appreciated and thank you all in advance!
    Tim

    I think there is some missing authorization,here
    the user has no access to execute the DTP.
    You can analyse it in rsecadmin or also see
    which object you are getting in su53.
    Thanks,
    Saveen

  • Unable to find my InfoArea in Query Designer

    Hi all,
    Iu2019ve created an InfoObjectCatalog, InfoArea, DSO and an InfoCube. The data source, transformation,DTPs are all active and the flat file is loaded into the cube. I've the data also in the cube. I want to create a query out this cube that I created. But when I open query designer and look for my info area, I donu2019t find it. I searched for Info Area in Find but it says u2018No objects foundu2019. I can't find any of the objects i.e. either the cube or the Info Area.
    My question is why canu2019t find any of the objects that I created inspite of all the objects being active? How do I locate my objects in Query Designer that I created in BI backend?
    Please advise.
    Thank you.
    TR.

    Hi,
    You need to create the Query first to Find the Info Area (relevant to that Query)
    You can search the Info Area Manually (if you dont have Query also) provided You need to say New at Query designer Standard Menu.
    Hope it helps

  • Error Executing query in web template

    Hi,
    We recently upgraded to Nw2004s. When i'm executing my query i'm getting below error message
    Technical Information for Message:
    Activate the ZCRM_IS01 InfoCube/InfoProvider Again 
    Notification Number BRAIN 056 
    When i activated the infoprovider we started getting
    "Infoprovider is locked by User.."
    Any idea what happening?
    Thanks
    Sachin K

    U bet ...
    Even after that you might get some errors but most of the datastore and DTP related issues are solved with SP09 ,good luck.
    I know you must be aware of this ,but still heads up..
    worth reading ,gives exactly how it would tackle most of your issues in different grey areas
    <a href="https://www.sdn.sap.com/irj/sdn/thread?threadID=210736&tstart=0">https://www.sdn.sap.com/irj/sdn/thread?threadID=210736&tstart=0</a>
    Hope it Helps
    Chetan
    @CP..

  • Technical Content Cube 0TCT_MC21, Query 0TCT_MC21_Q0111

    Hi SAP experts
    We have a problem:
    Executing the query 0TCT_MC21_Q0111 for runtime processes seems to ignore some relevant infopackages with high runtime.
    Have you ever experienced such symptom with content statistics in 7.0?
    The subsequent respectivley the following DTP is listed in the DTP-statistics. Isn't this very strange?
    Any ideas?
    Regards
    Chäsitzer

    Dear Lokesh
    Thanks for your rapid reply.
    This note solves our problem.
    Thanks
    Chäsitzer

  • Which is better or 'standard' option for a Delta DTP request

    Hi folks,
                I am creating a delta DTP for loading HR Payroll data. While creating the DTP I see two options
    a) Only Get Delta once
    b) Get all new data request by request
    which is a better option if I have a lot of records to upload for every delta upload.
    Points waiting to be given to you...
    Thanks
    Sunil

    U should go with Get all new data request by request
    i. With Only Get Delta Once, define if the source requests should be transferred only once.
    Setting this flag ensures that the content of the InfoProvider is an exact representation of the source data.
    A scenario of this type may be required if you always want an InfoProvider to contain the most recent data for a query, but technical reasons prevent the DataSource on which it is based from delivering a delta (new, changed or deleted data records). For this type of DataSource, the current data set for the required selection can only be transferred using a full update.
    In this case, a DataStore object cannot normally be used to determine the missing delta information (overwrite and create delta). If this is not logically possible because, for example, data is deleted in the source without delivering reverse records, you can set this flag and perform a snapshot scenario. Only the most recent request for this DataSource is retained in the InfoProvider. Earlier requests for the DataSource are deleted from the (target) InfoProvider before a new one is requested (this is done by a process in a process chain, for example). They are not transferred again by the DTP delta process. When the system determines the delta when a new DTP request is generated, these earlier (source) requests are considered to have been retrieved
    ii. Define if you want to Get All New Data in Source Request by Request.
    Since a DTP bundles all transfer-relevant requests from the source, it sometimes generates large requests. If you do not want to use a single DTP request to transfer the dataset from the source because the dataset is too large, you can set the Get All New Data in Source Request by Request flag. This specifies that you want the DTP to read only one request from the source at a time. Once processing is completed, the DTP request checks for further new requests in the source. If it finds any, it automatically creates an additional DTP request.
    u201COnly get Delta Onceu201D
    /people/community.user/blog/2007/06/21/sap-netweaver-70-bi-data-transfer-process-with-147only-get-delta-once148
    u201CGet Data by Requestu201D
    /people/community.user/blog/2007/06/21/sap-netweaver-70-bi-data-transfer-process-with-147get-data-by-request148

  • SELECT Query performance tunning

    Hi All,
      our objective is to read value from  three  DSO  table, for that  we have written three select query .
    In this we have used three internal talbes.
    We have written in END routine.
    A model select statement for reading the Values in DSO and move statement i have given .
    for 1,75000 records it is taking about 8 hours for DTP to run .
    Usually they are meaning that it will take just 20 minutes.
    Can anbody help on this please ??????????????????????????????
    SELECT logsys
             doc_num
             doc_item
             comp_code
             /bic/gpusiteid
             /bic/gpumtgrid
             /bic/gpuspntyp
             /bic/gpuspndid
             /bic/gpuprocmt
             /bic/gpubufunc
             co_area
             order_quan
             po_unit
             entry_date
             /bic/gpuitmddt
             /bic/gpuovpoc
             currency
             /bic/gpudel_in
    BT8695*
             costcenter
             /bic/gpuordnum
             /bic/gpupostxt
    BT8695*
        FROM (c_poadm_det)
        INTO TABLE t_podetails
         FOR ALL ENTRIES IN result_package
       WHERE logsys   EQ result_package-logsys
         AND doc_num  EQ result_package-doc_num
         AND doc_item EQ result_package-doc_item.
    LOOP AT result_package
    ASSIGNING <result_fields>.
        UNASSIGN <fs_podetails>.
        READ TABLE t_podetails
         ASSIGNING <fs_podetails>
          WITH KEY logsys    = <result_fields>-logsys
                   doc_num   = <result_fields>-doc_num
                   doc_item  = <result_fields>-doc_item.
        IF sy-subrc EQ 0.
          MOVE <fs_podetails>-/bic/gpusiteid TO <result_fields>-/bic/gpusiteid.
          MOVE <fs_podetails>-/bic/gpumtgrid TO <result_fields>-/bic/gpumtgrid.
          MOVE <fs_podetails>-/bic/gpuspntyp TO <result_fields>-/bic/gpuspntyp.
    IF <result_fields>-order_quan NE ' '.
            MOVE c_true TO <result_fields>-/bic/gpucount.
          ENDIF.
        ENDIF.

    Hi,
      In the Read statement just use BINARY SEARCH it will improve the performance. Before putting BINARY SEARCH first the
    internal table should be sort like wht field you giving the condition in read statement.
    sort t_podetails by logsys doc_num doc_item."add this line
    LOOP AT result_package
    ASSIGNING <result_fields>.
    UNASSIGN <fs_podetails>."why your giving the unassigned here it will give the dump. why because the field symbol is not assigned after the read symbol only they going to assign.
    READ TABLE t_podetails
    ASSIGNING <fs_podetails>
    WITH KEY logsys = <result_fields>-logsys
    doc_num = <result_fields>-doc_num
    doc_item = <result_fields>-doc_item. " use BINARY SEARCH here
    IF sy-subrc EQ 0.
    MOVE <fs_podetails>-/bic/gpusiteid TO <result_fields>-/bic/gpusiteid.
    MOVE <fs_podetails>-/bic/gpumtgrid TO <result_fields>-/bic/gpumtgrid.
    MOVE <fs_podetails>-/bic/gpuspntyp TO <result_fields>-/bic/gpuspntyp.
    IF <result_fields>-order_quan NE ' '.
    MOVE c_true TO <result_fields>-/bic/gpucount.
    ENDIF.
    ENDIF.
    Regards,
    Dhina..

  • Show last data 0 in an Input-ready query - Bex

    Hi experts,
    I have an Input-ready query working ok with IP (Integrated Planning).
    The infocube contains information about date/material/stock_quantity (via DTP, date and material, and real-time introduced stock-quantity), the user (via Bex) only introduce the stock quantity.
    This is an example of the real-time cube data:
    Date Material Quantity
    02/03/2011    M1     1000  (input via Input-ready-query)
    03/03/2011    M1     0   (planned via DTP)
    04/03/2011    M1     0    (planned via DTP)
    05/03/2011    M1     200  (input via Input-ready-query)
    06/03/2011    M1     50    (input via Input-ready-query)
    07/03/2011    M1     0    (planned via DTP)
    I want to create a query-workwook that displays: if stock_quant.=0, show the last value <>0 of this material, if stock_quant<>0 it's ok:
    Example workbook desired,
    02/03/2011    M1     1000
    03/03/2011    M1     1000
    04/03/2011    M1     1000   
    05/03/2011    M1     200
    06/03/2011    M1     50 
    07/03/2011    M1     50   
    Do you know how could I show it via Bex-workbook or other loading way ?
    Thanks!!

    Hi Marcel,
    an idea could be to create a virtual key figure and then to use it in the desired bex query. You will need also a way to create an mdx (pre-) query on the fly in order to have the quantity history of the material(s). This is a better approach than reading directly the infocube.
    Take a look at the how to guide "How to ... Realize summarized display of stock values on storage location level". The specific scenario is quite similar with your own requirements. You can find the document on URL http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/eb9faa90-0201-0010-cc9c-cd2e6c0a549a
    Kind Regards,
    Theodoros

  • DTP load in Yellow status forever

    All,
    The DTP load to this particular remains in Yellow status forever even though the transferred records are just 5000. After observing  the dump, it shows problem where a select query to a table is done in the end routine of the transfroamtion. 
    The select is on a big master data table (material) but when I gave the selection by itself in SE16, there were not many records - just 3000. then why would the trace say that all the time is being spent querying this database table?
    This was all working in D but not in q
    Another important note is that there was no scuh change to the material table. however, there were four attributes to which we had enabled 'conversion' in the transfer structure
    Would this be an issue if not the code itself?
    Thanks

    Any updates please?
    MORE iNFORMATION:
    The dtp loads from one DSO to another. I checked a similar scenario in the same system - things look fine.
    I have an end routine in the transformation. when I look at the dtp monitor for the status, it says:
    Extract. from DataStore xxxx: 4709 Data Records ---status GREEN
    Filter Out New Records with the Same Key : 5000 -> 5000 Data Records------status GREEN
    ODSO xxxxx-> ODSO yyyyy-----status YELLOW
    -Transformation Start---status GREEN
    -Rules---status GREEN
    -End Routine---status GREEN
    No Message: Transformation End
    No Message: Update to DataStore Object yyyyyy
    No Message: Set Technical Status to Green
    No Message: Set Overall Status to Green
    Thanks!

Maybe you are looking for

  • Change employee reconciliation account

    Dear Sapgurus, There is a need to change one employees Reconciliation account from 21100000  to 21101000,  Kindly understand 21100000 is Vendor Reconciliation account and 21101000 is employee reconciliation account.   By anychance my Client was using

  • Can't Recall Two-Digit Screensets! Ideas?

    Hi, I can't figure out why I'm unable to recall two-digit screensets other than from the Screenset menu. I'm holding down Control, have tried both the number pad and QWERTY number keys, the fn button (I'm on the new flat Apple keyboard): nada. Any id

  • Ethernet Pro 100 Problems Recognizing

    I have a fairly ancient laptop here and I decided that the only truly just thing would be to install ArchLinux on the thing... I'm a pretty big amateur when it comes to Linux but even with my limited skills I've come to the conclusion that the system

  • Agentry - Field alignment errors in Windows client

    I'm running WM6.1 on the 7.03.605 Windows client and I've been unable to find any solution to field values aligning on top while their labels are aligned to the base of the line. This is somewhat noticeable at the default window size and glaringly ob

  • Bluetooth interface problems: Nissan Altima vs. My Address Book

    I have a 2009 Nissan Altima w/the connection package, and I attempted to integrate my Blackberry 8830's contact list w/the car's phonebook. So far, I've been able to transfer my address book to the vehicle's database (under Options>Bluetooth>Bluetoot