Data Packet issue for DTP

Hi,
    I have Data packet  issue with DTP  when i am loding for General Ledger Account -2009
   (cube) the data load is from DATA SOURCE (0FI_GL_4) . the data is first loaded to DSO
   and then to a cube. upto this level the data is going fine, when the data is loaded to 2009
   cube using DTP as full load  i have an issue in the report, where the net balance is not " 0 ".
   but when i do a manual load for Selective company code's as a single value selection in the
   DTP filter condetion for all the company codes my data is matching in the report, where
   the netbalance is " 0 ". 
   With this i think there is an issue with Datapacket for DTP.
   Please sugest in this regard.
   Regards,
   prasad.

Hi Ngendra,
Yes, there will problem wth data loads some time with DTP.
This can be resolved by setting scemantic grouping at dtp level, make sure that scemantic field ur defining will be unique.
I am not sure with respect to functional side but with respect to ur post. i assume your problem will resoved by seeting company code as scemantic field or by setting some unique field as scemantic while data loading....
Hope this helps you.
Best Regards,
Maruthi

Similar Messages

  • Issue for DTP from DSO to open hub destination

    Hello Gurus,
            I have a issue for DTP from DSO to open hub destination, long text for error in the monitor is as follows:
              " Could not open file
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr_2 on application server"
              " Error while updating to target ZFIGLH03 (type Open Hub Destination)     "
          for open hub destination, I check the configure for logical file name , which is "
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr",
    I am wondering where that file " 
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr_2" in the error message comes from?
    Many thanks,

    Hi
    You do not need to create a file in application server. It will be created automatically.
    But if you have defined a logical file name in tcode FILE and used that in OHD and if it is not correct then it will show a conflict. Check this out.

  • Z data source issue for creating packets

    Hi I have created a Z data source (function module) .
    My issue is I am not able to create data record packets all the data is coming in one packet only.
    The is code is as show below is some one can please assist me how can I change the code so that is can create multiple packets for the option given in Tcode RSA3.
    FUNCTION ZBW_MATERIAL_GROUP_HIE.
    ""Local Interface:
    *" IMPORTING
    *" VALUE(I_REQUNR) TYPE SRSC_S_IF_SIMPLE-REQUNR
    *" VALUE(I_DSOURCE) TYPE SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
    *" VALUE(I_MAXSIZE) TYPE SRSC_S_IF_SIMPLE-MAXSIZE OPTIONAL
    *" VALUE(I_INITFLAG) TYPE SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
    *" VALUE(I_READ_ONLY) TYPE SRSC_S_IF_SIMPLE-READONLY OPTIONAL
    *" VALUE(I_REMOTE_CALL) TYPE SBIWA_FLAG DEFAULT SBIWA_C_FLAG_OFF
    *" TABLES
    *" I_T_SELECT TYPE SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL
    *" I_T_FIELDS TYPE SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL
    *" E_T_DATA STRUCTURE ZBW_MAT_GRP_HIER OPTIONAL
    *" EXCEPTIONS
    *" NO_MORE_DATA
    *" ERROR_PASSED_TO_MESS_HANDLER
    TABLES : /BI0/HMATL_GROUP.
    DATA : BEGIN OF t_hmat OCCURS 0,
    hieid LIKE /BI0/HMATL_GROUP-hieid,
    objvers LIKE /BI0/HMATL_GROUP-objvers,
    iobjnm LIKE /BI0/HMATL_GROUP-iobjnm,
    nodeid LIKE /BI0/HMATL_GROUP-nodeid,
    nodename LIKE /BI0/HMATL_GROUP-nodename,
    tlevel LIKE /BI0/HMATL_GROUP-tlevel,
    parentid LIKE /BI0/HMATL_GROUP-parentid,
    END OF t_hmat.
    DATA : BEGIN OF t_flathier,
    hieid LIKE /BI0/HMATL_GROUP-hieid,
    lv2_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv2_name LIKE /BI0/HMATL_GROUP-nodename,
    lv3_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv3_name LIKE /BI0/HMATL_GROUP-nodename,
    lv4_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv4_name LIKE /BI0/HMATL_GROUP-nodename,
    lv5_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv5_name LIKE /BI0/HMATL_GROUP-nodename,
    lv6_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv6_name LIKE /BI0/HMATL_GROUP-nodename,
    lv7_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv7_name LIKE /BI0/HMATL_GROUP-nodename,
    lv8_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv8_name LIKE /BI0/HMATL_GROUP-nodename,
    lv9_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv9_name LIKE /BI0/HMATL_GROUP-nodename,
    lv10_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv10_name LIKE /BI0/HMATL_GROUP-nodename,
    lv11_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv11_name LIKE /BI0/HMATL_GROUP-nodename,
    material LIKE /BI0/HMATL_GROUP-nodename,
    END OF t_flathier.
    FIELD-SYMBOLS: <f> LIKE LINE OF t_hmat,
    <Level> TYPE ANY.
    data : count(2) type c,
    lv_level(20) type c.
    DATA : lv_count TYPE n.
    DATA : lv_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv_hieid LIKE /BI0/HMATL_GROUP-hieid.
    Auxiliary Selection criteria structure
    DATA: l_s_select TYPE srsc_s_select.
    Maximum number of lines for DB table
    STATICS: s_s_if TYPE srsc_s_if_simple,
    counter
    s_counter_datapakid LIKE sy-tabix,
    cursor
    s_cursor TYPE cursor.
    Select ranges
    RANGES: l_r_nodename FOR /BI0/HMATL_GROUP-nodename,
    l_r_hieid FOR /BI0/HMATL_GROUP-hieid.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
    IF i_initflag = sbiwa_c_flag_on.
    Initialization: check input parameters
    buffer input parameters
    prepare data selection
    Check DataSource validity
    CASE i_dsource.
    WHEN 'ZMATERIAL_GROUP_HIE'.
    WHEN OTHERS.
    IF 1 = 2. MESSAGE e009(r3). ENDIF.
    this is a typical log call. Please write every error message like this
    log_write 'E' "message type
    'R3' "message class
    '009' "message number
    i_dsource "message variable 1
    ' '. "message variable 2
    RAISE error_passed_to_mess_handler.
    ENDCASE.
    APPEND LINES OF i_t_select TO s_s_if-t_select.
    Fill parameter buffer for data extraction calls
    s_s_if-requnr = i_requnr.
    s_s_if-dsource = i_dsource.
    s_s_if-maxsize = i_maxsize.
    Fill field list table for an optimized select statement
    (in case that there is no 1:1 relation between InfoSource fields
    and database table fields this may be far from beeing trivial)
    APPEND LINES OF i_t_fields TO s_s_if-t_fields.
    ELSE. "Initialization mode or data extraction ?
    Data transfer: First Call OPEN CURSOR + FETCH
    Following Calls FETCH only
    First data package -> OPEN CURSOR
    IF s_counter_datapakid = 0.
    Fill range tables BW will only pass down simple selection criteria
    of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
    LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = '0MATERIAL'.
    MOVE-CORRESPONDING l_s_select TO l_r_nodename.
    APPEND l_r_nodename.
    ENDLOOP.
    LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'HIEID'.
    MOVE-CORRESPONDING l_s_select TO l_r_hieid.
    APPEND l_r_hieid.
    ENDLOOP.
    Get the data from Hierarchy table
    SELECT * FROM /BI0/HMATL_GROUP INTO CORRESPONDING FIELDS OF
    TABLE t_hmat
    WHERE hieid IN l_r_hieid
    AND objvers = 'A' .
    ENDIF.
    loop through all the 0MATERIAL entries to get all the hirarchy levels.
    Start of change.
    LOOP AT t_hmat ASSIGNING <f>
    WHERE iobjnm = '0MATL_GROUP'
    AND nodename IN l_r_nodename.
    LOOP AT t_hmat ASSIGNING <f>
    WHERE nodename IN l_r_nodename.
    End of change
    lv_count = <f>-tlevel.
    "refresh t_flathier.
    CLEAR: t_flathier. ", lv_level, count.
    MOVE :
    <f>-hieid TO lv_hieid ,
    <f>-nodename TO t_flathier-material,
    <f>-parentid TO lv_id.
    if <f>-iobjnm <> '0MATL_GROUP' .
    move <f>-nodename+3 to t_flathier-material .
    else.
    move <f>-nodename to t_flathier-material .
    endif.
    Added for Last level.
    if lv_count = '1' .
    *t_flathier-lv1_name = t_flathier-material .
    elseif lv_count = '2' .
    t_flathier-lv2_name = t_flathier-material .
    elseif lv_count = '3' .
    t_flathier-lv3_name = t_flathier-material .
    elseif lv_count = '4' .
    t_flathier-lv4_name = t_flathier-material .
    elseif lv_count = '5' .
    t_flathier-lv5_name = t_flathier-material .
    elseif lv_count = '6' .
    t_flathier-lv6_name = t_flathier-material .
    elseif lv_count = '7' .
    t_flathier-lv7_name = t_flathier-material .
    elseif lv_count = '8' .
    t_flathier-lv8_name = t_flathier-material .
    elseif lv_count = '9' .
    t_flathier-lv9_name = t_flathier-material .
    elseif lv_count = '10' .
    t_flathier-lv10_name = t_flathier-material .
    endif.
    DO lv_count TIMES .
    lv_count = lv_count - 1.
    IF lv_count = 1.
    EXIT.
    ENDIF.
    READ TABLE t_hmat WITH KEY
    hieid = lv_hieid
    nodeid = lv_id.
    IF sy-subrc = 0.
    CLEAR lv_id.
    CASE lv_count.
    WHEN '11' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv11_name,
    t_hmat-parentid TO lv_id.
    WHEN '10' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv10_name,
    t_hmat-parentid TO lv_id.
    WHEN '9' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv9_name,
    t_hmat-parentid TO lv_id.
    WHEN '8' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv8_name,
    t_hmat-parentid TO lv_id.
    WHEN '7' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv7_name,
    t_hmat-parentid TO lv_id.
    WHEN '6' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv6_name,
    t_hmat-parentid TO lv_id.
    WHEN '5' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv5_name,
    t_hmat-parentid TO lv_id.
    WHEN '4' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv4_name,
    t_hmat-parentid TO lv_id.
    WHEN '3' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv3_name,
    t_hmat-parentid TO lv_id.
    WHEN '2' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv2_name.
    ENDCASE.
    ENDIF.
    ENDDO.
    Populate data for level 1 (Class Type)
    READ TABLE t_hmat WITH KEY
    hieid = lv_hieid
    tlevel = 1.
    IF sy-subrc = 0.
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_OUTPUT'
    EXPORTING
    input = t_hmat-nodename
    IMPORTING
    output = e_t_data-0class_type.
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
    EXPORTING
    input = e_t_data-0class_type
    IMPORTING
    output = e_t_data-0class_type.
    ENDIF.
    populate data to extraction structure ( removing prefixe 'class type')
    MOVE : lv_hieid TO e_t_data-hieid,
    t_flathier-lv2_name TO e_t_data-xhier_lv1,
    t_flathier-lv3_name TO e_t_data-xhier_lv2,
    t_flathier-lv4_name TO e_t_data-xhier_lv3,
    t_flathier-lv5_name TO e_t_data-xhier_lv4,
    t_flathier-lv6_name TO e_t_data-xhier_lv5,
    t_flathier-lv7_name TO e_t_data-xhier_lv6,
    t_flathier-lv8_name TO e_t_data-xhier_lv7,
    t_flathier-lv9_name TO e_t_data-xhier_lv8,
    t_flathier-lv10_name TO e_t_data-xhier_lv9,
    t_flathier-lv11_name TO e_t_data-xhie_lv10,
    t_flathier-material TO e_t_data-0MATL_GROUP.
    APPEND e_t_data.
    CLEAR e_t_data.
    ENDLOOP.
    s_counter_datapakid = s_counter_datapakid + 1.
    IF s_counter_datapakid > 1 .
    RAISE no_more_data.
    ENDIF.
    ENDIF. "Initialization mode or data extraction ?
    ENDFUNCTION.
    As now when I run it in Tcode RSA3 it give only one data packet of some 5k to 6k records.
    Thanks in advance for your help.
    Pawan.

    Hi PS,
    Instead of
    SELECT * FROM /BI0/HMATL_GROUP INTO CORRESPONDING FIELDS OF
    TABLE t_hmat
    WHERE hieid IN l_r_hieid
    AND objvers = 'A' .
    code should look like this .
          OPEN CURSOR WITH HOLD S_CURSOR FOR
          SELECT (S_S_IF-T_FIELDS) FROM /BI0/HMATL_GROUP
        FETCH NEXT CURSOR S_CURSOR
                   APPENDING CORRESPONDING FIELDS
                   OF TABLE E_T_DATA
                   PACKAGE SIZE S_S_IF-MAXSIZE.
    For more information refer to sample code of fm "RSAX_BIW_GET_DATA_SIMPLE"
    Hope that helps.
    Regards
    Mr Kapadia
    ***Assigning points is the way to say thanks in SDN.***

  • Date difference issue for Purchase order in R/3 and APO

    Hi,
       I am sending the purchase order from APO to R/3. When I transfer the data through CIF and see the curresponding entries in stock requirement list in R/3, purchase order is 1 day earlier than in SCM.
    However, when this purchase order is CIFed back to SCM, date appears correct in SCM ( as what it was before). Is this something because of timezone that difference in date appears in SCM and R/3.
    Can someone who came across date issues for transaction data transfer between SCM and R/3 elaborate in more detail please.
    Thanks,
    Sanjay

    Hello Somnath ,
    I am also facing the same issue which sanju was facing earlier in PRs .
    In APO I am not able to understand how system calculates the opening , start and end date .In my case the transportation time is 48 hrs , planned delivery time is 1 day and all calenders are 5*24 , GR proceeeing time .01 day . When I am creating PR at destination location on 06/06/2008 , 24:00:00 Hrs . System calculates the dates as follows :-
    Start date -06/05/2008 , 00:45:00 , end date -06/06/2008, 24:00:00 , opening date -05/20/2008 ,23:25:00 Hrs  . I tried to understand the formula given by you in your reply as Opening Date = Requirement Due Date - GR Time - Planned Delivery Time
    Start Date = Requirement Due Date - GR time - Transportation Duration
    End date = Requirement Date.
    But I am not getting the desired results.
    Could you let me know why the system is behaving in such way.I also want to know does system can create orders in such a way the opening date lies in past. If yes then can we restrict that ?
    Regards ,
    Mukesh
    Edited by: Mukesh  Kumar on Mar 9, 2008 1:59 PM

  • Data pump issue for oracle 10G in window2003

    Hi Experts,
    I try to run data pump in oracle 10G in window 2003 server.
    I got a error as
    D:\>cd D:\oracle\product\10.2.0\SALE\BIN
    D:\oracle\product\10.2.0\SALE\BIN>expdp system/xxxxl@sale full=Y directory=du
    mpdir dumpfile=expdp_sale_20090302.dmp logfile=exp_sale_20090302.log
    Export: Release 10.2.0.4.0 - Production on Tuesday, 03 March, 2009 8:05:50
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Produc
    tion
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-31626: job does not exist
    ORA-31650: timeout waiting for master process response
    However, I can run exp codes and works well.
    What is wrong for my data pump?
    Thanks
    JIM

    Hi Anand,
    I did not see any error log at that time. Actually, it did not work any more. I will test it again based on your emial after exp done.
    Based on new testing, I got below errors as
    ORA-39014: One or more workers have prematurely exited.
    ORA-39029: worker 1 with process name "DW01" prematurely terminated
    ORA-31671: Worker process DW01 had an unhandled exception.
    ORA-04030: out of process memory when trying to allocate 4108 bytes (PLS non-lib hp,pdzgM60_Make)
    ORA-06512: at "SYS.KUPC$QUEUE_INT", line 277
    ORA-06512: at "SYS.KUPW$WORKER", line 1366
    ORA-04030: out of process memory when trying to allocate 65036 bytes (callheap,KQL tmpbuf)
    ORA-06508: PL/SQL: could not find program unit being called: "SYS.KUPC$_WORKERERROR"
    ORA-06512: at "SYS.KUPW$WORKER", line 13360
    ORA-06512: at "SYS.KUPW$WORKER", line 15039
    ORA-06512: at "SYS.KUPW$WORKER", line 6372
    ORA-39125: Worker unexpected fatal error in KUPW$WORKER.DISPATCH_WORK_ITEMS while calling DBMS_METADATA.FETCH_XML_CLOB [PROCOBJ:"SALE"."SQLSCRIPT_2478179"]
    ORA-06512: at "SYS.KUPW$WORKER", line 7078
    ORA-04030: out of process memory when trying to allocate 4108 bytes (PLS non-lib hp,pdzgM60_Make)
    ORA-06500: PL/SQL: storage error
    ORA-04030: out of process memory when trying to allocate 16396 bytes (koh-kghu sessi,pmucpcon: tds)
    ORA-04030: out of process memory when trying to allocate 16396 bytes (koh-kghu sessi,pmucalm coll)
    Job "SYSTEM"."SYS_EXPORT_FULL_01" stopped due to fatal error at 14:41:36
    ORA-39014: One or more workers have prematurely exited.
    the trace file as
    *** 2009-03-03 14:20:41.500
    *** ACTION NAME:() 2009-03-03 14:20:41.328
    *** MODULE NAME:(oradim.exe) 2009-03-03 14:20:41.328
    *** SERVICE NAME:() 2009-03-03 14:20:41.328
    *** SESSION ID:(159.1) 2009-03-03 14:20:41.328
    Successfully allocated 7 recovery slaves
    Using 157 overflow buffers per recovery slave
    Thread 1 checkpoint: logseq 12911, block 2, scn 7355467494724
    cache-low rba: logseq 12911, block 251154
    on-disk rba: logseq 12912, block 221351, scn 7355467496281
    start recovery at logseq 12911, block 251154, scn 0
    ----- Redo read statistics for thread 1 -----
    Read rate (ASYNC): 185319Kb in 1.73s => 104.61 Mb/sec
    Total physical reads: 189333Kb
    Longest record: 5Kb, moves: 0/448987 (0%)
    Change moves: 1378/5737 (24%), moved: 0Mb
    Longest LWN: 1032Kb, moves: 45/269 (16%), moved: 41Mb
    Last redo scn: 0x06b0.9406fb58 (7355467496280)
    ----- Recovery Hash Table Statistics ---------
    Hash table buckets = 32768
    Longest hash chain = 3
    Average hash chain = 35384/25746 = 1.4
    Max compares per lookup = 3
    Avg compares per lookup = 847056/876618 = 1.0
    *** 2009-03-03 14:20:46.062
    KCRA: start recovery claims for 35384 data blocks
    *** 2009-03-03 14:21:02.171
    KCRA: blocks processed = 35384/35384, claimed = 35384, eliminated = 0
    *** 2009-03-03 14:21:02.531
    Recovery of Online Redo Log: Thread 1 Group 2 Seq 12911 Reading mem 0
    *** 2009-03-03 14:21:04.718
    Recovery of Online Redo Log: Thread 1 Group 1 Seq 12912 Reading mem 0
    *** 2009-03-03 14:21:16.296
    ----- Recovery Hash Table Statistics ---------
    Hash table buckets = 32768
    Longest hash chain = 3
    Average hash chain = 35384/25746 = 1.4
    Max compares per lookup = 3
    Avg compares per lookup = 849220/841000 = 1.0
    *** 2009-03-03 14:21:28.468
    tkcrrsarc: (WARN) Failed to find ARCH for message (message:0x1)
    tkcrrpa: (WARN) Failed initial attempt to send ARCH message (message:0x1)
    *** 2009-03-03 14:26:25.781
    kwqmnich: current time:: 14: 26: 25
    kwqmnich: instance no 0 check_only flag 1
    kwqmnich: initialized job cache structure
    ktsmgtur(): TUR was not tuned for 360 secs
    Windows Server 2003 Version V5.2 Service Pack 2
    CPU : 8 - type 586, 4 Physical Cores
    Process Affinity : 0x00000000
    Memory (Avail/Total): Ph:7447M/8185M, Ph+PgF:6833M/9984M, VA:385M/3071M
    Instance name: vmsdbsea
    Redo thread mounted by this instance: 0 <none>
    Oracle process number: 0
    Windows thread id: 2460, image: ORACLE.EXE (SHAD)
    Dynamic strand is set to TRUE
    Running with 2 shared and 18 private strand(s). Zero-copy redo is FALSE
    *** 2009-03-03 08:06:51.921
    *** ACTION NAME:() 2009-03-03 08:06:51.905
    *** MODULE NAME:(expdp.exe) 2009-03-03 08:06:51.905
    *** SERVICE NAME:(xxxxxxxxxx) 2009-03-03 08:06:51.905
    *** SESSION ID:(118.53238) 2009-03-03 08:06:51.905
    SHDW: Failure to establish initial communication with MCP
    SHDW: Deleting Data Pump job infrastructure
    is it a system memory issue for data pump? my exp works well
    How to fix this issue?
    JIM
    Edited by: user589812 on Mar 3, 2009 5:07 PM
    Edited by: user589812 on Mar 3, 2009 5:22 PM

  • Data Extarction issue for 0SALARYLV_ATTR in BI 7.0

    Dear Experts,
    We have a data extarction situation for masater data - 0SAALRYLV_ATTR. Seems to be that the standard extractor does not bring in the attribiute values like pay Grade MIn, Max Ref Salary,etc if the employee are hourly or any other frequency period than yearly (freq code value = 8).
    I am wondering if anybody out here have experienced this situation., I am not sure if this is the intended purpose of this extarfor or there is a bug. if it it intended, I would appreciate if there is a busienss logical reason for it.
    Any help is appreciated.
    Regards
    Raj

    Thanks Alex! Any sample code and steps you might have to bring in the attirbute values for all types of employee (not just exempt - yearly), that would be great!
    Using the ABAPEr's help we found that for non-yearly (Freq codes other than 8), it it s trying to calculate yearly amound and skips the record (I guess deletes the record in table T710A or something). If you are already familiar with this datasource code, you might be able to explain it better what's happening inside.
    Thanks for your help.
    Regards
    Raj

  • Data Replication issue for R/3 on i5

    QUESTION:
    1) Do we need the data in following 2 files to be replicated when the production operation runs on
    Stand-by system?
    DBSTATIDB4 Index Sizes in the Database (statistical data)
    DBSTATTDB4 Table Sizes in the Database (statistical data)
    2) If we don't need data in those 2 files when production operation is underway on Stand-by,
    please let us know any considerations if you have.
    BACKGROUND INFO:
    Although our customer has two R/3 system on i5 configured as a production and a stand-by system
    and replicate R/3 production data to stand-by system using data replication tool called MIMIX,
    sometimes we encounter a problem with the data of following files are out of synchronization.
    Files cannot be replicated:
    DBSTATIDB4     F@Index Sizes in the Database (statistical data)
    DBSTATTDB4     F@Table Sizes in the Database (statistical data)
    The reason for that is  because of the new DB function called fast-delete since V5R3.
    This problem happens due to the fact that this function execute ENDJRN/STRJRN and it makes
    problem within MIMIX replication operation.
    In order to avoid this problem, we'd like to make sure that it's possible to eliminate these files
    out of synchronization and need more information for these files.

    Hello Yuhko,
    first of all thank you very much, that you are the first user/customer, that is using this new and great forum !!!
    We will move lot's of further customers into this forum soon.
    Fortunately, yes you can ignore these 2 tables if you like - they are not really interesting for the healthy of your SAP system. But, are you really on the latest PTF Levels of V5R3 - because I know this error and it should be fixed in the meantime - perhaps you need a newer MIMIX version as well - you should at least check.
    Then you could configure to run this "crazy" job not twice a day, but once a week only - that makes even your system faster ... This could be done in ST03 in the collector configuration. It is RSORATDB or RSDB_TDB in TCOLL - depending on your SAP release.
    But again:
    If you want to exclude these 2 tables from replication you are fine as well - but I would make this job even more rarely for better performance.
    Regards

  • Data Package Issue in DTP

    Hi gurus,
    My dataflow is like datsource->infosource->wrute optimised DSO with semantic key..
    In source , i have 10 records in that 7 records are duplicate records.
    I reduced the DTP datapackage size from 50000 to 5.
    When i excuted the DTP , i got 2 data package. in the first data package i got all the 7 records for the same set of keys and in the second data package i got the remaining records.
    My doubt is i have defined the data package size as "5" then how come the first data package can hold 7 records instead of 5 records.
    Thanks in advance !

    Hi ,
    It is because of the Semantic Key seeting that you have maintained .Data records that have the same key are combined in a single data package. This setting is only relevant for DataStore objects with data fields that are overwritten .
    Semantic Groups ensures how you want to build the data packages that are read from the source (DataSource or InfoProvider).
    This setting also defines the key fields for the error stack. By defining the key for the error stack, you ensure that the data can be updated in the target in the correct order once the incorrect data records have been corrected.
    Hope it helps .
    Thanks
    Kamal Mehta

  • ACI  - Date Range issue for Sales Details report

    I am working on ACI setup for one of my client. I set everything us as per documentation.
    This is regarding ‘Sales Details’ (Public Folders > ATG > Commerce > Sales > All Sales) report.
    Report is being generated if I select ‘Date Range’ under ‘Time Period’; but if I select ‘Predefined’ I get below errors:
    RQP-DEF-0177
    An error occurred while performing operation 'sqlPrepareWithOptions' status='-9'.
    UDA-SQL-0107 A general exception has occurred during the operation "prepare".ORA-32035: unreferenced query name defined in WITH clause RSV-SRV-0042 Trace back:RSReportService.cpp(758): QFException: CCL_CAUGHT: RSReportService::process()RSReportServiceMethod.cpp(239): QFException: CCL_RETHROW: RSReportServiceMethod::process(): promptPagingForward_RequestRSASyncExecutionThread.cpp(774): QFException: RSASyncExecutionThread::checkExceptionRSASyncExecutionThread.cpp(211): QFException: CCL_CAUGHT: RSASyncExecutionThread::run(): promptPagingForward_RequestRSASyncExecutionThread.cpp(824): QFException: CCL_RETHROW: RSASyncExecutionThread::processCommand(): promptPagingForward_RequestExecution/RSRenderExecution.cpp(593):
    Has anybody come across this issue?
    Any help in this regard will be highly appreciated.
    Thanks,
    Mukesh

    Contact Oracle support. I think we've seen this one before if using a particular version of Oracle(11.1?). There's a particular version of Oracle that doesn't support queries in a WITH clause that aren't referenced in the main query. Cognos seems to generate these types of queries not knowing that the version of Oracle doesn't support it. According or Support Article ID 1063400.1 you can patch this particular problem with Oracle or you can upgrade to Oracle 11.2. I also think that was a to get Cognos to generate an alternative query that doesn't use the WITH clause at all. Something about disabling the use of WITH in all queries by making a change to the report definition or alternatively a global change to the metadata model.
    Good luck...
    Andrew

  • LSMW Master Data Load issue for UOM.

    Hello Friends,
    I hope that someone out there can provide a answer to my question.
    I am attempting to use LSMW to load Alternate units of measure via MM02. The Main data is already in the system.
    My Flat file has 2 levels.
    Header--> contains Material Number
    Line --> contains data relevant to each UOM that I want to enter.
    When I do the Direct Input Method, I get the following message:
    "The material cannot be maintained since no maintainable data transferred"
    Here is the format of the flat file.
    SOURCEHEAD HEADER
    IND C(001) INDICATOR
    Identifing Field Content: H
    MATNR C(018) Material
    SOURCELINE UOM DATA
    IND C(001) INDICATOR
    Identifing Field Content: L
    MAT C(018) MAT
    UMREN C(005) Denominator for conversion to base units of measure
    MEINH C(003) Alternate Unit of measure sku
    UMREZ C(005) numerator for conversion to base units of measure
    EAN11 C(018) International Article Number (EAN/UPC)
    NUMTP C(002) Category of International Article Number (EAN)
    LAENG C(013) Length
    BREIT C(013) Width
    HOEHE C(013) Height
    MEABM C(003) Unit of Dimension for Length/Width/Height
    VOLUM C(013) Volume
    VOLEH C(003) Volume unit
    BRGEW C(013) Gross weight
    NTGEW C(013) Net weight
    GEWEI C(003) Weight Unit
    When I manually process the data, I have no issues.
    I am thinking that I may just be missing some piece of information necessary to get this data processed, am can not see what the answer is.
    Any Help that can be provided would be great.
    Regards,
    Christopher

    Hello,
    You need to map BMMH1 along with BMMH6.
    Map TCODE& MATNR in BMMH1 with XEIK1 = 'X' (BasicData). Subsequently map fields of structure: BMMH6.
    I just made a test now and it works fine for me.
    Hope this helps.
    Best Regards, Murugesh AS

  • Data control issue for content repository when running apps in webcenter .

    Hi All ,
    I have created content repository connection in my local jdeveloper and
    exposed it as a data control .
    from data control i am displaying some path and name based on some search criteria .
    Whenever i am runnig this application i am getting following exceptions and no datas are displayed .
    Since i have define connection for content server locally on my jdeveloper do i need to create some jndi
    on server side .
    TestContentServer is the content repository connection i have created in jdeveloper .
    if yes tell me how can i do it and how it will port to my data control .
    [2010-10-05T09:34:39.245-07:00] [wc_custom] [WARNING] [] [oracle.adf.controller.faces.lifecycle.Utils] [tid: [ACTIVE].ExecuteThread: '3' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: anonymous] [ecid: 0000IhxFdvi4ulWpTwp2ic1CemrZ0000fT,0:1] [WEBSERVICE_PORT.name: WSRP_v2_Markup_Service] [APP: application1] [J2EE_MODULE.name: TestContentService-ViewController-context-root] [WEBSERVICE.name: WSRP_v2_Service] [J2EE_APP.name: application1] ADF: Adding the following JSF error message: TestContentServer[[
    javax.naming.NameNotFoundException: TestContentServer; remaining name 'TestContentServer'
    Thanks,
    Arun

    Hi Yanic ,
    1. code for my jspx page
    <?xml version='1.0' encoding='UTF-8'?>
    <jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1"
    xmlns:f="http://java.sun.com/jsf/core"
    xmlns:h="http://java.sun.com/jsf/html"
    xmlns:af="http://xmlns.oracle.com/adf/faces/rich">
    <jsp:directive.page contentType="text/html;charset=UTF-8"/>
    <f:view>
    <af:document id="d1">
    <af:messages id="m1"/>
    <af:form id="f1">
    <af:inputText value="#{bindings.path.inputValue}" simple="true"
    required="#{bindings.path.hints.mandatory}"
    columns="#{bindings.path.hints.displayWidth}"
    maximumLength="#{bindings.path.hints.precision}"
    shortDesc="#{bindings.path.hints.tooltip}" id="it1">
    <f:validator binding="#{bindings.path.validator}"/>
    </af:inputText>
    <af:inputText value="#{bindings.type.inputValue}" simple="true"
    required="#{bindings.type.hints.mandatory}"
    columns="#{bindings.type.hints.displayWidth}"
    maximumLength="#{bindings.type.hints.precision}"
    shortDesc="#{bindings.type.hints.tooltip}" id="it2">
    <f:validator binding="#{bindings.type.validator}"/>
    </af:inputText>
    <af:commandButton actionListener="#{bindings.getItems.execute}"
    text="getItems"
    disabled="#{!bindings.getItems.enabled}" id="cb1"
    partialSubmit="true"/>
    <af:table value="#{bindings.Items.collectionModel}" var="row"
    rows="#{bindings.Items.rangeSize}"
    emptyText="#{bindings.Items.viewable ? 'No data to display.' : 'Access Denied.'}"
    fetchSize="#{bindings.Items.rangeSize}"
    rowBandingInterval="0" id="t1" partialTriggers="::cb1">
    <af:column sortProperty="name" sortable="false"
    headerText="#{bindings.Items.hints.name.label}" id="c2">
    <af:inputText value="#{row.bindings.name.inputValue}"
    label="#{bindings.Items.hints.name.label}"
    required="#{bindings.Items.hints.name.mandatory}"
    columns="#{bindings.Items.hints.name.displayWidth}"
    maximumLength="#{bindings.Items.hints.name.precision}"
    shortDesc="#{bindings.Items.hints.name.tooltip}"
    id="it3">
    <f:validator binding="#{row.bindings.name.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="path" sortable="false"
    headerText="#{bindings.Items.hints.path.label}" id="c3">
    <af:inputText value="#{row.bindings.path.inputValue}"
    label="#{bindings.Items.hints.path.label}"
    required="#{bindings.Items.hints.path.mandatory}"
    columns="#{bindings.Items.hints.path.displayWidth}"
    maximumLength="#{bindings.Items.hints.path.precision}"
    shortDesc="#{bindings.Items.hints.path.tooltip}"
    id="it8">
    <f:validator binding="#{row.bindings.path.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="URI" sortable="false"
    headerText="#{bindings.Items.hints.URI.label}" id="c5">
    <af:inputText value="#{row.bindings.URI.inputValue}"
    label="#{bindings.Items.hints.URI.label}"
    required="#{bindings.Items.hints.URI.mandatory}"
    columns="#{bindings.Items.hints.URI.displayWidth}"
    maximumLength="#{bindings.Items.hints.URI.precision}"
    shortDesc="#{bindings.Items.hints.URI.tooltip}"
    id="it7">
    <f:validator binding="#{row.bindings.URI.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="primaryType" sortable="false"
    headerText="#{bindings.Items.hints.primaryType.label}"
    id="c7">
    <af:inputText value="#{row.bindings.primaryType.inputValue}"
    label="#{bindings.Items.hints.primaryType.label}"
    required="#{bindings.Items.hints.primaryType.mandatory}"
    columns="#{bindings.Items.hints.primaryType.displayWidth}"
    maximumLength="#{bindings.Items.hints.primaryType.precision}"
    shortDesc="#{bindings.Items.hints.primaryType.tooltip}"
    id="it6">
    <f:validator binding="#{row.bindings.primaryType.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="title" sortable="false"
    headerText="#{bindings.Items.hints.title.label}" id="c1">
    <af:inputText value="#{row.bindings.title.inputValue}"
    label="#{bindings.Items.hints.title.label}"
    required="#{bindings.Items.hints.title.mandatory}"
    columns="#{bindings.Items.hints.title.displayWidth}"
    maximumLength="#{bindings.Items.hints.title.precision}"
    shortDesc="#{bindings.Items.hints.title.tooltip}"
    id="it9">
    <f:validator binding="#{row.bindings.title.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="docType" sortable="false"
    headerText="#{bindings.Items.hints.docType.label}" id="c6">
    <af:inputText value="#{row.bindings.docType.inputValue}"
    label="#{bindings.Items.hints.docType.label}"
    required="#{bindings.Items.hints.docType.mandatory}"
    columns="#{bindings.Items.hints.docType.displayWidth}"
    maximumLength="#{bindings.Items.hints.docType.precision}"
    shortDesc="#{bindings.Items.hints.docType.tooltip}"
    id="it5">
    <f:validator binding="#{row.bindings.docType.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="displayName" sortable="false"
    headerText="#{bindings.Items.hints.displayName.label}"
    id="c4">
    <af:inputText value="#{row.bindings.displayName.inputValue}"
    label="#{bindings.Items.hints.displayName.label}"
    required="#{bindings.Items.hints.displayName.mandatory}"
    columns="#{bindings.Items.hints.displayName.displayWidth}"
    maximumLength="#{bindings.Items.hints.displayName.precision}"
    shortDesc="#{bindings.Items.hints.displayName.tooltip}"
    id="it4">
    <f:validator binding="#{row.bindings.displayName.validator}"/>
    </af:inputText>
    </af:column>
    </af:table>
    </af:form>
    </af:document>
    </f:view>
    </jsp:root>
    2. code for binding .
    <?xml version="1.0" encoding="UTF-8" ?>
    <pageDefinition xmlns="http://xmlns.oracle.com/adfm/uimodel"
    version="11.1.1.56.60" id="Test1PageDef"
    Package="com.heiwip.cs.view.pageDefs">
    <parameters/>
    <executables>
    <variableIterator id="variables">
    <variable Type="java.lang.String" Name="getItems_path"
    IsQueriable="false"/>
    <variable Type="java.lang.String" Name="getItems_type"
    IsQueriable="false"/>
    </variableIterator>
    <methodIterator Binds="getItems.result" DataControl="TestDC1" RangeSize="25"
    BeanClass="com.heiwip.cs.view.TestDC1.getItems_return"
    id="getItemsIterator"/>
    </executables>
    <bindings>
    <methodAction id="getItems" RequiresUpdateModel="true" Action="invokeMethod"
    MethodName="getItems" IsViewObjectMethod="false"
    DataControl="TestDC1" InstanceName="TestDC1"
    ReturnName="TestDC1.methodResults.getItems_TestDC1_getItems_result">
    <NamedData NDName="path" NDType="java.lang.String"
    NDValue="${bindings.getItems_path}"/>
    <NamedData NDName="type" NDType="java.lang.String"
    NDValue="${bindings.getItems_type}"/>
    </methodAction>
    <attributeValues IterBinding="variables" id="path">
    <AttrNames>
    <Item Value="getItems_path"/>
    </AttrNames>
    </attributeValues>
    <attributeValues IterBinding="variables" id="type">
    <AttrNames>
    <Item Value="getItems_type"/>
    </AttrNames>
    </attributeValues>
    <attributeValues IterBinding="getItemsIterator" id="displayName">
    <AttrNames>
    <Item Value="displayName"/>
    </AttrNames>
    </attributeValues>
    <attributeValues IterBinding="getItemsIterator" id="name">
    <AttrNames>
    <Item Value="name"/>
    </AttrNames>
    </attributeValues>
    <attributeValues IterBinding="getItemsIterator" id="path1">
    <AttrNames>
    <Item Value="path"/>
    </AttrNames>
    </attributeValues>
    </bindings>
    </pageDefinition>
    3. Table is displayed with no data .
    whenever i am trying to pass something in search criteria to get the result its
    throwing connection error .
    4. I am using oracle UCM (oracle content server) .
    for this i am using identity propogation .
    Thanks,
    Arun.

  • Data package and data packet

    Hit
    i want to know the difference between data package and data packet .when this comes in sap bw
    with regards
    tushar

    Hello,
    Data package term is related to DTP which is used to load Data from PSA to further Data Targets
    Start and end routine works at package level so routine run for each package one by one .By default package have sorted data based on keys (non unique keys (characteristics )of source or target) and by setting semantic keys you can change this order.So Package having more data will take more time in processing then package have lesser data .
    Data Packet Term is related to Info Package which is used to load data from Source System to BI (PSA).
    As per SAP standard, we prefer to have 50,000 records per one data packet.
    For every data packet, it does commit & save --- so less no. of data packets required.
    If you have 1 lakh records per data packet and there is an error in the last record, the entire packet gets failed.
    Hope it helps!

  • How to set a data packet to red status.

    I have a big load which ended mostly in green data packets except for two, which were red,  because of server / resources issues. I tried to update them manually and they turned yellow but all processes ended and they are still in yellow. I want to try a manual update again, but I can't because they have to be in red status. How can i change the status of a single packet? This was a pretty hard load, and I can't afford to loose what so far made it okay to the Infocube.

    Hi,
      You need to force the request to red and then ur datapacket will turn to red then update the two packets manually then force it to green.
    Regards,
    Malar

  • Error "Data packet not complete; for example, 000013" when reading from PSA

    Dear all,
    I have an issue with my data loading, when i execute my infopackage, i specify that it should be loaded into PSA before going to the data target. in my infopackage request i notice that i have got missing data package
    e.g.
    Data package 1 : everything OK
    Data package 2 : everything OK
    Data package 3 : everything OK
    Data package 8 : everything OK
    Data package 10 : everything OK
    Data package 12 : everything OK
    Data package 13 : everything OK
    what happen to data package inbetween e.g. DP4, DP5 etc etc?
    In my "step by step analysis" under the status tab everything is green and nothing seem to have went wrong, no short dump, no job cancelled, nothing.
    Since everything "seems" to be alright, when i try to update subsequently to the data target, i get the following errors below.
    Data packet not complete; for example, 000013
    Request has errors / is incomplete
    I got 2 questions here...
    1. why do i have missing data package and it still showes me a green idicator?
    2. how can i solve this and is this something i should be worried about?
    Thank you very very much!

    Hii SCHT,
    you encounter these type of errors rarely...
    but in RSMO screen, check for any TRFCs struck..
    also go to the JOB in Source System and analyze the JOBLOG,
    there you can find some information about what tht particular job did..!!!
    and after that check for the same in the BW system...
    perhaps while transferring?extracting the recordsthrough data packages, it might have missed ..
    Force the request RED and repeat the IP again...
    you dont need to worry about anything...simply force the request red and reload.. just let us know if the problem still persists after repeating the load...
    Regards
    Prince

  • QM action not allowed for DTP requests in master data and Text tables

    Hi,
    I have a master data object load request which has failed, returned error in st22 and job finished.
    BUT the request has not turned red in the monitor - it's still yellow. That means that I can not delete request nor start new load to infoprovider because it believes a request is still running. But it's not, I've checked both sm37, sm50 etc.
    When trying to manually change QM status to 'red' it says 'QM action not allowed for DTP requests in master data and Text tables'
    How can I force QM status to red so that I can move on?
    Running NW2004s BI (7.0) Patch 15.
    I searched for this question but there is no answer
    Thank you

    Folks
    I know how frustrating this problem is even in netweaver 7.0 environment. I found a solution for this problem atlast and it works but a not direct way of resolving this issue.
    When this request is in yellow status and not able to change / delete the request, its actually in a pseudo status and it is a bug in the program. This request sits in table RSBKREQUEST with processing type as 5 in data elements USTATE, TSTATE and this 5 is actually "Active" status which is obviously wrong.
    All we need is, ask your basis person to change the status to 3 in both these data elements and then it allows to reload the delta. Once the delta is successfully loaded, you can delete the previous bad request even though it is still in yellow. Once the request is deleted, the request status gets updated as "4" in table RSBKREQUEST.
    Hope this helps.
    Thanks

Maybe you are looking for