Data Extarction issue for 0SALARYLV_ATTR in BI 7.0

Dear Experts,
We have a data extarction situation for masater data - 0SAALRYLV_ATTR. Seems to be that the standard extractor does not bring in the attribiute values like pay Grade MIn, Max Ref Salary,etc if the employee are hourly or any other frequency period than yearly (freq code value = 8).
I am wondering if anybody out here have experienced this situation., I am not sure if this is the intended purpose of this extarfor or there is a bug. if it it intended, I would appreciate if there is a busienss logical reason for it.
Any help is appreciated.
Regards
Raj

Thanks Alex! Any sample code and steps you might have to bring in the attirbute values for all types of employee (not just exempt - yearly), that would be great!
Using the ABAPEr's help we found that for non-yearly (Freq codes other than 8), it it s trying to calculate yearly amound and skips the record (I guess deletes the record in table T710A or something). If you are already familiar with this datasource code, you might be able to explain it better what's happening inside.
Thanks for your help.
Regards
Raj

Similar Messages

  • Date difference issue for Purchase order in R/3 and APO

    Hi,
       I am sending the purchase order from APO to R/3. When I transfer the data through CIF and see the curresponding entries in stock requirement list in R/3, purchase order is 1 day earlier than in SCM.
    However, when this purchase order is CIFed back to SCM, date appears correct in SCM ( as what it was before). Is this something because of timezone that difference in date appears in SCM and R/3.
    Can someone who came across date issues for transaction data transfer between SCM and R/3 elaborate in more detail please.
    Thanks,
    Sanjay

    Hello Somnath ,
    I am also facing the same issue which sanju was facing earlier in PRs .
    In APO I am not able to understand how system calculates the opening , start and end date .In my case the transportation time is 48 hrs , planned delivery time is 1 day and all calenders are 5*24 , GR proceeeing time .01 day . When I am creating PR at destination location on 06/06/2008 , 24:00:00 Hrs . System calculates the dates as follows :-
    Start date -06/05/2008 , 00:45:00 , end date -06/06/2008, 24:00:00 , opening date -05/20/2008 ,23:25:00 Hrs  . I tried to understand the formula given by you in your reply as Opening Date = Requirement Due Date - GR Time - Planned Delivery Time
    Start Date = Requirement Due Date - GR time - Transportation Duration
    End date = Requirement Date.
    But I am not getting the desired results.
    Could you let me know why the system is behaving in such way.I also want to know does system can create orders in such a way the opening date lies in past. If yes then can we restrict that ?
    Regards ,
    Mukesh
    Edited by: Mukesh  Kumar on Mar 9, 2008 1:59 PM

  • Data pump issue for oracle 10G in window2003

    Hi Experts,
    I try to run data pump in oracle 10G in window 2003 server.
    I got a error as
    D:\>cd D:\oracle\product\10.2.0\SALE\BIN
    D:\oracle\product\10.2.0\SALE\BIN>expdp system/xxxxl@sale full=Y directory=du
    mpdir dumpfile=expdp_sale_20090302.dmp logfile=exp_sale_20090302.log
    Export: Release 10.2.0.4.0 - Production on Tuesday, 03 March, 2009 8:05:50
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Produc
    tion
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-31626: job does not exist
    ORA-31650: timeout waiting for master process response
    However, I can run exp codes and works well.
    What is wrong for my data pump?
    Thanks
    JIM

    Hi Anand,
    I did not see any error log at that time. Actually, it did not work any more. I will test it again based on your emial after exp done.
    Based on new testing, I got below errors as
    ORA-39014: One or more workers have prematurely exited.
    ORA-39029: worker 1 with process name "DW01" prematurely terminated
    ORA-31671: Worker process DW01 had an unhandled exception.
    ORA-04030: out of process memory when trying to allocate 4108 bytes (PLS non-lib hp,pdzgM60_Make)
    ORA-06512: at "SYS.KUPC$QUEUE_INT", line 277
    ORA-06512: at "SYS.KUPW$WORKER", line 1366
    ORA-04030: out of process memory when trying to allocate 65036 bytes (callheap,KQL tmpbuf)
    ORA-06508: PL/SQL: could not find program unit being called: "SYS.KUPC$_WORKERERROR"
    ORA-06512: at "SYS.KUPW$WORKER", line 13360
    ORA-06512: at "SYS.KUPW$WORKER", line 15039
    ORA-06512: at "SYS.KUPW$WORKER", line 6372
    ORA-39125: Worker unexpected fatal error in KUPW$WORKER.DISPATCH_WORK_ITEMS while calling DBMS_METADATA.FETCH_XML_CLOB [PROCOBJ:"SALE"."SQLSCRIPT_2478179"]
    ORA-06512: at "SYS.KUPW$WORKER", line 7078
    ORA-04030: out of process memory when trying to allocate 4108 bytes (PLS non-lib hp,pdzgM60_Make)
    ORA-06500: PL/SQL: storage error
    ORA-04030: out of process memory when trying to allocate 16396 bytes (koh-kghu sessi,pmucpcon: tds)
    ORA-04030: out of process memory when trying to allocate 16396 bytes (koh-kghu sessi,pmucalm coll)
    Job "SYSTEM"."SYS_EXPORT_FULL_01" stopped due to fatal error at 14:41:36
    ORA-39014: One or more workers have prematurely exited.
    the trace file as
    *** 2009-03-03 14:20:41.500
    *** ACTION NAME:() 2009-03-03 14:20:41.328
    *** MODULE NAME:(oradim.exe) 2009-03-03 14:20:41.328
    *** SERVICE NAME:() 2009-03-03 14:20:41.328
    *** SESSION ID:(159.1) 2009-03-03 14:20:41.328
    Successfully allocated 7 recovery slaves
    Using 157 overflow buffers per recovery slave
    Thread 1 checkpoint: logseq 12911, block 2, scn 7355467494724
    cache-low rba: logseq 12911, block 251154
    on-disk rba: logseq 12912, block 221351, scn 7355467496281
    start recovery at logseq 12911, block 251154, scn 0
    ----- Redo read statistics for thread 1 -----
    Read rate (ASYNC): 185319Kb in 1.73s => 104.61 Mb/sec
    Total physical reads: 189333Kb
    Longest record: 5Kb, moves: 0/448987 (0%)
    Change moves: 1378/5737 (24%), moved: 0Mb
    Longest LWN: 1032Kb, moves: 45/269 (16%), moved: 41Mb
    Last redo scn: 0x06b0.9406fb58 (7355467496280)
    ----- Recovery Hash Table Statistics ---------
    Hash table buckets = 32768
    Longest hash chain = 3
    Average hash chain = 35384/25746 = 1.4
    Max compares per lookup = 3
    Avg compares per lookup = 847056/876618 = 1.0
    *** 2009-03-03 14:20:46.062
    KCRA: start recovery claims for 35384 data blocks
    *** 2009-03-03 14:21:02.171
    KCRA: blocks processed = 35384/35384, claimed = 35384, eliminated = 0
    *** 2009-03-03 14:21:02.531
    Recovery of Online Redo Log: Thread 1 Group 2 Seq 12911 Reading mem 0
    *** 2009-03-03 14:21:04.718
    Recovery of Online Redo Log: Thread 1 Group 1 Seq 12912 Reading mem 0
    *** 2009-03-03 14:21:16.296
    ----- Recovery Hash Table Statistics ---------
    Hash table buckets = 32768
    Longest hash chain = 3
    Average hash chain = 35384/25746 = 1.4
    Max compares per lookup = 3
    Avg compares per lookup = 849220/841000 = 1.0
    *** 2009-03-03 14:21:28.468
    tkcrrsarc: (WARN) Failed to find ARCH for message (message:0x1)
    tkcrrpa: (WARN) Failed initial attempt to send ARCH message (message:0x1)
    *** 2009-03-03 14:26:25.781
    kwqmnich: current time:: 14: 26: 25
    kwqmnich: instance no 0 check_only flag 1
    kwqmnich: initialized job cache structure
    ktsmgtur(): TUR was not tuned for 360 secs
    Windows Server 2003 Version V5.2 Service Pack 2
    CPU : 8 - type 586, 4 Physical Cores
    Process Affinity : 0x00000000
    Memory (Avail/Total): Ph:7447M/8185M, Ph+PgF:6833M/9984M, VA:385M/3071M
    Instance name: vmsdbsea
    Redo thread mounted by this instance: 0 <none>
    Oracle process number: 0
    Windows thread id: 2460, image: ORACLE.EXE (SHAD)
    Dynamic strand is set to TRUE
    Running with 2 shared and 18 private strand(s). Zero-copy redo is FALSE
    *** 2009-03-03 08:06:51.921
    *** ACTION NAME:() 2009-03-03 08:06:51.905
    *** MODULE NAME:(expdp.exe) 2009-03-03 08:06:51.905
    *** SERVICE NAME:(xxxxxxxxxx) 2009-03-03 08:06:51.905
    *** SESSION ID:(118.53238) 2009-03-03 08:06:51.905
    SHDW: Failure to establish initial communication with MCP
    SHDW: Deleting Data Pump job infrastructure
    is it a system memory issue for data pump? my exp works well
    How to fix this issue?
    JIM
    Edited by: user589812 on Mar 3, 2009 5:07 PM
    Edited by: user589812 on Mar 3, 2009 5:22 PM

  • Data Packet issue for DTP

    Hi,
        I have Data packet  issue with DTP  when i am loding for General Ledger Account -2009
       (cube) the data load is from DATA SOURCE (0FI_GL_4) . the data is first loaded to DSO
       and then to a cube. upto this level the data is going fine, when the data is loaded to 2009
       cube using DTP as full load  i have an issue in the report, where the net balance is not " 0 ".
       but when i do a manual load for Selective company code's as a single value selection in the
       DTP filter condetion for all the company codes my data is matching in the report, where
       the netbalance is " 0 ". 
       With this i think there is an issue with Datapacket for DTP.
       Please sugest in this regard.
       Regards,
       prasad.

    Hi Ngendra,
    Yes, there will problem wth data loads some time with DTP.
    This can be resolved by setting scemantic grouping at dtp level, make sure that scemantic field ur defining will be unique.
    I am not sure with respect to functional side but with respect to ur post. i assume your problem will resoved by seeting company code as scemantic field or by setting some unique field as scemantic while data loading....
    Hope this helps you.
    Best Regards,
    Maruthi

  • Data Replication issue for R/3 on i5

    QUESTION:
    1) Do we need the data in following 2 files to be replicated when the production operation runs on
    Stand-by system?
    DBSTATIDB4 Index Sizes in the Database (statistical data)
    DBSTATTDB4 Table Sizes in the Database (statistical data)
    2) If we don't need data in those 2 files when production operation is underway on Stand-by,
    please let us know any considerations if you have.
    BACKGROUND INFO:
    Although our customer has two R/3 system on i5 configured as a production and a stand-by system
    and replicate R/3 production data to stand-by system using data replication tool called MIMIX,
    sometimes we encounter a problem with the data of following files are out of synchronization.
    Files cannot be replicated:
    DBSTATIDB4     F@Index Sizes in the Database (statistical data)
    DBSTATTDB4     F@Table Sizes in the Database (statistical data)
    The reason for that is  because of the new DB function called fast-delete since V5R3.
    This problem happens due to the fact that this function execute ENDJRN/STRJRN and it makes
    problem within MIMIX replication operation.
    In order to avoid this problem, we'd like to make sure that it's possible to eliminate these files
    out of synchronization and need more information for these files.

    Hello Yuhko,
    first of all thank you very much, that you are the first user/customer, that is using this new and great forum !!!
    We will move lot's of further customers into this forum soon.
    Fortunately, yes you can ignore these 2 tables if you like - they are not really interesting for the healthy of your SAP system. But, are you really on the latest PTF Levels of V5R3 - because I know this error and it should be fixed in the meantime - perhaps you need a newer MIMIX version as well - you should at least check.
    Then you could configure to run this "crazy" job not twice a day, but once a week only - that makes even your system faster ... This could be done in ST03 in the collector configuration. It is RSORATDB or RSDB_TDB in TCOLL - depending on your SAP release.
    But again:
    If you want to exclude these 2 tables from replication you are fine as well - but I would make this job even more rarely for better performance.
    Regards

  • ACI  - Date Range issue for Sales Details report

    I am working on ACI setup for one of my client. I set everything us as per documentation.
    This is regarding ‘Sales Details’ (Public Folders > ATG > Commerce > Sales > All Sales) report.
    Report is being generated if I select ‘Date Range’ under ‘Time Period’; but if I select ‘Predefined’ I get below errors:
    RQP-DEF-0177
    An error occurred while performing operation 'sqlPrepareWithOptions' status='-9'.
    UDA-SQL-0107 A general exception has occurred during the operation "prepare".ORA-32035: unreferenced query name defined in WITH clause RSV-SRV-0042 Trace back:RSReportService.cpp(758): QFException: CCL_CAUGHT: RSReportService::process()RSReportServiceMethod.cpp(239): QFException: CCL_RETHROW: RSReportServiceMethod::process(): promptPagingForward_RequestRSASyncExecutionThread.cpp(774): QFException: RSASyncExecutionThread::checkExceptionRSASyncExecutionThread.cpp(211): QFException: CCL_CAUGHT: RSASyncExecutionThread::run(): promptPagingForward_RequestRSASyncExecutionThread.cpp(824): QFException: CCL_RETHROW: RSASyncExecutionThread::processCommand(): promptPagingForward_RequestExecution/RSRenderExecution.cpp(593):
    Has anybody come across this issue?
    Any help in this regard will be highly appreciated.
    Thanks,
    Mukesh

    Contact Oracle support. I think we've seen this one before if using a particular version of Oracle(11.1?). There's a particular version of Oracle that doesn't support queries in a WITH clause that aren't referenced in the main query. Cognos seems to generate these types of queries not knowing that the version of Oracle doesn't support it. According or Support Article ID 1063400.1 you can patch this particular problem with Oracle or you can upgrade to Oracle 11.2. I also think that was a to get Cognos to generate an alternative query that doesn't use the WITH clause at all. Something about disabling the use of WITH in all queries by making a change to the report definition or alternatively a global change to the metadata model.
    Good luck...
    Andrew

  • Z data source issue for creating packets

    Hi I have created a Z data source (function module) .
    My issue is I am not able to create data record packets all the data is coming in one packet only.
    The is code is as show below is some one can please assist me how can I change the code so that is can create multiple packets for the option given in Tcode RSA3.
    FUNCTION ZBW_MATERIAL_GROUP_HIE.
    ""Local Interface:
    *" IMPORTING
    *" VALUE(I_REQUNR) TYPE SRSC_S_IF_SIMPLE-REQUNR
    *" VALUE(I_DSOURCE) TYPE SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
    *" VALUE(I_MAXSIZE) TYPE SRSC_S_IF_SIMPLE-MAXSIZE OPTIONAL
    *" VALUE(I_INITFLAG) TYPE SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
    *" VALUE(I_READ_ONLY) TYPE SRSC_S_IF_SIMPLE-READONLY OPTIONAL
    *" VALUE(I_REMOTE_CALL) TYPE SBIWA_FLAG DEFAULT SBIWA_C_FLAG_OFF
    *" TABLES
    *" I_T_SELECT TYPE SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL
    *" I_T_FIELDS TYPE SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL
    *" E_T_DATA STRUCTURE ZBW_MAT_GRP_HIER OPTIONAL
    *" EXCEPTIONS
    *" NO_MORE_DATA
    *" ERROR_PASSED_TO_MESS_HANDLER
    TABLES : /BI0/HMATL_GROUP.
    DATA : BEGIN OF t_hmat OCCURS 0,
    hieid LIKE /BI0/HMATL_GROUP-hieid,
    objvers LIKE /BI0/HMATL_GROUP-objvers,
    iobjnm LIKE /BI0/HMATL_GROUP-iobjnm,
    nodeid LIKE /BI0/HMATL_GROUP-nodeid,
    nodename LIKE /BI0/HMATL_GROUP-nodename,
    tlevel LIKE /BI0/HMATL_GROUP-tlevel,
    parentid LIKE /BI0/HMATL_GROUP-parentid,
    END OF t_hmat.
    DATA : BEGIN OF t_flathier,
    hieid LIKE /BI0/HMATL_GROUP-hieid,
    lv2_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv2_name LIKE /BI0/HMATL_GROUP-nodename,
    lv3_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv3_name LIKE /BI0/HMATL_GROUP-nodename,
    lv4_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv4_name LIKE /BI0/HMATL_GROUP-nodename,
    lv5_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv5_name LIKE /BI0/HMATL_GROUP-nodename,
    lv6_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv6_name LIKE /BI0/HMATL_GROUP-nodename,
    lv7_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv7_name LIKE /BI0/HMATL_GROUP-nodename,
    lv8_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv8_name LIKE /BI0/HMATL_GROUP-nodename,
    lv9_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv9_name LIKE /BI0/HMATL_GROUP-nodename,
    lv10_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv10_name LIKE /BI0/HMATL_GROUP-nodename,
    lv11_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv11_name LIKE /BI0/HMATL_GROUP-nodename,
    material LIKE /BI0/HMATL_GROUP-nodename,
    END OF t_flathier.
    FIELD-SYMBOLS: <f> LIKE LINE OF t_hmat,
    <Level> TYPE ANY.
    data : count(2) type c,
    lv_level(20) type c.
    DATA : lv_count TYPE n.
    DATA : lv_id LIKE /BI0/HMATL_GROUP-nodeid,
    lv_hieid LIKE /BI0/HMATL_GROUP-hieid.
    Auxiliary Selection criteria structure
    DATA: l_s_select TYPE srsc_s_select.
    Maximum number of lines for DB table
    STATICS: s_s_if TYPE srsc_s_if_simple,
    counter
    s_counter_datapakid LIKE sy-tabix,
    cursor
    s_cursor TYPE cursor.
    Select ranges
    RANGES: l_r_nodename FOR /BI0/HMATL_GROUP-nodename,
    l_r_hieid FOR /BI0/HMATL_GROUP-hieid.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
    IF i_initflag = sbiwa_c_flag_on.
    Initialization: check input parameters
    buffer input parameters
    prepare data selection
    Check DataSource validity
    CASE i_dsource.
    WHEN 'ZMATERIAL_GROUP_HIE'.
    WHEN OTHERS.
    IF 1 = 2. MESSAGE e009(r3). ENDIF.
    this is a typical log call. Please write every error message like this
    log_write 'E' "message type
    'R3' "message class
    '009' "message number
    i_dsource "message variable 1
    ' '. "message variable 2
    RAISE error_passed_to_mess_handler.
    ENDCASE.
    APPEND LINES OF i_t_select TO s_s_if-t_select.
    Fill parameter buffer for data extraction calls
    s_s_if-requnr = i_requnr.
    s_s_if-dsource = i_dsource.
    s_s_if-maxsize = i_maxsize.
    Fill field list table for an optimized select statement
    (in case that there is no 1:1 relation between InfoSource fields
    and database table fields this may be far from beeing trivial)
    APPEND LINES OF i_t_fields TO s_s_if-t_fields.
    ELSE. "Initialization mode or data extraction ?
    Data transfer: First Call OPEN CURSOR + FETCH
    Following Calls FETCH only
    First data package -> OPEN CURSOR
    IF s_counter_datapakid = 0.
    Fill range tables BW will only pass down simple selection criteria
    of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
    LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = '0MATERIAL'.
    MOVE-CORRESPONDING l_s_select TO l_r_nodename.
    APPEND l_r_nodename.
    ENDLOOP.
    LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'HIEID'.
    MOVE-CORRESPONDING l_s_select TO l_r_hieid.
    APPEND l_r_hieid.
    ENDLOOP.
    Get the data from Hierarchy table
    SELECT * FROM /BI0/HMATL_GROUP INTO CORRESPONDING FIELDS OF
    TABLE t_hmat
    WHERE hieid IN l_r_hieid
    AND objvers = 'A' .
    ENDIF.
    loop through all the 0MATERIAL entries to get all the hirarchy levels.
    Start of change.
    LOOP AT t_hmat ASSIGNING <f>
    WHERE iobjnm = '0MATL_GROUP'
    AND nodename IN l_r_nodename.
    LOOP AT t_hmat ASSIGNING <f>
    WHERE nodename IN l_r_nodename.
    End of change
    lv_count = <f>-tlevel.
    "refresh t_flathier.
    CLEAR: t_flathier. ", lv_level, count.
    MOVE :
    <f>-hieid TO lv_hieid ,
    <f>-nodename TO t_flathier-material,
    <f>-parentid TO lv_id.
    if <f>-iobjnm <> '0MATL_GROUP' .
    move <f>-nodename+3 to t_flathier-material .
    else.
    move <f>-nodename to t_flathier-material .
    endif.
    Added for Last level.
    if lv_count = '1' .
    *t_flathier-lv1_name = t_flathier-material .
    elseif lv_count = '2' .
    t_flathier-lv2_name = t_flathier-material .
    elseif lv_count = '3' .
    t_flathier-lv3_name = t_flathier-material .
    elseif lv_count = '4' .
    t_flathier-lv4_name = t_flathier-material .
    elseif lv_count = '5' .
    t_flathier-lv5_name = t_flathier-material .
    elseif lv_count = '6' .
    t_flathier-lv6_name = t_flathier-material .
    elseif lv_count = '7' .
    t_flathier-lv7_name = t_flathier-material .
    elseif lv_count = '8' .
    t_flathier-lv8_name = t_flathier-material .
    elseif lv_count = '9' .
    t_flathier-lv9_name = t_flathier-material .
    elseif lv_count = '10' .
    t_flathier-lv10_name = t_flathier-material .
    endif.
    DO lv_count TIMES .
    lv_count = lv_count - 1.
    IF lv_count = 1.
    EXIT.
    ENDIF.
    READ TABLE t_hmat WITH KEY
    hieid = lv_hieid
    nodeid = lv_id.
    IF sy-subrc = 0.
    CLEAR lv_id.
    CASE lv_count.
    WHEN '11' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv11_name,
    t_hmat-parentid TO lv_id.
    WHEN '10' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv10_name,
    t_hmat-parentid TO lv_id.
    WHEN '9' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv9_name,
    t_hmat-parentid TO lv_id.
    WHEN '8' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv8_name,
    t_hmat-parentid TO lv_id.
    WHEN '7' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv7_name,
    t_hmat-parentid TO lv_id.
    WHEN '6' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv6_name,
    t_hmat-parentid TO lv_id.
    WHEN '5' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv5_name,
    t_hmat-parentid TO lv_id.
    WHEN '4' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv4_name,
    t_hmat-parentid TO lv_id.
    WHEN '3' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv3_name,
    t_hmat-parentid TO lv_id.
    WHEN '2' .
    MOVE : t_hmat-nodename+3 TO t_flathier-lv2_name.
    ENDCASE.
    ENDIF.
    ENDDO.
    Populate data for level 1 (Class Type)
    READ TABLE t_hmat WITH KEY
    hieid = lv_hieid
    tlevel = 1.
    IF sy-subrc = 0.
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_OUTPUT'
    EXPORTING
    input = t_hmat-nodename
    IMPORTING
    output = e_t_data-0class_type.
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
    EXPORTING
    input = e_t_data-0class_type
    IMPORTING
    output = e_t_data-0class_type.
    ENDIF.
    populate data to extraction structure ( removing prefixe 'class type')
    MOVE : lv_hieid TO e_t_data-hieid,
    t_flathier-lv2_name TO e_t_data-xhier_lv1,
    t_flathier-lv3_name TO e_t_data-xhier_lv2,
    t_flathier-lv4_name TO e_t_data-xhier_lv3,
    t_flathier-lv5_name TO e_t_data-xhier_lv4,
    t_flathier-lv6_name TO e_t_data-xhier_lv5,
    t_flathier-lv7_name TO e_t_data-xhier_lv6,
    t_flathier-lv8_name TO e_t_data-xhier_lv7,
    t_flathier-lv9_name TO e_t_data-xhier_lv8,
    t_flathier-lv10_name TO e_t_data-xhier_lv9,
    t_flathier-lv11_name TO e_t_data-xhie_lv10,
    t_flathier-material TO e_t_data-0MATL_GROUP.
    APPEND e_t_data.
    CLEAR e_t_data.
    ENDLOOP.
    s_counter_datapakid = s_counter_datapakid + 1.
    IF s_counter_datapakid > 1 .
    RAISE no_more_data.
    ENDIF.
    ENDIF. "Initialization mode or data extraction ?
    ENDFUNCTION.
    As now when I run it in Tcode RSA3 it give only one data packet of some 5k to 6k records.
    Thanks in advance for your help.
    Pawan.

    Hi PS,
    Instead of
    SELECT * FROM /BI0/HMATL_GROUP INTO CORRESPONDING FIELDS OF
    TABLE t_hmat
    WHERE hieid IN l_r_hieid
    AND objvers = 'A' .
    code should look like this .
          OPEN CURSOR WITH HOLD S_CURSOR FOR
          SELECT (S_S_IF-T_FIELDS) FROM /BI0/HMATL_GROUP
        FETCH NEXT CURSOR S_CURSOR
                   APPENDING CORRESPONDING FIELDS
                   OF TABLE E_T_DATA
                   PACKAGE SIZE S_S_IF-MAXSIZE.
    For more information refer to sample code of fm "RSAX_BIW_GET_DATA_SIMPLE"
    Hope that helps.
    Regards
    Mr Kapadia
    ***Assigning points is the way to say thanks in SDN.***

  • LSMW Master Data Load issue for UOM.

    Hello Friends,
    I hope that someone out there can provide a answer to my question.
    I am attempting to use LSMW to load Alternate units of measure via MM02. The Main data is already in the system.
    My Flat file has 2 levels.
    Header--> contains Material Number
    Line --> contains data relevant to each UOM that I want to enter.
    When I do the Direct Input Method, I get the following message:
    "The material cannot be maintained since no maintainable data transferred"
    Here is the format of the flat file.
    SOURCEHEAD HEADER
    IND C(001) INDICATOR
    Identifing Field Content: H
    MATNR C(018) Material
    SOURCELINE UOM DATA
    IND C(001) INDICATOR
    Identifing Field Content: L
    MAT C(018) MAT
    UMREN C(005) Denominator for conversion to base units of measure
    MEINH C(003) Alternate Unit of measure sku
    UMREZ C(005) numerator for conversion to base units of measure
    EAN11 C(018) International Article Number (EAN/UPC)
    NUMTP C(002) Category of International Article Number (EAN)
    LAENG C(013) Length
    BREIT C(013) Width
    HOEHE C(013) Height
    MEABM C(003) Unit of Dimension for Length/Width/Height
    VOLUM C(013) Volume
    VOLEH C(003) Volume unit
    BRGEW C(013) Gross weight
    NTGEW C(013) Net weight
    GEWEI C(003) Weight Unit
    When I manually process the data, I have no issues.
    I am thinking that I may just be missing some piece of information necessary to get this data processed, am can not see what the answer is.
    Any Help that can be provided would be great.
    Regards,
    Christopher

    Hello,
    You need to map BMMH1 along with BMMH6.
    Map TCODE& MATNR in BMMH1 with XEIK1 = 'X' (BasicData). Subsequently map fields of structure: BMMH6.
    I just made a test now and it works fine for me.
    Hope this helps.
    Best Regards, Murugesh AS

  • Data control issue for content repository when running apps in webcenter .

    Hi All ,
    I have created content repository connection in my local jdeveloper and
    exposed it as a data control .
    from data control i am displaying some path and name based on some search criteria .
    Whenever i am runnig this application i am getting following exceptions and no datas are displayed .
    Since i have define connection for content server locally on my jdeveloper do i need to create some jndi
    on server side .
    TestContentServer is the content repository connection i have created in jdeveloper .
    if yes tell me how can i do it and how it will port to my data control .
    [2010-10-05T09:34:39.245-07:00] [wc_custom] [WARNING] [] [oracle.adf.controller.faces.lifecycle.Utils] [tid: [ACTIVE].ExecuteThread: '3' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: anonymous] [ecid: 0000IhxFdvi4ulWpTwp2ic1CemrZ0000fT,0:1] [WEBSERVICE_PORT.name: WSRP_v2_Markup_Service] [APP: application1] [J2EE_MODULE.name: TestContentService-ViewController-context-root] [WEBSERVICE.name: WSRP_v2_Service] [J2EE_APP.name: application1] ADF: Adding the following JSF error message: TestContentServer[[
    javax.naming.NameNotFoundException: TestContentServer; remaining name 'TestContentServer'
    Thanks,
    Arun

    Hi Yanic ,
    1. code for my jspx page
    <?xml version='1.0' encoding='UTF-8'?>
    <jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1"
    xmlns:f="http://java.sun.com/jsf/core"
    xmlns:h="http://java.sun.com/jsf/html"
    xmlns:af="http://xmlns.oracle.com/adf/faces/rich">
    <jsp:directive.page contentType="text/html;charset=UTF-8"/>
    <f:view>
    <af:document id="d1">
    <af:messages id="m1"/>
    <af:form id="f1">
    <af:inputText value="#{bindings.path.inputValue}" simple="true"
    required="#{bindings.path.hints.mandatory}"
    columns="#{bindings.path.hints.displayWidth}"
    maximumLength="#{bindings.path.hints.precision}"
    shortDesc="#{bindings.path.hints.tooltip}" id="it1">
    <f:validator binding="#{bindings.path.validator}"/>
    </af:inputText>
    <af:inputText value="#{bindings.type.inputValue}" simple="true"
    required="#{bindings.type.hints.mandatory}"
    columns="#{bindings.type.hints.displayWidth}"
    maximumLength="#{bindings.type.hints.precision}"
    shortDesc="#{bindings.type.hints.tooltip}" id="it2">
    <f:validator binding="#{bindings.type.validator}"/>
    </af:inputText>
    <af:commandButton actionListener="#{bindings.getItems.execute}"
    text="getItems"
    disabled="#{!bindings.getItems.enabled}" id="cb1"
    partialSubmit="true"/>
    <af:table value="#{bindings.Items.collectionModel}" var="row"
    rows="#{bindings.Items.rangeSize}"
    emptyText="#{bindings.Items.viewable ? 'No data to display.' : 'Access Denied.'}"
    fetchSize="#{bindings.Items.rangeSize}"
    rowBandingInterval="0" id="t1" partialTriggers="::cb1">
    <af:column sortProperty="name" sortable="false"
    headerText="#{bindings.Items.hints.name.label}" id="c2">
    <af:inputText value="#{row.bindings.name.inputValue}"
    label="#{bindings.Items.hints.name.label}"
    required="#{bindings.Items.hints.name.mandatory}"
    columns="#{bindings.Items.hints.name.displayWidth}"
    maximumLength="#{bindings.Items.hints.name.precision}"
    shortDesc="#{bindings.Items.hints.name.tooltip}"
    id="it3">
    <f:validator binding="#{row.bindings.name.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="path" sortable="false"
    headerText="#{bindings.Items.hints.path.label}" id="c3">
    <af:inputText value="#{row.bindings.path.inputValue}"
    label="#{bindings.Items.hints.path.label}"
    required="#{bindings.Items.hints.path.mandatory}"
    columns="#{bindings.Items.hints.path.displayWidth}"
    maximumLength="#{bindings.Items.hints.path.precision}"
    shortDesc="#{bindings.Items.hints.path.tooltip}"
    id="it8">
    <f:validator binding="#{row.bindings.path.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="URI" sortable="false"
    headerText="#{bindings.Items.hints.URI.label}" id="c5">
    <af:inputText value="#{row.bindings.URI.inputValue}"
    label="#{bindings.Items.hints.URI.label}"
    required="#{bindings.Items.hints.URI.mandatory}"
    columns="#{bindings.Items.hints.URI.displayWidth}"
    maximumLength="#{bindings.Items.hints.URI.precision}"
    shortDesc="#{bindings.Items.hints.URI.tooltip}"
    id="it7">
    <f:validator binding="#{row.bindings.URI.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="primaryType" sortable="false"
    headerText="#{bindings.Items.hints.primaryType.label}"
    id="c7">
    <af:inputText value="#{row.bindings.primaryType.inputValue}"
    label="#{bindings.Items.hints.primaryType.label}"
    required="#{bindings.Items.hints.primaryType.mandatory}"
    columns="#{bindings.Items.hints.primaryType.displayWidth}"
    maximumLength="#{bindings.Items.hints.primaryType.precision}"
    shortDesc="#{bindings.Items.hints.primaryType.tooltip}"
    id="it6">
    <f:validator binding="#{row.bindings.primaryType.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="title" sortable="false"
    headerText="#{bindings.Items.hints.title.label}" id="c1">
    <af:inputText value="#{row.bindings.title.inputValue}"
    label="#{bindings.Items.hints.title.label}"
    required="#{bindings.Items.hints.title.mandatory}"
    columns="#{bindings.Items.hints.title.displayWidth}"
    maximumLength="#{bindings.Items.hints.title.precision}"
    shortDesc="#{bindings.Items.hints.title.tooltip}"
    id="it9">
    <f:validator binding="#{row.bindings.title.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="docType" sortable="false"
    headerText="#{bindings.Items.hints.docType.label}" id="c6">
    <af:inputText value="#{row.bindings.docType.inputValue}"
    label="#{bindings.Items.hints.docType.label}"
    required="#{bindings.Items.hints.docType.mandatory}"
    columns="#{bindings.Items.hints.docType.displayWidth}"
    maximumLength="#{bindings.Items.hints.docType.precision}"
    shortDesc="#{bindings.Items.hints.docType.tooltip}"
    id="it5">
    <f:validator binding="#{row.bindings.docType.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="displayName" sortable="false"
    headerText="#{bindings.Items.hints.displayName.label}"
    id="c4">
    <af:inputText value="#{row.bindings.displayName.inputValue}"
    label="#{bindings.Items.hints.displayName.label}"
    required="#{bindings.Items.hints.displayName.mandatory}"
    columns="#{bindings.Items.hints.displayName.displayWidth}"
    maximumLength="#{bindings.Items.hints.displayName.precision}"
    shortDesc="#{bindings.Items.hints.displayName.tooltip}"
    id="it4">
    <f:validator binding="#{row.bindings.displayName.validator}"/>
    </af:inputText>
    </af:column>
    </af:table>
    </af:form>
    </af:document>
    </f:view>
    </jsp:root>
    2. code for binding .
    <?xml version="1.0" encoding="UTF-8" ?>
    <pageDefinition xmlns="http://xmlns.oracle.com/adfm/uimodel"
    version="11.1.1.56.60" id="Test1PageDef"
    Package="com.heiwip.cs.view.pageDefs">
    <parameters/>
    <executables>
    <variableIterator id="variables">
    <variable Type="java.lang.String" Name="getItems_path"
    IsQueriable="false"/>
    <variable Type="java.lang.String" Name="getItems_type"
    IsQueriable="false"/>
    </variableIterator>
    <methodIterator Binds="getItems.result" DataControl="TestDC1" RangeSize="25"
    BeanClass="com.heiwip.cs.view.TestDC1.getItems_return"
    id="getItemsIterator"/>
    </executables>
    <bindings>
    <methodAction id="getItems" RequiresUpdateModel="true" Action="invokeMethod"
    MethodName="getItems" IsViewObjectMethod="false"
    DataControl="TestDC1" InstanceName="TestDC1"
    ReturnName="TestDC1.methodResults.getItems_TestDC1_getItems_result">
    <NamedData NDName="path" NDType="java.lang.String"
    NDValue="${bindings.getItems_path}"/>
    <NamedData NDName="type" NDType="java.lang.String"
    NDValue="${bindings.getItems_type}"/>
    </methodAction>
    <attributeValues IterBinding="variables" id="path">
    <AttrNames>
    <Item Value="getItems_path"/>
    </AttrNames>
    </attributeValues>
    <attributeValues IterBinding="variables" id="type">
    <AttrNames>
    <Item Value="getItems_type"/>
    </AttrNames>
    </attributeValues>
    <attributeValues IterBinding="getItemsIterator" id="displayName">
    <AttrNames>
    <Item Value="displayName"/>
    </AttrNames>
    </attributeValues>
    <attributeValues IterBinding="getItemsIterator" id="name">
    <AttrNames>
    <Item Value="name"/>
    </AttrNames>
    </attributeValues>
    <attributeValues IterBinding="getItemsIterator" id="path1">
    <AttrNames>
    <Item Value="path"/>
    </AttrNames>
    </attributeValues>
    </bindings>
    </pageDefinition>
    3. Table is displayed with no data .
    whenever i am trying to pass something in search criteria to get the result its
    throwing connection error .
    4. I am using oracle UCM (oracle content server) .
    for this i am using identity propogation .
    Thanks,
    Arun.

  • I am having a issue with getting data useage alerts for my iphone 4s

    I am having a issue with getting data useage alerts for my iphone 4s from AT&T.  I do not download anything huge at all.
    I looked into it and figured out that the phone dials out nightly at 12:29am every night.   I went into my settings and went to general..about..diagnostics and useage..then diagnostics and useage data to see this.  I then clicked don't send...but I am still getting useage alerts.  Can anyone help me please...
    Thanks

    Honestly, from reading the thread linked, they all come off as a bunch of whiney people that cannot be bothered to help themselves.
    Little to nothing in that thread indicates an issue beyond inept consumers.  Yes, I read several pages on the incessant gripes.  Very few made any actual attempts to troubleshoot issues before whining about the "Apple issue" and those that did actual troubleshooting got their issues resolved.
    So no, Apple has nothing to fix beyond a few specific devices that are experiencing hardware issues.
    If you have actually put forth effort and done the basic troubleshooting, take the device to Apple for evaluation and possible replacement.  Whining will get nothing accomplished.

  • Issue for creation date of Billing document in third party ???

    Hello SAP Folks,
    This issue is related to third party scenario. The basic requirement is to create to a billing document with a creation date same as
    Good issue date of Sales Order.
    For your information, with the help of enhacement we have passed on the value of posting date of material document generated
    from MIGO to Goods issue date in Sales Order. I am very much aware that setting up a new copying routine would help me but
    relating the GI date field of Sales Order from the techincal point of view is a complete work stopper. Could anyone throw light on
    the workaround for mapping this case ???
    Regards,
    Sarthak

    Hello,
    Thanks Alex for your input on the topic !!
    But i would like to say that the billing relevance of the item category has to "B" as this process of Third part Sales talks about creation of billing document after the MIGO. And i am basically interested to Populated the posting date of MIGO for Biling creation date. 
    In Sales Order, the GI date is modified to Posting date of MIGO through enhancement. I have been looking around relevant tables to fetch the GI date of Sales Order but not getting the right track...
    Could anyone provide any other workaround to implement this sort of requirement ???
    Eager to hear from Lakshmipathi ...
    Regards,
    Sarthak

  • Reminder app not working correctly, issue started around Sept. 1,2014. Will not allow me to set date and time for a reminder.

    Today I noticed an issue with my Reminder app on my IPhone 5s. Using OS 7.1.2. I called Apple, and the tech didn't know what he was doing and just trying different things.The reminder app will not let me set a date and time for reminder. Finally he escalated the call to a senior advisor who was rude, arrogant and unprofessional. He insisted the Reminder App was updated with the new update of IOS September/October 2013. Not true, the issue just started about 5 days ago August 31,2014. Now there is an additional reminder line for reminder entries that I have to use and have to transfer all my reminders manually from the other part of the app. The feature does exist; if I try to set a date and time for the reminder it appears for 1-2 seconds and then disappears.
    Has anyone else noticed this issue?

    He insisted the Reminder App was updated with the new update of IOS September/October 2013. Not true
    Yes, it is true.
    iOS 7 was released 18 Sep 2013. At this time, the Reminders app was updated (with most everything else).
    the issue just started about 5 days ago August 31,2014.
    Your issue just started but the app was not updated 5 days ago.
    Now there is an additional reminder line for reminder entries that I have to use
    I don’t understand what this means...
    Open Reminders, type the reminder, tap the i at the end, turn on Remind me on a day and enter the date/time.

  • ECC to CRM Mapping Issue for vendor data

    Hi,
    I am facing an issue for replicating vendor data from ECC to CRM. One of the vendor is not confirmed at ECC side and one vendor's time zone is not maintain in the master of time zone in CRM side. This may be the issue.
    But Where exact we can see that mapping of the tables of ECC and CRM. So in future we should maintain mandatory data based on mapping to avoid BDOC failure.

    Hello Rahul ,
    The field (MARA-BISMT) is not used in CRM system. If you want this
    field to be maintained in CRM as well, and to be transferred from R/3
    to CRM you must enhance the standard by customer extensions.
    The procedure for customer enhancements for Download from R/3
    to CRM is described in the following document:
    SAPNet, alias crm,   > Media Library   > Documentation   >
    Key Capabilities   > Master Data   > CRM Product: Customer
    Enhancements for Download/Upload
    For further information about Product Master enchancements and how to
    find techncical documentation about it please refer to the attached
    Note 428989.
    Thanks & regards,
    Krishnen

  • Has anyone found a solution for iPhone 5 data leak issues?

    Up until about a week ago I was using a 3GS and the data leak issues seemed to be fixed with the newest iOS 6 update. However, I recently got an iPhone 5 and I've noticed it uses around 1 MB per hour no matter what I'm actually doing on the phone. I actually went to sleep last night, turning of cellular data AND wifi and it STILL used about 4 MB of data!! What is up with this?? I am a pretty conservative user of data when not on wifi, but I'm only 2 days in to my bill cycle and already on pace to go over my 2 GB limit by the end of the month. Please help! I do not want to switch my plan and play more! I am on AT&T by the way.

    Have you tried these basic troubleshooting steps?
    Restart / Reset
    http://support.apple.com/en-us/HT201559
    Restore from backup
    Restore as new
    http://support.apple.com/en-us/HT201252
    If no joy, make an appointment with the Apple genius bar for an evaluation.

  • Issue while Installing Oracle Data Access Software for Windows

    All,
    Iam getting the following error while installing Oracle Data Access Software for windows. Iam installing in WindowsXP, with Oracle 9i release 9.2.0.7.0 DB and client in the same Box.
    It shows
    The Specified Key key was not found while trying to GetValue
    * Stop installation of all Products
    * Stop installtion of this componenent only.
    Kindly let me know why this error is showing up.
    Regards
    Ramesh

    Most probably you have hit this issue:
    "If you have more than one Oracle Home installed on the same machine (e.g. Oracle8i client and Oracle9i Release 2 client), use the Oracle Home Selector to run your applications with Oracle9i Release 2 client. "
    As documented on the Oracle Data Access Software for Windows. Release 9.2.0.4.0
    ~ Madrid.

Maybe you are looking for

  • Images darken after editing in both Lightroom and PS

    My apologies if this topic has been recently addressed, I've canvassed both Adobe's online documentation as well as several archival pages of this thread without finding mention of it. I'm shooting JPEGs on a D70, importing them in Lightroom. LR has

  • Georaster metadata in Mapviewer

    I've loaded a grayscale tif into Georaster and can view in Mapviewer (10.1.2). I'm manipulating the Georaster appearance by updating its metadata through SDO_GEOR.setGrayScale. I can see the change when viewed in the standalone GeorasterViewer, but M

  • CFImage Functions and DPI / Image Quality.

    I am doing some work that involves taking images supplied from a product called Fotoweb and putting them into PDF files using cfdocument. We are noticing a degradation in image quality that seems to be coldfusion related. In the code below I am readi

  • Adding image to correct/incorrect caption

    How do I add an image (tick or cross) as part of the quiz correct / incorrect caption?

  • Virtual Video

    I want to create virtual video of a building and its surroundings. exactly like this link http://www.youtube.com/watch?v=5Fq7f7H30dY Is this possible in Flash? or any other software in which i can easily create such video.