Link to Segment

Dear Experts,
i`ve got a z-assignmentblock (configTable) with SEGMENT_DESCR entries. Now i want to create a Link to the OVELSegments View. I already made this with Partner to get to Partner Overview, which works fine! Therefore i defined the
getter - get_p_partnerid,
event handler - eh_ontopartner,
view outbound plug - op_partner and
window outbound plug - op_partner.
In event handler i got a method cl_crm_uiu_bt_partner => get_partner_navigation_advs to create my navigation collection.
Is there something like for segments? Do you have an Idea how to create this link.
Thanks and best regards,
Tobias Meisersick

Hi Masood,
thanks for your answer. Now i used your mentioned method. But it doesen`t take any effect. Here is my EH method:
METHOD eh_ontosegment.
  DATA:     lr_desc_object      TYPE REF TO if_bol_bo_property_access,
            lr_data_collection  TYPE REF TO if_bol_bo_col,
            lr_nav              TYPE REF TO if_crm_ui_navigation_service,
            lr_entity           TYPE REF TO cl_crm_bol_entity,
            lr_listent          TYPE REF TO if_bol_bo_property_access,
            lr_core             TYPE REF TO cl_crm_bol_core,
            lv_desc             TYPE string,
            lv_guid             TYPE crmt_targetgrp_guid,
            lv_index            TYPE i.
* Get the index of result list
  cl_thtmlb_util=>get_event_info(
    EXPORTING
      iv_event = htmlb_event_ex
    IMPORTING
      ev_index = lv_index ).
*     Get entity of result list
  lr_listent ?= me->typed_context->zphkolfull->get_bo_by_index( iv_index = lv_index iv_change_focus = abap_false ).
  CHECK lr_listent IS BOUND.
*     Get Targetgroup Description
  lv_desc = lr_listent->get_property_as_string( 'TARGETGROUPID' ).
*     Get GUID of Segment
  SELECT SINGLE guid FROM crmd_mkttg_tg_t INTO lv_guid WHERE tg_descr = lv_desc.
*     Get the BOL Core Instance
  lr_core ?= cl_crm_bol_core=>get_instance( ).
*     Get the Root entity
  lr_entity ?= lr_core->get_root_entity( iv_object_name = 'SEGTg' iv_object_guid = lv_guid  ).
  CHECK lr_entity IS BOUND.
*    Create UI based Entity
  CALL METHOD cl_crm_ui_descriptor_obj_srv=>create_entity_based
    EXPORTING
      ir_entity           = lr_entity
      iv_ui_object_type   = 'SEG_TARGETGROUP'
      iv_ui_object_action = 'B'            "display
    RECEIVING
      rr_result           = lr_desc_object.
*      Create a BOL collection to be passed to the inbound plug of
*      the Called component
  CREATE OBJECT lr_data_collection
    TYPE
      cl_crm_bol_bo_col.
*    Add the UI Descriptor the BOL Collection
  lr_data_collection->add( lr_desc_object ).
*    Get Instance of Navigation Service
  lr_nav = cl_crm_ui_navigation_service=>get_instance( ).
*    Navigate to Target Component
  IF lr_nav->is_dynamic_nav_supported( lr_desc_object ) = abap_true.
    lr_nav->navigate_dynamically( lr_data_collection ).
  ENDIF.
ENDMETHOD.
While debugging I saw that this code is runnint to end with no exception!
Is it right that i only have to code the get_p method of the attribute + my event handler!
Thanks,
Tobias Meisersick

Similar Messages

  • Projects data by GL segments

    Hi All,
    1. We got a requirement to pull the project expenditure cost, revenue, budget & forecast data from Oracle, w.r.t gl code combination wise. As expenditures and revenues are posted to GL, (also based on cost distribution lines and pa events table), we are able to pull the cost and revenue for all the gl code combinations.
    But we need to pull the budget and forecast information, on GL code combination basis. Can you please let me know, how we can link gl segments with budget and forecast data.
    2. We need to list the above data on monthly basis. As i am extracting cost and revenue from GL, i can list the monthly basis data.
    We need to provide the budget and forecast data also on monthly basis. But we are using PA weekly periods for Budget and forecast data. Can I use PA_PERIODS_ALL.GL_PERIOD_NAME to group monthly data, although we are budgeting weekly?
    3. Last question is, how GL_PERIOD_NAME is populated in PA_PERIODS_ALL table?
    Thanks a lot for your time and help in advance.

    Hi Dina,
    Thanks a lot for your time and sorry for the delay.
    I understand what you are saying. Please let me elaborate my issue, so that you can suggest some other approach. I am working on few extracts for Hyperion. For this I need to extract the data from EBS R12. For hyperion all the data files need to be send by gl account wise (like segment1: company, segment2: cost center, segment4: account...etc and amount), for a particular period. For ex: for AR data I am extracting all the revenue invoice amounts that got posted to GL, gl accounts/period wise.
    In the same way, i need to extract all the projects related data by gl accounts, period wise. I am able to extract the project actual's (expenditures) by gl account wise. In the similar way i also need to extract the project budget and forecast amount by gl account,period wise.
    I tried to implement the solution you proposed. I think, I can extract the first 3 segments (company,location, costcenter) based on the project type and org. only issue is getting the segment4 (account). We have one accounting rule setup for the expenditure types. So i thought of deriving account based on the expenditure types. But I found that, budget and forecast is allocated on (task, resource list) wise. Some of the resource members have expenditure type associated with them. But some of them don't. So I am not able to depend on the expenditure type accounting rule.
    Can you please let me know, if I went wrong any where and suggest the right path.
    Thanks again for your time.

  • DTW Template for Segmented COA

    Hi All
    I want to know that, Is it possible to import Segmented COA data including Active and title Account.
    When I am trying I am able to import only titles, this is because I can make title account without segment.
    But in case of  Active acoount, the fields I have used in template are -
    Segment_0    Segment_1, .  . . . .  AccntName etc but in DTW all segment field was unmapped.
    Can anybody provide me template for Segmented COA?
    Thanks
    Anubha

    Hi Anubha,
    Refer this link
    Importing segmented COA
    Uploading of Chart of Accounts through DTW
    DTW Chart of Accounts error
    Thanks,
    Srujal patel

  • Segment & Profit Center Activation

    Hello Gurus,
    Could you please forward me some good links on  Segment & Profit Center Activation.
    ie How & Where to configure for Segments?
    & How & Where to configure for Profit Center Activation?
    If you can help me on this, it will really be very nice of you.
    Thanks in Anticipation,
    Warm Regards,
    Debojit Dey

    Hi,
    Profit Center field should normally be available in Sales order line item in Account Assignment tab. If it is not visible, please check the field status group of Revenue GL Code. (Profit center field should be made optional). Also check the settings of Movement type 601 (hope this is the movement type which is used when PGI for this Sales order is done)
    Please revert if it helped.
    Regards
    Vineet
    Edited by: Vineet Bhardwaj on May 1, 2010 7:31 PM

  • How to change Field SEGMENT in Profit Centre Master

    Dear Experts,
    We have wrongly created a profit centre with wrong segment in the master and we have done some transactions with the same profit centre. Now we want to correct the SEGMENT field . But the field is greyed out when using KSE2 t code.
    Please advise how can we change the segment field.
    Regards,
    Alok

    Hi
    Please go through the below mentioned links
    Modify segment in Profit Center
    Segment in document posting
    Hope it may help you to solve your problem
    Regards
    Praveen P C

  • Profitability Segment : FI Linkage to PA

    Dear All,
    Could Help me how we can link Profitability Segment No from BSEG table to CO-PA area.
    Example: FI Doc No: 1400003 ( In this Document Profitably Segment Value is 12789)
    CO: With Respect to above document, document posted to PA i.e 100569.
    How System co-relate Profit Segment: 12789 to CO PA document 100569 as we dont have master table for Profitability Segments.
    Thanks inadvance.
    Regards,
    Venkat

    Hello,
    The profitability segment what is derived in FI docuement is , as per the derivation strategy COPA.
    Your COPA document is genarated with the Profitability segment mentioned by you.
    There is NO direct relationship between Profitability segment and COPA document.
    When you refer to the mentioned COPA document, system will display full information, in addition to Profitability segment.
    Hope, I have clarified your doubts.
    Please revert for further clarifications , if any.
    Thanks,
    Santosh

  • Profitability segment-- Linkage FI to CO

    Dear All,
    Could Help me how we can  link Profitability Segment No from BSEG table to CO-PA area.
    Example:  FI Doc No: 1400003 ( In this Document Profitably Segment Value is 12789)
                    CO:  With Respect to above document, document posted to PA i.e 100569.
    How System co-relate Profit Segment: 12789 to CO PA document 100569.
    Thanks inadvance.
    Regards,
    Venkat

    The link is based on reference document (RBELN) in KE24 transaction.
    Thanks and regards
    Kedar

  • Creating a multiclip sequence, HERE'S A REAL STUMPER for all you Pros...

    Ok,
    this one is a real doozy. We've tried everything we can think of.
    Workflow: Three HDX-900 cameras, jam-synced to clocktime and shooting a sports event at 30p (59.94fps) DVCProHD. Recording onto FireStore-100s and backing up onto DVCProHD tape. Dumping firestore DVCProHD footage directly into FCS2. Everything looks and sounds great. We labeled each angle field in FCP according to the camera (A, B, C). We carefully labelled the reel numbers according to our project (P1031308A0101).
    I selected the 3 bins (each bin holds the footage from each camera) and made a multiclip sequence. It was UNABLE to make a 60 minute multiclip with the footage from the three cameras spread out according to their timecode. It kept trying to make about 30 multiclips, few of them with overlapping timecode. a couple of the clips did sync together into a 3 camera segment, but most of them failed to find each other. We played with the overlap slider and all of the settings in the multiclip window, and at various points FCP tried to offer us 4 or even 5 camera angles of footage, all from our three angles, each with its own embedded and (seemingly) functional TC. It would NOT simply do the obvious-- take the free-run timecode and arrange each clip along a linear timeline within that timeframe and multiclip them together to form a three-angle view of the sports event.
    Why not?????
    Now, one issue we have with this workflow is that the FS-100 drives are unfortunately FAT32, meaning every 2.2 minutes (2 gigs) the FS-100 automatically cuts the footage into 2 gig chunks. but in a regular non-multiclip sequence, this split-up doesn't seem to affect the footage, or timecode at all, and the first frame of the new clip is one TC frame after the last frame of the previous clip, the way it is supposed to. I don't know if this is relevant.
    Also, I don't know if this is relevant: We have been shooting in 59.94 fps, 30p, and we have been using the 60@30 method of viewing TC in the viewer and the project settings.
    I am thinking that if there was a way to take all my 2.2 minute clips and put them together into a giant master clip (without exporting and losing quality) then my simplified work environment would allow for a working multiclip.
    There are three of us in the edit room and we have been trying to multiclip this stuff for hours. We need this solved today. Is there anyone who has an idea? I think we've tried everything obvious. Our computer is a new MacPro with 4gb RAM, capturing onto an eSATA hard drive

    Kevan,
    Thanks for the hey hey.
    TonyTony,
    I have lots and lots of 2 minute clips. But we finally figured it out. It's a failing of FCP's multiclip feature to recognize segmented files that originated on a FAT32 drive. Basically, you simply can't, as of FCS2, make multiclips from several files that were automatically split into 2gig segments, even if their timecodes were maintained.
    So the solution: I took each camera's footage, lots and lots of 2 minute clips, and pasted them together in quicktime pro to make a longer sequence. then i saved a reference file to the long sequence and brought that into FCP. I did the same for the files from the other 2 cameras. Then multiclipped those reference movies.
    The newest version of the FireStore firmware has a setting that lets you create a quicktime reference movie from within the firestore that links the segments together. so my workflow for the future involves making these reference movie, bringing only the reference movies into FCP, and multiclipping them. problem solved.
    probably 2 people who are reading this understand what i'm talking about; i'm not particularly great at explaining things. but we're back up and running . thanks for reading.

  • Why is recording a V/O in FCP X such a pain?

    I am trying to lay in a V/O track that I will then cut my video to match.  FCP X though keeps insisting on linking each segment of my audio to a blank placeholder in the video track??!!!  What the heck is this nonsense??  Now I have 10 or 12 separate audio clips each with the little dongle connecting them to NOTHING, yet when I attempt to lay in the video clip that corresponds with the V/O I CAN'T DO IT!!!!  Ahhhh!  Is there not a way to just lay in a V/O track?  Please help....

    Thanks Jim - Your response brings up another thorn in my side.  I think the new layout of these forums really stinks!  I thought that I WAS in the FCP-X forum.  Thanks Jobs....

  • ALE/IDOC[custom table transfer]

    Hi All,
    I have a doubt regarding the type of programs used to post the idocs using ALE in the outbound process.
    Please also tell me which program I will use to post a custom table of mine to another server using ALE/IDOC.
    Please also tell me the procedure to do the same mentioned above.
    Please help. Its a urgent requirement...........................
    Thanks in advance......................................................

    Hi Guru,
    if you cannot use a standard idoc then you´ll need to create a custom one starting from the segment (WE31) where you can create an Idoc segment starting from your own Z table structure using the wizard.
    Then you´ll need to also create the idoc type (WE30) and link the segment to the IDoc.
    Then you´ll need to create a message type and link the message type to the Idoc type (WE81, WE82).
    Then you´´ll use RFC "MASTER_IDOC_DISTRIBUTE" to populate the IDOC and send the data.
    This is a sample code just to give you an idea:
    DATA:          
      Z_SEGNAME(7) TYPE C VALUE 'SEGMENT',
      Z_MESTYPE(9) TYPE C VALUE 'MESSAGE',
      Z_IDOC_TYPE(8) TYPE C VALUE 'IDOC'.
    DATA:
      IDOC_CONTROL LIKE EDIDC,
      T_COMM_CONTROL LIKE EDIDC OCCURS 0 WITH HEADER LINE,
      IDOC_DATA LIKE EDIDD OCCURS 0 WITH HEADER LINE.
    *Reads data from Z table
    SELECT *
    FROM ZTABLE
    INTO TABLE L_ZTABLE.
    *Set the control data info required for the distribution
    IDOC_CONTROL-MESTYP = Z_MESTYPE.
    IDOC_CONTROL-DOCTYP = Z_IDOC_TYPE.
    *Populate the IDoc
    LOOP AT L_ZTABLE.
      CLEAR IDOC_DATA.
      IDOC_DATA-SEGNAM = Z_SEGNAME.
      IDOC_DATA-SDATA = ZTABLE.
      APPEND IDOC_DATA.
    ENDLOOP.
    *Deliver the IDOC as defined in distribution model/partner profile
    CALL FUNCTION 'MASTER_IDOC_DISTRIBUTE' IN UPDATE TASK
      EXPORTING
        MASTER_IDOC_CONTROL          = IDOC_CONTROL
      TABLES
        COMMUNICATION_IDOC_CONTROL     = T_COMM_CONTROL
        MASTER_IDOC_DATA               = IDOC_DATA
      EXCEPTIONS
        ERROR_IN_IDOC_CONTROL               = 1
        ERROR_WRITING_IDOC_STATUS          = 2
        ERROR_IN_IDOC_DATA               = 3
        SENDING_LOGICAL_SYSTEM_UNKNOWN      = 4
        OTHERS                         = 5.
    IF sy-subrc = 0.
      COMMIT WORK.
    ENDIF.

  • EDI Message type issue

    Dear Friends,
    In my current project my issue is related to EDI Message type the problem is as below
    If I use output type (message type) NEU which is SAP standard and if I select this out put type (NEU) with medium EDI after saving this I can see all the data segments in Outbound Idoc.
    Now the actual issue is if I use the ZNET customize output type  which is the extension of NEU with medium type EDI after saving this if I check the Idoc (We02) I am not able to see few purchase order’s data segments like E1EDKT1(Header Text ) and E1EDPt1(Item Text)
    What could be the reasons? Is there any setting where output types and data segments are linked
    Your help will be highly appreciated
    With Best Regards,
    Ashish Vats

    Hi,
    Thanks for your reply.
    I have maintained all the parameters in partner profile.... I think Could not explained the issue in detail.
    the issue is as below
    When I select the output type ZNET in purchase order and medium EDI(Partner profile for Vendor maintained for outbound PO) after saving the same when I check the outbound IDOC generated by system in that some segments are not coming while in NEU output type all the segments are coming properly.
    ZNET is the Extension of NEU .. now my question is to forum where output types are linked with segments ?so that I can compare the ZNET with NEU  and findout the missing segments or could link the missing segments with ZNET
    Hope I could explained you...........
    Regards,
    Ashish

  • Help needed in retrieving account description present in PO distributions

    Hi All,
    I have written a query to retrieve purchase orders and it's distributions. I need to retrieve account descrption related to distributions. I have got account number from gl_code_combinations table(segment1 to segment5). I am not able to proceed further in getting account description. Please suggest which are the tables and what is the link between them.
    Thanks and Regards,
    Mahesh

    Hi,
    Please find my comments below:
    Thanks for the reply
    Welcome
    If we link with fnd_flex_values_vl and gl_code_combinations, number of records is increasing i.e records are repeating.
    add condition flex_value_set_id
    Also account number will be like 01-0101-0101-1010, so this means that account number is flex field. Then i think we have to use flex field tables. But i am not able to find out which are the tables and what is the link between them
    Account number is a combination of Segments. You cannot find total combination anywhere. You have to take Code combination id, get the segments, link the segment with fnd_flex_values_vl with particular value_set_id and display the description against that segment.
    Finally you have to concatenate all the segment descriptions.
    Regards,
    Sridhar

  • Creating merged clips from the timeline not working, any advice?

    I am syncing HDV video captured with guide audio, with four track audio recorded onto a Hard Disk Recorder. I need to sync in the timeline as there is not always a clap, but there is plenty of guide sound with the picture. I want to make a merged clip that I can then work with.
    I bring everything in - sync it - and highlight everything but find that the link command is dimmed, it is not available.
    The 4 track audio, and the video with guide sound both come in to the timeline already linked so I tried unlinking them when I first paste them into the timeline and making them independent clips, before I synced them together but that made no difference. The linking tool (Modify-Link) is dimmed and thus when I drag them into the browser they become many different files, instead of one merged clip. Can anyone suggest a reason why this is not working for me?
    I'm running Final Cut Pro 5.0.4
    thanks very much..

    I think I can answer my own question. It seems that several separate files were made in the audio but only one take made in the video, and FCP will not allow the clips to be linked if I'm trying to link one take of video with three files of audio (pasted onto the timeline one after the other under the video). But If I cut the video and link the segments one at a time then I can merge the clips independently. It's not ideal but it makes sense.

  • Workaround for playing audio under several slides

    There is a workaround (albeit a bit clumsy) for playing audio tracks under a series of slides throughout a presentation. Divide the slideshow into individual slideshow segments (A, B, C, D, etc.) and whenever you need to play audio under a series of slides, create this series as a separate segment, with the audio track running as the "soundtrack." Then, you can link each segment to the next via a "hyperlink."
    BEWARE, however, that although the audio will play throughout the segment, you still must do all of your Builds and Transitions manually (or automatically) because they will NOT playback with the "automation" feature of a "recorded" soundtrack presentation.
    You also will not be able to play (or preview) any "recorded" slideshow from anywhere but the beginning.
    I said it was cumbersome, but it does work ... sort of ...

    They probably need Quicktime 7 install on their machines. Here's how to install Quicktime 7 on Snow Leopard. http://support.apple.com/kb/HT3678
    To be honest with you your best bet is to go to Telestreams web site and ask them what the fix for this problem is. http://www.telestream.net/telestream-home.htm

  • Problemm for quary slow running

    hi all,
    i do have a problem...i am using oracle database 10g rel 2 enterprise edition on solaris .Some of the quary from sql client runs smooth normally.but today some of then running very slow.i got awr and addm.c The perticular quary that taking lot of time is....
    *1>select idtr_broc_code, sum(idtr_val)/10000000 from import_dtr where idtr_rpt_date between '01-APR-10' and '31-DEC-10' and substr(idtr_cty_cons, 3, 3)='187' group by idtr_broc_code order by idtr_broc_code .*
    now i am giving the addm report ...
    FINDING 1: 56% impact (1052 seconds)
    Individual database segments responsible for significant user I/O wait were
    found.
    RECOMMENDATION 1: Segment Tuning, 31% benefit (582 seconds)
    ACTION: Run "Segment Advisor" on TABLE *"FTSSDEV.EXPORT_DTR*" with object
    id 53014.
    RELEVANT OBJECT: database object with id 53014
    ACTION: Investigate application logic involving I/O on TABLE
    "FTSSDEV.EXPORT_DTR" with object id 53014.
    RELEVANT OBJECT: database object with id 53014
    RATIONALE: The I/O usage statistics for the object are: 8 full object
    scans, 5198937 physical reads, 11923 physical writes and 0 direct
    reads.
    RATIONALE: The SQL statement with SQL_ID "gsvs1ybcp476p" spent
    significant time waiting for User I/O on the hot object.
    RELEVANT OBJECT: SQL statement with SQL_ID gsvs1ybcp476p
    select sum(EDTR_VAL) from export_dtr
    where EDTR_SLDT between '01-apr-2009' and '31-mar-2010'
    RATIONALE: The SQL statement with SQL_ID "25ankjpssqcj6" spent
    significant time waiting for User I/O on the hot object.
    RELEVANT OBJECT: SQL statement with SQL_ID 25ankjpssqcj6
    RATIONALE: The SQL statement with SQL_ID "2hfu33hbrdr72" spent
    significant time waiting for User I/O on the hot object.
    RELEVANT OBJECT: SQL statement with SQL_ID 2hfu33hbrdr72
    RATIONALE: The SQL statement with SQL_ID "71vdzkq9j2j1q" spent
    significant time waiting for User I/O on the hot object.
    RELEVANT OBJECT: SQL statement with SQL_ID 71vdzkq9j2j1q
    RATIONALE: The SQL statement with SQL_ID "b5ww83x3qf8s8" spent
    significant time waiting for User I/O on the hot object.
    RELEVANT OBJECT: SQL statement with SQL_ID b5ww83x3qf8s8
    select edtr_broc_code, sum(edtr_val)/10000000 from export_dtr
    where edtr_sldt between '01-APR-10' and '31-DEC-10'
    and substr(edtr_cty_code,3,3)='187'
    group by edtr_broc_code
    order by edtr_broc_code
    RECOMMENDATION 2: Segment Tuning, 14% benefit (266 seconds)
    ACTION: Run "Segment Advisor" on TABLE "FTSSDEV.IMPORT_DTR" with object
    id 53220.
    RELEVANT OBJECT: database object with id 53220
    ACTION: Investigate application logic involving I/O on TABLE
    "FTSSDEV.IMPORT_DTR" with object id 53220.
    RELEVANT OBJECT: database object with id 53220
    RATIONALE: The I/O usage statistics for the object are: 3 full object
    scans, 1389746 physical reads, 119 physical writes and 0 direct
    reads.
    RATIONALE: The SQL statement with SQL_ID "3k7qjrwwqu1gt" spent
    significant time waiting for User I/O on the hot object.
    RELEVANT OBJECT: SQL statement with SQL_ID 3k7qjrwwqu1gt
    select idtr_broc_code, sum(idtr_val)/10000000 from import_dtr
    where idtr_rpt_date between '01-APR-10' and '31-DEC-10'
    and substr(idtr_cty_cons,3,3)='187'
    group by idtr_broc_code
    order by idtr_broc_code
    RATIONALE: The SQL statement with SQL_ID "c68ts7wh4nq3q" spent
    significant time waiting for User I/O on the hot object.
    RELEVANT OBJECT: SQL statement with SQL_ID c68ts7wh4nq3q
    SELECT COUNT(IDTR_VAL),SUM(IDTR_VAL),COUNT(IDTR_QTY),SUM(IDTR_QTY)
    FROM IMPORT_DTR WHERE idtr_amy_msft = :1 and idtr_itchs = :2 and
    idtr_cty_cons = :3 order by
    idtr_itchs,substr(idtr_cty_cons,3,3),idtr_seq_no
    RATIONALE: The SQL statement with SQL_ID "bf1g8zdfr9bu6" spent
    significant time waiting for User I/O on the hot object.
    RELEVANT OBJECT: SQL statement with SQL_ID bf1g8zdfr9bu6
    SELECT COUNT(IDTR_VAL),SUM(IDTR_VAL),COUNT(IDTR_QTY),SUM(IDTR_QTY)
    FROM IMPORT_DTR WHERE idtr_amy_msft = :1 and idtr_itchs = :2 order
    by idtr_itchs,substr(idtr_cty_cons,3,3),idtr_seq_no
    RECOMMENDATION 3: Segment Tuning, 8.1% benefit (153 seconds)
    ACTION: Run "Segment Advisor" on TABLE "FTSSDEV.EXP_BRC" with object id
    53046.
    RELEVANT OBJECT: database object with id 53046
    ACTION: Investigate application logic involving I/O on TABLE
    "FTSSDEV.EXP_BRC" with object id 53046.
    RELEVANT OBJECT: database object with id 53046
    RATIONALE: The I/O usage statistics for the object are: 38 full object
    scans, 1871936 physical reads, 1957 physical writes and 0 direct
    reads.
    RATIONALE: The SQL statement with SQL_ID "71vdzkq9j2j1q" spent
    significant time waiting for User I/O on the hot object.
    RELEVANT OBJECT: SQL statement with SQL_ID 71vdzkq9j2j1q
    RECOMMENDATION 4: Segment Tuning, 2.7% benefit (51 seconds)
    ACTION: Investigate application logic involving I/O on INDEX
    "FTSSDEV.IDTR_CTY_CONS_IDX" with object id 56342.
    RELEVANT OBJECT: database object with id 56342
    RATIONALE: The I/O usage statistics for the object are: 0 full object
    scans, 21347 physical reads, 22 physical writes and 0 direct reads.
    RATIONALE: The SQL statement with SQL_ID "c68ts7wh4nq3q" spent
    significant time waiting for User I/O on the hot object.
    RELEVANT OBJECT: SQL statement with SQL_ID c68ts7wh4nq3q
    SELECT COUNT(IDTR_VAL),SUM(IDTR_VAL),COUNT(IDTR_QTY),SUM(IDTR_QTY)
    FROM IMPORT_DTR WHERE idtr_amy_msft = :1 and idtr_itchs = :2 and
    idtr_cty_cons = :3 order by
    idtr_itchs,substr(idtr_cty_cons,3,3),idtr_seq_no
    SYMPTOMS THAT LED TO THE FINDING:
    SYMPTOM: Wait class "User I/O" was consuming significant database time.
    (64% impact [1205 seconds])
    FINDING 2: 36% impact (681 seconds)
    SQL statements consuming significant database time were found.
    RECOMMENDATION 1: SQL Tuning, 17% benefit (327 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "9749w96kkmuju".
    RELEVANT OBJECT: SQL statement with SQL_ID 9749w96kkmuju and
    PLAN_HASH 3254866456
    UPDATE EXP_BRC SET E_PORT_ORDER = :B4 , E_PRT_DESC = :B3 WHERE
    E_PORT_CODE = :B2 AND E_AMY_BRC = :B1
    RATIONALE: SQL statement with SQL_ID "9749w96kkmuju" was executed 34
    times and had an average elapsed time of 8.5 seconds.
    RECOMMENDATION 2: SQL Tuning, 7.4% benefit (139 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "cy02xtwqtbw1c".
    RELEVANT OBJECT: SQL statement with SQL_ID cy02xtwqtbw1c and
    PLAN_HASH 475583467
    UPDATE EXPORT_DTR SET EDTR_GROSS_WT = NVL(:1 , EDTR_GROSS_WT )
    ,EDTR_PORT = NVL(:1 , EDTR_PORT ) ,EDTR_ITCHS = NVL(:1 , EDTR_ITCHS )
    ,EDTR_CTY_CODE = NVL(:1 , EDTR_CTY_CODE ) ,EDTR_VAL = NVL(:1 ,
    EDTR_VAL ) ,EDTR_QTY = NVL(:1 , EDTR_QTY ) ,EDTR_BROC_CODE =
    DECODE(:1 , NULL , EDTR_BROC_CODE , '99' ) WHERE EDTR_SEQ_NO =
    :1 AND ( ( TO_NUMBER (TO_CHAR (EDTR_AMY_BRC , 'MM' ) ) =
    TO_NUMBER (:1 ) AND TO_NUMBER (TO_CHAR (EDTR_AMY_BRC , 'RRRR' ) )
    = TO_NUMBER (:1 ) ) OR ( TO_NUMBER (TO_CHAR (EDTR_AMY_MSFT , 'MM'
    ) ) = TO_NUMBER (:1 ) AND TO_NUMBER (TO_CHAR (EDTR_AMY_MSFT ,
    'RRRR' ) ) = TO_NUMBER (:1 ) ) )
    RATIONALE: SQL statement with SQL_ID "cy02xtwqtbw1c" was executed 30069
    times and had an average elapsed time of 0.0029 seconds.
    RECOMMENDATION 3: SQL Tuning, 6.9% benefit (129 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "3k7qjrwwqu1gt".
    RELEVANT OBJECT: SQL statement with SQL_ID 3k7qjrwwqu1gt and
    PLAN_HASH 728549793
    select idtr_broc_code, sum(idtr_val)/10000000 from import_dtr
    where idtr_rpt_date between '01-APR-10' and '31-DEC-10'
    and substr(idtr_cty_cons,3,3)='187'
    group by idtr_broc_code
    order by idtr_broc_code
    RATIONALE: SQL statement with SQL_ID "3k7qjrwwqu1gt" was executed 2
    times and had an average elapsed time of 58 seconds.
    RECOMMENDATION 4: SQL Tuning, 6.3% benefit (120 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "gsvs1ybcp476p".
    RELEVANT OBJECT: SQL statement with SQL_ID gsvs1ybcp476p and
    PLAN_HASH 3481633577
    select sum(EDTR_VAL) from export_dtr
    where EDTR_SLDT between '01-apr-2009' and '31-mar-2010'
    RATIONALE: SQL statement with SQL_ID "gsvs1ybcp476p" was executed 2
    times and had an average elapsed time of 52 seconds.
    RECOMMENDATION 5: SQL Tuning, 4.7% benefit (88 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "7xph3c32acw08".
    RELEVANT OBJECT: SQL statement with SQL_ID 7xph3c32acw08 and
    PLAN_HASH 2314778580
    select substr(idtr_cty_cons,3,3), sum(idtr_val)/10000000 from
    import_dtr
    where idtr_rpt_date between '01-APR-10' and '31-DEC-10'
    and substr(idtr_cty_cons,3,3)='187'
    group by substr(idtr_cty_cons,3,3)
    RATIONALE: SQL statement with SQL_ID "7xph3c32acw08" was executed 1
    times and had an average elapsed time of 79 seconds.
    FINDING 3: 30% impact (558 seconds)
    The SGA was inadequately sized, causing additional I/O or hard parses.
    RECOMMENDATION 1: DB Configuration, 30% benefit (558 seconds)
    ACTION: Increase the size of the SGA by setting the parameter
    "sga_target" to 2688 M.
    ADDITIONAL INFORMATION:
    The value of parameter "sga_target" was "1536 M" during the analysis
    period.
    SYMPTOMS THAT LED TO THE FINDING:
    SYMPTOM: Wait class "User I/O" was consuming significant database time.
    (64% impact [1205 seconds])
    FINDING 4: 19% impact (361 seconds)
    Individual SQL statements responsible for significant user I/O wait were
    found.
    RECOMMENDATION 1: SQL Tuning, 7.4% benefit (139 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "cy02xtwqtbw1c".
    RELEVANT OBJECT: SQL statement with SQL_ID cy02xtwqtbw1c and
    PLAN_HASH 475583467
    UPDATE EXPORT_DTR SET EDTR_GROSS_WT = NVL(:1 , EDTR_GROSS_WT )
    ,EDTR_PORT = NVL(:1 , EDTR_PORT ) ,EDTR_ITCHS = NVL(:1 , EDTR_ITCHS )
    ,EDTR_CTY_CODE = NVL(:1 , EDTR_CTY_CODE ) ,EDTR_VAL = NVL(:1 ,
    EDTR_VAL ) ,EDTR_QTY = NVL(:1 , EDTR_QTY ) ,EDTR_BROC_CODE =
    DECODE(:1 , NULL , EDTR_BROC_CODE , '99' ) WHERE EDTR_SEQ_NO =
    :1 AND ( ( TO_NUMBER (TO_CHAR (EDTR_AMY_BRC , 'MM' ) ) =
    TO_NUMBER (:1 ) AND TO_NUMBER (TO_CHAR (EDTR_AMY_BRC , 'RRRR' ) )
    = TO_NUMBER (:1 ) ) OR ( TO_NUMBER (TO_CHAR (EDTR_AMY_MSFT , 'MM'
    ) ) = TO_NUMBER (:1 ) AND TO_NUMBER (TO_CHAR (EDTR_AMY_MSFT ,
    'RRRR' ) ) = TO_NUMBER (:1 ) ) )
    RATIONALE: SQL statement with SQL_ID "cy02xtwqtbw1c" was executed 30069
    times and had an average elapsed time of 0.0029 seconds.
    RATIONALE: Average time spent in User I/O wait events per execution was
    0.0026 seconds.
    RECOMMENDATION 2: SQL Tuning, 6.9% benefit (129 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "3k7qjrwwqu1gt".
    RELEVANT OBJECT: SQL statement with SQL_ID 3k7qjrwwqu1gt and
    PLAN_HASH 728549793
    select idtr_broc_code, sum(idtr_val)/10000000 from import_dtr
    where idtr_rpt_date between '01-APR-10' and '31-DEC-10'
    and substr(idtr_cty_cons,3,3)='187'
    group by idtr_broc_code
    order by idtr_broc_code
    RATIONALE: SQL statement with SQL_ID "3k7qjrwwqu1gt" was executed 2
    times and had an average elapsed time of 58 seconds.
    RATIONALE: Average time spent in User I/O wait events per execution was
    44 seconds.
    RECOMMENDATION 3: SQL Tuning, 6.3% benefit (120 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "gsvs1ybcp476p".
    RELEVANT OBJECT: SQL statement with SQL_ID gsvs1ybcp476p and
    PLAN_HASH 3481633577
    select sum(EDTR_VAL) from export_dtr
    where EDTR_SLDT between '01-apr-2009' and '31-mar-2010'
    RATIONALE: SQL statement with SQL_ID "gsvs1ybcp476p" was executed 2
    times and had an average elapsed time of 52 seconds.
    RATIONALE: Average time spent in User I/O wait events per execution was
    35 seconds.
    RECOMMENDATION 4: SQL Tuning, 4.6% benefit (87 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "2j4wgybgmsq78".
    RELEVANT OBJECT: SQL statement with SQL_ID 2j4wgybgmsq78 and
    PLAN_HASH 3886674073
    select substr(edtr_cty_code,3,3), sum(edtr_val)/10000000 from
    export_dtr
    where edtr_sldt between '01-APR-10' and '31-DEC-10'
    and substr(edtr_cty_code,3,3)='187'
    group by substr(edtr_cty_code,3,3)
    RATIONALE: SQL statement with SQL_ID "2j4wgybgmsq78" was executed 1
    times and had an average elapsed time of 78 seconds.
    RATIONALE: Average time spent in User I/O wait events per execution was
    61 seconds.
    RECOMMENDATION 5: SQL Tuning, 3.3% benefit (62 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "bf1g8zdfr9bu6".
    RELEVANT OBJECT: SQL statement with SQL_ID bf1g8zdfr9bu6 and
    PLAN_HASH 2952524195
    SELECT COUNT(IDTR_VAL),SUM(IDTR_VAL),COUNT(IDTR_QTY),SUM(IDTR_QTY)
    FROM IMPORT_DTR WHERE idtr_amy_msft = :1 and idtr_itchs = :2 order
    by idtr_itchs,substr(idtr_cty_cons,3,3),idtr_seq_no
    RATIONALE: SQL statement with SQL_ID "bf1g8zdfr9bu6" was executed 41
    times and had an average elapsed time of 1.5 seconds.
    RATIONALE: Average time spent in User I/O wait events per execution was
    1.4 seconds.
    SYMPTOMS THAT LED TO THE FINDING:
    SYMPTOM: Wait class "User I/O" was consuming significant database time.
    (64% impact [1205 seconds])
    FINDING 5: 9.2% impact (173 seconds)
    Cluster multi-block requests were consuming significant database time.
    RECOMMENDATION 1: SQL Tuning, 7.8% benefit (146 seconds)
    ACTION: Investigate the SQL statement with SQL_ID "71vdzkq9j2j1q" for
    possible performance improvements. Look for an alternative plan that
    does not use object scans.
    RELEVANT OBJECT: SQL statement with SQL_ID 71vdzkq9j2j1q
    SYMPTOMS THAT LED TO THE FINDING:
    SYMPTOM: Inter-instance messaging was consuming significant database
    time on this instance. (11% impact [209 seconds])
    SYMPTOM: Wait class "Cluster" was consuming significant database
    time. (11% impact [209 seconds])
    FINDING 6: 9.2% impact (173 seconds)
    Global Cache Service Processes (LMSn) in other instances were not processing
    requests fast enough.
    RECOMMENDATION 1: DB Configuration, 9.2% benefit (173 seconds)
    ACTION: Increase throughput of the Global Cache Service (LMSn)
    processes. Increase the number of Global Cache Service processes by
    increasing the value of the parameter "gcs_server_processes".
    Alternatively, if the host is CPU bound consider increasing the OS
    priority of the Global Cache Service processes.
    RATIONALE: The value of parameter "gcs_server_processes" was "2" during
    the analysis period.
    SYMPTOMS THAT LED TO THE FINDING:
    SYMPTOM: Inter-instance messaging was consuming significant database
    time on this instance. (11% impact [209 seconds])
    SYMPTOM: Wait class "Cluster" was consuming significant database
    time. (11% impact [209 seconds])
    FINDING 7: 9% impact (169 seconds)
    SQL statements responsible for significant inter-instance messaging were found
    RECOMMENDATION 1: SQL Tuning, 17% benefit (327 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "9749w96kkmuju".
    RELEVANT OBJECT: SQL statement with SQL_ID 9749w96kkmuju and
    PLAN_HASH 3254866456
    UPDATE EXP_BRC SET E_PORT_ORDER = :B4 , E_PRT_DESC = :B3 WHERE
    E_PORT_CODE = :B2 AND E_AMY_BRC = :B1
    RATIONALE: SQL statement with SQL_ID "9749w96kkmuju" was executed 34
    times and had an average elapsed time of 8.5 seconds.
    RATIONALE: Average time spent in Cluster wait events per execution was
    3.2 seconds.
    RECOMMENDATION 2: SQL Tuning, 7.4% benefit (139 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "cy02xtwqtbw1c".
    RELEVANT OBJECT: SQL statement with SQL_ID cy02xtwqtbw1c and
    PLAN_HASH 475583467
    UPDATE EXPORT_DTR SET EDTR_GROSS_WT = NVL(:1 , EDTR_GROSS_WT )
    ,EDTR_PORT = NVL(:1 , EDTR_PORT ) ,EDTR_ITCHS = NVL(:1 , EDTR_ITCHS )
    ,EDTR_CTY_CODE = NVL(:1 , EDTR_CTY_CODE ) ,EDTR_VAL = NVL(:1 ,
    EDTR_VAL ) ,EDTR_QTY = NVL(:1 , EDTR_QTY ) ,EDTR_BROC_CODE =
    DECODE(:1 , NULL , EDTR_BROC_CODE , '99' ) WHERE EDTR_SEQ_NO =
    :1 AND ( ( TO_NUMBER (TO_CHAR (EDTR_AMY_BRC , 'MM' ) ) =
    TO_NUMBER (:1 ) AND TO_NUMBER (TO_CHAR (EDTR_AMY_BRC , 'RRRR' ) )
    = TO_NUMBER (:1 ) ) OR ( TO_NUMBER (TO_CHAR (EDTR_AMY_MSFT , 'MM'
    ) ) = TO_NUMBER (:1 ) AND TO_NUMBER (TO_CHAR (EDTR_AMY_MSFT ,
    'RRRR' ) ) = TO_NUMBER (:1 ) ) )
    RATIONALE: SQL statement with SQL_ID "cy02xtwqtbw1c" was executed 30069
    times and had an average elapsed time of 0.0029 seconds.
    RATIONALE: Average time spent in Cluster wait events per execution was
    0.00021 seconds.
    RECOMMENDATION 3: SQL Tuning, 6.3% benefit (120 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "gsvs1ybcp476p".
    RELEVANT OBJECT: SQL statement with SQL_ID gsvs1ybcp476p and
    PLAN_HASH 3481633577
    select sum(EDTR_VAL) from export_dtr
    where EDTR_SLDT between '01-apr-2009' and '31-mar-2010'
    RATIONALE: SQL statement with SQL_ID "gsvs1ybcp476p" was executed 2
    times and had an average elapsed time of 52 seconds.
    RATIONALE: Average time spent in Cluster wait events per execution was
    6.2 seconds.
    RECOMMENDATION 4: SQL Tuning, 4.7% benefit (88 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "7xph3c32acw08".
    RELEVANT OBJECT: SQL statement with SQL_ID 7xph3c32acw08 and
    PLAN_HASH 2314778580
    select substr(idtr_cty_cons,3,3), sum(idtr_val)/10000000 from
    import_dtr
    where idtr_rpt_date between '01-APR-10' and '31-DEC-10'
    and substr(idtr_cty_cons,3,3)='187'
    group by substr(idtr_cty_cons,3,3)
    RATIONALE: SQL statement with SQL_ID "7xph3c32acw08" was executed 1
    times and had an average elapsed time of 79 seconds.
    RATIONALE: Average time spent in Cluster wait events per execution was
    32 seconds.
    RECOMMENDATION 5: SQL Tuning, 3.4% benefit (64 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "21b7t3t31uwv7".
    RELEVANT OBJECT: SQL statement with SQL_ID 21b7t3t31uwv7 and
    PLAN_HASH 747082824
    INSERT INTO EXP_GRP_ADD_TEMP(ID, ITC2, ITC4, ITC6, ITC8, GRP) SELECT
    ROWID, SUBSTR(EDTR_ITCHS,1,2), SUBSTR(EDTR_ITCHS,1,4),
    SUBSTR(EDTR_ITCHS,1,6), EDTR_ITCHS, EDTR_BROC_CODE FROM EXPORT_DTR
    WHERE TO_CHAR(EDTR_AMY_BRC,'DD-MM-YYYY') = :B3 AND EDTR_BROC_CODE =
    '99' AND EDTR_TOD BETWEEN :B2 AND :B1
    RATIONALE: SQL statement with SQL_ID "21b7t3t31uwv7" was executed 1
    times and had an average elapsed time of 61 seconds.
    RATIONALE: Average time spent in Cluster wait events per execution was 8
    seconds.
    SYMPTOMS THAT LED TO THE FINDING:
    SYMPTOM: Wait class "Cluster" was consuming significant database time.
    (11% impact [209 seconds])
    FINDING 8: 2.4% impact (45 seconds)
    Wait event "Streams AQ: qmn coordinator waiting for slave to start" in wait
    class "Other" was consuming significant database time.
    RECOMMENDATION 1: Application Analysis, 2.4% benefit (45 seconds)
    ACTION: Investigate the cause for high "Streams AQ: qmn coordinator
    waiting for slave to start" waits. Refer to Oracle's "Database
    Reference" for the description of this wait event.
    RECOMMENDATION 2: Application Analysis, 2.4% benefit (45 seconds)
    ACTION: Investigate the cause for high "Streams AQ: qmn coordinator
    waiting for slave to start" waits in Service "SYS$BACKGROUND".
    SYMPTOMS THAT LED TO THE FINDING:
    SYMPTOM: Wait class "Other" was consuming significant database time.
    (4.8% impact [90 seconds])
    FINDING 9: 2.1% impact (40 seconds)
    Time spent on the CPU by the instance was responsible for a substantial part
    of database time.
    RECOMMENDATION 1: SQL Tuning, 6.9% benefit (129 seconds)
    ACTION: Run SQL Tuning Advisor on the SQL statement with SQL_ID
    "3k7qjrwwqu1gt".
    RELEVANT OBJECT: SQL statement with SQL_ID 3k7qjrwwqu1gt and
    PLAN_HASH 728549793
    select idtr_broc_code, sum(idtr_val)/10000000 from import_dtr
    where idtr_rpt_date between '01-APR-10' and '31-DEC-10'
    and substr(idtr_cty_cons,3,3)='187'
    group by idtr_broc_code
    order by idtr_broc_code
    RATIONALE: SQL statement with SQL_ID "3k7qjrwwqu1gt" was executed 2
    times and had an average elapsed time of 58 seconds.
    RATIONALE: Average CPU used per execution was 19 seconds.
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ADDITIONAL INFORMATION
    Wait class "Application" was not consuming significant database time.
    Wait class "Commit" was not consuming significant database time.
    Wait class "Concurrency" was not consuming significant database time.
    Wait class "Configuration" was not consuming significant database time.
    Wait class "Network" was not consuming significant database time.
    Session connect and disconnect calls were not consuming significant database
    time.
    Hard parsing of SQL statements was not consuming significant database time.
    The analysis of I/O performance is based on the default assumption that the
    average read time for one database block is 10000 micro-seconds.
    An explanation of the terminology used in this report is available when you
    run the report with the 'ALL' level of detail.
    Now its showing that ..the particular quary..need segment tuning.But how should i approach to tune this problem.

    Apparently if you're able to run ADDM, you must have something like Grid Control.
    On the main page of each database, there is a link called "Segment Advisor".
    Behind this link there is a page, with all segment recommendation of that perticular database.
    Also check if the table has correct indexes, statistics, etc..etc..
    HTH
    FJFranken

Maybe you are looking for

  • Help for purchasing photoshop touch

    I have recently become really interested in photo editing and projects. I was browsing the Apple App Store and came across photoshop touch, a perfect way to get my feet wet. I immediately tried purchase it, yet informed me I had to have a flash on my

  • Ethernet LAN not working after upgrading to windows 8.1

    After upgrading from Windows 8 to 8.1, my Ethernet LAN is no longer working. Wireless LAN connectivity is OK. I have run the HP Support Assistant and downloaded the drivers for 8.1. Hovever, I do not see a 8.1 driver listed for the Ethernet LAN.

  • Hp printer can't scan to hp touchsmart lost connection

    I recently purchased and connected successfully a new HP Photosmart6520 printer, all functions worked fine including scan to my HP Touch Smart, however recently I have been unable to scan to my PC. I can scan to my Acer Laptop without a problem so I

  • XSLT mapping Examples

    Hi Xperts, Could any one send me XSLT transformation code  with step by step procedure and some examples  on XSLT transformation for  XML to XML Convert. please send me to:  <b>[email protected]</b> Best regards Gopi

  • After 2013-002 security update quick look preview does not work with Office files

    As title, one of best feature of Snow leopard is definitively dead!! Office.qlGenerator is at the same path. How can i solve this bug??