How to relate data of tables returned by FM for reading data from ST03N

Hello,
I am using the FM SWNC_COLLECTOR_GET_AGGREGATES to retrieve data displayed in T-Code ST03N into my report.
I am able to get the required data in different tables like Times, Memory, Usertcode, Userworkload, Hitlist_database.
However I am unable to undertand how to relate the data of these different tables, prepare another internal table with only the fields that are required for my report and then display those details.
Like, I want to loop through one of these internal tables, read data from other internal tables and consolidate my required data. But I am unable to understand based on what criteria I can read other internal tables inside a loop of any one of the above said internal tables.
I searched a lot and found many threads and blogs related to ST03N and learnt a lot about this T-code. But I could not find how to relate the data of all these tables.
Request you to please help.
Thanks and Best Regards,
Eswar

Hi,
I see the common fields like TaskType, EntryId. But, I want to make sure that these are the fields that are enough to read the records or is there any other way that I can be sure that I am getting the correct record in my read statement. So my question is specific to the tables returned from the FM SWNC_COLLECTOR_GET_AGGREGATES and not a generic one.
Thank you for your help.
Best Regards,
Eswar.
Edited by: eswar praveen on Aug 20, 2010 12:19 PM

Similar Messages

  • Dynamic Internal Table for reading data from external file

    Hello All,
    The task was to create a internal table with dynamic columns,
    Actually this is my first task in the WebAS 6.20, my program is based on input file provided by user with certain effort. this file can have different effort for a one yr to five year frame..
    I needed to read the raw data from file, based on months create a internal table to hold the data, after this i need to validate the data...
    I have browsed thru dynamic internal table topic, but couldn't find any dynamic appending structure, the dynamic structure would contains 12 month fileds.
    can any one help me in getting my task completed..
    Thanks
    Kumar

    Hi,
    I see that you posted the same question a couple of days ago at Dynamic Internal Table for reading data from external file Didn't Charles's response address your problem?
    Regards

  • MDW Disk Usage for Database Report Error - A data source has not been supplied for the data source DS_TraceEvents

    Hello,
    On the MDW Disk Usage Collection Set report, I get the following error when I click on a database hyperlink.
    A data source has not been supplied for the data source DS_TraceEvents
    SQL profiler shows the following SQL statements are executed (I've replaced the database name with databaseX)
    1. exec sp_executesql N'SELECT
    dtb.name AS [Name]
    FROM
    master.sys.databases AS dtb
    WHERE
    (dtb.name=@_msparam_0)',N'@_msparam_0 nvarchar(4000)',@_msparam_0=N'databaseX'
    this returns zero rows as databaseX does not exist on my MDW central server, but is a database on a target server (i.e. one that is being monitored and uploaded into the MDW central server).
    2. USE [datatbaseX]
    this produces the following error:
    Msg 911, Level 16, State 1, Line 1
    Database 'databaseX' does not exist. Make sure that the name is entered correctly.
    why is the report looking for the database on my server?
    thanks
    Jag
    Environment: MDW (Management Data Warehouse) on SQL 2008 R2

    Hi Jag,
    Based on my test, while this database is offline, we will encounter this issue. This is because that while we click the certain database in “Disk Usage Collection
    Set” report, it will query some information with that certain database. If this database is offline, we will not access this database to acquire related information and generates this error.
    Therefore I recommend that you check the status of this database by using this system view:
    sys.databases. If it is not online, please execute
    the following statements in a new window to make this database to be online:
    USE master
    GO
    ALTER DATABASE <database name> SET ONLINE
    GO
    If anything is unclear, please let me know.
    Regards,
    Tom Li

  • How do i get a wifi connection at home for my ipad from a imac

    how do i get a wifi connection at home for my ipad from a imac

    If you have a wired connection to the iMac and you are asking about using the iMac for the Internet connection for the iPad, you will need to use Internet Sharing on the Mac.
    No clue which OS version you run on the Mac, but here is a link to help with 10.6 and this will get you started. You can Google " Internet sharing with Mac OS ...." and come up with a number of support articles.
    http://docs.info.apple.com/article.html?path=Mac/10.6/en/8156.html

  • How do I remove "DNA Software tethered shooting demo for A99" disclaimer from all my during capture?

    how do I remove "DNA Software tethered shooting demo for A99" disclaimer from all my images during capture?

    I guess the obvious answer is PURCHASE the DNA Software so it won't be a demo unless you have already purchased it and it still won't work and in that case you should contact DNA Software.

  • Master data time dependant attribute in query for several dates

    Hello all,
    I need to create a query to display prices for a material in different periods. 
    The user will enter a period interval.  For each period, they want to know the material price the last day of that period.
    Material price is a time-dependat attribute of material, and I created one formula variable to display it. The problem is the using the key-date of the query only works for a date, but not for severals.
    Is there any other way to do it?
    If not, the only solution I find is to have in the infoprovider the prices that I need, but I don't think this is a good practice.
    Any suggestions?
    Thanks!

    I have just seen something: in fact the created lines are:
    Key1   Compound1   fromdate    todate    attr1      attr2
    0001      L      01.01.1000      01.01.2006          <empty
    so as you can see, the right interval is created but the attributes are not up to date.
    Cheers.
    Cyril.

  • I want to display static data in table in web dynpro for java

    Hi,
    I have to display static (textview )data in table . So what im doing is that i have created a node of name "table" then created its attribues and binded it to the table and in table, under "column editor" option of "text"  is there where i can write the required text . But Im not able to do the same thing in other rows. i.e. im restricted to only 1st row. How to enter it in other rows. So i want that i should write some content in all the rows of table and display it in text view when i execute it. Its just like u r presenting an excel sheet in output form to the user .
    Thanks

    Hi
    for( int i = 0 ; i <= 5 ; i++) {
    IPrivateTestView.ITableElement element = wdContext.nodeTable().createTableElement();
    element.set<TextViewatrribute>("TEXT");
    wdContext.nodeTable().addElement(element);
      error is on 3rd line of code in "element.set "  .The error is   " invalid expression as statement"
    Thanks

  • Data in table control not seen for the Standard Transaction Iview

    Hi
    I am creating one Standard Transaction Iview for CATS .
    While doing print preview in IE 6 , I am not able to see data in Table control(Data Entry Area ).
    Can you please provide me the solution for how I can see the data in Table Control
    Regards
    Ruturaj

    Hi David,
    I too struggled a lot to find the solution ....atlast got it....It possible by exporting and importing the table control values to Database Index.
    1. AT SELECTION SCREEN OUTPUT event  triggers when you SAVE and GET the variant.
    2. So write the logic in AT SELECTION-SCREEN OUTPUT event.
    CONSTANTS: c_vari TYPE char30 VALUE
                             '(SAPLSVAR)RSVAR-VARIANT'.
      FIELD-SYMBOLS: <lfs_vari> TYPE ANY.
      ASSIGN: (c_vari) TO <lfs_vari>.
      IF sy-subrc = 0.
        IF <lfs_vari> IS NOT INITIAL.
          IF ok_code = 'SPOS'.
            EXPORT gt_chars[] TO DATABASE vari(tc) ID <lfs_vari>.
          ELSEIF ok_code = space.
            IMPORT gt_chars[] FROM DATABASE vari(tc) ID <lfs_vari>.
          ENDIF.
        ENDIF.
      ENDIF.
    In the above logic if OK_CODE is 'SPOS', that is for saving the variant with the name <lfs_vari>.
    Similarly if the OK_CODE is other than, 'GET'....that is for retrieving the variant. But in case of getting the variant OK_CODE is not filled with 'GET', but variant will be filled. We should take variant filling as base and do as done above.
    It worked for me.....

  • How to avoid record rejection due to conversion failed for dirty data

    There is a source delimited flat file created with codepage CP936, simplified
    Chinese. Data Services XI on HP Unix has to extract this file to a UTF-8 db2
    table.
    However, because the flat file contains a lot of dirty or corrupted
    characters for Chinese word columns, some records are failed to convert and being
    rejected by DS to load to db2. Typically the error is the corrupted character
    causes DS to think there is less delimiters on the record then it should be.
    I use less perfect workaround to cheat DS to tell it is UTF8 codepage input file
    instead of CP936 codepage file, and DS doesn't complain missing delimiters no more, but the
    extracted Chinese columns are not readable.
    However, problem resurfaces again when DS encounters this gibberish data value in
    one column of the record:
      u20AC\ \u2021 u20ACxu2019 u20AC     
    The record will be rejected entirely and no error and no warning being generated
    by DS to explain the rejection. Even I try to use Text delimiter of double qoute to enclose
    this data value, but DS still unable to extract it and rejects the whole record without warning.
    Is there a way to use Validation Transform or other transforms to catch all the conversion
    errors before hand, and assign null value for dirty data values, rather then
    letting the records being rejected?
    Or there is a way to set the options and relax the conversion errors to let
    all data records successfully extracted? Currently at Error handling section
    of flat file format object, all error handling and logging options are
    selected Yes.
    Edited by: Chow Ming Darren Cheeng on Aug 7, 2009 10:31 AM

    Thank you for your reply. Currently I am using this ifthenelse function to check wheter the data value contains
    escape character and set the whole value to null to avoid data rejection.
    ifthenelse(index(column_value, '
    ', 1) IS NOT NULL, NULL, column_value)
    The source file team says that whenever they have nonprintable characters inside the data value, the character will be escaped with
    I agree with what you said if record is broken up too much, DS has no choice but to skip it.

  • Insert data into table 1 but remove the duplicate data

    hello friends,
    i m trying to insert data into table tab0 using hints,
    query is like this..
    INSERT INTO /*+ APPEND PARALLEL(tab0) */ tab NOLOGGING
    (select /*+ parallel(tab1)*/
    colu1,col2
    from tab1 a
    where a.rowid =(select max (b.rowid) from tab2 b))
    but this query takes too much time around 5 hrs...
    bz data almost 40-50 lacs.
    i m using
    a.rowid =(select max (b.rowid) from tab2 b))....
    this is for remove the duplicate data..
    but it takes too much time..
    so please can u suggest me any ohter option to remove the duplicate data so it
    resolved the optimization problem.
    thanks in advance.

    In the code you posted, you're inserting two columns into the destination table. Are you saying that you are allowed to have duplicates in those two columns but you need to filter out duplicates based on additional columns that are not being inserted?
    If you've traced the session, please post your tkprof results.
    What does "table makes bulky" mean? You understand that the APPEND hint is forcing the insert to happen above the high water mark of the table, right? And you understand that this prevents the insert from reusing space that has been freed up because of deleted in the table? And that this can substantially increase the cost of full scans on the table. Did you benchmark the INSERT without the APPEND hint?
    Justin

  • Get table partition name dynamically for given date range

    Dear All,
    Could you please tell me how to get the partition name dynamicaly for given date range ?
    Thank you.

    SQL> select table_name,
           partition_name,
           to_date (
              trim (
                 '''' from regexp_substr (
                              extractvalue (
                                 dbms_xmlgen.
                                 getxmltype (
                                    'select high_value from all_tab_partitions where table_name='''
                                    || table_name
                                    || ''' and table_owner = '''
                                    || table_owner
                                    || ''' and partition_name = '''
                                    || partition_name
                                    || ''''),
                                 '//text()'),
              'syyyy-mm-dd hh24:mi:ss')
              high_value_in_date_format
      from all_tab_partitions
    where table_name = 'SALES' and table_owner = 'SH'
    TABLE_NAME                     PARTITION_NAME                 HIGH_VALUE_IN_DATE_FORMAT
    SALES                          SALES_1995                     01-JAN-96               
    SALES                          SALES_1996                     01-JAN-97               
    SALES                          SALES_H1_1997                  01-JUL-97               
    SALES                          SALES_H2_1997                  01-JAN-98               
    SALES                          SALES_Q1_1998                  01-APR-98               
    SALES                          SALES_Q2_1998                  01-JUL-98               
    SALES                          SALES_Q3_1998                  01-OKT-98               
    SALES                          SALES_Q4_1998                  01-JAN-99               
    SALES                          SALES_Q1_1999                  01-APR-99               
    SALES                          SALES_Q2_1999                  01-JUL-99               
    SALES                          SALES_Q3_1999                  01-OKT-99               
    SALES                          SALES_Q4_1999                  01-JAN-00               
    SALES                          SALES_Q1_2000                  01-APR-00               
    SALES                          SALES_Q2_2000                  01-JUL-00               
    SALES                          SALES_Q3_2000                  01-OKT-00               
    SALES                          SALES_Q4_2000                  01-JAN-01               
    SALES                          SALES_Q1_2001                  01-APR-01               
    SALES                          SALES_Q2_2001                  01-JUL-01               
    SALES                          SALES_Q3_2001                  01-OKT-01               
    SALES                          SALES_Q4_2001                  01-JAN-02               
    SALES                          SALES_Q1_2002                  01-APR-02               
    SALES                          SALES_Q2_2002                  01-JUL-02               
    SALES                          SALES_Q3_2002                  01-OKT-02               
    SALES                          SALES_Q4_2002                  01-JAN-03               
    SALES                          SALES_Q1_2003                  01-APR-03               
    SALES                          SALES_Q2_2003                  01-JUL-03               
    SALES                          SALES_Q3_2003                  01-OKT-03               
    SALES                          SALES_Q4_2003                  01-JAN-04               
    28 rows selected.

  • How tio restrict the generation of Calib.Maint Order for Previous dates

    Dear PM Guru's
    i required one restriction option to prevent the generation of calibration  maintenance orders for previous dates.please under stand the full information in following example.
    one of my gauge frequency was every 6months. for every 6months user wants one calibration order. in between these 6months frequency user checks the gauge for every 1month as a Inspection / verification of gauge. while uploading the data in SAP, user enters the dates 02.02.2011 as  a start of cycle, but we upload the data in sap 02.05.2011. on 05.05.2011 we will go for live. here system should generate the order every month as 02.03.2011,02.04.2011,02.05.2011,02.06.2011,02.07.2011,02.08.2011. As per out condition 02.08.2011 has Inspection / verification and calibration remaining months are inspection / verification. Now here how i can restrict the order generation for 02.03.2011,02.04.2011,02.05.2011dates. Please give me the suggestion and solution to avoid the order generation. if i left same as it is somany orders should be generated.
    please help me. thanks so much in advance to all PM gurus.
    regards
    PM

    First method:
    You can define offset in strategy so that first call will be created after the offset period (in your case, it is 4 months).
    Second method:
    You can define the start date of cycle as 02.05. instead of 02.02, as system is going live only by 05.05.

  • Table EABL Question SORT with  reading date ADATTATS and  ADAT

    Hi all,
    I have selction criteria to qery EABL.
    After select ,i have to sort to get  latest 10 records from EABL .
    I  was told if the value of ADATTATS  is null ,i should display ADAT .
    But to select latest 10 records i have need to sort based upon date field..IF date field is null i will not get sort in proper descending order......
    Any suggestions and help ?
    display :for a partcular contract account
    Current meter reading date 1
    previous meter reading date 2
    previous meter reading date 3
    Is there any other field in this table EABL  which could be used for SORT here  for this display.
    Thanks in advance...
    Edited by: Sona on Feb 26, 2008 7:29 PM

    Yes,
    My case is :
    ADATTATS      
    record1 : date1         
    date2        
    null          
    null           
    record5 : date5   
    ADAT 
    record1 :  0
    0
    date3
    date4
    record5 :     0 
    Result is :
    date1
    date2
    date3
    date4
    date5
    I was told only if date is null in ADATTATS  i should go for ADAT.(that means only then value will be in ADAT)The majority of the time there will be a date in ADATTATS.  If an estimate or manual MR is entered, the date will come from ADAT.
    My test table has all null values in ADATTATS ,i hv values only in ADAT in test table.
    Another question :
    Can we change the data in test tables ?
    Testing process to check the billing run ?what r the steps involved ?

  • Return Code = 12 for LSMW data read

    Hi all,
    I am executing an LSMW that will change 27 classifications for each materials. I have tested running a file with only 100 materials and all is good. Now I am testing a file with 40k materials (90MB file size) but when trying to "read data" from the imported file I get 'Error when uploading file 'J:\Unilever\Test Load LSMW files for 10-01-08\40k'  'Message no. /SAPDMC/LSMW_OBJ_070010'.
    I am guessing there's a file size maximum which I have exceeded and that's why the LSMW is not capable of reading it but there in the past has been successfull executions of 25k - 50k materials...
    Any help is appreciated, thanks!

    There is also a limit on the length of the file name (with the directory structure). Please try reading the same file from the J drive (not under any subdirectory).
    Hope this helps.
    Lakshman

  • Server return code 0x80020004 : Error reading fileu201D from ABAP MDM API

    Hello,
    I am trying use ABAP APIs to create records in a qulifier table in SRM MDM catalog . The tabel i am trying to update is Contract price. it has 4 non qualifier fields and 5 qualifier fields. I am using below code to create it.
    CALL METHOD lr_api->mo_core_service->create_simple
              EXPORTING
                iv_object_type_code = 'MDMSRM_CONTRACT_PRICE'
                is_ddic_structure   = ls_retrive
              IMPORTING
                ev_new_internal_id  = lv_key.
    But it retuns an exception cx_mdm_server_rc_code : Server return code 0x80020004 : Error reading file . My ID has admin auth in MDM data manger and able to create recrds direclty in data manager. But API retuns an error.
    Requst you to help me ith this . Please advice if i have posted it in wrong section.
    Thanks,
    DIvya.

    done. This API is working from MDM7 version, not available in MDM5

Maybe you are looking for

  • Scheduled report never leaves 'scheduled' status

    I have a BIP report that is supposed to burst to three pdf files. I schedule it to run, it hits the scheduler queue, and it never leaves the queue. The Status column shows 'Scheduled' and it just sits there - never completes, never fails. I can click

  • CS6 PhotoShop hanging when opening PDFs - help?

    Hi all, I'm IT support to a team of studio artists who are having some issues with our newly upgrade CS6 Premium suite (Mac OS 10.6 + Mac OS 10.8). Situation: (Mac OS 10.6 problem) someone will be working on a 2GB network file, and tries to open a PD

  • Dropping a column in composite xmltype table with virtual column

    Hello, i found some interesting behavior. I have a table with xmltype column and a virtual column. If i drop a column, which has an index smaller than the index for the xmltype column, than the virtual column reference for xmltype column is not chang

  • Messages staying in offline

    My iMac Messages keeps saying "connecting" and won't go online.

  • Adobe photoshop elements 10 - cab13 corrupt

    The software fails to install reporting a corrup Data13.cab file. Can I download instal the trial and use the serial number to activate.