Definitions of time-related fields of usage tracking data

Hi all,
After reading the related documentations on usage tracking, I'm still confused about the true meaning of the following fields:
COMPILE_TIME_SEC The time in seconds required to compile the query.
C UM_DB_TIME_SEC The total amount of time in seconds that the Oracle BI Server waited for back-end physical databases on behalf of a logical query.
TOTAL_TIME_SEC The time in seconds that the Oracle BI Server spent working on the query while the client waited for responses to its query requests.
START_TS The date and time the logical query was submitted.
END_TS The date and time the logical query finished. The start and end timestamps also reflect any time the query spent waiting for resources to become available.
My questions are:
1. Is TOTAL_TIME_SEC = C U M_DB_TIME_SEC + COMPILE_TIME_SEC? Why are there cases where C U M_DB_TIME_SEC > TOTAL_TIME_SEC ?
2. I've logged cases where (END_TS - START_TS) > TOTAL_TIME_SEC. Why would this happen?
Could someone clarify the meaning of these fields for me please? Thanks very much!!
=================
I have logged the following data with usage tracking, posting here as examples:
1.
compile_time_sec = 0
c um_db_time_sec = 23
total_time_sec = 12
start_ts = 18-AUG-09 (14:15:47)
end_ts = 18-AUG-09 (14:16:00)
end_ts - start_ts (in sec) = 13
cache_ind_flg = N
num_cache_hits = 0
num_cache_inserted = 1
2.
compile_time_sec = 0
c um_db_time_sec = 125
total_time_sec = 110
start_ts = 21-AUG-09 (15:45:37)
end_ts = 21-AUG-09 (16:47:35)
end_ts - start_ts (in sec) = 3718
cache_ind_flg = N
num_cache_hits = 0
num_cache_inserted = 0
3.
compile_time_sec = 0
c um_db_time_sec = 1252
total_time_sec = 932
start_ts = 24-AUG-09 (17:23:40)
end_ts = 24-AUG-09 (17:49:27)
end_ts - start_ts (in sec) = 1547
cache_ind_flg = N
num_cache_hits = 0
num_cache_inserted = 0

Check out DOC ID 973090.1 in metalink.

Similar Messages

  • Accuracy of Usage Tracking Data

    Have anybody used and compared the Usage Tracking data with actual user experience?
    I'm noticing that the Usage tracking is not being logged consistently for all the users. Some of the entries are missing. What can the reason be for these missing entries.
    When I compare the Elapsed Time from Session logs with the TOTAL_TIME_SEC from Usage tracking that does not alwasy match up. Is ther any difference between those 2?
    What is the best measure to track the performance of the application that will give acurate measure the actual user experience?
    Any help with shading some light on overall Usage Tracking accuracy will be a great help.
    Thanks!
    Edited by: VNC on Sep 18, 2009 12:24 PM

    Check out DOC ID 973090.1 in metalink.

  • Query of some time related fields

    Hi experts
    In pm order operation "Dates" view, there are some time fields in the screen include Earliest scheduled start time,
    Earliest scheduled finish time,Latest scheduled start time,Latest scheduled finish time.
    and actual start time,actual finish time.
    I found the earliest and latest time were calculated automatically and they have display status,can not be changed.I want to know how system calculate and get the time?
    after technical close the pm order. I found the actual time is still blank. what should I do to get the actual time and make it display in the "Dates" screen?
    thanks in advance!

    Mark,
    Useful reading: [Understanding Maintenance Order Scheduling|http://www.sapfans.com/forums/viewtopic.php?p=711224]
    Maybe you need to activate the "forward scheduling with time" in the IMG???
    PeteA

  • Time Related Dimension(table) in communication data model

    Hi,
    The most common and important thing for a communication data model to me is time metrics.
    Things like mean up time, or mean down time are common KPIs we are tracking.
    To make this model efficient, we need to create a time dimension can precisely and efficiently to calculate the mean.
    How do we create such time dimension with common year, quarter, month, date, and time, while with UTC to local conversion?

    Not sure i follow your question.
    Can you give some example of the KPIs you mentioned - Mean up time and Mean down time with examples -- as detailed as possible (fact data details)?
    On the face of it, it looks like these KPIs should be Calculated Measures involving Time expressions on date columns (date datatype =day+time infomation present in source columns) but nothing in it which needs a customized time dimension. Maybe not out of the box but very much doable in analytical layer (either using olap or relational rpd). So it should be do-able without needing any changes to the default time dimension available with OCDM.
    HTH

  • Usage Tracking table column definition

    Hi Gurus,
    I am not able to visualize the difference / significance of these two columns in S_NQ_ACCT table for usage tracking:
    NUM_CACHE_HITS - {Indicates the number of times existing cache was returned.}
    NUM_CACHE_INSERTED - {Indicates the number of times query generated cache was returned.}
    i got NUM_CACHE_HITS which gives the number of times cache match has occured, but what is role / meaning of NUM_CACHE_INSERTED, the documentation says "Indicates the number of times query generated cache was returned" what is query generated cache?
    Thanks,
    Sri

    hi jups,
    By deafult that parameter wil be null ,but any query got cached then it updates its value this way num_cache_inserted = 1
    Will this help you Definitions of time-related fields of usage tracking data
    hope helps you.
    Cheers,
    KK

  • Oracle Usage Tracking dialog

    hi
    After installing JDeveloper 11.1.1.5.0 (and other 11g versions before that) and starting JDeveloper for the first time, an "Oracle Usage Tracking" dialog is shown with the message
    "In order to continuously improve our products, Oracle is interested in learning about product usage. To that end, automated reports can occasionally be sent to Oracle describing the product features in use. No personally identifiable information will be sent and the report will not affect performance. You can review Oracle's privacy policy on our website. "
    and a checkbox "Allow automated usage reporting to Oracle".
    see the screenshot at http://www.consideringred.com/files/oracle/img/2011/OracleUsageTracking111150.png
    - (q1) Where can details be found about this feature, like which information it sends exactly, or when?
    - (q2) Has Oracle published anything related to the results of this "Usage Tracking" feature?
    many thanks
    Jan Vervecken

    Hi Jan,
    I'm not aware of a public wiki page or anything like that which explains the data being gathered on this feature.
    What it does, is create an xml file after each close of the IDE. It records some of the API usage during a session (defined as start to close of the IDE).
    The file is saved to this location by default on a windows machine with an install with the "single user" option selected.
    C:\Documents and Settings\<username>\Application Data\JDeveloper\system11.1.1.5.37.60.13\o.ide.usages-tracking
    You can see what this file gathers if you like, by setting some bogus proxy info in the Tools --> Preference in JDev to stop JDev from being able to connect to the tracking server. Then just doing some normal work and stopping JDeveloper. The file will stay there until it can connect again.
    Of course this will only happen if you have opt-in to the usage-reporting in the first place.
    The data that is gathered is used internally by the product management team to see which features are used more often in relation to each other, etc. It is not made available publicly.
    Here is a report that I just did on my own machine. This is the result of opening an extension.xml file and then opening a .java file.
    <?xml version = '1.0' encoding = 'UTF-8'?>
    <usages xmlns="http://xmlns.oracle.com/jdeveloper/110000/usages-tracking-data">
    <hash n="system-info">
    <value n="build-label" v="JDEVADF_11.1.1.5.0_GENERIC_110409.0025.6013"/>
    <value n="dev-build" v="false"/>
    <value n="guid" v="07dd49ee-012e-1000-8001-0a9a27d99b84"/>
    <value n="jdk-version" v="1.6.0_24"/>
    <value n="operating-system" v="Windows XP"/>
    <value n="product-edition" v="oracle.studio, oracle.j2ee, oracle.jdeveloper"/>
    <value n="product-name" v="Oracle JDeveloper 11g Release 1"/>
    <value n="product-version" v="11.1.1.5.37.60.13"/>
    <value n="session-end-time" v="1305043200531"/>
    <value n="session-id" v="1305043105550"/>
    <value n="session-start-time" v="1305043105550"/>
    <value n="user-role" v="<none>"/>
    </hash>
    <hash n="usage-data">
    <list n="activities">
    <hash>
    <value n="extension-product-id" v="oracle.jdevimpl.extensiondt.editor.ExtensionManifestEditor"/>
    <value n="property-id" v="oracle.jdeveloper.extensiondt.model.ExtensionManifestNode"/>
    <value n="time-stamp" v="1305043182769"/>
    <value n="usage-type" v="OPEN_EDITOR"/>
    </hash>
    <hash>
    <value n="extension-product-id" v="oracle.jdevimpl.extensiondt.editor.ExtensionManifestEditor"/>
    <value n="property-id" v="oracle.jdeveloper.extensiondt.model.ExtensionManifestNode"/>
    <value n="time-stamp" v="1305043182832"/>
    <value n="usage-type" v="ACTIVATE_EDITOR"/>
    </hash>
    <hash>
    <value n="extension-product-id" v="oracle.ide.ceditor.CodeEditor"/>
    <value n="property-id" v="oracle.jdeveloper.model.JavaSourceNode"/>
    <value n="time-stamp" v="1305043190096"/>
    <value n="usage-type" v="OPEN_EDITOR"/>
    </hash>
    <hash>
    <value n="extension-product-id" v="oracle.ide.ceditor.CodeEditor"/>
    <value n="property-id" v="oracle.jdeveloper.model.JavaSourceNode"/>
    <value n="time-stamp" v="1305043190767"/>
    <value n="usage-type" v="ACTIVATE_EDITOR"/>
    </hash>
    </list>
    </hash>
    </usages>
    Hope that helps explain things a bit. Let me know if you have any other Q's.
    --jb                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Clear up Usage tracking in Version 8

    <p>Is there a way to clear up the usage tracking records inV8_USAGE_ATTRIBUTES table to clear up the table.  The has 2.6million rows and slowing up the the migration to sytem 9.  Anythoughts on this?</p>

    Hi Mathias
    Yes I did. It appears, that with the new version, we are also getting some new Usage Tracking data. These were created when you updated your schemas. So you need to reimport the usage tracking metadata for your usage tracking. OBIEE gives the errors because it can't write the new data fields. Once I did that, most of my reporting started to work again. The reason for the added data, is for Exalytics usage. You might not need them, but you can't come around them - so reimport the metadata, and you should be fine.
    It's just not very clear, that you need to do this, from the documentation Oracle provides, atleast it wasn't for you and me any ways. Hope this helps! :-)
    Regards, Jacob

  • Really weird problem with Usage Tracking!

    Hi guys
    We are running OBIEE 11.1.1.6.5 on Exalytics and allmost all is fine and dandy. We have though one weird problem. When we first enabled usage tracking, we sudddenly got some errors on some of our dashboards, saying that we didn't have the privilegies to create temporary tables. Allright we got our user (which is a shared user) the privilegies to create temporary tables, and the reports with the problems started working again! Awesome! But low and behold - then Usage Tracking stopped working, not a single mentioning in the logfiles about it. Maybe an error or, the acknowledgement that Usage Tracking has started, which is the normal.
    So basically we are running without Usage Tracking because the reports that gives us the problems are deemed more important... which they might be, but thats different story...
    Any ideas to what is going on here? We have collected Usage Tracking data, while the reports were broke, so im quite confident that the setup is correct. But why would it stop, when giving "create table" privilege to our shared user?
    Best regards, Jacob

    What is the setting for UsageTrackingDirectInsert?
    Have you tried chaging the output to file instead of database?
    Please see:
    http://docs.oracle.com/cd/E23943_01/bi.1111/e10541/usage_track.htm#i1017237
    Section 9.3, "Setting Up a Log File to Collect Information for Usage Tracking"
    Michael
    PS:
    Hi Christian! :)

  • Usage Tracking for BI Server Hosting Multiple Repositories

    Hi all,
    I've got my BI Server hosting two repositories:
    [ REPOSITORY ]
    DemoApp1 = DemoApp1.rpd, DEFAULT;
    DemoApp2 = DemoApp2.rpd;
    And started two Presentation Services connecting to these two repositories respectively, and two web front-ends at /analytics and /analytics2 to display the two different catalogs.
    Next, I enabled usage tracking with direct insert to the following physical table:
    PHYSICAL_TABLE_NAME = "OBIEEREP".""."OBIEEREP"."S_NQ_ACCT" ;
    CONNECTION_POOL = "OBIEEREP"."Connection Pool" ;
    Both repositories has the above physical table and connection pool defined. In DemoApp1.rpd, "OBIEEREP"."Connection Pool" points to a database OBIEEDB1, and in DemoApp2.rpd, it points to a database OBIEEDB2.
    But then the problem occurs. All usage tracking data are inserted to OBIEEDB1 only, no matter which web front-end they are from. I tried removing the "DEFAULT" from NQSConfig.ini, but this doesn't help.
    Is there any way to configure BI Server to insert usage tracking data from /analytics to OBIEEDB1, and those from /analytics2 to OBIEEDB2?
    Thanks very much!!!
    Edited by: Kaphenda on Jan 18, 2010 2:34 PM

    HI Kaphenda,
    That's probably just a side-effect of having two repositories. Keep in mind that having multiple repositories and multiple presentation servers is not support by Oracle, so they were not on the hook to modify the Usage Tracking algorithm to write to two different databases. It seems to me like they either didn't anticipate this scenario, or they didn't want to program for this scenario.
    I don't think you have too many options in terms of making OBIEE insert the traffic logs into the appropriate database. However, once the data has been inserted, you can do just about anything you want to it in terms of ETL, triggers, materialized views, etc.
    Here's what I would recommend:
    1) Have the OBIEE usage tracking information logged into a neutral schema.
    2) Write a materialized view in database 1: CREATE MATERIALIZED VIEW S_NQ_ACCT AS SELECT * from S_NQ_ACCT@NEUTRAL_SERVER WHERE REPOSITORY_NAME = 'DemoApp1'
    3) Write a materialized view in database 2: CREATE MATERIALIZED VIEW S_NQ_ACCT AS SELECT * from S_NQ_ACCT@NEUTRAL_SERVER WHERE REPOSITORY_NAME = 'DemoApp2'
    You can specify the refresh frequency of these materialized views to be as frequent as you want. The great thing about this strategy is that you get the best of all worlds. Each database has the traffic information broken down by their personal RPD. But the neutral schema can provide Global reporting across both repositories which can be very valuable itself.
    Good luck and if you found this post useful, please award points!
    Best regards,
    -Joe
    Edited by: Joe Bertram on Jan 18, 2010 9:59 AM

  • Time input field

    How to i create a time input field similar to a date input field with a calenda

    Hi,
    We currently do not support a time input field.
    Regards,
    Brian

  • Trouble shoot Usage Tracking

    I configured Usage tracking but for some reason S_NQ_Acct table is not populating rows. I checked the error log and I see there has been some problem with the usage tracking. I used the 10.1.3.4 usage tracking rpd and did update the connection pools to point to the right database.
    The rpd metadata columns and the tables created under the schema are consistent. I do not see any errors under Usage tracking canned reports or the columns. All i see is No results because S_NQ_ACCT table is always empty, even though I ran some reports.
    Here is my usage tracking settings under nqsconfig.ini file
    ENABLE = YES;
    DIRECT_INSERT = YES;
    PHYSICAL_TABLE_NAME = "OBI Usage Tracking"."Catalog"."dbo"."S_NQ_ACCT" ;
    CONNECTION_POOL = "OBI Usage Tracking"."Usage Tracking Writer Connection Pool" ;
    BUFFER_SIZE = 10 MB ;
    BUFFER_TIME_LIMIT_SECONDS = 5 ;
    NUM_INSERT_THREADS = 5 ;
    MAX_INSERTS_PER_TRANSACTION = 1 ;
    My NqServer log details:
    2010-07-08 14:53:14
    [59055] Usage Tracking started.
    2010-07-08 14:53:16
    [16020] Metadata Database Type: Oracle8.1
    Data Source Name: Dev
    Data Source Type: Oracle 10.02.0020
    2010-07-08 14:53:16
    Data Source Name: Dev
    Client Driver Version 10.02.0001
    2010-07-08 14:53:16
    [59048] Usage Tracking encountered an insert statement execution error. This error has occurred 1 times and resulted in the loss of 1 insert statements since this message was last logged.
    [nQSError: 16001] ODBC error state: S1000 code: 984 message: [Oracle][ODBC][Ora]ORA-00984: column not allowed here.
    [nQSError: 16015] SQL statement execution failed.
    I tried to debug what that error is but i was not successfull in doing so. One thing that i wanted to mention here is, when creating the NQ_LOGIN_GROUP view, I edited th default script for creating the column LOGIN to USER_NAME as LOGIN is a key word in ORACLE database. I dont think that is causing the issue here, as I renamed the rpd Physical column of the table as well. That was the only change i made and rest of them are default settings.
    Any suggestions here....
    Thanks
    Prash

    A lot of things smell fishy in your post. For a start your BI Server log says "Metadata Database Type: Oracle8.1" but then it says you are using "Client Driver Version 10.02.0001". Which database version are you running? If you are running 10g then make sure you set the correct database type in the Physical layer. If you are using Oracle then you should be using Oracle native drviers in your connection pool (OCI) however the PHYSICAL_TABLE_NAME definition has the "dbo" which seems to suggest you are using ODBC. Also the error you are gettign seems to suggest you are using ODBC. The insert the BI Server sends looks like this:
    INSERT INTO S_NQ_ACCT (USER_NAME, REPOSITORY_NAME, SUBJECT_AREA_NAME, NODE_ID, START_TS, START_DT, START_HOUR_MIN, END_TS, END_DT, END_HOUR_MIN, QUERY_TEXT, SUCCESS_FLG, ROW_COUNT, TOTAL_TIME_SEC, COMPILE_TIME_SEC, NUM_DB_QUERY, CUM_DB_TIME_SEC, CUM_NUM_DB_ROW, CACHE_IND_FLG, QUERY_SRC_CD, SAW_SRC_PATH, SAW_DASHBOARD, SAW_DASHBOARD_PG, PRESENTATION_NAME, RUNAS_USER_NAME, NUM_CACHE_INSERTED, NUM_CACHE_HITS)
    So try that manually to see if it works but it seems you need to to some RPD config changes first as it all looks too fishy to me.

  • FIELD CAT USAGE?

    HI EXPERTS,
    CAN ANY BODY EXPLAIN ME ABT THE USAGE OF FIELD CAT IN ALV.
    PLEASE DESCRIBE ME ABOUT FIELDCAT USAGE AND FUNCTIONS OF FIELDCAT IN ALV.
    THANKS,
    SAKTHI
    *VALUABLE POSTS WILL BE REWARDED--*

    Field cataloge is to create columna heading. In this you need to fill an internal table which contains the heading of all the fields which needs to be displayed. e.g.: customer no, Customer name etc.
    Field_catalog:
    Field catalog with field descriptions
    2.7.1. Description
    Field catalog containing descriptions of the list output fields (usually a subset of the internal output table fields). A field catalog is required for every ALV list output.
    The field catalog for the output table is built-up in the caller's coding. The build-up can be completely or partially automated by calling the REUSE_ALV_FIELDCATALOG_MERGE module
    See also the documentation of the function module REUSE_ALV_FIELDCATALOG_MERGE.
    The minimal field catalog is documented under 'default'. The caller can use the other optional parameters to assign output attributes to a field which differ from the default.
    A field catalog need not be built-up and passed explicitly only under the following conditions:
    • The internal table to be output has the same structure as a Data Dictionary structure which is referred to in the internal table declaration using LIKE or INCLUDE STRUCTURE.
    • all fields in this structure are to be output
    • the structure name is passed to ALV in the parameter I_STRUCTURE_NAME.
    See also the documentation of the IMPORTING paramter I_STRUCTURE_NAME.
    Positioning
    • row_pos (row position)
    value set: 0, 1 - 3
    Only relevant if the list output is to be multi-line (two or three lines) by default.
    A multi-line list can also be defined by the user interactively if the default list is one-line.
    The parameter specifies the relative output line of the column in a multi-line list.
    • col_pos (column position)
    value set: 0, 1 - 60
    only relevant when the default relative column positions differ from the field catalog field sequence. The parameter specifies the relative column position of the field in the list output. The column order can be changed interactively by the user. If this parameter is initial for all field catalog entries, columns appear in the field catalog field sequence.
    Identification
    • fieldname (field name)
    value set: internal output table field name (required parameter)
    Name of the internal output table field which is described by this field catalog entry
    • tabname (internal output table)
    value set: SPACE, internal output table name
    This parameter is used in 'manual' field catalog build-up only for hierarchical-sequential lists.
    Name of the internal output table which contains the field FIELDCAT-FIELDNAME.
    Data Dictionary reference
    • ref_fieldname (reference field name)
    value set: SPACE, Data Dictionary field name
    Name of the Data Dictionary field referred to.
    This parameter is only used when the internal output table field described by the current field catalog entry has a reference to the Data Dictionary (not a program field), and the field name in the internal output table is different from the name of the field in the Data Dictionary. If the field names are identical, naming the Data Dictionary structure or table in the FIELDCAT-REF_TABNAME parameter is sufficient.
    • ref_tabname (reference table/structure field name)
    value set: SPACE, name of a Data Dictionary structure or table
    Structure or table name of the referred Data Dictionary field.
    This parameter is only used when the internal output table field described by the current field catalog entry has a Data Dictionary reference (not a program field).
    Reference to fields with currency/measurement unit
    Each internal output table sum or quantity field whose decimal places are to be formatted appropriately for the unit in the list must follow the convention:
    • the field is of data type QUAN or CURR (internal type P) (the field must really have this physical data type. Overwriting the physical data type with the parameter FIELDCAT-DATATYPE has no effect)
    • There is a field in the internal output table which contains the associated unit.
    • There is also an entry in the field catalog for the unit field.
    (If the unit is not to appear as a column in the list, and cannot be interactively displayed as a column, e.g. because it is always unambiguous and is therefore explicitly output by the caller in the list header, the field catalog units field entry can take the parameter FIELDCAT-TECH = 'X'.
    The association of a value field to a unit affects the output as follows:
    • appropriate decimal places display for the unit
    • an initialized field with a link to a non-initial unit is output as '0' for the unit (if FIELDCAT-NO_ZERO is initial). When this field is summed, this unit affects whether the units are homogeneous.
    • an initialized field with a link to an initial unit is output as SPACE. When this field is summed, the unit SPACE does not affect the homogeneity of the units.
    • When non-initial fields with an initial unit are summed, the unit SPACE is considered to be a unit.
    Link to currency unit
    • cfieldname (currency unit field name)
    value set: SPACE, output table field name
    Only relevant for amount columns with associated unit.
    Name of the internal output table field containing the currency unit associated with the amount field FIELDCAT-FIELDNAME. The field in FIELDCAT-CFIELDNAME must have its own field catalog entry.
    • ctabname (internal currency unit field output table)
    value set: SPACE, output table field name
    only relevant for hierarchical-sequential lists
    Name of the internal output table containing the FIELDCAT-CFIELDNAME field.
    Link to measurement unit
    • qfieldname (measurement unit field name)
    value set: SPACE, output table field name
    only relevant for quantity columns with unit link.
    Name of the internal output table field containing the measurement unit associated with the quantity field FIELDCAT-FIELDNAME.
    The field in FIELDCAT-QFIELDNAME must have its own field catalog entry.
    • qtabname (internal measurement unit field output table)
    value set: SPACE, output table field name
    only relevant for hierarchical-sequential lists
    Name of the internal output table containing the FIELDCAT-QFIELDNAME field.
    Column output options
    • outputlen (column width)
    value set: 0 (initial), n
    For fields with a Data Dictionary link this parameter can be left initial.
    For fields without a Data Dictionary link (program field) the parameter must be given the value of the desired field list output length (column width).
    initial = column width is the output length of the referred Data Dictionary field (domain).
    n = column width is n characters
    • key (key column)
    value set: SPACE, 'X' 'X' = kex field (key field output in color)
    Key fields can not be interactively hidden. Parameter FIELDCAT-NO_OUT must be left initial.
    For exceptions see the documentation of the FIELDCAT-KEY_SEL parameter.
    • key_sel (hideable key column)
    value set: SPACE, 'X'
    only relevant when FIELDCAT-KEY = 'X'
    Key field which can be hidden interactively.
    The key column sequence cannot be changed interactively by the user.
    The output is controlled by the FIELDCAT-NO_OUT parameter analogously to non-key fields.
    • no_out (field in field list)
    value set: SPACE, 'X' 'X' = field is not displayed in the current list.
    The user can interactively choose the field for output from the field list.
    The user can display the contents of these fields at line level using the 'Detail' function.
    See also the 'Detail screen' documentation of the parameter IS_LAYOUT.
    • tech (technical field)
    value set: SPACE, 'X' 'X' = technical field
    Field cannot be output in the list and cannot be displayed interactively.
    Field can only be used in the field catalog (not in IT_SORT, ...).
    • emphasize (highlight columns in color)
    value set: SPACE, 'X' or 'Cxyz' (x:'1'-'9'; y,z: '0'=off '1'=on)
    'X' = column is colored with the default column highlight color.
    'Cxyz' = column is colored with a coded color:
    • C: Color (coding must begin with C)
    • x: color number
    • y: bold
    • z: inverse
    • hotspot (column as hotspot)
    value set: SPACE, 'X'
    'X' = column cells are output as hotspots
    • fix_column (fix column)
    value set: SPACE, 'X'
    Not relevant for block lists (output of several lists consecutively)
    'X' = column fixed (does not scroll horizontally)
    All columns to be fixed must have this flag, starting from the left. If a column without this flag is output, only the columns to the left of this column are fixed. The user can change the column fixing interactively. See also the documentation of the Layout parameter
    IS_LAYOUT-NO_KEYFIX of the IMPORTING paramter IS_LAYOUT.
    • do_sum (sum over column)
    value set: SPACE, 'X' 'X' = a sum is to be calculated over this internal output table field.
    This function can also be called by the user interactively.
    • no_sum (sums forbidden)
    value set: SPACE, 'X' 'X' = no sum can be calculated over this field, although the data type of the field would allow summing.
    • input (column ready for input)
    Function not available
    Format column contents
    • icon
    value set: SPACE, 'X' 'X' = column contents to be output as an icon.
    The caller must consider the printability of icons.
    • symbol
    value set: SPACE, 'X' 'X' = column contents are to be output as a symbol.
    The internal output table column must be a valid symbol character.
    The caller must consider the printability of symbols.
    Symbols can usually be printed, but may not always be output correctly, depending on the printer configuration.
    • just (justification)
    value set: SPACE, 'R', 'L', 'C'
    Only relevant for fields of data type CHAR or NUMC
    ' ' = default justification for this data type
    'R' = right-justified output
    'L' = left-justified output
    'C' = centered output
    The justification of the column header always follows the justification of the columns. Independent justification of the column neader is not possible.
    • lzero (leading zeros)
    value set: SPACE, 'X'
    Only relevant for fields of data type NUMC
    ALV outputs NUMC fields right-justified without leading zeros by default.
    'X' = output with leading zeros
    Note: If a NUMC field is output left-justified or centered by FIELDCAT-JUST, leading zeros are output. If the output of leading zeros is suppressed by a Data Dictionary reference ALPHA conversion exit, the output is always left-justified.
    • no_sign (no +/- sign) Only relevant for value fields
    value set: SPACE, 'X' 'X' = value output without +/ sign
    • no_zero (suppress zeros) Only relevant for value fields
    value set: SPACE, 'X' 'X' = suppress zeros
    • edit_mask (field formatting)
    value set: SPACE, template
    template = see documentation of WRITE formatting option USING EDIT MASK template
    The output conversion conv can be made by template = '== conv'.
    Texts
    The following text parameters should be specified for program fields without a Data Dictionary reference. The texts are taken from the Data Dictionary for fields with a Data Dictionary reference. If this is not desired, the text parameters can also be specified. The Data Dictionary texts are then ignored. If the user changes the column width interactively, the column header text with the appropriate length is always used. The interactive function 'Optimize column width' takes account of both the field contents and the column headers: if all field contents are shorter than the shortest column header, the column width depends on the column header.
    The 'long field label' is also used in display variant definition, sort, etc. popups.
    • seltext_l (long field label)
    • seltext_m (medium field label)
    • seltext_s (short field label)
    • reptext_ddic (header)
    analogous to the Data element maintenance 'Header'
    The specified text is not necessarily output in the list, an optimum among all texts is sought.
    • ddictxt (specify text)
    value set: SPACE, 'L', 'M', 'S'
    You can specify with values 'L', 'M', and 'S', the keyword that should always be used as column header. If the column width changes, no attempt is made in this case to find an appropriate header for the new output width.
    Parameters for program fields without Data Dictionary reference
    see also 'Text' parameters
    • datatype (data type)
    value set: SPACE, Data Dictionary data type (CHAR, NUMC,...)
    Only relevant for fields without Data Dictionary reference
    Program field data type
    • ddic_outputlen (external output length)
    value set: 0 (initial), n
    Only relevant for fields without Data Dictionary reference whose output is nevertheless to be modified by a conversion exit.
    Prerequisites:
    • FIELDCAT-EDIT_MASK = '==conv'
    see also the documentation of the parameter FIELDCAT-EDIT_MASK
    • FIELDCAT-INTLEN = n
    see also the documentation of the parameter FIELDCAT-INTLEN
    n = external format field output length
    The column width FIELDCAT-OUTPUTLEN need not be the same as the external format output length (FIELDCAT-DDIC_OUTPUTLEN).
    • intlen (internal output length)
    value set: 0 (initial), n
    Only relevant for fields without Data Dictionary reference whose output is nevertheless to be modified by a conversion exit.
    Prerequisites:
    • FIELDCAT-EDIT_MASK = '==conv'
    see also the documentation of the parameter FIELDCAT-EDIT_MASK
    • FIELDCAT-DDIC_OUTPUTLEN = n
    see also the documentation of the parameter FIELDCAT-DDIC_OUTPUTLEN
    n = internal format field output length
    • rollname (data element)
    value set: SPACE, Data Dictionary data element name
    F1 help can be provided for a program field without a Data Dictionary reference, or F1 help which differs from the Data Dictionary help can be provided for a field with a Data Dictionary reference, using this parameter.
    When F1 help is called for this field, the documentation of the specified data element is displayed.
    If the FIELDCAT-ROLLNAME is initial for fields with a Data Dictionary reference, the documentation of the data element of the referred Data Dictionary field is output.
    Others
    • sp_group (field group key)
    value set: SPACE, CHAR(1)
    Field group key.
    Keys are assigned to group names in the IT_SPECIAL_GROUPS parameter (see also the documentation of the parameter IT_SPECIAL_GROUPS).
    When such an assignment is made in the field catalog and in IT_SPECIAL_GROUPS, the fields are grouped correspondingly in the display variant popup.
    • reprep (Report/Report interface selection criterion)
    value set: SPACE, 'X'
    Prerequisites:
    • The system contains the Report/Report interface (function group RSTI, table TRSTI)
    • Parameter LAYOUT-REPREP = 'X'
    (see also the documentation of the parameter LAYOUT-REPREP of the IMPORTING parameter IS_LAYOUT )
    'X' = When the Report/Report interface is called, the value of this field is passed in the selected interface start record as a selection criterion.
    2.7.2. Default
    • The following entries are usually sufficient for internal table fields with a reference to a field defined in the Data Dictionary :
    • fieldname
    • ref_tabname
    Notes:
    ALV gets the remaining information from the Data Dictionary.
    If no relative column position (COL_POS) is specified, the fields are output in the list in the order in which they were added to the field catalog.
    REF_FIELDNAME need only be specifid when the name of the internal table field differs from the name of the referred Data Dictionary field.
    Information which is explicitly entered in the field catalog is not overwritten by information from the Data Dictionary.
    Priority rule:
    Entries in the field catalog have priority over differing entries in the Data Dictionary.
    • The following entries are usually sufficient for internal table fields without a reference to the Data Dictionary (program fields):
    • fieldname
    • outputlen
    • datatype
    • seltext_s
    • seltext_m
    • seltext_l
    Notes:
    F1 help can be provided for program fields by assigning a data element to the parameter ROLLNAME.
    If the parameters SELTEXT_S, SELTEXT_M, SELTEXT_L, and REPTEXT_DDIC contain appropriate field labels, the program field column headers are also adjusted appropriately when the column width changes.

  • Date / Time System fields - inconsistent

    This is a weird one...please bear with me
    Before I begin...note that this is not a problem that is difficult to fix, I'm just interested in better understanding why the system fields are behaving the way they are.
    We have 20 or so programs that use a function module to create a file header that includes a date/time stamp.  The programs also place a trailer with a date/time stamp that should match the one in the header.  However, the programmer used different system fields (after a GET TIME statement) to create the date/time stamp in the trailer.  On occasion, the date/time stamp comes out different. 
    Here's the code (sry about all caps):
      GET TIME.
      CONCATENATE SY-DATUM SY-UZEIT INTO BATCHID.
      CALL FUNCTION 'Z_YADAYADA'
           EXPORTING
                FILE_ID              = P_FILEID
                REFRESH_TYPE         = P_REFTYP
                START_PERIOD         = START_PER
                END_PERIOD           = END_PER
           IMPORTING
                STRING255            = OUTPUTSTRING
           EXCEPTIONS
                INVALID_REFRESH_TYPE = 1
                INVALID_BATCH_ID     = 2.
      ...sy-subrc check...
      TRANSFER OUTPUTSTRING TO P_FILE1.  <--header output
      CONCATENATE L_TRL BATCHID COUNT T_TOTAL SPACE
                  INTO OUTPUTSTRING SEPARATED BY L_DEL.
      CONDENSE OUTPUTSTRING NO-GAPS.
      TRANSFER OUTPUTSTRING TO P_FILE1.  <--trailer output
    inside function module *****
      CONCATENATE: SY-DATLO SY-TIMLO INTO DATESTAMP,
                   SY-SYSID SY-MANDT INTO MACHINE_NAME.
      CONDENSE: DATESTAMP, MACHINE_NAME NO-GAPS.
      CONCATENATE C_HDR DATESTAMP FILE_ID MACHINE_NAME
                  SY-CPROG ' ' REFRESH_TYPE START_PERIOD
                  END_PERIOD DATESTAMP
                  INTO STRING255 SEPARATED BY '~'.
      CONDENSE: STRING255 NO-GAPS.
    You can see that the function uses SY-DATLO and SY-TIMLO for creation of the header, while SY-DATUM and SY-UZEIT are used for the trailer.  What's wierd is that sometimes the trailer has a date/time stamp that is 1 second <b>greater</b> than the header, despite the fact that it is stored in the BATCHID variable <b>before</b> the header is created.
    Any help, references, explanations greatly appreciated (and rewarded with points)
    BMV

    Hi Brian
    Here is the related part from the weblog of ABAP expert Horst Keller. Hope it may help. For the full document, you can visit <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.sdn.wcm.compound.docs/library/uuid/f1391fc3-0301-0010-d384-fccd036f1c67">here</a>.
    The values of all system fields in this table are implicitly set when the program is started, every time a screen layout of a screen is sent, and when the internal mode is set. The GET TIME command explicitly updates the system fields, except for sy-dayst, sy-fdayw and sy-tzone.
    With the exception of sy-datlo and sy-timlo, all system fields refer to the local date and time of the current SAP system. The ABAP runtime environment clock is synchronized with the database server clock at regular intervals in order to calculate the local time of the SAP system. During the synchronization process, the ABAP runtime environment clock is set to the database server clock. Because this happens on all application servers in an SAP system, the ABAP runtime environment clock is synchronous with the clocks on all other application servers and with the database system clock, and thus shows the local time of the entire SAP system. The time zone on which the local time of an SAP system is based is the only entry in the database table TTZCU.
    The content of sy-zonlo is taken from the user master record of the current user. The values of sy-datlo and sy-timlo are calculated from sy-datum and sy-uzeit and from the time zone of the SAP system for the time zone in sy-zonlo. If the user master record does not contain a time zone, or if it contains an invalid or an inactive time zone, sy-datlo and sy-timlo are set to the values of sy-datum and sy-uzeit. All valid time zones are defined in table TTZZ.
    Is't the naming of sy-datum and sy-uzeit really geeky?
    Time Stamps
    The above system fields for date and time are not sufficient for many requirements of determining unique points in time: They represent local times and the values are measured in seconds. For more exact date and time determination, you use time stamps.
    A time stamp represents date and time in the form YYYYMMDDHHMMSS. YYYY is the year, MM the month, DD the day, HH the hour, MM the minutes and SS the seconds. There is a short form and a long form. In the long form, the format specified above additionally contains 7 decimal places for fractions of seconds, which allows for an accuracy of up to 100 ns. The maximum time resolution depends on the operating system of the application server and may be less.
    A valid time stamp must contain values whose date and time specifications before the decimal separator correspond to valid values for the data types d and t. Time stamps in this form are always considered as UTC (Universal Time Coordinated, basis for calculating worldwide time specifications; the UTC reference time is based on Greenwich Mean Time, GMT, but is not a time zone; it has no daylight saving time or summer time) time stamps when processed with the corresponding statements. You use the statement GET TIME STAMP to create a time stamp that represents the current UTC reference time.
    Regards
    *--Serdar <a href="https://www.sdn.sap.com:443http://www.sdn.sap.comhttp://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.sdn.businesscard.sdnbusinesscard?u=qbk%2bsag%2bjiw%3d">[ BC ]</a>

  • Error in implementing Usage Tracking in OBIEE

    Hi All,
    I wants to implement Usage Tracking in my project. I have done all the steps as per link http://obiee101.blogspot.com/2008/08/obiee-setting-up-usage-tracking.html
    But i am getting following error in NQServer.log file----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    [59048] Usage Tracking encountered an insert statement execution error. This error has occurred 1 times and resulted in the loss of 1 insert statements since this message was last logged.
    [nQSError: 16001] ODBC error state: S1000 code: 1456 message: [Oracle][ODBC][Ora]ORA-01456: may not perform insert/delete/update operation inside a READ ONLY transaction.
    [nQSError: 16015] SQL statement execution failed.
    I have also check by changing the Call_Interface in Connection Pool of Admin Tool from ODBC 3.5 to OCI 10g. By this chage i get ollowing error:
    [59048] Usage Tracking encountered an insert statement execution error. This error has occurred 1 times and resulted in the loss of 1 insert statements since this message was last logged.
    [nQSError: 17001] Oracle Error code: 3114, message: ORA-03114: not connected to ORACLE
    at OCI call OCIStmtExecute.
    [nQSError: 17011] SQL statement execution failed.
    Please help....
    Thanks in Advance...
    Regards,
    Avi

    problem is solved....
    Actually in NQServer.ini file i was using actual Database name instead of Admin Tool Physical Layer name at PHSICAL TABLE NAME: <database>.<schema>.<tablename>
    and at CONNECTION POOL: <database name>.<connection pool>
    thanks...

  • Usage tracking problem : not started due to non-existent Usage Tracking tab

    Hello,
    I need some help after following the necessary steps to setup 'Usage Tracking'.
    My table S_NQ_ACCT stays empty.
    The structure in the rpd files looks like
    * OBI Usage Tracking
    Connection Pool
    Usage Tracking Writer Connection Pool
    Catalog
    dbo
    S_NQ_ACCT
    My nqsconfig has the following entries related to usage tracking
    # Usage Tracking Section
    # Collect usage statistics on each logical query submitted to the
    # server.
    [ USAGE_TRACKING ]
    ENABLE = YES;
    DIRECT_INSERT = YES;
    PHYSICAL_TABLE_NAME = "OBI Usage Tracking"."Catalog"."dbo"."S_NQ_ACCT";
    CONNECTION_POOL = "OBI Usage Tracking"."Usage Tracking Writer Connection Pool" ;
    BUFFER_SIZE = 10 MB ;
    BUFFER_TIME_LIMIT_SECONDS = 5 ;
    NUM_INSERT_THREADS = 5 ;
    MAX_INSERTS_PER_TRANSACTION = 1 ;
    When making queries on the Usage Tracking SA, I still receive the following error in the NQServer.log File
    +[59049] Usage Tracking not started due to non-existent Usage Tracking table "<Database>"."<Catalog>"."<Schema>"."<Table>".+
    What am I doing wrong??
    Txs for your quick help on this!
    Kr,
    A.

    Can you post your entire NQSConfig.INI file?
    According to the message guide (http://download.oracle.com/docs/cd/E12103_01/books/AnyMsg/AnyMsg_Messages.html#wp1007961) the error you're getting specifies the table that OBIEE is trying to use, so the fact that yours reports literally "<Database>"."<Catalog>"."<Schema>"."<Table>" makes me think maybe the wrong config is being picked up. Also your comment that using a file instead of DB doesn't work either.

Maybe you are looking for

  • IPhoto sample images

    I am new to iPhoto so pls bear with me.  I seem to have these sample photos which I believe come from iPhoto, (soccer photos, AZ vacation photos.....), and I want to delete them.  The only problem is, I can't find them to nuke em.  Can somebody help?

  • X220 random shutdowns

    My new X220 laptop worked fine for 4 days.  Then it crashed randomly 6 times. I had installed Office 2007, Lightroom 3, Photoshop 5.  I called tech support and this is what i have done: 1. Tech support 1 told me to do a recovery and reinstall the bas

  • Oracle Eway performance problem

    Hi everyone I have a java collaboration definition, which use OTD for conect to oracle 10g. When I connect to Database, which is on external server, java collaboration take too long time to connect to database when that java collaboration receive fir

  • Adobe Application Manager Failed

    Hi there! I bought a one year license for Adobe Premiere Pro. Everything was fine. Adobe Application Manager was instalating it but when it finished i tried to open Adobe Premiere and it said it was an error. I can't re-installate it again, Adobe App

  • UCS Basic Questions

    Hi I am new to Cisco UCS, I have a little idea about Nexus Series Switches. I would like to know what exactly are the differentiating factors between - UCS 6100 Fabric Interconnect Nexus 5000 Series MDS 9000 Series switches All these can support FCoE