Analytics server 1.1

I installed analytics server 1.1. Installation went successfully. When I loaded analytics console community, I get javascript error PTHTTPGetRequest is undefined. Any idea what is creating this problem:
I have win2003 server/portal 5.03 .NET/Sql 2000 SP3a.
thanks

I was facing exactly the same problem since Friday I installed Analytics Server on a pre-production platform.As suggested by Eric, I searched into the log files of Analytics to get something wrong.I found
10 Mar 2006 16:08:27,408 FATAL JSIncludesHandler - com.plumtree.openfoundation.io.XPFileNotFoundException: jscomponent file for jsutil not found or failed to load.Exceptions encountered: - com.plumtree.openfoundation.io.XPFileNotFoundException: Failed to load http://img-rec.portal.groupe-casino.fr/imageserver/plumtree/common/private/js/jsutil/jscomponent.xml. Loader exception: Connection refused: connect
In the analyticsui.log file.
My first understanding was that this javascript file was missing or corrupted. So Analytics Remote Server was encountering exactly the same kind of error as the client browser.But I checked that the file was present on the imageserver.
So I tried the exact HTTP request (http://img-rec.portal.groupe-casino.fr/imageserver/plumtree/common/private/js/jsutil/jscomponent.xml) being logged on the Analytics Remote Server. The request did not finished. In my case, I had a problem to access the imageserver (problem with the VLAN configuration and hardware loadbalancer for imageserver).Solving this, there's no more problem.I encourage you to check the accessibility of the imageserver from the Analytics Remote Server.
I was encouraged to search this way because I had one time the same kind of error with the content server. Content Server was requesting for a javascript file of the imageserver and never got it. This resulting with a blocking state for Content Server. In this case, the javascript file was really missing. But I discovered this time that such remote servers as Content Server also request for the imageserver component.
My understanding in the case of the problem with the PTHTTPGETRequest missing function is the following.The PTHTTPGETRequest function is hosted in the PTXML.js javascript file.This javascript file is located in the E:\sysapp\plumtree\ptimages\imageserver\plumtree\common\private\js\jsutil\XXXXXX directory. The beginning of the directory depends on your installation. XXXXXX refers to the current version (something like 157085)This is the same for other fundamental javascript functions for the client.So the client needs to make the appropriate includes of javascript files according to your installation and the products you're using.The name of the files to be included are determined by the Remote Server (Analytics in our case). To get the right version to include, Analytics has to access the http://img-rec.portal.groupe-casino.fr/imageserver/plumtree/common/private/js/jsutil/jscomponent.xml. This jscomponent.xml file contains the id of the current version (157085 for instance). Because the Analytics Remote Server don't get this file, I guess it doesn't know what is the right version. So it doesn't put any include clause into the HTML answer to the client.But for any reason, the Analytics Server put in the HTML answer the code with the PTHTTPGETRequest.In the HTML page, there's the PTHTTPGETRequest call but no include. This leads to the blocking problem.
I also established this analysis thanks to the name of the component logging the error message: JSIncludesHandler. I understood this component is in charge of defining the right javascript includes for the client.
Let me know if this is not clear __

Similar Messages

  • Dashboard error u2013 u201CThere is no AA Analytics Server available. Contact your

    Hi,
    We are using BO XI R3.1 version, and we never used the dash board here . Now when we are trying to access dashboard in Infoview  its giving error as u2013
    There is no AA Analytics Server available. Contact your Business administrator to verify status. AA2213
    Even when we are trying to see settings Dashboard and Analytics setting same message appears.
    We tried dashboard related server restart and also all server restart, but still problem exists.
    Please advice.

    Hi,
    We have servers named Dashboard Analytics Server and Dashboard Server related to dashboard. Both are up and running.
    All other servres under Performance management are up and running.
    We donot find AA Analytics server in server list, is this any specific server?

  • Analytics Server

    Hello,
    I am installing an Analytics Server version 1.1.
    i checked and unfortunately i see statistics only today .
    is some one can tell me how can i see statistics about the history ?
    Nir

    Hello,
    I am installing an Analytics Server version 1.1.
    i checked and unfortunately i see statistics only today .
    is some one can tell me how can i see statistics about the history ?
    Nir

  • Siebel Analytics

    I installed Siebel Analytics 7.8.5. I have installed the complete components.
    Siebel Analytics web server is not showing.
    Could anyone help me how to solve this issue.
    Thanks in advance.

    Hi,
    i would like to clear that it is Siebel Analytice web services is not showing (not Server.) i have IIS also.
    Regarding Siebel Analytics Server Error. NQServer log file throws the following
    2008-02-06 13:13:12
    [nQSError: 47007] Invalid repository file: D:\SiebelAnalytics\Repository\AAA.rpd.
    2008-02-06 13:13:12
    [43032] Siebel Analytics Server could not be started. Initialization failed.
    2008-02-06 13:13:43
    [nQSError: 47007] Invalid repository file: D:\SiebelAnalytics\Repository\AAA.rpd.
    2008-02-06 13:13:43
    [43032] Siebel Analytics Server could not be started. Initialization failed.

  • Analytics not recording usage statistics

    I installed G6 and analytics 1.2. Installation went successfull and sync job updates the database with new communities and users. How ever no portal usage statistics are being recorded on the database.
    I created new communities and users, ran the sync job, accessed the communties using the new user / community.
    Reviewed the analytics database, no entry for user logins. I ran the sync job again and verified. Still no entries. How and when does the portal usage statistics gets recorded on to the system?
    thanks

    Our Plumtree Portal G6 and analytics server are installed on the same computer. Analytics does not recording usage statistics, it shows "no data to display". Why? What should I do?
    I have installed analytics many times, the same results are "no data to display".
    For once, I got analytics data for one day, the day I installed analytics. After that day, nothing. The same problem as Michael Capuano.

  • Analytics 2.5 - Analytics Collector

    With what product would you generally install the Analytics Collector? Should we install Analytics Collector on the Analytics Server with Analytics Console?
    Below is our current Anlaytics 2.1 configuration for ALUI 6.1 which didn't include the Analytics Collector at the time.
    Interaction Analytics Component - Portal Server
    Image Service Component - Image Server
    Analytics Console - Analytics Server
    Analytics Automation Jobs - Automation Server

    The Collector can indeed be installed on the same machine. We have the collector installed on the same machine in our Prod environment.

  • Not able to see the Siebel Analytics login screen

    Hi,
    I have installed Siebel Analytics 7.8.4 in my mahcine. IIS, Siebel Analytics Server, Siebel Analytics Web services are running up. I can able to open the repository files in online mode.
    when i try to call "http://localhost/analytics/saw.dll?Answers", its throwing the error as "you are not authorised to view this page"
    for the url "http://localhost:9703/analytics/saw.dll?Answers", getting the error as "Internet Explorer cannot display the webpage"
    could any one please suggest me how can I ovrcome this problem...
    Thanks in advance.
    Regards,
    Kiran
    Edited by: user653357 on Mar 4, 2009 8:44 PM

    Hi Kiran,
    Have u started OC4J before opening presentation services

  • Not able to login into presentations server

    Hi,
    I have installed OBIEE in my local machine. As a part of report creation I have created an rpd file and changed the NQSConfig.ini file as mentioned in
    http://www.oracle.com/technology/obe/obe_bi/bi_ee_1013/bi_admin/biadmin.html.
    Then I have re-started the OC4J server and all the OBIEE services. Then I open the Presentation Services, I am not able to log in. The error shown is
    Error Codes: WH4KCFW6:OPR4ONWY:U9IM8TAC
    Odbc driver returned an error (SQLDriverConnectW).
    +State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 12008] Unable to connect to port 9703 on machine . [nQSError: 12010] Communication error connecting to remote end point: address = ; port = 9703. [nQSError: 12002] Socket communication error at call=: (Number=-1) Unknown (HY000)+
    I have checked the Log file (NQServer.log file in /OracleBI/Server/log folder). The Entries in the Log file are like this
    2009-10-27 18:00:42
    [36007] Loading repository C:\OBIEE\OracleBI\server\Repository\EventHub.rpd.
    2009-10-27 18:00:42
    [14055] Loading subject area: EventHub ...
    2009-10-27 18:00:42
    [14056] Finished loading subject area: EventHub.
    2009-10-27 18:00:42
    [43030] :     Oracle BI Server started. Version: 10.1.3.4.1.090414.1900.
    2009-10-29 14:09:42
    [nQSError: 12002] Socket communication error at call=send: (Number=10038) An operation was attempted on something that is not a socket.
    2009-10-29 14:29:23
    [43031] :     Oracle BI Server shutdown.
    2009-10-29 14:29:26
    [14058] Unloaded all subject areas.
    2009-10-29 15:01:55
    [36007] Loading repository C:\OBIEE\OracleBI\server\Repository\EventHub.rpd.
    2009-10-29 15:01:55
    [14055] Loading subject area: EventHub ...
    2009-10-29 15:01:55
    [14056] Finished loading subject area: EventHub.
    2009-10-29 15:01:55
    [43030] :     Oracle BI Server started. Version: 10.1.3.4.1.090414.1900.
    I have already gone through the forums which are already logged reporting the same error. But I dint get any pointers to the solution.
    Please let me know what might have gone wrong and how I can login to the Presentation Server and continue with report creation. Please let me know if you need further information on this .
    Thanks in advance for your time,
    Raj Kumar

    Hi,
    The following is the NQSConfig.ini file.
    # NQSConfig.INI
    # Copyright (c) 1997-2006 Oracle Corporation, All rights reserved
    # INI file parser rules are:
    # If values are in literals, digits or _, they can be
    # given as such. If values contain characters other than
    # literals, digits or _, values must be given in quotes.
    # Repository Section
    # Repositories are defined as logical repository name - file name
    # pairs. ODBC drivers use logical repository name defined in this
    # section.
    # All repositories must reside in OracleBI\server\Repository
    # directory, where OracleBI is the directory in which the Oracle BI
    # Server software is installed.
    [ REPOSITORY ]
    Star     =     EventHub.rpd, DEFAULT;
    # Query Result Cache Section
    [ CACHE ]
    ENABLE     =     NO;
    // A comma separated list of <directory maxSize> pair(s)
    // e.g. DATA_STORAGE_PATHS = "d:\OracleBIData\nQSCache" 500 MB;
    DATA_STORAGE_PATHS     =     "C:\OBIEE\OracleBIData\cache" 500 MB;
    MAX_ROWS_PER_CACHE_ENTRY = 100000; // 0 is unlimited size
    MAX_CACHE_ENTRY_SIZE = 1 MB;
    MAX_CACHE_ENTRIES = 1000;
    POPULATE_AGGREGATE_ROLLUP_HITS = NO;
    USE_ADVANCED_HIT_DETECTION = NO;
    MAX_SUBEXPR_SEARCH_DEPTH = 7;
    // Cluster-aware cache
    // GLOBAL_CACHE_STORAGE_PATH = "<directory name>" SIZE;
    // MAX_GLOBAL_CACHE_ENTRIES = 1000;
    // CACHE_POLL_SECONDS = 300;
    // CLUSTER_AWARE_CACHE_LOGGING = NO;
    # General Section
    # Contains general server default parameters, including localization
    # and internationalization, temporary space and memory allocation,
    # and other default parameters used to determine how data is returned
    # from the server to a client.
    [ GENERAL ]
    // Localization/Internationalization parameters.
    LOCALE     =     "English-usa";
    SORT_ORDER_LOCALE     =     "English-usa";
    SORT_TYPE = "binary";
    // Case sensitivity should be set to match the remote
    // target database.
    CASE_SENSITIVE_CHARACTER_COMPARISON = OFF ;
    // SQLServer65 sorts nulls first, whereas Oracle sorts
    // nulls last. This ini file property should conform to
    // that of the remote target database, if there is a
    // single remote database. Otherwise, choose the order
    // that matches the predominant database (i.e. on the
    // basis of data volume, frequency of access, sort
    // performance, network bandwidth).
    NULL_VALUES_SORT_FIRST = OFF;
    DATE_TIME_DISPLAY_FORMAT = "yyyy/mm/dd hh:mi:ss" ;
    DATE_DISPLAY_FORMAT = "yyyy/mm/dd" ;
    TIME_DISPLAY_FORMAT = "hh:mi:ss" ;
    // Temporary space, memory, and resource allocation
    // parameters.
    // You may use KB, MB for memory size.
    WORK_DIRECTORY_PATHS     =     "C:\OBIEE\OracleBIData\tmp";
    SORT_MEMORY_SIZE = 4 MB ;
    SORT_BUFFER_INCREMENT_SIZE = 256 KB ;
    VIRTUAL_TABLE_PAGE_SIZE = 128 KB ;
    // Analytics Server will return all month and day names as three
    // letter abbreviations (e.g., "Jan", "Feb", "Sat", "Sun").
    // To use complete names, set the following values to YES.
    USE_LONG_MONTH_NAMES = NO;
    USE_LONG_DAY_NAMES = NO;
    UPPERCASE_USERNAME_FOR_INITBLOCK = NO ; // default is no
    // Aggregate Persistence defaults
    // The prefix must be between 1 and 8 characters long
    // and should not have any special characters ('_' is allowed).
    AGGREGATE_PREFIX = "SA_" ;
    # Security Section
    # Legal value for DEFAULT_PRIVILEGES are:
    # NONE READ
    [ SECURITY ]
    DEFAULT_PRIVILEGES = READ;
    PROJECT_INACCESSIBLE_COLUMN_AS_NULL     =     NO;
    MINIMUM_PASSWORD_LENGTH     =     0;
    #IGNORE_LDAP_PWD_EXPIRY_WARNING = NO; // default is no.
    #SSL=NO;
    #SSL_CERTIFICATE_FILE="servercert.pem";
    #SSL_PRIVATE_KEY_FILE="serverkey.pem";
    #SSL_PK_PASSPHRASE_FILE="serverpwd.txt";
    #SSL_PK_PASSPHRASE_PROGRAM="sitepwd.exe";
    #SSL_VERIFY_PEER=NO;
    #SSL_CA_CERTIFICATE_DIR="CACertDIR";
    #SSL_CA_CERTIFICATE_FILE="CACertFile";
    #SSL_TRUSTED_PEER_DNS="";
    #SSL_CERT_VERIFICATION_DEPTH=9;
    #SSL_CIPHER_LIST="";
    # There are 3 types of authentication. The default is NQS
    # You can select only one of them
    #----- 1 -----
    #AUTHENTICATION_TYPE = NQS; // optional and default
    #----- 2 -----
    #AUTHENTICATION_TYPE = DATABASE;
    # [ DATABASE ]
    # DATABASE = "some_data_base";
    #----- 3 -----
    #AUTHENTICATION_TYPE = BYPASS_NQS;
    # Server Section
    [ SERVER ]
    SERVER_NAME = Oracle_BI_Server ;
    READ_ONLY_MODE = NO;     // default is "NO". That is, repositories can be edited online.
    MAX_SESSION_LIMIT = 2000 ;
    MAX_REQUEST_PER_SESSION_LIMIT = 500 ;
    SERVER_THREAD_RANGE = 40-100;
    SERVER_THREAD_STACK_SIZE = 0; // default is 256 KB, 0 for default
    DB_GATEWAY_THREAD_RANGE = 40-200;
    DB_GATEWAY_THREAD_STACK_SIZE = 0; // default is 256 KB, 0 for default
    MAX_EXPANDED_SUBQUERY_PREDICATES = 8192; // default is 8192
    MAX_QUERY_PLAN_CACHE_ENTRIES = 1024; // default is 1024
    MAX_DRILLDOWN_INFO_CACHE_ENTRIES = 1024; // default is 1024
    MAX_DRILLDOWN_QUERY_CACHE_ENTRIES = 1024; // default is 1024
    INIT_BLOCK_CACHE_ENTRIES = 20; // default is 20
    CLIENT_MGMT_THREADS_MAX = 5; // default is 5
    # The port number specified with RPC_SERVICE_OR_PORT will NOT be considered if
    # a port number is specified in SERVER_HOSTNAME_OR_IP_ADDRESSES.
    RPC_SERVICE_OR_PORT = 9703; // default is 9703
    # If port is not specified with a host name or IP in the following option, the port
    # number specified at RPC_SERVICE_OR_PORT will be considered.
    # When port number is specified, it will override the one specified with
    # RPC_SERVICE_OR_PORT.
    SERVER_HOSTNAME_OR_IP_ADDRESSES = "ALLNICS"; # Example: "hostname" or "hostname":port
    # or "IP1","IP2":port or
    # "hostname":port,"IP":port2.
    # Note: When this option is active,
    # CLUSTER_PARTICIPANT should be set to NO.
    ENABLE_DB_HINTS = YES; // default is yes
    PREVENT_DIVIDE_BY_ZERO = YES;
    CLUSTER_PARTICIPANT = NO; # If this is set to "YES", comment out
    # SERVER_HOSTNAME_OR_IP_ADDRESSES. No specific NIC support
    # for the cluster participant yet.
    // Following required if CLUSTER_PARTICIPANT = YES
    #REPOSITORY_PUBLISHING_DIRECTORY = "<dirname>";
    #REQUIRE_PUBLISHING_DIRECTORY = YES; // Don't join cluster if directory not accessible
    DISCONNECTED = NO;
    AUTOMATIC_RESTART = YES;
    # Dynamic Library Section
    # The dynamic libraries specified in this section
    # are categorized by the CLI they support.
    [ DB_DYNAMIC_LIBRARY ]
    ODBC200 = nqsdbgatewayodbc;
    ODBC350 = nqsdbgatewayodbc35;
    OCI7 = nqsdbgatewayoci7;
    OCI8 = nqsdbgatewayoci8;
    OCI8i = nqsdbgatewayoci8i;
    OCI10g = nqsdbgatewayoci10g;
    DB2CLI = nqsdbgatewaydb2cli;
    DB2CLI35 = nqsdbgatewaydb2cli35;
    NQSXML = nqsdbgatewayxml;
    XMLA = nqsdbgatewayxmla;
    ESSBASE = nqsdbgatewayessbasecapi;
    # User Log Section
    # The user log NQQuery.log is kept in the server\log directory. It logs
    # activity about queries when enabled for a user. Entries can be
    # viewed using a text editor or the nQLogViewer executable.
    [ USER_LOG ]
    USER_LOG_FILE_SIZE = 10 MB; // default size
    CODE_PAGE = "UTF8"; // ANSI, UTF8, 1252, etc.
    # Usage Tracking Section
    # Collect usage statistics on each logical query submitted to the
    # server.
    [ USAGE_TRACKING ]
    ENABLE = NO;
    //==============================================================================
    // Parameters used for writing data to a flat file (i.e. DIRECT_INSERT = NO).
    STORAGE_DIRECTORY = "<full directory path>";
    CHECKPOINT_INTERVAL_MINUTES = 5;
    FILE_ROLLOVER_INTERVAL_MINUTES = 30;
    CODE_PAGE = "ANSI"; // ANSI, UTF8, 1252, etc.
    //==============================================================================
    DIRECT_INSERT = YES;
    //==============================================================================
    // Parameters used for inserting data into a table (i.e. DIRECT_INSERT = YES).
    PHYSICAL_TABLE_NAME = "<Database>"."<Catalog>"."<Schema>"."<Table>" ; // Or "<Database>"."<Schema>"."<Table>" ;
    CONNECTION_POOL = "<Database>"."<Connection Pool>" ;
    BUFFER_SIZE = 10 MB ;
    BUFFER_TIME_LIMIT_SECONDS = 5 ;
    NUM_INSERT_THREADS = 5 ;
    MAX_INSERTS_PER_TRANSACTION = 1 ;
    //==============================================================================
    # Query Optimization Flags
    [ OPTIMIZATION_FLAGS ]
    STRONG_DATETIME_TYPE_CHECKING = ON ;
    # CubeViews Section
    [ CUBE_VIEWS ]
    DISTINCT_COUNT_SUPPORTED = NO ;
    STATISTICAL_FUNCTIONS_SUPPORTED = NO ;
    USE_SCHEMA_NAME = YES ;
    USE_SCHEMA_NAME_FROM_RPD = YES ;
    DEFAULT_SCHEMA_NAME = "ORACLE";
    CUBE_VIEWS_SCHEMA_NAME = "ORACLE";
    LOG_FAILURES = YES ;
    LOG_SUCCESS = NO ;
    LOG_FILE_NAME     =     "C:\OBIEE\OracleBI\server\Log\CubeViews.Log";
    # MDX Member Name Cache Section
    # Cache subsystem for mapping between unique name and caption of
    # members for all SAP/BW cubes in the repository.
    [ MDX_MEMBER_CACHE ]
    // The entry to indicate if the feature is enabled or not, by default it is NO since this only applies to SAP/BW cubes
    ENABLE = NO ;
    // The path to the location where cache will be persisted, only applied to a single location,
    // the number at the end indicates the capacity of the storage. When the feature is enabled,
    // administrator needs to replace the "<full directory path>" with a valid path,
    // e.g. DATA_STORAGE_PATH = "C:\OracleBI\server\Data\Temp\Cache" 500 MB ;
    DATA_STORAGE_PATH     =     "C:\OBIEE\OracleBIData\cache" 500 MB;
    // Maximum disk space allowed for each user;
    MAX_SIZE_PER_USER = 100 MB ;
    // Maximum number of members in a level will be able to be persisted to disk
    MAX_MEMBER_PER_LEVEL = 1000 ;
    // Maximum size for each individual cache entry size
    MAX_CACHE_SIZE = 100 MB ;
    # Oracle Dimension Export Section
    [ ORA_DIM_EXPORT ]
    USE_SCHEMA_NAME_FROM_RPD = YES ; # NO
    DEFAULT_SCHEMA_NAME = "ORACLE";
    ORA_DIM_SCHEMA_NAME = "ORACLE";
    LOGGING = ON ; # OFF, DEBUG
    LOG_FILE_NAME     =     "C:\OBIEE\OracleBI\server\Log\OraDimExp.Log";
    One more thing is that I have kept the BI server (OC4J server) runnung.
    Does the time of the error in NQSServer.log match with when you try to login to Presentation Services?
    I donot see any error matching the time of loginto presentation services.

  • Why do Google Analytics and Sharepoint give difference in numbers?

    Hi all,
    We are using Google Analytics with our Sharepoint Intranet portal. Just recently we’ve decided to compare the numbers with the real server logs as the numbers from google where a little bit suspicions. After some data analyzing, it looks like the numbers are totally different, and my boss is not believing and numbers from google analytics now. The problem is that we are using google analytics on our website as well (hosted in different place and based on different CMS though), but because of the differences in Sharepoint my boss is questioning the numbers for the web as well! Is it possible that the Sharepoint here is the problem and google analytics is not working properly with it, because I don’t know – the way it is structured, caching servers, etc, not sure. Please, could you explain that a little bit to me if you had similar experience or might know what it works like that.
    Thank you very much in advance for your help.
    stone28

     Google Analytics Tracking Code is executed (behind the scenes) in the client's browser and if that can access the Internet to register the hit on the Google Analytics server, GA should collect the Intranet traffic just fine. I can't see why the GA script will not fire if someone visits that page.
    If you have one HTML page with no images or external files called from it like stylesheets or javascript this will be one line on the IIS log each time it is loaded. If you have an HTML page with an external stylesheet and 3 images this will be logged on 5 lines in the log file each time the page is requested because you are requesting 5 files from the web server. Usually your statistics software will filter out all the requests for images, stylesheets and other files and just look at the pages that were loaded because we usually don’t care how many times the logo on each page was loaded.
    When you type in a URL to a specific page on a sub site that folder hierarchy and file doesnt really exist in the physical file system. What happens is the request is intercepted by SharePoint/WSS and routed to a DLL that renders the file from another location (a ghosted page) or renders the page from the SQL database (unghosted page). So your IIS logs are going to contain a whole lot of requests for DLLs like owssrv.dll.
    Reference -
    http://www.sharepointsos.com/category/sharepoint-analytics/
    Hope that helps !!

  • XMII Connector for SQL Analytics (OLAP Database Provided by MS SQL)

    Hi,
    I need to create connector for SQL Analytics server in xMII. I have never created it before.
    Please provide me some information on this. i have followed steps given for the regular OLAP connector but no Data is retrived from the server not even the meta data.
    Regards,
    Prashant Kiran A

    Prashant,
    The MSOLAP connection properties are defined by the MSOLAP interface, a simple Google search will direct you to the values:
    [http://www.google.com/search?hl=en&q=MSOLAPSOAP|http://www.google.com/search?hl=en&q=MSOLAPSOAP]
    As for the logs, depends on your version...
    In 11.5 they are under the admin menu -> System Administration -> General Log
    In v12 they are the NetWeaver logs (Filter: Application = xmii)
    Hope this helps.
    Sam

  • Portal 5.x use of stored procedures in SQL Server

    Anyone know if stored procedures are used in the Plumtree, collab, content, studio, or Analytics Server databases for Portal 5.x on SQL Server. We are running Portal 5.0.4 .NET.

    In SQL 2005 there is, sort of. This is query lists the last execution
    time for all SQL modules in a database:
       SELECT object_name(m.object_id), MAX(qs.last_execution_time)
       FROM   sys.sql_modules m
       LEFT   JOIN (sys.dm_exec_query_stats qs
                    CROSS APPLY sys.dm_exec_sql_text (qs.sql_handle) st) 
              ON m.object_id = st.objectid
             AND st.dbid = db_id()
       GROUP  BY object_name(m.object_id)
    But there are tons of caveats. The starting point of this query is
    the dynamic management view dm_exec_query_stats, and the contents is
    per *query plan*. If a stored procedure contains several queries, 
    there are more than one entry for the procedure in dm_exec_query_stats.
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • Error while updating account dim

    I get the below error while updating the Outline Account Dim.This error occurs only when updating account dim. It does not happen with other dimensions. So it doesn't seem to be the connection being down. If we recycle the analytics server. But we are not able to find out the actual cause for this issue.
    Error message:
    Cannot build dimension. Analytic Server Error(1042017): Network error: The client or server timed out waiting to receive data using TCP/IP. Check network connections. Increase the NetRetryCount and/or NetDelay values in the ESSBASE.CFG file. Update this file on both client and server. Restart the client and try again
    Additional information
    Account is defined as dense dimesion with 1800 level 0 members. We are updating this using rules file.
    Potential Number of Data Blocks     :     27762609853440
    Number of Existing Data Blocks     :     2625360
    Index Cache: 40960
    Data cache : 20480
    I appreciate any help in resolving this issue

    There is probably something in your dim build source file that Essbase doesn't like and isn't giving you a good error message. Did this dim build work in the past and you recently added something to it?
    Try excluding certain columns like member properties and data storage just to see if you can get the members to load. Then slowly bring back in the other columns to see if you find the one that has the problem. Similarly try keeping all the columns but just try loading one record and see if it succeeds, this will help you start to narrow down if it is a load rule or source issue. Other option is to create a new load rule, perhaps there is something wrong with the one you are using.
    Unfortunately there is no magic bullet here, you just have to work through small samples to try and isolate what the problem is.

  • How to get a List of Users Currently Logged into the portal

    Hi,
    Im trying to get the list of all users logged into the portal to do a web service, but I can't find the way to do this, is there any way to find this info thrugh a java class or some object in the RCU/schema/WCP database?
    Greetings
    Mike

    Try the analytics tables. Before you can use them, you need to setup the analytics server and configure it for your application:
    http://docs.oracle.com/cd/E23943_01/webcenter.1111/e12405/wcadm_analytics.htm#BEIDBHHG
    This gives an overview for building reports: http://docs.oracle.com/cd/E23943_01/webcenter.1111/e10148/jpsdg_analytics.htm#BABFDGEG
    Login metrics: http://docs.oracle.com/cd/E23943_01/webcenter.1111/e10148/jpsdg_analytics.htm#BABFFHGD
    You can use these queries to get the info you want.
    Edited by: Yannick Ongena on Oct 8, 2012 8:00 AM

  • OBIEE 11g "WITH SAWITH0 AS" subquery factoring clause in the generated sql

    I've observed that the OBIEE 11g generates in the query log physical query using the WITH (sub-query factoring) clause to make the generated sql elegantly readable. This is great! Thanks for the developers. However I have some questions about this.
    __Background__
    Oracle Database' default behaviour is that if you have only one sub-query in the WITH section, it executes it as an in-line view and does not materialize it before the main sql is executed. If you have more than one, by default the database engine materializes all of them in the order of the definition. In some cases this can completely blow up the SGA and make the query never ending. To divert this behaviour you can apply two hints that work both in inline views and in sub-queries as well: /*+ MATERIALIZE */ and /*+ INLINE*/, however Analytics 11g does not seem to have hint capabilities at the logical table level, only at physical table level.
    If we go with the current defaults, developers not aware of this feature can bump into serious performance issues for the sake of some syntax candy at the generated sql level, I'm afraid.
    __Questions__
    * Is it possible to turn the Analytics server not to use WITH but use inline views instead?
    * Is there any way to sneak in some hints that would put the /*+ INLINE */ hint to the appropriate place in the generated sub-queries if needed
    * Does the Oracle Database have any initialization parameter that can influence this sub-query factoring behavior and divert from the default?

    The WITH statement is not added to make the query more elegant, it's added for performance reasons. If your queries take long to run then you may have a design issue. In a typical DWH DB SGA needs to be seriously increased since the queries ran are much larger and complex than on an OLTP DB. In any case you can disable the WITH statement in the Admin Tool by double clicking on your database object on the physical layer and going to the Features tab. The feature is called WITH_CLAUSE_SUPPORTED.

  • Analysis takes more than 1hr to show results : Obiee 11g (11.1.1.5.0)

    Hi,
    I have an analysis which taking too long to show the results. I have executed the physical query of this analysis(generated by BI Server) on Database and it take 3 seconds.
    Has anyone faced this issue? How to fix this?
    Thanks in advance.

    Hi Raj,
    Thanks for your reply.
    1. Check the concurrent settings in the connection pool. What OS are you using? Make sure the query threads are not on wait mode.It is only happening to a particular Analysis. All others are running fine.
    2. Is the analytics server merging multiple query results from the database? If so, check the memory settings on the analytics server.At the moment only this Analysis is running. There is no problem with the memory.
    3. Also, you mentioned the query at the database comes back in 3 seconds. If you ran the query directly from the database server, you might want to check for network issues.There are no network issues...as there is problem only with this Analysis.
    Thanks in advance.

Maybe you are looking for