Tricky Analytics

Hello Gurus
The following analytical SQL query when executed gives the result as below :
LEAPYEAR is a function to validate the dates.
SELECT SRNO, MAX(TRANSACTIONDATE) TRANSACTIONDATE,(CASE WHEN MAX(PNARRATION) IS NULL THEN
' ' ELSE MAX(PNARRATION) END ) PNARRATION,
(CASE WHEN MAX(PRINCADV) IS NULL THEN '' ELSE TO_CHAR(NVL(MAX(PRINCADV),0),'FM99,99,99,99,99,990.009') END )  PAdv,
(CASE WHEN MAX(PRINCRECPT) IS NULL THEN '' ELSE TO_CHAR(NVL( MAX(PRINCRECPT),0),'FM99,99,99,99,99,990.009')END ) PRect,
(CASE WHEN MAX(PRINCBAL) IS NULL THEN '' ELSE CASE WHEN MAX(PRINCBAL)=0 THEN
TO_CHAR(NVL(MAX(PRINCBAL),0),'FM99,99,99,99,99,990.009') || ' ' ELSE CASE
WHEN MAX(PRINCBAL) < 0 THEN TO_CHAR(NVL(MAX(ABS(PRINCBAL)),0),'FM99,99,99,99,99,990.009') || ' Cr ' ELSE
TO_CHAR(NVL(MAX(PRINCBAL),0),'FM99,99,99,99,99,990.009') || ' Dr ' END END END  ) PRINCBAL,
MAX( NODAYS) NODAYS,  TO_CHAR( NVL(ROUND(( (LAG(MAX(PRODUCTS),1) OVER ( ORDER BY MAX(TRANSACTIONDATE)))  *
(SELECT UNIQUE INTRATE FROM HEADER_M WHERE SL_LOAN_AC='JA0904221'))/(LEAPYEAR(TO_DATE('31/MAR/2010')) * 100),0),0),'FM99,99,99,99,99,990.009') PRODUCTS FROM LEDGER_TMP GROUP BY SRNO
SRNO     TRANSACTIONDATE     PNARRATION               PADV          PRECT          PRINCBAL     NODAYS     PRODUCTS
     DATE                              
1     01/04/2009     Opening Balance               68,04,192.00     0.00          68,04,192.00 Dr   77.00          0.00
2     17/06/2009     DN200906170000015 : BY AMT.     0.00          1,60,062.00     66,44,130.00 Dr   12.00      1,07,655.00
3     29/06/2009     DN200906290000014 : BY AMT.     0.00             10,633.00     66,33,497.00 Dr  156.00        16,383.00
4     02/12/2009     DN200912020000021 : BY AMT.     0.00             55,708.00     65,77,789.00 Dr    0.00      2,12,635.00
8     02/12/2009     DN200912020000018 : BY AMT.     0.00           1,10,417.00     62,75,195.00 Dr    0.00             0.00
10     02/12/2009     XV200912020000002 : TO AMT.     55,708.00     0.00          63,04,851.00 Dr    0.00             0.00
12     02/12/2009     XV200912020000002 : TO AMT.     26,052.00     0.00          64,41,320.00 Dr    0.00             0.00
11     02/12/2009     XV200912020000002 : TO AMT.     1,10,417.00     0.00          64,15,268.00 Dr    0.00             0.00
9     02/12/2009     DN200912020000018 : BY AMT.     0.00             26,052.00     62,49,143.00 Dr    0.00             0.00
7     02/12/2009     DN200912020000018 : BY AMT.     0.00             55,708.00     63,85,612.00 Dr    0.00             0.00
5     02/12/2009     DN200912020000021 : BY AMT.     0.00           1,10,417.00     64,67,372.00 Dr    0.00             0.00
6     02/12/2009     DN200912020000021 : BY AMT.     0.00             26,052.00     64,41,320.00 Dr    0.00             0.00
13     31/03/2010     BALANCE ON CURRENT DATE          0.00          0.00          64,41,320.00 Dr  120.00             0.00
14                                                                        1,58,827.00How can the above Query be rectified to get all the Products adjacent to the number of days (NODAYS) as well as the last Product
(coming on a separate line against Balance as on 31/03/2010).
Thanks

user3308033 wrote:
Hello Gurus
The following analytical SQL query when executed gives the result as below :
LEAPYEAR is a function to validate the dates.
SELECT SRNO, MAX(TRANSACTIONDATE) TRANSACTIONDATE,(CASE WHEN MAX(PNARRATION) IS NULL THEN
' ' ELSE MAX(PNARRATION) END ) PNARRATION,
(CASE WHEN MAX(PRINCADV) IS NULL THEN '' ELSE TO_CHAR(NVL(MAX(PRINCADV),0),'FM99,99,99,99,99,990.009') END )  PAdv,
(CASE WHEN MAX(PRINCRECPT) IS NULL THEN '' ELSE TO_CHAR(NVL( MAX(PRINCRECPT),0),'FM99,99,99,99,99,990.009')END ) PRect,
(CASE WHEN MAX(PRINCBAL) IS NULL THEN '' ELSE CASE WHEN MAX(PRINCBAL)=0 THEN
TO_CHAR(NVL(MAX(PRINCBAL),0),'FM99,99,99,99,99,990.009') || ' ' ELSE CASE
WHEN MAX(PRINCBAL) < 0 THEN TO_CHAR(NVL(MAX(ABS(PRINCBAL)),0),'FM99,99,99,99,99,990.009') || ' Cr ' ELSE
TO_CHAR(NVL(MAX(PRINCBAL),0),'FM99,99,99,99,99,990.009') || ' Dr ' END END END  ) PRINCBAL,
MAX( NODAYS) NODAYS,  TO_CHAR( NVL(ROUND(( (LAG(MAX(PRODUCTS),1) OVER ( ORDER BY MAX(TRANSACTIONDATE)))  *
(SELECT UNIQUE INTRATE FROM HEADER_M WHERE SL_LOAN_AC='JA0904221'))/(LEAPYEAR(TO_DATE('31/MAR/2010')) * 100),0),0),'FM99,99,99,99,99,990.009') PRODUCTS FROM LEDGER_TMP GROUP BY SRNO
SRNO     TRANSACTIONDATE     PNARRATION               PADV          PRECT          PRINCBAL     NODAYS     PRODUCTS
     DATE                              
1     01/04/2009     Opening Balance               68,04,192.00     0.00          68,04,192.00 Dr   77.00          0.00
2     17/06/2009     DN200906170000015 : BY AMT.     0.00          1,60,062.00     66,44,130.00 Dr   12.00      1,07,655.00
3     29/06/2009     DN200906290000014 : BY AMT.     0.00             10,633.00     66,33,497.00 Dr  156.00        16,383.00
4     02/12/2009     DN200912020000021 : BY AMT.     0.00             55,708.00     65,77,789.00 Dr    0.00      2,12,635.00
8     02/12/2009     DN200912020000018 : BY AMT.     0.00           1,10,417.00     62,75,195.00 Dr    0.00             0.00
10     02/12/2009     XV200912020000002 : TO AMT.     55,708.00     0.00          63,04,851.00 Dr    0.00             0.00
12     02/12/2009     XV200912020000002 : TO AMT.     26,052.00     0.00          64,41,320.00 Dr    0.00             0.00
11     02/12/2009     XV200912020000002 : TO AMT.     1,10,417.00     0.00          64,15,268.00 Dr    0.00             0.00
9     02/12/2009     DN200912020000018 : BY AMT.     0.00             26,052.00     62,49,143.00 Dr    0.00             0.00
7     02/12/2009     DN200912020000018 : BY AMT.     0.00             55,708.00     63,85,612.00 Dr    0.00             0.00
5     02/12/2009     DN200912020000021 : BY AMT.     0.00           1,10,417.00     64,67,372.00 Dr    0.00             0.00
6     02/12/2009     DN200912020000021 : BY AMT.     0.00             26,052.00     64,41,320.00 Dr    0.00             0.00
13     31/03/2010     BALANCE ON CURRENT DATE          0.00          0.00          64,41,320.00 Dr  120.00             0.00
14                                                                        1,58,827.00How can the above Query be rectified to get all the Products adjacent to the number of days (NODAYS) as well as the last Product
(coming on a separate line against Balance as on 31/03/2010).
ThanksHow about you show us the data output you expect based on what i assume is your given input above?
Also, an Oracle version is near always helpful.
select * from v$version;

Similar Messages

  • Oracle BI Vs Sievel Analytics 7.8.x in Query Generation.

    Hi All,
    In Siebel Analytics 7.8.x if you have 10 columns in the Criteria and if you are using only two columns in the Pivot table view, the query is generated to using only columns used in Pivot view. Where as in Oracle BI it uses all 10 columns in the criteria.
    Is this a bug? Or any instanceconfig.xml parameter exists to control this behavior?
    Our Aim have different view (each view uses 2 or 3 columns) and use View Selector
    This is causing major issue in our project. Please let me know if any one faced this issue and how to resolve this.
    Thank you.

    Yes, you can you can keep your apps at 7.8 but use OBI EE 10g. For OBI EE the DWH created through the BI Apps is just a data source.
    One thing though: in order to be able to make full use of the 10g funtionalities regarding time series funcitons (todate and ago), you will need to fit your W_DAY_D dimension to the 7.9 standard. For that, please open anew post in the Apps forum since this discussion doesn't belong here. Business Intelligence Applications
    I'll post my documentation on the upgrade steps there then.
    As for upgrading Infa, this is a bit more tricky. You need to make suzre that DAC 7.8 can still fire all its commands at a 8.1.1. instance of Infa. (Haven't tried that one yet).
    Cheers,
    C.

  • BFC Extended Analytics or Cube Designer

    Hello
    We're building a business case on the benefits on using Extended Analytics over the reporting capabilities of BFC itself.  Anybody has a high-level document or presentation that outlines the functionalities of the application that he/she would like to share.
    Thanks beforehand.
    Ashwin

    Hi Deepesh,
    The same applies to Star Schemas, the main difference being that the star schema itself (the fact and dimension tables) are also stored in the relational DB.
    1. Yes, the member will be present, but I believe it will appear as unselected (unless of course you have selected your members by some filter or characteristic and this new value is within that selection)
    2. This is a tricky one.
    If you introduce a new dimension (which can only be a user-defined dimension and therefore you need dimensional analysis) it depends what was done with 'old' data.
    In Designer, the new dimension will be visible, but if you are using multi measures, you would need to select this new dimension in the "Dimensional Analysis" tab.
    You would now be in a situation where you have a new measure and you would as a first step you would need to deploy. Now, if in BFC you introduced this new dimension in (for example) 2014.04, all your data prior to this period would not have any values for this DA.
    In the cube this means that if you selected this new measure, you would not see any data for 2014.03, 2014.02 etc.
    This means the end-user needs to select both measures in the report- and this can be quite confusing. You can use 'single measure' mode to avoid this
    3. If you want to use this new characteristic as a hierarchy, you would need to select it in the hierarchy tab and re-deploy
    Thanks
    Marc

  • SharePoint Foundation Site Web Analytics reports

    Hi,
    I am currently attempting to get the Site Web Analytics reports to work in SharePoint Foundation.  However, there seems to be little available information regarding this.  What is necessary for this to display a report?  To start with, what
    SharePoint & Windows services and SharePoint service applications need to be started?  We use SQL Authentication for the content database because all the databases are in a different domain.
    Any information someone can shed would be helpful.
    Currently, when I navigate to the page it states "A web analytics report is not available for this site.  Usage processing may be disabled on this server or the usage data for this site has not been processed yet."  I have looked up this
    information on the net, but nearly all information applies to SharePoint Server.  I need help setting this up with SharePoint Foundation.
    Thank you much.

    Hi Steven,
    Ahhh, I had a feeling that was a problem.
    I had similar issues after installing SP1 so I could have some of the new features, like the storage report back for sites, and deleted sites going into the recycle bin. However, I had read that I *had* to install the June 30 CU as well, which may have been
    a mistake. Because after installing both and running a psconfigui, it made things as flaky as they were during the beta.
    It took my good, stable, trustworthy and solid SPF install and made it unstable. Managed Accounts started acting weird, permissions were odd, BDC service crapped out, user code errors, the works. Bummed me out. I basically blew out the server, did a fresh
    install and restored from a farm backup (actually a full then differential, gotta love scheduling backups). Mind you, I could afford to do that, and I simply don't mess around with bad updates.
    (I also found a post that said that if you have SP1 installed, a common error is if you delete a list, then delete the site, you can restore the site from recycle bin, but the list that was deleted from that site earlier cannot be restored-- so to avoid/fix
    that problem, install the August CU... it's always something...)
    Do you have backups of your farm pre-updates, preferably both, but at least the CU? Maybe you can do a restore on a virtual machine to see if it fixes your issues?
    Also, here's another question. After you installed SP1 and the August CU, did you do a psconfigUI.exe (or at the command line, psconfig.exe)? It might be good and bad that you didn't (if you didn't). The way to check to see if you didn't is check the database
    statuses (review database status under Upgrade and Patch Management in Central Admin). The statuses will say that the databases can be/should be updated if you didn't run psconfig. That means that SharePoint has updated, but the databases in SQL haven't.
    Maybe you can make a copy of those databases, and try to attach those databases to a pre-update (sp1 and CU) SharePoint Foundation server to see if web analytics work there again. Although, if I am understanding correctly, some changes can occur on databases
    during updates, but they're not complete til psconfig is run. But hey, it's worth a shot.
    Or you can use psconfigui.exe (or psconfig.exe if you want to use the command line) to update the databases and see if that fixes the problem.
    I'm sorry again about your issues if they are update related. It is so frustrating to think you are trying to stay up to date on your server, being a good admin, and instead it breaks important things without apology or way to fix it (since cumulative updates
    cannot be uninstalled, but it is really difficult and potentially expensive to have to fully identical farms running side by side for testing).CA Callahan | Author: Mastering Windows SharePoint Service 3.0 and Mastering Microsoft SharePoint Foundation 2010 | Community Launch Leader |

  • Project Analytics 7.9.6.1 - Error while running a full load

    Hi All,
    I am performing a full load for Projects Analytics and get the following error,
    =====================================
    ERROR OUTPUT
    =====================================
    1103 SEVERE Wed Nov 18 02:49:36 WST 2009 Could not attach to workflow because of errorCode 36331 For workflow SDE_ORA_CodeDimension_Gl_Account
    1104 SEVERE Wed Nov 18 02:49:36 WST 2009
    ANOMALY INFO::: Error while executing : INFORMATICA TASK:SDE_ORA11510_Adaptor:SDE_ORA_CodeDimension_Gl_Account:(Source : FULL Target : FULL)
    MESSAGE:::
    Irrecoverable Error
    Error while contacting Informatica server for getting workflow status for SDE_ORA_CodeDimension_Gl_Account
    Error Code = 36331:Unknown reason for error code 36331
    Pmcmd output :
    Session log initialises NULL value to mapping parameter MPLT_ADI_CODES.$$CATEGORY. This is then used insupsequent SQL and results in ORA-00936: missing expression error following are the initialization section and the load section containing the error in the log
    Initialisation
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_11_5_10] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [ORA_11_5_10.DATAWAREHOUSE.SDE_ORA11510_Adaptor.SDE_ORA_CodeDimension_Gl_Account_Segments.log] for session parameter:[$PMSessionLogFile].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[MPLT_ADI_CODES.$$CATEGORY].
    DIRECTOR> VAR_27028 Use override value [4] for mapping parameter:[MPLT_SA_ORA_CODES.$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27028 Use override value [DEFAULT] for mapping parameter:[MPLT_SA_ORA_CODES.$$TENANT_ID].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_CodeDimension_Gl_Account_Segments] at [Wed Nov 18 02:49:11 2009].
    DIRECTOR> TM_6683 Repository Name: [repo_service]
    DIRECTOR> TM_6684 Server Name: [int_service]
    DIRECTOR> TM_6686 Folder: [SDE_ORA11510_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_CodeDimension_Gl_Account_Segments] Run Instance Name: [] Run Id: [17]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_CodeDimension_GL_Account_Segments [version 1].
    DIRECTOR> TM_6963 Pre 85 Timestamp Compatibility is Enabled
    DIRECTOR> TM_6964 Date format for the Session is [MM/DD/YYYY HH24:MI:SS]
    DIRECTOR> TM_6827 [C:\Informatica\PowerCenter8.6.1\server\infa_shared\Storage] will be used as storage directory for session [SDE_ORA_CodeDimension_Gl_Account_Segments].
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6703 Session [SDE_ORA_CodeDimension_Gl_Account_Segments] is run by 32-bit Integration Service [node01_ASG596138], version [8.6.1], build [1218].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [UNICODE]
    MAPPING> CMN_1570 Server Code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6151 The session sort order is [Binary].
    MAPPING> TM_6185 Warning. Code page validation is disabled in this session.
    MAPPING> TM_6156 Using low precision processing.
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6307 DTM error log disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> DBG_21075 Connecting to database [orcl], user [DAC_REP]
    MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_Master_Map] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_Master_Code] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_W_CODE_D] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_CodeDimension_Gl_Account_Segments]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Wed Nov 18 02:49:14 2009)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Wed Nov 18 02:49:14 2009)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 32000000 bytes and Block size is 128000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [asgdev], user [APPS]
    READER_1_1_1> BLKR_16051 Source database connection [ORA_11_5_10] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8147 Writer: Target is database [orcl], user [DAC_REP], bulk mode [OFF]
    WRITER_1_*_1> WRT_8221 Target database connection [DataWarehouse] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL INSERT statement:
    INSERT INTO W_CODE_D(DATASOURCE_NUM_ID,SOURCE_CODE,SOURCE_CODE_1,SOURCE_CODE_2,SOURCE_CODE_3,SOURCE_NAME_1,SOURCE_NAME_2,CATEGORY,LANGUAGE_CODE,MASTER_DATASOURCE_NUM_ID,MASTER_CODE,MASTER_VALUE,W_INSERT_DT,W_UPDATE_DT,TENANT_ID) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL UPDATE statement:
    UPDATE W_CODE_D SET SOURCE_CODE_1 = ?, SOURCE_CODE_2 = ?, SOURCE_CODE_3 = ?, SOURCE_NAME_1 = ?, SOURCE_NAME_2 = ?, MASTER_DATASOURCE_NUM_ID = ?, MASTER_CODE = ?, MASTER_VALUE = ?, W_INSERT_DT = ?, W_UPDATE_DT = ?, TENANT_ID = ? WHERE DATASOURCE_NUM_ID = ? AND SOURCE_CODE = ? AND CATEGORY = ? AND LANGUAGE_CODE = ?
    WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL DELETE statement:
    DELETE FROM W_CODE_D WHERE DATASOURCE_NUM_ID = ? AND SOURCE_CODE = ? AND CATEGORY = ? AND LANGUAGE_CODE = ?
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_CODE_D]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    READER_1_1_1> BLKR_16007 Reader run started.
    WRITER_1_*_1> WRT_8005 Writer run started.
    WRITER_1_*_1> WRT_8158
    Load section
    *****START LOAD SESSION*****
    Load Start Time: Wed Nov 18 02:49:16 2009
    Target tables:
    W_CODE_D
    READER_1_1_1> RR_4029 SQ Instance [mplt_BC_ORA_Codes_GL_Account_Segments.Sq_Fnd_Flex_Values] User specified SQL Query [SELECT
    FND_FLEX_VALUES.FLEX_VALUE_SET_ID,
    FND_FLEX_VALUES.FLEX_VALUE,
    MAX(FND_FLEX_VALUES_TL.DESCRIPTION),
    FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM,
    FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME
    FROM
    FND_FLEX_VALUES,
    FND_FLEX_VALUES_TL,
    FND_ID_FLEX_SEGMENTS,
    FND_SEGMENT_ATTRIBUTE_VALUES
    WHERE
    FND_FLEX_VALUES.FLEX_VALUE_ID = FND_FLEX_VALUES_TL.FLEX_VALUE_ID AND FND_FLEX_VALUES_TL.LANGUAGE ='US' AND
    FND_ID_FLEX_SEGMENTS.FLEX_VALUE_SET_ID =FND_FLEX_VALUES.FLEX_VALUE_SET_ID AND
    FND_ID_FLEX_SEGMENTS.APPLICATION_ID = 101 AND
    FND_ID_FLEX_SEGMENTS.ID_FLEX_CODE ='GL#' AND
    FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM =FND_SEGMENT_ATTRIBUTE_VALUES.ID_FLEX_NUM AND
    FND_SEGMENT_ATTRIBUTE_VALUES.APPLICATION_ID =101 AND
    FND_SEGMENT_ATTRIBUTE_VALUES.ID_FLEX_CODE = 'GL#' AND
    FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME=FND_SEGMENT_ATTRIBUTE_VALUES.APPLICATION_COLUMN_NAME AND
    FND_SEGMENT_ATTRIBUTE_VALUES.ATTRIBUTE_VALUE ='Y'
    GROUP BY
    FND_FLEX_VALUES.FLEX_VALUE_SET_ID,
    FND_FLEX_VALUES.FLEX_VALUE,
    FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM,
    FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME]
    READER_1_1_1> RR_4049 SQL Query issued to database : (Wed Nov 18 02:49:17 2009)
    READER_1_1_1> RR_4050 First row returned from database to reader : (Wed Nov 18 02:49:17 2009)
    LKPDP_3> DBG_21312 Lookup Transformation [mplt_ADI_Codes.Lkp_W_CODE_D]: Lookup override sql to create cache: SELECT W_CODE_D.SOURCE_NAME_1 AS SOURCE_NAME_1, W_CODE_D.SOURCE_NAME_2 AS SOURCE_NAME_2, W_CODE_D.MASTER_DATASOURCE_NUM_ID AS MASTER_DATASOURCE_NUM_ID, W_CODE_D.MASTER_CODE AS MASTER_CODE, W_CODE_D.MASTER_VALUE AS MASTER_VALUE, W_CODE_D.W_INSERT_DT AS W_INSERT_DT, W_CODE_D.TENANT_ID AS TENANT_ID, W_CODE_D.DATASOURCE_NUM_ID AS DATASOURCE_NUM_ID, W_CODE_D.SOURCE_CODE AS SOURCE_CODE, W_CODE_D.CATEGORY AS CATEGORY, W_CODE_D.LANGUAGE_CODE AS LANGUAGE_CODE FROM W_CODE_D
    WHERE
    W_CODE_D.CATEGORY IN () ORDER BY DATASOURCE_NUM_ID,SOURCE_CODE,CATEGORY,LANGUAGE_CODE,SOURCE_NAME_1,SOURCE_NAME_2,MASTER_DATASOURCE_NUM_ID,MASTER_CODE,MASTER_VALUE,W_INSERT_DT,TENANT_ID
    LKPDP_3> TE_7212 Increasing [Index Cache] size for transformation [mplt_ADI_Codes.Lkp_W_CODE_D] from [1000000] to [4734976].
    LKPDP_3> TE_7212 Increasing [Data Cache] size for transformation [mplt_ADI_Codes.Lkp_W_CODE_D] from [2000000] to [2007040].
    READER_1_1_1> BLKR_16019 Read [625] rows, read [0] error rows for source table [FND_ID_FLEX_SEGMENTS] instance name [mplt_BC_ORA_Codes_GL_Account_Segments.FND_ID_FLEX_SEGMENTS]
    READER_1_1_1> BLKR_16008 Reader run completed.
    LKPDP_3> TM_6660 Total Buffer Pool size is 609824 bytes and Block size is 65536 bytes.
    LKPDP_3:READER_1_1> DBG_21438 Reader: Source is [orcl], user [DAC_REP]
    LKPDP_3:READER_1_1> BLKR_16051 Source database connection [DataWarehouse] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    LKPDP_3:READER_1_1> BLKR_16003 Initialization completed successfully.
    LKPDP_3:READER_1_1> BLKR_16007 Reader run started.
    LKPDP_3:READER_1_1> RR_4049 SQL Query issued to database : (Wed Nov 18 02:49:18 2009)
    LKPDP_3:READER_1_1> CMN_1761 Timestamp Event: [Wed Nov 18 02:49:18 2009]
    LKPDP_3:READER_1_1> RR_4035 SQL Error [
    ORA-00936: missing expression
    Could you please suggest what the issue might be and how it can be fixed?
    Many thanks,
    Kiran

    I have continued related detains in the following thread,
    Mapping Parameter  $$CATEGORY not included in the parameter file (7.9.6.1)
    Apologies for the inconvenience.
    Thanks,
    Kiran

  • Error while using between operator with sql stmts in obiee 11g analytics

    Hi All,
    when I try to use between operator with two select queries in OBIEE 11g analytics, I'm getting the below error:
    Error Codes: YQCO4T56:OPR4ONWY:U9IM8TAC:OI2DL65P
    Location: saw.views.evc.activate, saw.httpserver.processrequest, saw.rpc.server.responder, saw.rpc.server, saw.rpc.server.handleConnection, saw.rpc.server.dispatch, saw.threadpool.socketrpcserver, saw.threads
    Odbc driver returned an error (SQLExecDirectW).
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 27002] Near <select>: Syntax error [nQSError: 26012] . (HY000)
    can anyone help me out in resolving this issue.

    Hi All,
    Thank u all for ur replies, but I dint the exact solution for what I'm searching for.
    If I use the condition as
    "WHERE "Workforce Budget"."Used Budget Amount" BETWEEN MAX("Workforce Budget"."Total Eligible Salaries") AND MAX("Workforce Budget"."Published Worksheet Budget Amount"",
    all the data will be grouped with the two columns which I'm considering in the condition.
    my actual requirement with this query is to get the required date from a table to generate the report either as daily or weekly or monthly report. If I use repository variables, variables are not getting refreshed until I regenerate the server(which I should not do in my project). Hence I have created a table to hold weekly start and end dates and monthly start and end dates to pass the value to the actual report using between operator.
    please could anyone help me on this, my release date is fast approaching.

  • Report Developer Control Of Applying Hints to Analytics Queries

    There are numerous ways to apply hints to the queries generated by Analytics:
    - Table level
    - Join level
    - Evaluate calculation
    Each has its advantages and drawbacks.
    - Table level: applies the hint to every query that references the table.
    - Join level: applies the hint whenever the join is used in the query.
    - Evaluate: allows the report developer to include a hint, but can't control where Analytics decides to apply the hint.
    I propose another method for the report developer to apply hints, when needed, that uses join level hints. All the report developer
    does is add the hint column to the Answer or add a filter based on the hint column to the Answer to apply the hint.
    Setup
    NOTE: I suggest you do consistency checks along the way, especially before starting work in the next Layer, to be sure that all setup errors are resolved early.
    1) Start by defining a Logical SQL table in the Physical Layer using the following SQL: Select 1 Hint from dual
    2) Alias this table for each hint to be defined for report developer usage. As an example, alias the hint table, creating
    No Star and Parallel alias tables.
    3) Join each alias to the physical layer fact tables, using a complex join, where the hint could be applied. In the Join definition screen, put the hint in the HINT field and enter 1=1 for
    in the Expression section. Yes, we are creating a cartesian join between the hint and the other joining table. As the hint table always returns one and only one row, there
    is no effect on the rows returned by the query. For No Star, you
    put NO_STAR_TRANSFORMATION in the Hint field. For Parallel, you put PARALLEL(<physical table name>, default, default), where the physical table name
    is the name of the actual database table, not the name of the alias table (Analytics will put the alias in the place of the database table name
    when it generates the SQL). Additionally, for hints that have no parameters, you only need to join it
    to the Fact tables in a query and not necessarily the dimensions. If you include fields from multiple fact tables, the hint will be applied
    for each fact table. So, you may see the hint multiple times in the SQL (something like SELECT /*+ NO_STAR_TRANSFORMATION NO_STAR_TRANSFORMATION */ t00001.col1...)
    4) Add the hint alias tables to the BMM Layer.
    5) Rename the Hint field in each of the BMM hint tables to identify the hint being applied. For No Star, change the column name from Hint to No Star Hint. For Parallel,
    change the column name from Hint to Parallel Hint.
    6) Set the hint column as a key.
    7) Join the BMM hint tables to the appropriate fact tables, using a complex join.
    8) Define each hint table as a dimension.
    9) Set the Logical Level in the Content tab in each of the sources of the joined tables to use the Detail of the hint dimension.
    10) Create a folder in the Presentation Layer called Hints
    11) Place each BMM hint field into the Presentation Layer Hints folder.
    To apply a hint to your Answer, either add the Hint field to your Answer or create a filter where the Hint field is equal to/is in 1 (the number one). Check that the SQL generated
    contains the hint, in Answers, go into Administration, Session Manager, and view the log for the user (the user log level will need to have been set to 7 to see the SQL generated).
    Use of hints in more complex setups can be done by performing a setup of the hints that is parallel to the fact table setup. As an example, if you specify fragmentation content and a where
    clause in your BMM for your fact tables, you would setup parallel physical layer hint tables and joins, BMM objects and joins, fragmentation content, and where clauses based on the
    hint tables (only hint tables are referenced in the where clause).
    As any database person knows, hints can either help or degrade the performance of queries. So, taking the SQL of the pre-hint Answer and figuring out which hints give the best
    performance is suggested, prior to adding the hint fields/filters to the Answer.

    Hi Oliver,
    I would suggest you to have a look at the below WLST script which would give you the required report of the active threads and it would be send an email too.
    Topic: Sending Email Alert for Threads Pool Health Using WLST
    http://middlewaremagic.com/weblogic/?p=5433
    Topic: Sending Email Alert for Hogger Threads Count Using WLST
    http://middlewaremagic.com/weblogic/?p=5423
    Also you can use the below script in case of the stuck threads, this script would send you an email with the thread dumps during the issue occurred.
    Topic: Sending Email Alert For Stuck Threads With Thread Dumps
    http://middlewaremagic.com/weblogic/?p=5582
    Regards,
    Ravish Mody

  • Analytics Load Times

    I have extracted and created the analytics data using orgchart 2.1 staged, but when I log in with a manager or executive user and go to view the analytics or demographics in the details panel, the analytics take a very long time to load, sometimes it seems like it requires me to swap to a different org unit and back to the original before it loads.  The analytics never load, and after a while, the Loading icon swaps to u201CNot Applicableu201D.  I changed the logging level to 1 in the Settings.xml, and the error I get in the log file is this:
    INFO Nakisa.SAP.Custom.OTFProcessor_CacheList.logCommand() : WhereClause {(OrgUnitId = u201850105367u2019})
    INFO BAPI_SAP_OTFProcessor_CahceList.CacheDataSetGet() : Dataset not found in cache. Key : SAPOTFProcessorSessionKey ###2342abcd##
    INFO Nakisa.SAP.Custom.OTFProcessor_CacheList.logCommand() : WhereClause {(OrgUnitId = u201850105367u2019})
    WARNING Detail u2013 CurrentAgeLinkedDetails : FetchData : Could not fetch data for details
    WARNING Details u2013 CurrentAgeTableDetails : Render : Failed because no data is available
    I have checked the views in the database directly, and they are populated with data, so Iu2019m not sure what is causing these errors.  Any ideas?

    Hi Luke,
    Are all of those transactions near each other in time? What I'm wondering is whether the application is trying to read the analytics from SAP instead of your database. In the data elements in SAP you need to define whether they Live or Staged connection is being used. I'm not sure if there is a global parameter in the Admin Console that defines this, generally because I only configure using the XML files etc.
    Luke

  • Portal events are not getting loaded into the Analytics database tables

    Analytics database ASFACT tables (ASFACT_PAGEVIEWS,ASFACT_PORLETVIEW) are not getting populated with data.
    Possible diagnosis/workarounds tried:
    -Checked the analytics configuration in configuration manager, Enable Analytics Communication option checked
    -Registered Portal Events during analytics installation
    -Verified that UDP events are sent out from the portal: Test: OK
    -Reinstalled Interaction analytics component
    Any inputs highly appreciated.
    Cheers,
    Sandeep
    In collector.log, found the exception:
    08 Jul 2010 07:12:54,613 ERROR PageViewHandler - could not retrieve user: com.plumtree.analytics.collector.exception.DimensionManagerException: Could not insert dimension in the database
    com.plumtree.analytics.collector.exception.DimensionManagerException: Could not insert dimension in the database
    at com.plumtree.analytics.collector.cache.DimensionManager.insertDB(DimensionManager.java:271)
    at com.plumtree.analytics.collector.cache.DimensionManager.manageDBImage(DimensionManager.java:139)
    at com.plumtree.analytics.collector.cache.DimensionManager.handleNewDimension(DimensionManager.java:85)
    at com.plumtree.analytics.collector.eventhandler.BaseEventHandler.insertDimension(BaseEventHandler.java:63)
    at com.plumtree.analytics.collector.eventhandler.BaseEventHandler.getUser(BaseEventHandler.java:198)
    at com.plumtree.analytics.collector.eventhandler.PageViewHandler.handle(PageViewHandler.java:71)
    at com.plumtree.analytics.collector.DataResolver.handleEvent(DataResolver.java:165)
    at com.plumtree.analytics.collector.DataResolver.run(DataResolver.java:126)
    Caused by: org.hibernate.MappingException: Unknown entity: com.plumtree.analytics.core.persist.BaseCustomEventDimension$$BeanGeneratorByCGLIB$$6a0493c4
    at org.hibernate.impl.SessionFactoryImpl.getEntityPersister(SessionFactoryImpl.java:569)
    at org.hibernate.impl.SessionImpl.getEntityPersister(SessionImpl.java:1086)
    at org.hibernate.event.def.AbstractSaveEventListener.saveWithGeneratedId(AbstractSaveEventListener.java:83)
    at org.hibernate.event.def.DefaultSaveOrUpdateEventListener.saveWithGeneratedOrRequestedId(DefaultSaveOrUpdateEventListener.java:184)
    at org.hibernate.event.def.DefaultSaveEventListener.saveWithGeneratedOrRequestedId(DefaultSaveEventListener.java:33)
    at org.hibernate.event.def.DefaultSaveOrUpdateEventListener.entityIsTransient(DefaultSaveOrUpdateEventListener.java:173)
    at org.hibernate.event.def.DefaultSaveEventListener.performSaveOrUpdate(DefaultSaveEventListener.java:27)
    at org.hibernate.event.def.DefaultSaveOrUpdateEventListener.onSaveOrUpdate(DefaultSaveOrUpdateEventListener.java:69)
    at org.hibernate.impl.SessionImpl.save(SessionImpl.java:481)
    at org.hibernate.impl.SessionImpl.save(SessionImpl.java:476)
    at com.plumtree.analytics.collector.cache.DimensionManager.insertDB(DimensionManager.java:266)
    ... 7 more
    In analyticsui.log, found the exception below:
    08 Jul 2010 06:50:25,910 ERROR Configuration - Could not compile the mapping document
    org.hibernate.MappingException: duplicate import: com.plumtree.analytics.core.persist.BaseCustomEventFact$$BeanGeneratorByCGLIB$$6a896b0d
    at org.hibernate.cfg.Mappings.addImport(Mappings.java:105)
    at org.hibernate.cfg.HbmBinder.bindPersistentClassCommonValues(HbmBinder.java:541)
    at org.hibernate.cfg.HbmBinder.bindClass(HbmBinder.java:488)
    at org.hibernate.cfg.HbmBinder.bindRootClass(HbmBinder.java:234)
    at org.hibernate.cfg.HbmBinder.bindRoot(HbmBinder.java:152)
    at org.hibernate.cfg.Configuration.add(Configuration.java:362)
    at org.hibernate.cfg.Configuration.addXML(Configuration.java:317)
    at com.plumtree.analytics.core.HibernateUtil.loadEventMappings(HibernateUtil.java:796)
    at com.plumtree.analytics.core.HibernateUtil.loadEventMappings(HibernateUtil.java:652)
    at com.plumtree.analytics.core.HibernateUtil.refreshCustomEvents(HibernateUtil.java:496)
    at com.plumtree.analytics.ui.common.AnalyticsInitServlet.init(AnalyticsInitServlet.java:104)
    at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1161)
    at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:981)
    at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4045)
    at org.apache.catalina.core.StandardContext.start(StandardContext.java:4351)
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:525)
    at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:920)
    at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:883)
    at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:492)
    at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1138)
    at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:311)
    at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
    at org.apache.catalina.core.StandardHost.start(StandardHost.java:719)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
    at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
    at org.apache.catalina.core.StandardService.start(StandardService.java:516)
    at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
    at org.apache.catalina.startup.Catalina.start(Catalina.java:566)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at com.plumtree.container.Bootstrap.start(Bootstrap.java:531)
    at com.plumtree.container.Bootstrap.main(Bootstrap.java:254)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at org.tanukisoftware.wrapper.WrapperStartStopApp.run(WrapperStartStopApp.java:238)
    at java.lang.Thread.run(Thread.java:595)
    08 Jul 2010 06:50:25,915 ERROR Configuration - Could not configure datastore from XML
    org.hibernate.MappingException: duplicate import: com.plumtree.analytics.core.persist.BaseCustomEventFact$$BeanGeneratorByCGLIB$$6a896b0d
    at org.hibernate.cfg.Mappings.addImport(Mappings.java:105)
    at org.hibernate.cfg.HbmBinder.bindPersistentClassCommonValues(HbmBinder.java:541)
    at org.hibernate.cfg.HbmBinder.bindClass(HbmBinder.java:488)
    at org.hibernate.cfg.HbmBinder.bindRootClass(HbmBinder.java:234)
    at org.hibernate.cfg.HbmBinder.bindRoot(HbmBinder.java:152)
    at org.hibernate.cfg.Configuration.add(Configuration.java:362)
    at org.hibernate.cfg.Configuration.addXML(Configuration.java:317)
    at com.plumtree.analytics.core.HibernateUtil.loadEventMappings(HibernateUtil.java:796)
    at com.plumtree.analytics.core.HibernateUtil.loadEventMappings(HibernateUtil.java:652)
    at com.plumtree.analytics.core.HibernateUtil.refreshCustomEvents(HibernateUtil.java:496)
    at com.plumtree.analytics.ui.common.AnalyticsInitServlet.init(AnalyticsInitServlet.java:104)
    at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1161)
    at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:981)
    at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4045)
    at org.apache.catalina.core.StandardContext.start(StandardContext.java:4351)
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:525)
    at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:920)
    at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:883)
    at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:492)
    at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1138)
    at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:311)
    at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
    at org.apache.catalina.core.StandardHost.start(StandardHost.java:719)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
    at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
    at org.apache.catalina.core.StandardService.start(StandardService.java:516)
    at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
    at org.apache.catalina.startup.Catalina.start(Catalina.java:566)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at com.plumtree.container.Bootstrap.start(Bootstrap.java:531)
    at com.plumtree.container.Bootstrap.main(Bootstrap.java:254)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at org.tanukisoftware.wrapper.WrapperStartStopApp.run(WrapperStartStopApp.java:238)
    at java.lang.Thread.run(Thread.java:595)
    wrapper_collector.log
    INFO | jvm 1 | 2009/11/10 17:25:22 | at com.plumtree.analytics.collector.eventhandler.PortletViewHandler.handle(PortletViewHandler.java:46)
    INFO | jvm 1 | 2009/11/10 17:25:22 | at com.plumtree.analytics.collector.DataResolver.handleEvent(DataResolver.java:165)
    INFO | jvm 1 | 2009/11/10 17:25:22 | at com.plumtree.analytics.collector.DataResolver.run(DataResolver.java:126)
    INFO | jvm 1 | 2009/11/10 17:25:22 | Caused by: java.sql.SQLException: [plumtree][Oracle JDBC Driver][Oracle]ORA-00001: unique constraint (ANALYTICSDBUSER.IX_USERBYUSERID) violated
    INFO | jvm 1 | 2009/11/10 17:25:22 |
    INFO | jvm 1 | 2009/11/10 17:25:22 | at com.plumtree.jdbc.base.BaseExceptions.createException(Unknown Source)

    Key words from the error msg suggests reinstallation of Analytics is needed to resolve this.Analytics database is failing to get updated with the correct event mapping and this is why no data is being inserted.
    "Could not insert dimension in the database",
    "ERROR Configuration - Could not configure datastore from XML
    org.hibernate.MappingException: duplicate import: com.plumtree.analytics.core.persist.BaseCustomEventFact$$BeanGeneratorByCGLIB$$6a896b0d"
    "ORA-00001: unique constraint (ANALYTICSDBUSER.IX_USERBYUSERID) violated",
    "ERROR Configuration - Could not compile the mapping document

  • Google Analytics OPT-OUT Browser Add-on 0.9.6 doesn't retain OPT-OUT settings...never did really. Also JAVA now refuses to create updates for WinXP/FireFox.

    I use PaleMoon now trying to get rid of advertising sharks. I have WINXP still - need assistive-technology computer for pain so taking time on getting new computer.
    I have been severely sabotaged by advertising conglomerates selling my private information. They have no right -yet they do whatever they want. Anyhow...I've tried to OPT-OUT of EVERYTHING POSSIBLE. I've tried 5 places found. Google, NAI, DAA, & FIREFOX HELP said go to about:config was instructed to turn "from false to true" the "privacy.tracking.protection.enabled" .But I couldn't keep this setting because this also has to be set to false to use any saved log-on's to my accounts., and add-on AD BLOCK PLUS 2.6.7 when I was on Firefox & had some ticks and now it is disabled because it won't work with PaleMoon 25.2.1
    This case is about GOOGLE OPT-OUT. Starting here: http://www.google.com/settings/ads?hl=en&sig=ACi0TCgWymel0CNWxdcnEWPzWNs9wDd7cVQHxHsmtz4w30CAyHk7MqyzzZdGh3m6FpDaJsKmunTLEvJZY5LAm3h6gIdzE30L-Q
    There are 2 columns. The left one "ADS ON GOOGLE" NEVER RETAINS AFTER LOG-OFF. The right column "Google ads across the web
    " seems to retain sometimes - if I remember right it falls off after a period of time as opposed to after just LOGGING-OFF. Below the columns there are options, but I don't have a GOOGLE ACCOUNT.
    Also down there is this link: https://support.google.com/adsense/troubleshooter/1631343?hl=en ...you see the box with this: Advertising industry program participation
    Google is a participating member of groups that have developed industry privacy standards like the Ad-Choices icon for online advertising.
    Group Location
    Network Advertising Initiative United States
    Digital Advertising Alliance (DAA) United States
    European Digital Advertising Alliance Europe
    Digital Advertising Alliance of Canada Canada
    ....I clicked OPTED OUT FOR NAI on their link http://www.networkadvertising.org/
    ...I clicked OPTED OUT FOR DAA on their link http://www.aboutads.info/
    ...I clicked PROTECT MY CHOICES and downloaded the app.
    These 1st 2 links used to have the same trouble and didn't retain ANYTHING and wouldn't re OPT-OUT ANY UNTIL THEY WERE SATIATED - saying I didn't have 3rd party cookies allowed when I did - this took hours.
    I sent numerous trouble reports to them that they ignored - then I think I sent one to Firefox and it is much better but still not completely right. Today DAA retains 106 out of 122 and if I repeatedly RE-OPT-OUT successively I can get it to 121 out of 122 - never completely. (currently the one not taking is Accuen Inc. but it changes periodically - makes me think they set one up to take my info and then share it with all of them - lol )
    Same for NAI currently I started with 94 out of 96 - next re-OPTED-OUT got it to 95 out of 96 and no further and same co. Accuen Inc (accuenmedia.com ) wouldn't update.
    NOTE: I was copy/pasting the list of NON-OPT-OUT'S to them and my own document file to try and find trends/problems...now NAI programmers protected the list from being able to be COPIED AND PASTED!!! That's the opposite of being open and interested in making this work, right? Why would they want to allow us to OPT-OUT of their criminal espionage behavior...RIGHT? (lol) FYI: I recorded that the big one that was always retained before this one was...MEDIA MATH INC.
    MANY Opt-outs would drop off while on-line and ALL would drop-off once logged-off. Now it's retaining most once logged off - but it hasn't been long enough to tell if they'll fall back into their bad habits as they've done before...
    This has been going on forever for me - I've been reporting this malfunction since 2013.
    Of course I downloaded PROTECT MY CHOICES FOR FIREFOX via NAI AND DAA'S links ...http://www.aboutads.info/PMC#protect-your-choices-for-firefox ............they both go to the same place???.....PROTECT MY CHOICES http://www.aboutads.info/plugin/install/firefox....that seems odd...it's as if they are the same entity then. ABOUTADS is DAA'S WEBSITE. NAI'S IS NETWORKADVERTISING.ORG so why would NAI use DAA'S PROTECT MY CHOICES app?
    Lastely, I also requested that the COOKIES NAMES BE UNIFORMLY IDENTIFIABLE. All the OPT-OUT COOKIES that are to be forevermore retained on my computer and that I am to trust the EVERY FUNCTION OF - well they all have different COOKIE NAMES!! Most of the names you can not tell that they are OPT-OUT COOKIES!! - SO NOWWW They have created another problem.
    We can no longer just "DELETE ALL COOKIES".
    PLUS, we can no longer go through our cookies and delete individual ones that snuck through.
    Every one of the OPT-OUT COOKIES SHOULD SAY "OPT-OUT" IWITHIN THEIR NAME OF THEIR COOKIE - PERIOD. RIGHT? (LOL) For real. Why was this mess was allowed?
    REALLY IN MY OPINION THESE COMPANIES SHOULD BE BLOCKED AT THE LOCAL PHONE COMPANIES LEVEL - IN OUR SERVERS. We should not have to deal with them. Same thing with viruses, malware and such beasts. But I digress.
    In summary:
    1.)Google Analytics OPT-OUT Browser Add-on 0.9.6 doesn't retain OPT-OUT settings
    2.) JAVA refuses to create updates for WinXP/FireFox/PaleMoon.
    3.) DAA & NAI still don't retain ALL OPT-OUT settings and never completely OPT-OUT all companies.
    4.) OPT-OUT cookies should be uniformly names with the words OPT-OUT in their titles.
    5.) Ad Block Plus 25.6.7 doesn't work for Pale Moon and there is no alternative - (didn't work great in FireFox)
    Right now I'm mainly interested in #1)retaining my GOOGLE OPT-OUTS while on line AND while logged off since it is attacking my computer and steeling all the speed until it freezes my keyboard.
    Currently I am trying to remember to run through 3. OPT-OUTS every time I log on.
    Thanks so much!

    hello, palemoon's support forums are here: http://forum.palemoon.org/

  • No data in Business Process Analytics after setting up BPA and BPMon

    Hi,
    I have followed the BPA set up guide and done all the configuration for BPA. I have set up BPMon for a solution and in the logical component node i have done configuration for many application monitors.
    Also in the "Load Master Data" node the master data is loaded. But still in BPA when I am selecting the time frame and applying it I am getting a message that no data is applicable in the selected time frame.
    How can i get the data in BPA.

    Hi Vishal,
    1435043 - E2E Workload Analysis - No applicable data found
    specially section 3) EFWK Resource Manager Job, wehre you find several
    hints to check this job and the EFWK Adminsitration.
    Also check note: 894279 Background processing in SAP Solution Manager
    Please have a look at following chapter of the the BPA Setup
    guide to fully understand the data flow in this application:
    2 Architecture of Business Process Analytics
      > 2.1 Data Retrieval and Reporting
    Please also review following chapter, in to see if there are any errors
    extracting the data from the managed systems to SAP Solution Manager:
    3.2.3 Checking status of Solution Manager Diagnostics Extractor
    Framework (EFWK)
    Also check:
    1570282, 1643303 and 1602437
    thanks
    Regards,
    Vikram

  • How do I Add Google's Universal Analytics to a Muse site

    I'm trying to add Google's new Universal Analytics code (.js code) to my muse site. I've added the js code into the Master page that every page uses but that doesn't seem to work. I've also tried adding it individually to each page (removing it from the master) but that also doesn't work. Google's tag manager says it is installed, but Analytics says it isn't.
    Does anyone know how I'd install it? Here is the website in question.
    Thanks in advance.

    Hi Tristen,
    Please refer to this similar thread to know more on this, Re: Where do I paste my tracking code from google analytics?
    - Abhishek Maurya

  • SharePoint Foundation 2013 Web Analytics

    Problem:
    I am using SharePoint Foundations 2013 (no patches) and trying to get Web Analytics working but running into difficulty. The error I am getting is the dreaded, "A web analytics report
    is not available for this site. Usage processing may be disabled on this server or the usage data for this site has not been processed yet.", when I click on Site Settings -> Site Web analytics reports
    Items completed:
    Enabled Site Usage and data collection in Central Admin.
    Logs are full of data in the hive
    I can see the data in dbo.requestusage in the WSS logging database.
    Possible problem is that Web Analytics has been deprecated in SharePoint 2013 however why would Microsoft have Web analytics link if it was deprecated.
    Question:
    How can I get Web Analytics to show data or has it been completely removed from SharePoint Foundations 2013 and the link is just there for frustration?
    Jay

    We’re having the same Problem.
    According to the following links, “Usage Analytics” and “Usage Reporting and Logging” are
    available in SharePoint Foundation 2013:
    http://www.khamis.net/blog/Lists/Posts/Post.aspx?ID=96
    http://blog.blksthl.com/2013/01/14/sharepoint-2013-feature-comparison-chart-all-editions/
    Are they wrong?
    I don’t mean the search related Reports (Most Popular Items), I mean the very simple Reports like
    Number of Page Views, Number of Unique Visitors, Number of Referrers, Top Pages/Visitors/Referrers/Browsers (URL: /_layouts/15/usageDetails.aspx)
    Sharepoint Hosting und Beratung www.sharepoint-foundation-hosting.ch

  • SharePoint 2010 web analytics not working - not all web requests hitting the logging database

    Hi there, I have 2 windows 2008 web front end servers, 1 windows 2008 application server and 2 clustered windows 2008, SQL Server 2008 SP2 clustered servers in my farm. 
    I am having trouble getting Web Analytics configured and googled for days and followed everything I have found but still no luck.  The problem is that the logging database is only logging requests to Central admin URL only but if I hit any other
    site in my farm it doesn't log in the database.  Hence why I am not getting any reports in the front end. 
    These are also missing from the ULS logs too.  Only central admin entries in the logs.
    Anyone seen this before?
    Thanks

    Hi CharlieBoy,
    Please enable the verbose ULS log for category "Web Analytics Services category", then see if there are more related ULS log message appearing (may need to wait a day).
    Also check the following 2 articles about troubleshooting the web analytics issue, which should be helpful.
    http://blogs.msdn.com/b/sharepoint_strategery/archive/2012/03/16/troubleshooting-sharepoint-2010-web-analytics.aspx
    http://blogs.technet.com/b/manharsharma/archive/2012/10/13/sharepoint-2010-web-analytics-troubleshooting-reporting-db.aspx
    Thanks
    Daniel Yang
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • SharePoint 2010 web analytics not working

    Hello guys -I have setup a  sharepoint farm, 1 app, 2 wfe (windows server 2008 r2) and sql server 2008 r2 DB. All service applications are functioning except web analytics. I have deleted and recreated again but still have no effect. It is creating
    usage files.
    interestingly Central admin site collection web analytics are working but when i check in a web application site collection web analytics report i have the below message.Not sure what I am missing. Can any one help me how to trace the issue:
    There is no data available for this report. Here are some possible reasons: (1) Web Analytics has not been enabled long enough to generate data; (2) There is insufficient data to generate this report; (3) Data logging required for this report might
    not be enabled; (4) Data aggregation might not be enabled at the level required for this report
    Appreciate your help.
    wm.

    Yes, they are also Started.
    To make sure - I have checked off and re enabled usage data collection. Restarted the above services. And also re run the timer job "Web Analytics Trigger Workflows Timer Job
    ". nothing seems helping
    I have set diagnostic logiging to Web Analytics Service - verbose both event and trace and observed the below:
    02/04/2011 14:30:00.19 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852e High Preparing to import '6' usage log files found with filter 'SHAREPOINTSER1-????????-?????.usage'. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:01.13 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852n High Flushing usage entry cache to storage (count=1211). 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:01.13 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 85f9 High Calling ImportEntries method for usage definition 'Microsoft.SharePoint.Administration.SPTimerJobUsageDefinition' with
    '47' entries. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:02.36 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852u High Called ImportEntries method for usage definition 'Microsoft.SharePoint.Administration.SPTimerJobUsageDefinition'. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:02.36 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 85f9 High Calling ImportEntries method for usage definition 'Microsoft.SharePoint.Administration.SPRequestUsageDefinition' with
    '1160' entries. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:02.91 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 884c High Calling 'UsageImported' method of Usage Receiver 'Microsoft.SharePoint.Administration.SPUsageReceiverDefinition' for
    Usage Definition 'Microsoft.SharePoint.Administration.SPRequestUsageDefinition'. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:02.91 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 884d High Instantiating usage receiver 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:02.91 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 884k High Instantiated usage receiver 'Microsoft.Office.Server.WebAnalytics.UsageLogging.SPRequestUsageReceiver'. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:02.91 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 884n High Executing receiver method 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:03.47 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 884o High Executed receiver method. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:03.47 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 884m High Called 'UsageImported' method of Usage Receiver 'Microsoft.SharePoint.Administration.SPUsageReceiverDefinition' for Usage
    Definition 'Microsoft.SharePoint.Administration.SPRequestUsageDefinition'. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:03.47 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852u High Called ImportEntries method for usage definition 'Microsoft.SharePoint.Administration.SPRequestUsageDefinition'. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:03.47 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 85f9 High Calling ImportEntries method for usage definition 'Microsoft.SharePoint.Administration.SPFeatureUsageDefinition' with
    '4' entries. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:03.56 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852u High Called ImportEntries method for usage definition 'Microsoft.SharePoint.Administration.SPFeatureUsageDefinition'. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:03.56 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852v High Flushed usage entries cache to storage. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.02 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852n High Flushing usage entry cache to storage (count=432). 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.02 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 85f9 High Calling ImportEntries method for usage definition 'Microsoft.SharePoint.Administration.SPRequestUsageDefinition' with
    '259' entries. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.25 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 884c High Calling 'UsageImported' method of Usage Receiver 'Microsoft.SharePoint.Administration.SPUsageReceiverDefinition' for
    Usage Definition 'Microsoft.SharePoint.Administration.SPRequestUsageDefinition'. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.25 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 884d High Instantiating usage receiver 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.25 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 884k High Instantiated usage receiver 'Microsoft.Office.Server.WebAnalytics.UsageLogging.SPRequestUsageReceiver'. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.25 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 884n High Executing receiver method 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.38 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 884o High Executed receiver method. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.38 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 884m High Called 'UsageImported' method of Usage Receiver 'Microsoft.SharePoint.Administration.SPUsageReceiverDefinition' for Usage
    Definition 'Microsoft.SharePoint.Administration.SPRequestUsageDefinition'. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.38 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852u High Called ImportEntries method for usage definition 'Microsoft.SharePoint.Administration.SPRequestUsageDefinition'. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.38 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 85f9 High Calling ImportEntries method for usage definition 'Microsoft.SharePoint.Administration.SPTimerJobUsageDefinition' with
    '172' entries. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.47 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852u High Called ImportEntries method for usage definition 'Microsoft.SharePoint.Administration.SPTimerJobUsageDefinition'. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.47 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 85f9 High Calling ImportEntries method for usage definition 'Microsoft.SharePoint.Administration.SPFeatureUsageDefinition' with
    '1' entries. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.53 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852u High Called ImportEntries method for usage definition 'Microsoft.SharePoint.Administration.SPFeatureUsageDefinition'. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.53 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852v High Flushed usage entries cache to storage. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.61 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852m High Deleting usage log file 'D:\SHRPT_UsageDataLogs\SHAREPOINTSER1-20110204-1356.usage' after data import. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.61 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852m High Deleting usage log file 'D:\SHRPT_UsageDataLogs\SHAREPOINTSER1-20110204-1401.usage' after data import. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.61 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852m High Deleting usage log file 'D:\SHRPT_UsageDataLogs\SHAREPOINTSER1-20110204-1406.usage' after data import. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.61 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852m High Deleting usage log file 'D:\SHRPT_UsageDataLogs\SHAREPOINTSER1-20110204-1411.usage' after data import. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.61 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852m High Deleting usage log file 'D:\SHRPT_UsageDataLogs\SHAREPOINTSER1-20110204-1416.usage' after data import. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:05.63 OWSTIMER.EXE (0x1840) 0x13A8 SharePoint Foundation Usage Infrastructure 852m High Deleting usage log file 'D:\SHRPT_UsageDataLogs\SHAREPOINTSER1-20110204-1421.usage' after data import. 03e6edfd-f852-4fa5-ae5a-04ae9bacc51c
    02/04/2011 14:30:06.25 w3wp.exe (0x1154) 0x1098 SharePoint Foundation Unified Logging Service b8fx High ULS Init Completed (w3wp.exe, onetnative.dll) 
    sorry for duming more logs ..hoping if this insights anything?
    Thanks.

Maybe you are looking for