Why canu0092t I enable capture of statistics performance data?

Hi,
Following some documents suggested here, I tried to enable capture of statistics performance data.
This is what I did:
rsa1,
Tools (from menu)
Settings for BI statistics
Then I selected the Infoprovider tab; Under the “statistics” column, it looked like:
Default Value    X
Cube1               D
Cube2               D
Cube3               D
I tried to change the “D”s to X, and the options were:
X   – On
D   – Default
     -- Off
But when I switched to X, the "check mark" to save from X to D remains gray and I am unable to turn on the statistics for the cubes Cube1, Cube2, Cube3, etc.
What could be the reason why I can’t seem to turn on the statistics on our BI7 development and quality environment?
Thanks

Hi,
Refer.
How to Set up BW Statistics
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/5401ab90-0201-0010-b394-99ffdb15235b
Activating Data Transfers for BW Statistics 
http://help.sap.com/saphelp_nw04s/helpdata/en/e5/2d1e3bd129be04e10000000a114084/frameset.htm
Installing BW Statistics
http://help.sap.com/saphelp_nw04/helpdata/en/8c/131e3b9f10b904e10000000a114084/frameset.htm
Monitoring BI Statistics
http://help.sap.com/saphelp_nw70/helpdata/en/46/f9bd5b0d40537de10000000a1553f6/frameset.htm
Hope this helps.
Thanks,
JituK

Similar Messages

  • When creating a tablespace why should we enable LOGGING when a database is already on ARCHIVE LOG mode

    Question :
    When creating a tablespace why should we enable LOGGING when a database is already on ARCHIVE LOG mode ?
    Example:
    Create Tablespace
    CREATE SMALLFILE TABLESPACE "TEST_DATA"
    LOGGING
    DATAFILE '+DG_TEST_DATA_01(DATAFILE)' SIZE 10G
    AUTOEXTEND ON NEXT  500K MAXSIZE 31000M
    EXTENT MANAGEMENT LOCAL
    SEGMENT SPACE MANAGEMENT AUTO;
    LOGGING: Generate redo logs for creation of tables, indexes and  partitions, and for subsequent inserts. Recoverable
    Are they not logged and not recoverable if we do not enable LOGGING? What is that ARCHIVELOG mode does?

    What is that ARCHIVELOG Mode Does?
    Whenever your database is in archive log mode , Oracle will backup the redo log files in the form of Archives so that we can recover the database to the consistent state in case of any failure.
    Archive logging is essential for production databases where the loss of a transaction might be fatal.
    Why Logging?
    Logging is safest method to ensure that all the changes made at the tablespace will be captured and available for recovery in the redo logs.
    It is just the level at which we defines:
    Force Logging at DB level
    Logging at Tablespace Level
    Logging at schema Level
    Before the existence of FORCE LOGGING, Oracle provided logging and nologging options. These two options have higher precedence at the schema object level than the tablespace level; therefore, it was possible to override the logging settings at the tablespace level with nologging setting at schema object level.

  • Why is not enabled on the Kingdom of Saudi Arabia FaceTime Why banned?

    Hello
      Why is not enabled on the Kingdom of Saudi Arabia FaceTime Why banned?
    I have purchased a product (iPhone) (4) and surprised the deletion of this property is important!
    There is no equality between us and you, Britain, Canada and some European countries?
    I pay this amount to $ 750 enjoy the full features found with this product and finally blocked some features, such as FaceTime?
    We will be happy if the service was activated FaceTime in Saudi Arabia
    thanks a lot

    Because your government doesn't want citizens making free calls over WiFi. Complain to them, as they are the one's who requested FaceTime be removed.

  • Why is there no Capture Pilot equivalent for Lightroom?

    Why is there no Capture Pilot equivalent for Lightroom?

    Nothing like it that I know of
    But I just +1 the idea here: Capture Pilot equivalent for LightRoom? on Adobes feedback site & suggest you do too!

  • Log files/troubleshooting performance data collection

    Hello: 
    Trying to use MAP 9.0, 
    When doing performance data collection, getting errors.  Is there a log file or event log that captures why the errors are occurring?
    One posting said to look in bin\log - but there is no log directory under BIN for this version it seems.  
    Thank you, 
    Mustafa Hamid, System Center Consultant

    Hi Mark,
    There's no CLEANER_ADJUST_UTILIZATION in EnvironmentConfig for BDB JE 5.0.43 which I'm currently using, I also tried
       envConfig.setConfigParam("je.cleaner.adjustUtilization",
              "false");
    it fails to start up with below error
    Caused by: java.lang.IllegalArgumentException: je.cleaner.adjustUtilization is not a valid BDBJE environment parameter
        at com.sleepycat.je.dbi.DbConfigManager.setConfigParam(DbConfigManager.java:412) ~[je-5.0.43.jar:5.0.43]
        at com.sleepycat.je.EnvironmentConfig.setConfigParam(EnvironmentConfig.java:3153) ~[je-5.0.43.jar:5.0.43]

  • Web Dynpro Console Doesnt display any Performance Data

    Hello,
    I set the JARM parameter to 10 in the Studio. Rebuild, Deployed new archive. Ran the Application and accessed couple of functions.
    When I go to the Web Dynpro Console, Trying to display the performance data. But I dont see anything in Applications or Web Dynpro Framework.
    I couldnt guess what went wrong. Does anybody know the reason, why didnt display the performance Data.
    Thanks,
    Sunita.

    Hi Sunita,
                    use this method for getting the performance data.
    <a href="/people/bertram.ganz/blog/2006/11/21/analyzing-web-dynpro-java-application-performance-with-a-simple-url-parameter:///people/bertram.ganz/blog/2006/11/21/analyzing-web-dynpro-java-application-performance-with-a-simple-url-parameter
    regards
    Sumit

  • Creating dimensions enables to have redudancy of data?

    hi all,
    Creating dimensions enables to have redudancy of data. can anyone explain why and how redudancy is been created while creating dimensions for an infocube.
    can anyone explain me with an scenario?
    regds
    hari

    hi
    A dimension is a grouping of characteristics whose content is logically connected.
    The customer dimension could, for example, be made up of the customer number, the customer group and the levels of the customer hierarchy.
    The sales dimension could contain the characteristics ‘sales person’, ‘sales group’ and ‘sales office’.
    And number of dimensions in BW is max. 16.
    Check http://help.sap.com/saphelp_nw04/helpdata/en/fb/184538650fb10de10000009b38f842/frameset.htm
    hope this helps

  • Capturing the details of  data in which error occured during uploading

    Hi, anyone tell me how to Capturing the details of  data in which error occured during uploading  the file using BDC in backgroung mode. Please do the needful
    Thanks & Regards.
    Aniruddha

    hi,
    This declaration is used to capture the error msg. V is the std table that captures that.i have given a sample code with this..pls chk it out..
    data: err type standard table of bdcmsgcoll with header line.
    SAmple code:
    report z_aru_bdc_new4
           no standard page heading line-size 255.
    include bdcrecx1.
    Generated data section with specific formatting - DO NOT CHANGE  ***
    parameters: p_file like rlgrap-filename obligatory.
    data: err type standard table of bdcmsgcoll with header line.
    data: mess(200) type c.
    data: begin of it_err occurs 0,
    msg(200) type c,
    end of it_err.
    data: begin of record occurs 0,
    data element:
            viewname_001(030),
    data element: VIM_LTD_NO
            ltd_dta_no_002(001),
    data element: ZEMPID3
            zempid3_003(004),
    data element: ZENAME3
            zename3_008(040),
    data element: ZEDEPID
            zedepid_009(004),
    data element:
            zsalkey_010(005),
    data element:
            salary_011(013),
    data element: ZENAME3
           ZENAME3_008(040),
    data element: ZEDEPID
           ZEDEPID_009(004),
    data element:
           ZSALKEY_010(005),
    data element:
           SALARY_011(013),
          end of record.
    End generated data section ***
    start-of-selection.
    at selection-screen on value-request for p_file.
    call function 'WS_FILENAME_GET'
    exporting
      DEF_FILENAME           = ' '
      DEF_PATH               = ' '
       mask                   = '.,..'
       mode                   = 'O'  " O -- open, S -- Save.
       title                  = 'OPEN'
    importing
       filename               = p_file
      RC                     =
    exceptions
       inv_winsys             = 1
       no_batch               = 2
       selection_cancel       = 3
       selection_error        = 4
       others                 = 5.
    start-of-selection.
    call function 'UPLOAD'
    exporting
      CODEPAGE                      = ' '
       filename                      = p_file
       filetype                      = 'DAT'
      ITEM                          = ' '
      FILEMASK_MASK                 = ' '
      FILEMASK_TEXT                 = ' '
      FILETYPE_NO_CHANGE            = ' '
      FILEMASK_ALL                  = ' '
      FILETYPE_NO_SHOW              = ' '
      LINE_EXIT                     = ' '
      USER_FORM                     = ' '
      USER_PROG                     = ' '
      SILENT                        = 'S'
    IMPORTING
      FILESIZE                      =
      CANCEL                        =
      ACT_FILENAME                  =
      ACT_FILETYPE                  =
      tables
        data_tab                      = record
    exceptions
       conversion_error              = 1
       invalid_table_width           = 2
       invalid_type                  = 3
       no_batch                      = 4
       unknown_error                 = 5
       gui_refuse_filetransfer       = 6
       others                        = 7
    if sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    endif.
    *perform open_dataset using dataset.
    *perform open_group.
    delete record index 1.
    loop at record.
    *read dataset dataset into record.
    *if sy-subrc <> 0. exit. endif.
    perform bdc_dynpro      using 'SAPMSVMA' '0100'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'VIEWNAME'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=UPD'.
    perform bdc_field       using 'VIEWNAME'
                                  record-viewname_001.
    perform bdc_field       using 'VIMDYNFLDS-LTD_DTA_NO'
                                  record-ltd_dta_no_002.
    perform bdc_dynpro      using 'SAPLZSHAP' '0001'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'ZEMPTAB1-ZEMPID3(01)'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=NEWL'.
    perform bdc_dynpro      using 'SAPLZSHAP' '0002'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'ZEMPTAB1-SALARY'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=SAVE'.
    perform bdc_field       using 'ZEMPTAB1-ZEMPID3'
                                  record-zempid3_003.
    perform bdc_field       using 'ZEMPTAB1-ZENAME3'
                                  record-zename3_008.
    perform bdc_field       using 'ZEMPTAB1-ZEDEPID'
                                  record-zedepid_009.
    perform bdc_field       using 'ZEMPTAB1-ZSALKEY'
                                  record-zsalkey_010.
    perform bdc_field       using 'ZEMPTAB1-SALARY'
                                  record-salary_011.
    perform bdc_dynpro      using 'SAPLZSHAP' '0002'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'ZEMPTAB1-ZENAME3'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=UEBE'.
    perform bdc_field       using 'ZEMPTAB1-ZENAME3'
                                  record-zename3_008.
    perform bdc_field       using 'ZEMPTAB1-ZEDEPID'
                                  record-zedepid_009.
    perform bdc_field       using 'ZEMPTAB1-ZSALKEY'
                                  record-zsalkey_010.
    perform bdc_field       using 'ZEMPTAB1-SALARY'
                                  record-salary_011.
    perform bdc_dynpro      using 'SAPLZSHAP' '0001'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'ZEMPTAB1-ZEMPID3(01)'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=SAVE'.
    perform bdc_dynpro      using 'SAPLZSHAP' '0001'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'ZEMPTAB1-ZEMPID3(01)'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=BACK'.
    perform bdc_dynpro      using 'SAPMSVMA' '0100'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '/EBACK'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'VIEWNAME'.
    perform bdc_transaction using 'SM30'.
    *enddo.
    *perform close_group.
    *perform close_dataset using dataset.
    endloop.
    loop at it_err.
    write : / it_err-msg.
    endloop.
    form error.
    call function 'FORMAT_MESSAGE'
    exporting
       id              = sy-msgid
       lang            = '-D'
       no              = sy-msgno
       v1              = sy-msgv1
       v2              = sy-msgv2
       v3              = sy-msgv3
       v4              = sy-msgv4
    importing
       msg             = mess
    exceptions
       not_found       = 1
       others          = 2
    if sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    endif.
    it_err-msg = mess.
    append it_err.
    clear it_err.
    endform.
    If it is session method u can find the session in SM35 from where u can get the error log.
    Hope this helps u,
    Regards,
    Arunsri

  • Why do newly created folders have random creation dates? This is on a Seagate GoFlex USB drive.

    Why do newly created folders have random creation dates? This is on a Seagate GoFlex USB drive. Is there a fix? It appears to work fine in the internal drive.

    So your hard drive is setup for a PC using NTFS and you have the NTFS-3G driver installed (I wonder which version of the driver). I know many users have reported problems with file transfer speeds being slower on a Mac, but this is understandable due to the 3rd party driver you have to use to access the drive.
    There are other higher performance drivers available for a price, like Tuxera but their site shows they haven't updated their drivers since late in 2010. They do say that they are 64 bit drivers but they want about $37 for it. They maintain a support site that has quite a bit on info about NTFS-3G issues.
    I assume you've got a freeware or open-source version of the driver from Seagate. It's probably OK for occasional use to access the NTFS drive, but not the way you want to use a drive all the time on MacOS.
    Fat32 is compatible with both OS's, but there are some restrictions like not being able to use video files larger than 4GB (actually 1byte smaller than 4GB) and the 2TB partition size limit. Both HFS+ and NTFS have advantages due to the way the OS's can optimize transfer speeds and reliability, but a drive that's plugged into many machines is always going to have compromises until we get native support for NTFS (possible in the future because it's already been tested), or the Windows side supports HFS+ (probably not going to happen). There are programs/drivers on both sides that can accomplish this compatibility, but not native solutions at this time.
    Using 3rd party drivers could mean problems with some drives that don't work well or completely break with a system update in the future. Apple doesn't seem to try to break compatibility features, but they do control the transfer protocols and optimize the OS for reliability. Flash drives are mostly formatted with Fat32 and they seem to work well for data transfer with just about any OS.
    For me, I've chosen to use a NetGear ReadyNAS NV+  in an Xraid configuration for local storage and Time Machine support, and Dropbox for some files that I can easily transfer to Mac, PC, or Linux by just using a web browser, and I quit installing proprietary drivers to any computer, Mac or PC. I've also got a couple of the USB Fat32 laptop drives (250 to 320 GB) laying around for large file transfers if needed.
    Sorry, I got a long way from your original Folder Creation Date issue.

  • Why in SE16 we can not  see New Data Table for standard DSO

    Hi,
    We says that there is three tables (New Data Table, Active Data Table and Change Log Table) of Standard DSO, Then Why in SE16 we can not  see New Data Table of Standard DSO.
    Regards,
    Sushant

    Hi Sushant,
    It is possible to see the 3 DSO tables data in through SE16. May be you do not have authorization to see data through SE16.
    Sankar Kumar

  • Not able to extract performance data from .ETL file using xperf commands. getting error "Events were lost in this trace. Data may be unreliable ..."

    Not able to extract  performance data from .ETL file using xperf commands.
    Xperf Commands:
    xperf –i C:\TempFolder\Test.etl -o C:\TempFolder\BootData.csv  –a process
    Getting following error after executing above command:
    "33288636 Events were lost
    in this trace. 
    Data may be unreliable
    This is usually caused
    by insufficient disk bandwidth for ETW lo
    gging.
    Please try increasing the minimum
    and maximum number of buffers
    and/or
                    the buffer size. 
    Doubling these values would be a good first at
    tempt.
    Please note, though, that
    this action increases the amount of me
    mory
                    reserved
    for ETW buffers, increasing memory pressure on your sce
    nario.
    See "xperf -help start"
    for the associated command line options."
    I changed page size file but its does not work for me.
    Any one have idea, how to solve this problem and extract ETL file data.

    I want to mention one point here. I have total 4 machines out of these 3 machines above
    commands working properly. Only one machine has this problem.<o:p></o:p>
    Hi,
    I consider that you can try to use xperf to collect the trace etl file and see if it can be extracted on this computer:
    Refer to following articles:
    start
    http://msdn.microsoft.com/en-us/library/windows/hardware/hh162977.aspx
    Using Xperf to take a Trace (updated)
    http://blogs.msdn.com/b/pigscanfly/archive/2008/02/16/using-xperf-to-take-a-trace.aspx
    Kate Li
    TechNet Community Support

  • Report/Widget for combined performance data

    Hello.
    I have created a dashboard view in SCOM 2012 R2 for power consumption of a number of servers. These servers are all part of a group. The power consists of a couple of different performance monitors (VMWare-sensor for VMs and Power Meter from the Power
    Consumption MP)
    The dashboard view works well, however I am after a total figure for each server in the group. Ideally this figure would be on the dashboard view as well.
    E.g. if I have 3 servers with the below power draw
    Server 1 – 150 Watts
    Server 2 – 200 Watts
    Server 3 – 300 Watts
    I would like a box stating total power 650 Watts
    If this is not available in a dashboard view, a report would be okay
    Thanks

    Hi,
    As far as I know, there is no direct way, you may create a sql query and sum the value for each server.
    You may refer to the helpful article below:
    http://blogs.technet.com/b/drewfs/archive/2014/08/17/scom-performance-data-and-power-view.aspx
    Regards,
    Yan Li
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected]

  • Performance data from system and DMS logs on PDW.

    I want all of ther performance data as given below of the particular completed job with respective login ID from system or DMS logs on PDW
    What is total Memory and CPU used for completing the job?
    How many rows processed?
    What is read and write data size?
    What is disk I/O ?
    How much time taken for completing the job?
    -Prasad KVSV

    Hi Prasad, you may want to have a look at the following links:
    http://download.microsoft.com/download/5/0/1/5015A62E-06BF-4DEC-B90A-37D52E279DE5/SQL_Server_2012_Parallel_Data_Warehouse_Breakthrough_Platform_White_Paper.pdf
    http://saldeloera.wordpress.com/2012/08/27/pdw-tip-query-optimization-techniques-to-avoid-shufflemoves-and-partition-moves/
    Regards, Leo

  • Why do I have to guess how much data I might use in a month when I select a plan?

    When I sign up for water service, I don't guess that I might use 1,000 gallons a month, and then pay three times the going rate for water if I take a few extra showers. I get charged for how much water I use that month.
    When I sign up for electricity service, I don't guess that I might use some set number of kilowatt hours, and then pay three times the going rate if we have a heat wave and I run my air conditioner for a few extra hours.
    Why do I have to guess how much data I might use, and then spend three times the going rate if I stay up watching youtube videos for a few extra nights?
    It is stressful to have to constantly monitor usage, and figure out ratios of how much I've used compared to how much time is left in the billing cycle to see if I need to curb my usage. It is not a good customer experience. I would prefer that data be charged a flat rate based on actual usage, perhaps with an option to throttle speeds at certain levels to prevent cases where a minor lapse in judgment could ruin my monthly budget, like forgetting to check data usage religiously on a device meant to simplify my life.
    Thank you for your consideration.

    Simple answer is Verizon is offering a block of data for usage.
    Where I live the water bill states my wife and I are being charged for 30,000 gallons per quarter/half and the price is say $60 however we never use any where near that amount. I send the meter reading in and it shows the amount we use. But the hamlet we are in has a set amount of gallons. Not fair but we live with it.
    For your data outside of the unlimited which is not available any more to new customers you may be a light user and never go over the 2GB for $30 dollars a month. If you do you will pay through the nose for data. If you pay for more data say 4GB and I don't have the price for it so lets say its $70 then you don't have to watch as much for overage fees. But you are paying for the data in advance so as not go over and get hit with whopping fees.
    It is just the way it is.

  • What are Parameters? How are they differenet from Variables? Why can't we use variables for passing data from one sequnece to another? What is the advantage of using Parameters instead of Variables?

    Hi All,
    I am new to TestStand. Still in the process of learning it.
    What are Parameters? How are they differenet from Variables? Why can't we use variables for passing data from one sequnece to another? What is the advantage of using Parameters instead of Variables?
    Thanks in advance,
    LaVIEWan
    Solved!
    Go to Solution.

    Hi,
    Using the Parameters is the correct method to pass data into and out of a sub sequence. You assign your data to be passed into or out of a Sequence when you are in the Edit Sequence Call dialog and in the Sequence Parameter list.
    Regards
    Ray Farmer

Maybe you are looking for

  • How do i use my HP 4500 as USB and wireless?

    Hi I have my printer set up as wireless. This is fine for mobile devices like my tablets but only 50% of the time it works from my Windows 8.1 PC. How do i change so it works for USB on PC and wireless for tablet etc? I tried to resinstall set up cd 

  • ICSS components need to be activated

    Hi, We are implementing ICSS B2C scenario for Complaints. What are the Components need to activate in the CRM server for ICSS process. Regards, Pavan.

  • Monitor Spanning with macparts.de, anyone tried this?

    Gang, I am looking for a new desktop and I was thinking of a iMac G5 2.0GHz. But I really need 2 monitors and so I saw that this guy has this firmware update that allows multiple monitors on iBooks, iMacs and others. I can see what he is doing but th

  • Data Migration Assistant Errors

    Trying to upgrade from CallManager 4.3 to 7.1 using the DMA software. The export is successful but the validation is not. I receive the attached errors and can't find what they mean anywhere. Thanks in advance.

  • How can I stop Mozilla from trying to install the updater everytime I open it?

    I had issues where Firefox eats memory searching for updates. I disabled the Updater, now every time I open Firefox, it tries to reinstall it. so i open Fire fox, the computer asks me if I want to let the Mozilla Updater" make changes to my computer.