Performance MULTI_COLUMN_DATASTORE and BASIC_SECTION_GROUP

In the following example there is a table with 3 text columns. There is a MULTI_COLUMN_DATASTORE
for all these columns. For each column there is a field section in a BASIC_SECTION_GROUP:
  CTX_DDL.CREATE_PREFERENCE ('my_pref', 'MULTI_COLUMN_DATASTORE');
  CTX_DDL.SET_ATTRIBUTE     ('my_pref', 'COLUMNS', 'col1, col2, col3' );
  CTX_DDL.CREATE_SECTION_GROUP ('my_section', 'BASIC_SECTION_GROUP');
  CTX_DDL.ADD_FIELD_SECTION ('my_section', 'col1',   'col1',   TRUE);
  CTX_DDL.ADD_FIELD_SECTION ('my_section', 'col2',   'col2',   TRUE);
  CTX_DDL.ADD_FIELD_SECTION ('my_section', 'col3',   'col3',   TRUE);The table CTXSYS.DR$SECTION then contains three rows.
Each row in the own index table $I is labled with the TOKEN_TYPE according DR$SECTION:
  Table CTXSYS.DR$SECTION:
  SEC_FID = 16  'col1'
  SEC_FID = 17  'col2'
  SEC_FID = 18  'col3'
  Table DR$INDEX_FUZZY$I:
  TOKEN_TYPE = 16 / 17 / 18
CREATE INDEX idx_fuzzy ON mytable (dummycol)
INDEXTYPE IS CTXSYS.CONTEXT
PARAMETERS ('... SECTION GROUP ctxsys.my_section ... ); The query looks like this: search a text string in the column col1:
exec :bnd := '?mysearch% WITHIN col1';
select *
from mytable
where contains(dummycol, :bnd, 0 ) > 0;A tracing analysis using tkprof shows, that there is a query like this, which is the most time consuming query (total cpu time here: 44 sec.):
SELECT/*+INDEX(T "DR$IDX_FUZZY$X")*/ DISTINCT TOKEN_TEXT FROM
  "DR$IDX_FUZZY$I" T WHERE TOKEN_TEXT LIKE :lkexpr and    (TOKEN_TYPE =
   0 OR TOKEN_TYPE = 4 OR TOKEN_TYPE BETWEEN 16 AND 74)
call     count       cpu    elapsed       disk      query    current        rows
Parse      115      0.00       0.01          0          0          0           0
Execute   4821      0.08       0.08          0          0          0           0
Fetch     8599     43.16      44.00       8013     633750          0     4257613
total    13535     43.24      44.09       8013     633750          0     4257613The question: why is there still a range scan like "*TOKEN_TYPE BETWEEN 16 and 74*" ?
This means that the query searches the complete INDEX table. All types will be included ( TOKEN_TYPE = 16 / 17 / 18 ), but
the query is only interested in TOKEN_TYPE = 16 ( i.e. column col1 ).

This happens during processing of the fuzzy expansion. If you remove the fuzzy expression then you won't see this query.
Why isn't fuzzy expansion "section-aware"? I don't know. I guess it's possible that you might use a similar expression when searching other columns, so the system thinks it's best to cache all the possible expansions. Or maybe there's just something in the architecture of the query parser that makes this impossible.
I would recommend raising an SR with support, then we can have the developers look at it and say whether it's feasible to do this more efficiently.

Similar Messages

  • Query question with MULTI_COLUMN_DATASTORE and BASIC_SECTION_GROUP

    Dear experts!
    Do I always need the WITHIN operator when using BASIC_SECTION_GROUP?
    I have created the following test case:
    connect ctxsys/ctxsys
    begin
    ctx_ddl.create_preference('otext_multi', 'MULTI_COLUMN_DATASTORE');
    ctx_ddl.set_attribute('otext_multi', 'columns', 'c2, c3');
    end;
    begin
    ctx_ddl.create_section_group('otext_section', 'BASIC_SECTION_GROUP');
    ctx_ddl.add_field_section('otext_section','c2','c2');
    ctx_ddl.add_field_section('otext_section','c3','c3');
    end;
    connect testuser/testuser
    create table otext (
    c1 number primary key,
    c2 varchar2(15),
    c3 clob,
    textidx char
    insert into otext values (1, 'First line', 'This is an Oracle Text Index test', null);
    insert into otext values (2, 'Second line', 'My Oracle database version is 10.2.0.1', null);
    insert into otext values (3, 'Third line', 'I hope you will help me', null);
    commit;
    create index idx_otext on otext(textidx) indextype is ctxsys.context parameters
    ('datastore ctxsys.otext_multi section group ctxsys.otext_section');
    select c3 from otext where contains (textidx, 'Oracle within c3') > 0;
    C3
    This is an Oracle Text Index test
    My Oracle database version is 10.2.0.1
    select c3 from otext where contains (textidx, 'Oracle') > 0;
    no rows returned
    So what do you think? Shouldn't the last select also return 2 rows?
    I mean a workaround would be like the following:
    select c3 from otext where contains (textidx, 'Oracle within c2 OR Oracle within c3') > 0;
    But I would expect if I have no sections in my contains clause Oracle will search in all available sections...
    Thanks
    Markus

    There is a fourth boolean "visible" parameter to ctx_ddl.add_field_section. The default is FALSE. Specify TRUE to make the text visible without using a "within" clause.
    Also, I noticed that you created your multi_column_datastore in the ctxsys schema. In 9i that was required, but as of 10g it can be created in another schema.
    Please see the demonstration below.
    SCOTT@10gXE> begin
    2 ctx_ddl.create_preference ('otext_multi', 'MULTI_COLUMN_DATASTORE');
    3 ctx_ddl.set_attribute ('otext_multi', 'columns', 'c2, c3');
    4 ctx_ddl.create_section_group ('otext_section', 'BASIC_SECTION_GROUP');
    5 ctx_ddl.add_field_section ('otext_section', 'c2', 'c2', TRUE);
    6 ctx_ddl.add_field_section ('otext_section', 'c3', 'c3', TRUE);
    7 end;
    8 /
    PL/SQL procedure successfully completed.
    SCOTT@10gXE> create table otext (
    2 c1 number primary key,
    3 c2 varchar2(15),
    4 c3 clob,
    5 textidx char
    6 )
    7 /
    Table created.
    SCOTT@10gXE> insert all
    2 into otext values (1, 'First line', 'This is an Oracle Text Index test', null)
    3 into otext values (2, 'Second line', 'My Oracle database version is 10.2.0.1', null)
    4 into otext values (3, 'Third line', 'I hope you will help me', null)
    5 select * from dual
    6 /
    3 rows created.
    SCOTT@10gXE> create index idx_otext on otext (textidx) indextype is ctxsys.context parameters
    2 ('datastore otext_multi section group otext_section')
    3 /
    Index created.
    SCOTT@10gXE> select c3 from otext where contains (textidx, 'Oracle within c3') > 0
    2 /
    C3
    This is an Oracle Text Index test
    My Oracle database version is 10.2.0.1
    SCOTT@10gXE> select c3 from otext where contains (textidx, 'Oracle') > 0
    2 /
    C3
    This is an Oracle Text Index test
    My Oracle database version is 10.2.0.1
    SCOTT@10gXE> select c3 from otext where contains (textidx, 'Oracle within c2 OR Oracle within c3') > 0
    2 /
    C3
    This is an Oracle Text Index test
    My Oracle database version is 10.2.0.1
    SCOTT@10gXE>

  • Model wage type for performance pay and additional pay

    Hi gurus,
    my client wants performance pay and additional pay in the additional payments. I am confused what model wage types to be copied for this. i tried copying m2sf for these but the prob is if if the employee is joining in the mid of month say 15th , then if i am trying to give performane pay of 10,000 rs then it is calculating only 15 days pay n giving 5000 in the payroll n payslip.
    if i try to copy the other wage types like m200,m281 n m282, i have the same problem . but if i copy m280 for both of these w.ts ,then it is giving the flat amount irrespective of the joining date of employee. but my question is does it have any implications on the calculation of the tax. coz i observed different model wage types deduct different amounts of tax. i.e if i copy m281 for performance apy n give the value as 10,000 then ,it give some x amount of tax on 10,000 similarly if i copy some other model w.t , for same amount of 10,000 it gives some other amount of tax.
    im really confused ,plzz help

    Hi ramm,
    I have checked the w.t amount in /434
    /434 Total Incom                                                   182,060.00
    n my other w.t amounts are:
    4MOB Mobile Reim                                                    25,000.00
    4PER Performance                                                    15,000.00
    5CEA Child Educa                                                       193.55
    5CON Conveyance                                                      1,451.61
    5HRA House Rent                                                      4,838.71
    5SPL Special All                                                    19,354.84
    /001 Valuation b01                          179.90
    /002 Valuation b01                          179.90
    /118 PTax Basis 01                                                  75,516.13
    /119 PTax Basis 01                       96,774.19                  35,516.13
    2CAN Canteen Ded01                                                   2,000.00-
    2NOT Notice Deuc01                                                   2,000.00-
    2OTH Other Deduc01                                                   1,500.00-
    5BAS BASIC      01                                                   9,677.42
    I dont know how to check the slabs, i am new to payroll,can u guide plzz

  • Performance Manager  and Data Gatherer

    Hello,
    I am looking for some basic info (I think). I have been using the EM Console - standalone for some time now but would like to use the diagnostic pack and Performance Manager. Can this be used in a standalone configuration or do I need to buy Enterprise Manager? If it works in a standalone configuration and I do not need to buy it, could you point me in the direction of documentation on how to start the data gatherer on my 9i database on AIX 5?
    Thanks,
    Jeff Graham

    Post Author: Ferhat
    CA Forum: Performance Management and Dashboards
    you have to find the keyword to see...
    i suggest you to uninstall businessobjects and install again(else it's very hard to add the performance management program).
    while you install, check the install performance management also(you have to have the keyword).

  • Perform static and dynamic check together on a customer?

    Hello All,
    Is it possible to perform static and dynamic check together on a customer?
    I believe standard SAP does not allow both options together.
    The scenario is: My client has customer who orders products 1 year in advance and create sales order with a future delivery date. The customer might have balance due on their account which is close to the credit limit. As a result the sale order should be blocked as if the automatic credit check is "Static". This would allow our customer service to call the customer and arrange for the payments.
    Then 11 months later, a dynamic check should be performed on the customer to make sure the past due is under the credit limit. If the past due is over credit limit, the sales order must be blocked for delivery as if the credit check was "Dynamic" with 30 days horizon. This would again give our customer service time to settle the payment with the customer.
    If anyone has implemented such a scenario, can you please give me an overview of how you did it?
    Thanks very much in advance and Text Removed.
    With Regards,
    AM
    Edited by: Lakshmipathi on Nov 13, 2010 9:23 AM
    Please dont include this text

    Static and Dynamic Cannot be Performed together. They are mutually exclusive events.
    Credit check can be performed either at Sale order level or at Delivery / PGI Level.
    FCV3 is a good report to see whether a customer would fail a credit check. ( suitable for your customer service )
    Coming to your Point , Assuming that the customer is eligible for a Credit of 50 Lacs. Now the Customer has placed an order for 60 Lacs. We set the Customer credit checking at  order level . The Customers Sale order would get blocked.
    The Customer service  will follow up for the payment and see to it that it is made. Assume the customer has made a payment for 70 Lacs.
    Now let us have a look at the account statement of the customer.
    Account Statement :     Customer Cr :  70 Lacs.   Dr   :  0.  ( because no delivery and invoice  so far has been made )  
    So we are supposed to deliver the Material worth value of 60 Lacs.
    The First sale order worth 60 Lacs has been unblocked in VKM3. Now at the time of Delivery level why would you want to chekc the Credit limit again ? It has already passed the credit check.
    It is not necessary to check credit limit . Because you have already performed the same at the order level.
    Now the customer has come again and raised another sale order for 80 Lacs. The System would block him.
    The system will never block the Previous delivery which you are, due for delivery  to the customer.
    If we keep on checking the Credits without delivering we would end up only raising in Sale orders and no Delivery to the Customer  at all. The Debit side of his account will always be 0.
    So i donot think Static and dynamic is needed . you can go for either Static or Dynamic.

  • I have Photoshop CS6 Extended Students and Teachers Edition.  When I go into the Filter/Oil paint and try to use Oil paint a notice comes up "This feature requires graphics processor acceleration.  Please check Performance Preferences and verify that "Use

    I have Photoshop CS6 Extended Students and Teachers Edition.  when I go into the Filter/Oil paint and try to use Oil Paint a notice comes up "This feature requires graphics processor acceleration.  Please Check Performance Preferences and verify that "Use Graphics Processor" is enabled.  When I go into Performance Preferences I get a notice "No GPU available with Photoshop Standard.  Is there any way I can add this feature to my Photoshop either by purchasing an addition or downloading something?

    Does you display adapter have a supported  GPU with at least 512MB of Vram? And do you have the latest device drivers install with Open GL support.  Use CS6 menu Help>System Info... use its copy button and paste the information in here.

  • I want to use a mac mini as a server supporting storage. Can I pair my macair to it for when I need to perform updates and maintenance ?

    I want to use a mac mini as a server supporting storage. I have other devices such as an iMac and iPad that will access information from the server. I do not want to purchase a monitor and keyboard as the unit will sit in a cupboard out of sight. Can I pair my macair to it for when I need to perform updates and maintenance ?

    I have a 2010 Mac Mini running Yosemite and Server which I use
    as a headless home server.
    I have is set up to allow screen sharing and can connect to it and
    control it with my iMac, Macbook Pro, iPhone, and a 2011 Mini Server
    that I use as an HTPC.
    You can check this out for all the Yosemite Server capabilities:
    https://help.apple.com/advancedserveradmin/mac/4.0/
    I have iTunes Home Sharing set up on it and have my entire iTunes
    library on it.  I can then use any of my Macs to play Movies or Songs
    from it and only keep locally a select subset of that on my individual
    devices.
    Rather than and update server, I utilize Server's Caching Service.  The caching
    server will duplicate any update download (system or MacApp Store purchases)
    any time a device that is connected to my network down loads one.  The update will
    then be stored locally and all other devices will download the update from it which
    can be faster than from Apple directly.  This has the advantage of only having to download
    once with limited bandwidth internet connections.  There is also an Update Server service
    available, but it is some what more involved in setting up.  However, it will download
    and store all available updates.
    There is another thing as well if you do not care for syncing things like Contacts, Calendar, etc.
    to iCloud, you can set Server up to sync these items across devices locally.

  • How to perform expand and collapse operation on click of link or button

    Hi
    Is there anyway by which i can perform expand and collapse operation over a table on click of a command link or button without using show details component.
    Is there any other alternative to show detail.
    I do not want to use show detail because of the following reason.
    I have got a page where in i pass some parameters to the URL based on which details get displayed on the page.
    Now when i pass a parameter, change the parameter, once again send the previously used parameter(i,e the parameter that i passed initially), values in the table under show detail do not show any change in values.
    Once i pass values which i have already passed, values in the table under show detail component do not refreshed.
    If anyone has encountered with similar situation, then please post me reply's as to how you could over come the problem??

    Hi
    Perhaps you could use an on-demand application process instead - you can specify these in the URL.
    The URL syntax (for your link) would be like...
    f?p=application_id:page_id:session:APPLICATION_PROCESS=process_idIs this what you mean?
    Cheers
    Ben

  • How to make use of byte array to perform credit and debit in Wallet Applet

    Hi,
    Can anyone please explain me how to use the concept of byte arrays.. to perform credit and debit operations of a 6byte amount. Say for example.. the amount contains 3 bytes whole number, 1 byte decimal dot and 2 bytes fractional part.
    It would be helpful if anyone could post a code snippet.. that way i can undersatnd easily.
    Thanks in Advance
    Nara

    Hello
    as explained in the other topic, remember your high school years and how you leart to add numbers.
    then, reproduce these algorithms for base-256 numbers.
    fractional? just google about fixed point math and floating point math. The first one is easier, as they are managed as integers.
    example, you want to store 1,42€. Then you decide the real number to deal with is N*100, and you store 142. You get 2 decimal places of precision.
    in the computer world it's easier to store n*256 as it's just a byte shift.
    then the addition operations are the same, and you just interpret the values as fractional numbers.
    regards
    sebastien

  • Can i use "Oracle Database 12c: Performance Management and Tuning " training for getting certification on "Oracle Database 11g: Performance Tuning 1Z0-054"

    i have taken "Oracle Database 12c: Performance Management and Tuning new" training from oracle university. Now i would like to get certified on "Oracle Database 11g: Performance Tuning 1Z0-054" exam. Is it possible ?

    I essentially endorse and refer you to Matthews' and John's post above.
    I would differ with slightly with Matthew because my guess is you would often be able to use like for like 12c training for an 11g certification ( I believe there are precedents).  BEFORE ANYONE ASKS THE OTHER WAY DOESN'T HAPPEN.
    .... but totally concur with Matthew you would ill advised to procede on that basis without one of:
    - This being advertised as possible on the website : e.g. https://education.oracle.com/pls/web_prod-plq-dad/db_pages.getpage?page_id=654&get_params=p_id:169 ... option 2 show courses.
    - Confirmation from Brandye
    - Confirmation from an Oracle Certification Support Web Ticket ( http://education.oracle.com/pls/eval-eddap-dcd/OU_SUPPORT_OCP.home?p_source=OCP )
    ... The more common (and in my opinion usually better) way would be get your 11g DBA OCP ( or higher first) and then take the 1z0-054.  I am almost certain they will accept your 12c course for the 11g DBA OCP.
    If you are choosing the route of not being a 11g (or 12c ) DBA OCP first but are on option 2 and relying on the course for certification then the issue is more in the balance and you are even more strongly advised to get confirmation before proceding (remember if the rules need to be changed for you only then any profit out of the exam is lost).
    In general my understanding is Oracle would prefer to encourage people to train on the latest version of product that is available for training  and will prefer to avoid restrictions which would cause you to train at a lower version.  ( This is simply my guess at Oracle University Policy ... personal opinion only).
    Having said all I have said I'd encourage you to go with the advice of the earlier two posts.

  • Performance analysis and monitoring of a Forteapplication

    Hello,
    It would be good if one could do some performance analysis and monitoring of
    a Forte application at production time.
    By performance analysis I mean measure the time some selected methods take
    to return. In a CS application such a method would be some selected method
    of a key remote service representative of the application's activity.
    One would like to mesure the min, max, average time, and also give a
    threshhold which when reached will automatically generate an alarm or some
    pre-defined processing.
    The most powerful way would be to use a SNMP application through a Forte -
    SNMP gateway. (see the G. Puterbaugh paper "Building a Forte-SNMP gateway"
    in the 96' Forte Forum proceedings).
    But before going that far some simple means accessible through EConsole
    would already be great.
    A collegue of mine when to that Forum and reported that Puterbaugh said that
    such an agent is currently missing, but that its implementation is not
    difficult.
    I looked at all the agents and their instruments. I came to the following
    conclusions :
    1) instrumented data are available at the granularity level of a partition,
    not at a smaller granularity. For example, the DistObjectMgr agent gives you
    very useful information : the number of events (sent/received) and the
    number of (remote) methods (called/invoked), but this for the entire
    partition. Thus it prevents making tuned observations (unless you partition
    in a special way your application to put in a dedicated partition the thing
    you want to observe and only this thing).
    2) there is no instrumented data related to processing time.
    This leads me to the point that no information observed by the standard
    agents help me figuring out my performances. Thus I have to add at
    development time some lines of code to the methods I potentially want to
    observe later at production time to generate the appropriate information a
    custom agent will then display (process) with the appropriate instruments.
    Does someone share this position ?
    Has someone implemented such an agent and assotiated means ?
    PS: I will probably implement my own one if no other way around.
    best regards,
    Pierre Gelli
    ADP GSI
    Payroll and Human Resources Management
    72-78, Grande Rue, F-92310 SEVRES
    phone : +33 1 41 14 86 42 (direct) +33 1 41 14 85 00 (reception desk)
    fax : +33 1 41 14 85 99

    Hello,
    It would be good if one could do some performance analysis and monitoring of
    a Forte application at production time.
    By performance analysis I mean measure the time some selected methods take
    to return. In a CS application such a method would be some selected method
    of a key remote service representative of the application's activity.
    One would like to mesure the min, max, average time, and also give a
    threshhold which when reached will automatically generate an alarm or some
    pre-defined processing.
    The most powerful way would be to use a SNMP application through a Forte -
    SNMP gateway. (see the G. Puterbaugh paper "Building a Forte-SNMP gateway"
    in the 96' Forte Forum proceedings).
    But before going that far some simple means accessible through EConsole
    would already be great.
    A collegue of mine when to that Forum and reported that Puterbaugh said that
    such an agent is currently missing, but that its implementation is not
    difficult.
    I looked at all the agents and their instruments. I came to the following
    conclusions :
    1) instrumented data are available at the granularity level of a partition,
    not at a smaller granularity. For example, the DistObjectMgr agent gives you
    very useful information : the number of events (sent/received) and the
    number of (remote) methods (called/invoked), but this for the entire
    partition. Thus it prevents making tuned observations (unless you partition
    in a special way your application to put in a dedicated partition the thing
    you want to observe and only this thing).
    2) there is no instrumented data related to processing time.
    This leads me to the point that no information observed by the standard
    agents help me figuring out my performances. Thus I have to add at
    development time some lines of code to the methods I potentially want to
    observe later at production time to generate the appropriate information a
    custom agent will then display (process) with the appropriate instruments.
    Does someone share this position ?
    Has someone implemented such an agent and assotiated means ?
    PS: I will probably implement my own one if no other way around.
    best regards,
    Pierre Gelli
    ADP GSI
    Payroll and Human Resources Management
    72-78, Grande Rue, F-92310 SEVRES
    phone : +33 1 41 14 86 42 (direct) +33 1 41 14 85 00 (reception desk)
    fax : +33 1 41 14 85 99

  • How can I transfer a big .pdf file e.g. 9 MB from my Macintosh Performa 5200 and its OS 8.1 to an USB-flash drive?

    How can I transfer a big .pdf file e.g. 9 MB from my Macintosh Performa 5200 and its OS 8.1 to an USB flash drive? E.g. is there any adapter available in order to connect the SCSI with USB? Or is it better to use a compression software and transfer it to 10 3,5" floppy discs?
    Thank you
    Emanuel

    Hello Emanuel,
    The Performa 5200CD did not have built-in Ethernet as standard, so unless an Ethernet card (or an external SCSI or LocalTalk-to-Ethernet adapter) has been added, that method would not be available in this case.
    Your suggestion involving compression software (such as an appropriate version of StuffIt) with segmenting capabilities could of course be one alternative.
    If you have an internal or external modem for the Performa, another way could be to use the telephone lines for transfers. A communications program would have to be used on both sides (for example, ZTerm or the communications section of ClarisWorks on the Performa).
    It is even possible to connect two serial modems directly. A simple line simulator (in principle, a 9 V battery in series with a 680 ohm resistor in one of the leads in an RJ-11 to RJ-11 cable), which can be built in a couple of minutes, is sometimes needed. Do NOT use a line simulator for units connected to the public telephone network.
    Yet another solution could be a null-modem transfer to a PC with a (DB-9M) serial port. This would require a null-modem cable (can be designed by combining a Macintosh modem cable (MiniDIN-8M to DB-25M) with a normal PC-style (DB-25F to DB-9F) null-modem cable . HyperTerminal or another communications program can be used on the PC.
    What do you have to work with (other computers/models/platforms)? Is this a one-time transfer, or do you plan to send additional files later? Is the intention to continue to use the Performa 5200?
    Jan

  • [svn] 2263: Fixed WiseUI performance app and some mustella test failures.

    Revision: 2263
    Author: [email protected]
    Date: 2008-06-26 13:19:12 -0700 (Thu, 26 Jun 2008)
    Log Message:
    Fixed WiseUI performance app and some mustella test failures.
    tests Passed: checkintests
    Needs QA: YES
    Needs DOC: NO
    Bug fixes:
    API Change: NO
    Reviewer: pfarland
    Code-level description of changes:
    OffsetInputBuffer.java
    Added getLineText() override to take the offset into
    consideration.
    Modified Paths:
    flex/sdk/trunk/modules/compiler/src/java/flex2/compiler/as3/OffsetInputBuffer.java

  • How to get the health , performance information and about the services run on devices that have connected to the system center?

    Hi All,
    I want to know how to get the health , performance information and about the services run on devices that have connected to the system center to my c# application. Also I need to know about the information of databases that have connected to system center.
    I will appreciate your feedback
    Thank you

    Hi,
    You can configure service monitor for the required service on the server
    refer below link for how to configure service monitoring
    http://www.bictt.com/blogs/bictt.php/2011/03/17/scom-monitoring-a-service-part3
    You can use SCOM SDK to connect to the scom server using c# and get required information
    http://msdn.microsoft.com/en-us/library/hh329086.aspx
    you can find the database in below registry path on management server
    HKLM:\SOFTWARE\Microsoft\Microsoft Operations Manager\3.0\Setup\DatabaseName
    Regards
    sridhar v

  • UCS Performance Manager and NetApp c-mode filers

    We started an evaluation of the UCS Performance Manager. I can't add to the UCS Performance Manager Infrastructure any NetApp c-mode filers. Does anyone have the same problem?

    Hi John, From the output you shared it looks like you using Performance Advisor and want to share a graph of a 7-mode system.  The instructions link you shared is from Performance Manager which is for Clustered Data ONTAP only.  In the Performance Advisor there is unfortunately no way to share a link.  You can 'save as' to png as you probably already saw, or you can use the dfm CLI to generate the same and email, perhaps as a daily report.  There is also a data collector that will post to the NetApp ToolChest that submits data to Graphite with dashboards in Grafana.  This solution would allow sharing of a URL like you desire.  If you want more info on this let me know. Kind Regards,Chris MaddenStorage Architect, NetApp

Maybe you are looking for