2.6.11.7 performance baseline

Looks like arch has found just the right combination of patches for the 2.6.11.7 kernel.  As far as performance goes, the arch kernel stomps all over suse's 9.3 kernel.  I don't have any benchmarks, but the difference is so obvious, I won't need any to be convinced.
I modified the PKGBUILD use 2.6.11.10, and the patches still apply with some offset and fuzz.
It would be nice to keep maintaining the 2.6.11 kernel and patchset as a performance baseline, since performance does not always improve with new releases of the kernel; sometimes it gets worse.

Gullible Jones wrote:via82xx driver for the stock kernel is very annoying.
annoying, good description
if you wanted, you could get the broken out mm patches and apply the via82xx driver from mm onto a CK kernel, thats what i did when the i915 graphics driver was only in MM.
http://kernel.org/pub/linux/kernel/peop … 2-rc4-mm2/
broken out are there.

Similar Messages

  • Not able to perform baseline index - ProductCatalogSimpleIndexingAdmin

    Hi!
    I am not being able to perform the baseline index, when i execute it on component 'ProductCatalogSimpleIndexingAdmin' i get the exception below.
    Cold anyone give a help on how to configure the ATG/Endeca properly to perform the baseline index?
    Thank you very much!
    SEVERE: Error starting baseline crawl 'lojamococaen-last-mile-crawl'.
    Occurred while executing line 11 of valid BeanShell script:
    [[ 8|      Dgidx.cleanDirs();
    9|      
    10|      // run crawl and archive any changes in dvalId mappings
    11|      CAS.runBaselineCasCrawl("lojamococaen-last-mile-crawl");
    12|      CAS.archiveDvalIdMappingsForCrawlIfChanged("lojamococaen-last-mile-crawl");
    13|
    14|      // archive logs and run the indexer
    Oct 16, 2014 1:27:14 PM com.endeca.soleng.eac.toolkit.Controller execute
    SEVERE: Caught an exception while invoking method 'run' on object 'BaselineUpdate'. Releasing locks.
    java.lang.reflect.InvocationTargetException
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:606)
      at com.endeca.soleng.eac.toolkit.Controller.invokeRequestedMethod(Controller.java:931)
      at com.endeca.soleng.eac.toolkit.Controller.execute(Controller.java:269)
      at com.endeca.soleng.eac.toolkit.Controller.main(Controller.java:137)
    Caused by: com.endeca.soleng.eac.toolkit.exception.AppControlException: Error executing valid BeanShell script.
      at com.endeca.soleng.eac.toolkit.script.Script.runBeanShellScript(Script.java:179)
      at com.endeca.soleng.eac.toolkit.script.Script.run(Script.java:127)
      ... 7 more
    Caused by: com.endeca.soleng.eac.toolkit.exception.CasCommunicationException: Error starting baseline crawl 'lojamococaen-last-mile-crawl'.
      at com.endeca.eac.toolkit.component.cas.ContentAcquisitionServerComponent.startBaselineCasCrawl(ContentAcquisitionServerComponent.java:447)
      at com.endeca.eac.toolkit.component.cas.ContentAcquisitionServerComponent.runBaselineCasCrawl(ContentAcquisitionServerComponent.java:355)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:606)
      at bsh.Reflect.invokeMethod(Unknown Source)
      at bsh.Reflect.invokeObjectMethod(Unknown Source)
      at bsh.Name.invokeMethod(Unknown Source)
      at bsh.BSHMethodInvocation.eval(Unknown Source)
      at bsh.BSHPrimaryExpression.eval(Unknown Source)
      at bsh.BSHPrimaryExpression.eval(Unknown Source)
      at bsh.BSHBlock.evalBlock(Unknown Source)
      at bsh.BSHBlock.eval(Unknown Source)
      at bsh.BSHBlock.eval(Unknown Source)
      at bsh.BSHIfStatement.eval(Unknown Source)
      at bsh.Interpreter.eval(Unknown Source)
      at bsh.Interpreter.eval(Unknown Source)
      at bsh.Interpreter.eval(Unknown Source)
      at com.endeca.soleng.eac.toolkit.script.Script.runBeanShellScript(Script.java:165)
      ... 8 more
    Caused by: Crawl failed to start: Error retrieving attributes from the config repository: Unable to create JSON output for merge request: validation errors:
      ERROR: failure to add '/sites/lojamococaen/attributes/product.category' to merged output: missing property 'mergeAction'.
    . See config repository log for more details.
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    Hi,
    Update the <app>/config/index_config/index-config.json with "mergeAction" property for the product.category.
    Ex:
    "product.category" : {
              "propertyDataType" : "ALPHA",
              "jcr:primaryType" : "endeca:property",
              "mergeAction" : "UPDATE"
              "isRecordFilterable" : true
    Then run the command from <app>/control index_config_cmd.bat set-config -f C:\Endeca\apps\CRS\config\index_config\index-config.json -o all
    I hope this helps.
    Thanks,
    Ravinder Pogulakonda

  • How to Set or decide the Performance Baseline ?

    Hi Guys ,
    Like to know different way to set performance base line on production SQL Server. would be great if can provide detail.
    Which monitoring counters should look for with min value?
    steps to decide baseline for SQL Server
    working on SQL Server 2008R2 Ent. 
    Thank you
    Please Mark As Answer if it is helpful. \\Aim To Inspire Rather to Teach A.Shah

    Have a look at this link please:
    http://www.sqlskills.com/blogs/erin/sql-server-baselines-series-on-sqlservercentral-com/
    sqldevelop.wordpress.com

  • Error in performing baseline update

    Hi all,
              I have done ATG installation 10.2, I installed Endeca,During integration of ATG CRS10.2  with endeca
             1.I succesfully created a application with name CRS102,
             2.Initialized the services.
             3.While I am running the base line update I am getting the following errors
    C:\Endeca\Apps\CRS102\control>baseline_update.bat
    [08.28.13 09:51:54] INFO: Checking definition from AppConfig.xml against existing EAC provisioning.
    [08.28.13 09:51:55] INFO: Updating provisioning for component 'DailyReportGenerator'.
    [08.28.13 09:51:55] INFO: Updating definition for component 'DailyReportGenerator'.
    [08.28.13 09:51:56] INFO: Definition updated.
    [08.28.13 09:51:56] INFO: Starting baseline update script.
    [08.28.13 09:51:56] INFO: Acquired lock 'update_lock'.
    [08.28.13 09:51:56] INFO: [ITLHost] Starting shell utility 'cleanDir_processing'.
    [08.28.13 09:51:57] INFO: [ITLHost] Starting shell utility 'cleanDir_forge-output'.
    [08.28.13 09:51:59] INFO: [ITLHost] Starting shell utility 'cleanDir_dgidx-output'.
    [08.28.13 09:52:00] INFO: [ITLHost] Starting shell utility 'move_-_to_processing'.
    [08.28.13 09:52:01] INFO: [ITLHost] Starting copy utility 'fetch_config_to_input_for_forge_Forge'.
    [08.28.13 09:52:03] INFO: [ITLHost] Starting backup utility 'backup_log_dir_for_component_ConfigurationGeneratorForge'.
    [08.28.13 09:52:05] INFO: [ITLHost] Starting component 'ConfigurationGeneratorForge'.
    [08.28.13 09:53:03] INFO: [ITLHost] Starting copy utility 'CopyRecsearchConfig'.
    [08.28.13 09:53:03] INFO: [ITLHost] Starting backup utility 'backup_log_dir_for_component_Forge'.
    [08.28.13 09:53:05] INFO: [ITLHost] Starting component 'Forge'.
    [08.28.13 09:53:19] INFO: [ITLHost] Starting backup utility 'backup_log_dir_for_component_Dgidx'.
    [08.28.13 09:53:19] INFO: [ITLHost] Starting component 'Dgidx'.
    [08.28.13 09:53:22] SEVERE: Batch component  'Dgidx' failed. Refer to component logs in C:\Endeca\Apps\CRS102\config\script\..\..\.\logs\dgidxs\Dgidx on host ITLHost.
    Occurred while executing line 53 of valid BeanShell script:
    50|
    51|        Dgidx.archiveLogDir();
    52|
    53|        Dgidx.run();
    54|
    55|
    56|
    [08.28.13 09:53:22] SEVERE: Caught an exception while invoking method 'run' on object 'BaselineUpdate'. Releasing locks.
    Caused by java.lang.reflect.InvocationTargetException
    sun.reflect.NativeMethodAccessorImpl invoke0 - null
    Caused by com.endeca.soleng.eac.toolkit.exception.AppControlException
    com.endeca.soleng.eac.toolkit.script.Script runBeanShellScript - Error executing valid BeanShell script.
    Caused by com.endeca.soleng.eac.toolkit.exception.EacComponentControlException
    com.endeca.soleng.eac.toolkit.component.BatchComponent run - Batch component  'Dgidx' failed. Refer to component logs in C:\Endeca\Apps\CRS102\config\script\..\..\.\logs\dgidxs\Dgidx on host ITLHost.
    [08.28.13 09:53:23] INFO: Released lock 'update_lock'.
    Errors in LOG FILE (Dgidx.log)
    Parsing XML dimensions data with validation turned on
    Parsing project file "C:\Endeca\Apps\CRS102\data\forge_output\CRS102.xml" (project="CRS102")
    XMLParser: Reading dimensions, dvals, and synonyms from file "C:\Endeca\Apps\CRS102\data\forge_output\\CRS102.dimensions.xml"
    ERROR 08/28/13 04:23:22.393 UTC (1377663802392) DGIDX {dgidx,baseline} Internal error while decompressing input stream: null
    FATAL 08/28/13 04:23:22.393 UTC (1377663802393) DGIDX {dgidx,baseline} Fatal error at file , line 0, char 0; Message: An exception occurred! Type:RuntimeException, Message:The primary document entity could not be opened. Id=C:\Endeca\Apps\CRS102\data\forge_output\\CRS102.dimensions.xml
    WARN 08/28/13 04:23:22.394 UTC (1377663802393) DGIDX {dgidx,baseline} Lexer/OLT log: level=-1: 2013/08/28 09:53:22 | INFO    | Disabling log callback
    I checked CRS102.dimensions.xml . This file is getting created but it is empty
    Please help me asap in resolving this error

    Hi all,
              I have done ATG installation 10.2, I installed Endeca,During integration of ATG CRS10.2  with endeca
             1.I succesfully created a application with name CRS102,
             2.Initialized the services.
             3.While I am running the base line update I am getting the following errors
    C:\Endeca\Apps\CRS102\control>baseline_update.bat
    [08.28.13 09:51:54] INFO: Checking definition from AppConfig.xml against existing EAC provisioning.
    [08.28.13 09:51:55] INFO: Updating provisioning for component 'DailyReportGenerator'.
    [08.28.13 09:51:55] INFO: Updating definition for component 'DailyReportGenerator'.
    [08.28.13 09:51:56] INFO: Definition updated.
    [08.28.13 09:51:56] INFO: Starting baseline update script.
    [08.28.13 09:51:56] INFO: Acquired lock 'update_lock'.
    [08.28.13 09:51:56] INFO: [ITLHost] Starting shell utility 'cleanDir_processing'.
    [08.28.13 09:51:57] INFO: [ITLHost] Starting shell utility 'cleanDir_forge-output'.
    [08.28.13 09:51:59] INFO: [ITLHost] Starting shell utility 'cleanDir_dgidx-output'.
    [08.28.13 09:52:00] INFO: [ITLHost] Starting shell utility 'move_-_to_processing'.
    [08.28.13 09:52:01] INFO: [ITLHost] Starting copy utility 'fetch_config_to_input_for_forge_Forge'.
    [08.28.13 09:52:03] INFO: [ITLHost] Starting backup utility 'backup_log_dir_for_component_ConfigurationGeneratorForge'.
    [08.28.13 09:52:05] INFO: [ITLHost] Starting component 'ConfigurationGeneratorForge'.
    [08.28.13 09:53:03] INFO: [ITLHost] Starting copy utility 'CopyRecsearchConfig'.
    [08.28.13 09:53:03] INFO: [ITLHost] Starting backup utility 'backup_log_dir_for_component_Forge'.
    [08.28.13 09:53:05] INFO: [ITLHost] Starting component 'Forge'.
    [08.28.13 09:53:19] INFO: [ITLHost] Starting backup utility 'backup_log_dir_for_component_Dgidx'.
    [08.28.13 09:53:19] INFO: [ITLHost] Starting component 'Dgidx'.
    [08.28.13 09:53:22] SEVERE: Batch component  'Dgidx' failed. Refer to component logs in C:\Endeca\Apps\CRS102\config\script\..\..\.\logs\dgidxs\Dgidx on host ITLHost.
    Occurred while executing line 53 of valid BeanShell script:
    50|
    51|        Dgidx.archiveLogDir();
    52|
    53|        Dgidx.run();
    54|
    55|
    56|
    [08.28.13 09:53:22] SEVERE: Caught an exception while invoking method 'run' on object 'BaselineUpdate'. Releasing locks.
    Caused by java.lang.reflect.InvocationTargetException
    sun.reflect.NativeMethodAccessorImpl invoke0 - null
    Caused by com.endeca.soleng.eac.toolkit.exception.AppControlException
    com.endeca.soleng.eac.toolkit.script.Script runBeanShellScript - Error executing valid BeanShell script.
    Caused by com.endeca.soleng.eac.toolkit.exception.EacComponentControlException
    com.endeca.soleng.eac.toolkit.component.BatchComponent run - Batch component  'Dgidx' failed. Refer to component logs in C:\Endeca\Apps\CRS102\config\script\..\..\.\logs\dgidxs\Dgidx on host ITLHost.
    [08.28.13 09:53:23] INFO: Released lock 'update_lock'.
    Errors in LOG FILE (Dgidx.log)
    Parsing XML dimensions data with validation turned on
    Parsing project file "C:\Endeca\Apps\CRS102\data\forge_output\CRS102.xml" (project="CRS102")
    XMLParser: Reading dimensions, dvals, and synonyms from file "C:\Endeca\Apps\CRS102\data\forge_output\\CRS102.dimensions.xml"
    ERROR 08/28/13 04:23:22.393 UTC (1377663802392) DGIDX {dgidx,baseline} Internal error while decompressing input stream: null
    FATAL 08/28/13 04:23:22.393 UTC (1377663802393) DGIDX {dgidx,baseline} Fatal error at file , line 0, char 0; Message: An exception occurred! Type:RuntimeException, Message:The primary document entity could not be opened. Id=C:\Endeca\Apps\CRS102\data\forge_output\\CRS102.dimensions.xml
    WARN 08/28/13 04:23:22.394 UTC (1377663802393) DGIDX {dgidx,baseline} Lexer/OLT log: level=-1: 2013/08/28 09:53:22 | INFO    | Disabling log callback
    I checked CRS102.dimensions.xml . This file is getting created but it is empty
    Please help me asap in resolving this error

  • SAp performance Baseline

    Hi all
    I need to create a baseline on
    CPU idle time
    SAP Response time
    Server up time
    Workload
    What transactions can i use and export the data to .xls
    Thanx Guru`s

    Hi,
    A step is the the elementary time spent by an abap program in a SAP process.
    For an interactive abap program (transaction), each user interaction is generating a step.
    It means that the sapgui response time is the mean response time of a dialog step.
    So, it means that
    >To get the average response times for those time periods i take the Average response times/Dialog >Step(ms) column reading
    is true.
    The Total Response time(s) gives measures the load generated on the system because a high Total Response time can be created by a few steps with a very high response time or a lot of steps with small response time.
    Regards,
    Olivier

  • Measure performance of windows 8 machines using windows Performance Toolkit (WPR tool)

    I want to create performance baseline for windows 8 machines(like time to winlogon, time to desktop and total boot time etc.). For this I used windows performance toolkit - WPR tool to record or log performance data(using boot scenario) into log file(.etl). I
    opened generated ETL file using WPA(windows performance analyzer), in processes section, always I am able to see winlogon.exe and explorer.exe  time more than 2 minutes on different machines. when I did it for windows XP(using xperf) machine
    winlogon.exe always showing as less than 30  seconds.
    Can you please let me know, how I can get correct data for following tasks using WPR tool:
    1) Time to Winlogon(winlogon.exe)
    2) Time to Desktop(explorer.exe )
    3) Total Boot time
    4) Time to Outlook Start
    5) Time to Full Outlook Load

    Anyone have idea about how to get correct performance data for tasks using windows performance toolkit(WPRUI)?

  • Oracle 12c migration (performance before/after checks) - suggestion needed

    Hello Experts;
    We are going to upgrade our database to 12c from 10.2.0.4 via manual migration (direct migration is from 10.2.0..5) as per documentation:
    http://www.oracle.com/technetwork/database/upgrade/upgrading-oracle-database-wp-12c-1896123.pdf
    I cannot find any guidance or tutorial for testing - how we should check our processes (performance, resource usage) before and after.
    Could you please suggest a way for testing before/after for our production processes?
    Thanks in advanace for your reply.
    Regards,
    Bolo

    What to test from performance perspective is very generic thing and answer lies in what is important in your application and what are the performance expectations around those important functionality.
    Since applications could be of different nature like- OLTP, DSS. Your requirement could also differ accordingly. In short business must be looking at no impact situation from this upgrade. There are many ways it can be ensured. More accurate way means more money and effort. e.g. You can use Real application testing option to test the production like workload in 12c and see the impact. This requires effort around setting up and using this option and license fee. You can use other available tools like load runner etc.
    In other option, you can do performance baseline on existing 10204 database  and then compare it with performance baseline in 12c database. Directly doing it in prod will make most sense since then you are actually doing it real data volume and workload but it is most risky in the situations where performance degrades for something very important.
    Henceforth it is recommended to run the performance baselining on non-production environment which is production like. By production like, I meant having same data as production (to get this, database refresh is very much recommended), similar workload as production (this criteria becomes more important in OLTP systems) and similar H/W, OS and Database configuration as Production. If you can't do so then your approach is not risk free.
    In any case, you will have option of quick database tuning using ADDM, SQL tuning advisor and AWR etc.
    Hope it helps.
    Thanks,
    Abhi

  • Data Warehouse Infrastructure

    I have a requirement to build a Data Warehouse and Analytics / Reporting capability with the following requirements...
    Maximum of 1TB for Production Data + DR + Test/Dev Env.
    SSIS (up to 25 sources), SSAS (cubes, 5 concurrent users) and SSRS (2 concurrent users, max 500 reports).
    I needs a Production, DR and Test/Dev Environment 
    I have been told that I will require 12 servers each having 4 cores and 12GB of storage (4 for Prod, 4 DR and 4 Test/Dev).
    To give you an idea of load we plan to have 1 full time ETL developer, 5 Data Analysts, 2 Reporting Analysts. We are quite a small business and don't have a particularly large
    amount of data. 
    The model has SQL Server, SSIS, SSAS, SSRS on different servers across each Environment. 
    Any idea if this is overkill? I also have an estimate of 110 days for Setting up the Servers, Installing the SQL Server software and general Infrastructure design activity.

    Agree. Overkill. Big overkill.
    I would recommend production/DR/Dev each have 2 servers. I'd put SSAS, SSRS and SSIS one one and the DB on the other.
    In production, SSAS/SSRS will be active during the daytime; SSIS will likely be active off hours. So putting all that on one box should be fine for sharing the load. The DB on a second box would be good since it will likely be busy during the daytime
    and night time. Four processors may be heavy depending on the types of queries and usage patterns. I suspect you can get by with 2 processor servers, but would recommend buying the 4 processor boxes for dev and production, get them configured and run
    some performance baselines before putting in the DR environment. Then, if you find the CPUs idling, you can always cut the DR environment to 2 processor boxes. Not sure it's worth the minor cost savings to save 2 processors on 2 boxes with that effort, but
    if you're looking to cut corners, you may find that a 2 processor per server DR environment is within your performance comfort zone.
    For the dev environment, one box may well handle it all, but I'd go for 2. On average, a Dev environment isn't all that busy, but when you need the horsepower, you need it. And since it's Development AND Test, you help yourself by having realistic production
    level performance on what you're testing. Four processors is fine, but max it out on memory.
    As for hard drives, be careful about configuration. You need the space on your DW server and maybe for the SSAS server depending on how the cubes are built (ROLAP/MOLAP). When you speak about amounts of data, be careful since you'll want a lot of indexes,
    and that can double the DB size for a DW. Your DW will also run faster if you have different filegroups for data/indexes/temp DB, but only if those different filegroups are on different physical media that work well in parallel. You can always get fancier
    with more filegroups to have different ones for staging tables, for segregating fact & dimension tables etc. But for this size DB, that's overkill as well.
    Mainly, I'd look at spending hardware $s on memory for the servers, but get less of them.
    Now... two questions...
    1) Can you clarify the disk space needs? How much total data space in one environment, without indexes? Based on that, add the same for indexes, add half as much (?) for TempDB and you have the core disk needs. Depending on how much it is,
    you can decide on RAID, filegroup configuration, etc. And if the disk space with indexes is small enough that it all fits in memory, then disk and filegroup configuration becomes inconsequential except for ETL loads.
    2) The 25 sources... can you clarify that? 25 source systems? Total of 25 source applications? Total of 25 tables? Curious, because I'm wondering about how long you'd keep 1 full time ETL developer busy.

  • Bandwidth test with ttcp

    Hi!
    I have internet connection on my cisco 2811 router, and i want to test bandwidth of this connection. Recenty i heared about ttcp tool. I tried it, but i saw bandwidth something about 7 Mb/s, but really connection greater then 20 Mb/s (must be 30 Mb/s). Mybe somebody know how i can test bandwidth? I can install server only from one side. PS: Also i tried to use ip sla ftp operation, but i achiev only 10 Mb/s, i can't continue test, becouse CPU was loaded on 95%

    There are many, many variables that affect observed "bandwidth" (or, more accurately, throughput).
    The 2811 by itself is more than capable of handling the full rate of your connection. See this report.
    To properly test bandwidth definitively, you need to be in control of and able to isolate all of the variables - test equipment capabilities, local and remote LAN segment load, device under test, upstream devices, other systems currently using the same connection(s), etc.
    An easier path is to just neasure your day to day performance (thus establishing a performance baseline) and monitor that for both performance (as observed by the polling and as perceived by your users) and availability. Something simple like a Nagios monitoring tool installation can be used for that. If performance is fine, no need to dig any deeper. If it isn't then start looking into the variables I mentioned above.
    Hope this helps.

  • Initial Service script in Endeca Application

    Hi All,
    Agter creating the Endeca Application, there is a step that tells , to run initial-servies.sh, what is the exact use of this is given below, but where can i see these files ?? where is the location??Can you polease tell me the exact use oif this step.
    The initialize_services script creates the Record Store instances for product data,
    dimension values, precedence rules, and schema information with the names below. The Record
    Store instances are prefixed with the application name and language code specified in the
    deployment descriptor file. In this case, the application name is Discover and the language
    code is en:
     Discover_en_schema
     Discover_en_dimvals
     Discover_en_prules
     Discover_en_data

    Hi Bravo,
    Creation of CAS record store instances is done in C:\Endeca\ToolsAndFrameworks\3.1.0\reference\discover-data-pci\control\initialize_rs_feeds.bat which is invoked by initialize_services.bat file. This step is available when you create an Endeca Application using discover-data-pci which is mainly used for Product Catalog Integration using ATG. This step creates four empty record stores (data,dimvals,schema and precedecerules) in CAS.
    If you are using ATG 10.1.1 or above to integrate with Endeca Commerce 3.1.0 or above, there are components in ATG such as DataDocumentSumbitter etc which populate the data into these record stores and then invoke the EndecaScriptsService to trigger the scripts to perform Baseline Update. As part of the baseline update, Forge takes these CAS record stores as input and indexes the data.
    Thanks,
    Shabarinath Kande
    Edited by: shabari on Jan 10, 2013 1:10 AM

  • RAC Production Comprehensive Check List

    Dear RAC Gurus,
    I am putting together a daily health check list for a 4-node production RAC cluster hosting a multi terra-byte database. I would like to know if any of you have put together such list and can share your thoughts on it and what to include in the list.
    Please respond with your suggestions and advice.
    Madhu

    Leverage DB Console or Grid Control, create a performance baseline, and create alerts for the exceptions to that baseline (within an acceptable boundary).
    CPU utilization per node and overall
    CPU queue size
    IOPS and throughput per node and overall.
    Response time for business critical queries
    Interconnect error rate
    Latency (interconnect and I/O)
    Interconnect I/O rate
    Space utilization
    TEMP space utilization and incidents
    UNDO space incidents
    Concurrent sessions
    Usable FRA
    Services incidents
    Cluster incidents

  • Endeca Phrases not working

    Hi,
    I am currently working on Endeca 6.2.2. In the case of automatic phrasing, I have added an entry in the AppConfig.xml to take the phrases from Workbench.
    <property name="webStudioMaintainedFile4" value="phrases.xml" />
    Upon adding phrases in the workbench, the phrases xml is not being updated in the pipeline even after performing baseline update.
    I tried using the other approach to import phrases using the developer studio. This is updating the phrases xml in the pipeline.
    But the search is not working as expected after baseline update. Suppose if red white wine is added as a phrase entry.
    The search for red white wine is giving results whereas the search for "red white wine" is not giving any results.
    Spelling correction and Did you mean are working as expected.
    Please let me know in case I missed any configuration settings.
    Also please let me know if phrases testing can be verified using the Endeca JSP Reference Implementation while performing search under any matchmode.

    Hi
    I think you might be mixing up phrasing with automatic phrasing. Phrasing is when the customer enters quotation marks around multiple search terms, so "red wine" would only match against those terms appearing in that sequence in the data (i.e. not wine red). Automatic phrasing is a function to add the quotations automatically even though the customer didn't enter them. This list can be manually defined in either Workbench or in Developer Studio, or can be calculated based on the contents of a dimension (or dimensions).
    Three things:
    1) You can test a phrase search by doing a search in the reference application and including the quotation marks around the phrase - if this isn't working for you, then there is a problem at the data, or (2) is relevant
    2) I believe phrased search only works with the standard Endeca analyser, so if you are using the new OLT analyser available in 6.4 with a non-latin language, phrased search won't work
    3) To enable automatic phrasing, you need to make some changes to your web application code as well - see Chapter 7 in the Advanced Developer's Guide (here: http://docs.oracle.com/cd/E38682_01/MDEX.640/pdf/AdvDevGuide.pdf), and specifically the methods setNavERecSearchComputeAlternativePhrasings() and setNavERecSearchRewriteQueryToAnAlternativePhrasing()
    Note (3) above is only for automatic phrasing - i.e. programmatically adding the quotation marks around two or more search terms to treat them as a phrase (so a search for Walking Dead would be transformed to "Walking Dead" and therefore not match Dead Man Walking, for example). You shouldn't need to do anything in the web application to support phrase searches when the customers include the quotation marks themselves.
    Michael

  • IOS 5 (slower, battery, app crashes) Solution: Downgrade?

    I'm having problems since I updated my iPod Touch 4G to iOS 5.
    My battery life is shorter and some apps (that are updated to iOS 5) always crashes! Also the device got slower!
    I'm not as happy as when I bought my iPod with iOS 4.x.x. (It was faster, has got more battery life and all the apps was working properly)
    So I thought that the best solution is to downgrade iOS 5 to iOS 4.x.x but I read that Apple doesn't support it.
    So, what's the last option? Dear apple, don't you have a solution for that? Should I jailbreak it?

    I keep reading posts with this same theme, and the answer is real simple:  You have something installed that iOS 5 doesn't like. 
    I've been dealing with computers for almost 30 years, and it's always the same thing:  My computer doesn't work like I want it too,,,  Wha, Wha, Wha.
    Here's a clue for ya:  The fastest a computer will ever be is when you turn it on for the first time.  After that all bets are off.  I've seen it with every computer and protable device I've ever owned....
    So, what you need to do is blow the iPod away (Restore it to factory defaults) and start over.
    DO NOT synch any music, apps, videos, pictures, contacts, or what ever.  DO NOT restore froma backup.  Just give it a name, set up the wifi, and "play" with it for a day or two.  Surf the web, take some pictures, take some videos, use iMessage, but DO NOT install ANY apps....
    This will give you a better performance baseline to work from.
    Next, delete the pictures and videos so you have a "clean" device again.
    Connect to iTunes and synch music only.  NO APPS!  Get the album art squared away if you want to.  Make a note of how much memory your songs are taking up....  Use the iPod for a couple of days, keeping track of what you are doinig and how the device is performing.
    Now comes the "fun" part, synching your apps.  To get this right you need to synch ONE APP AT A TIME, and throughly test your iPod.  In other words, synch an app, test the iPod for a couple of days, synch another app, test the iPod, and so forth.
    If you get to a point where the iPod starts acting up, "unsynch" the last app you synched, and test again.  If the iPod is acting better, skip the "bad app" and continue one-by-one synching, followed by testing between each "new" app.
    If you follow this process I have no doubt you will find what is causing your iPod to mis-bahave.
    And, if you have neither the desire nor the discipline to follow this process, then don't come back whining.....

  • How is Lion's performance with the baseline 13" MBA (Late 2010) 2GB RAM?  I know it works fine with the baseline 11" 2GB model, but because that's a smaller machine, perhaps the 2GB limit is not so detrimental to performance.  MBA 13" 1.86, 2GB/2010

    How does the baseline 13" mba (late 2010, 1.86 ghz, 2GB RAM/128GB) work with OSX LIon? I know the baseline 11" is fine, but does the larger form factor of the MBA 13" require more RAM to be speedy under Lion? I wouldn't think so, but please advise.  Please answer this question ONLY with refrence to the specs listed (13" MBA (late 2010, 1.86 ghz, 2GB RAM, 128gb).  Thanks much!

    Overy time, every PC/Mac gets unavoidably bloated with junk like unnecessary files/downloads/apps or left-over files of video-streams/uninstalled apps/caches/... which slow down the overall system. That's why PC/Mac should be formatted (=deleting everything even including the operating system) once in a while to reinstall the operating system (here: Lion) from scratch -hence "clean install". To format a Mac is pretty easy, but all of your files (e.g. images, documents,...) should be saved on external harddrives/CDs/DVDs/Flash-drives... because they are going to be deleted during the formatting.
    I'm NOT going to describe you how to format, because it might not be necessary for you.
    But what you NEED to do anyway is (if you want Lion):
    (( 0. Eventually, but not necessarily: Make a TimeMachine-BackUp to be able to return to Snow Leopard))
    1. Download Lion, BUT DON'T install it (cancel the install-process).
    2. Follow the instructions on this site:
    http://lifehacker.com/5823096/how-to-burn-your-own-lion-install-dvd-or-flash-dri ve
    This will give you a legal copy of Lion on a DVD or Flash-Drive, which btw allows you to install it on all of your mac at home. Why do you need copy of Lion? If you encounter problems with lion after a "normal install"(=NOT formatting and NOT clean installing the operating system), you still have the option to make a clean install. A clean install can't be done without the copy of Lion on a DVD or Flash-Drive.

  • Best Practice Question Re: Tasks That Actually Finished before Predecessors in a Baselined Schedule

    Hindsight is indeed 20/20, but here is the situation:
    Set of Tasks A (Design IDD Tasks) - These tasks are not complete and are, in fact delayed
    Set of Tasks B (ARC Tasks) – In the baseline, they have Set of Tasks A as their immediate predecessors – some are marked as complete and I made those updates
    Set of Tasks C (Tasks that Depend on ARC Tasks) – in the baseline they have Set of Tasks B as their immediate predecessors
    The Problem is delays to Set of Tasks A currently do not affect Set of Tasks C (Tasks that Depend on ARC Tasks) – and they should. I am looking for the best way to make Set of Tasks C reflect delays from Set of Tasks A, given that the project has been
    baselined with a certain set of predecessors and successors.

    Leslie - If you are looking for videos, you can try this. I delivered ten webinars on Microsoft Project 2010 last year and its recordings can be played for
    free. 
    Session 1: Ready. Set. Go. Preparing Project :
    http://goo.gl/yWVGn
    Session
    2 : How to change working time and set holidays in Project 2010?
    http://goo.gl/QTRds  
    Session 3 : Structure the schedule by WBS and task
    dependencies http://goo.gl/SPqkM  
    Session 4 : Setup people, cost and material resources
    in Project 2010 http://goo.gl/lBTUF  
    Session 5 : Assigning resources (people and material)
    and costs (fixed,variable) http://goo.gl/PPI18  
    Session 6 : Convert draft schedule to optimal schedule
    that meets stakeholders requirements http://goo.gl/ptdTl  
    Session 7: Keeping Your Project on Track by Leveraging
    the Baseline Features of Project 2010 http://goo.gl/TM8Gv   
    Session 8: Track project actual against project baseline
    information http://goo.gl/ZWJxP  
    Session 9 : Report project performance through Reporting
    Features http://goo.gl/CC76e  
    Session 10: Sharing Resources across Projects
    http://goo.gl/JkZU01
    Sai PMI-SP, MCTS Project, MVP Project

Maybe you are looking for

  • Problems in Dialog instance installation for ECC6.0(ABAP+JAVA)

    Does anyone can help me what should be a procedure of installing Dialog instance for ECC 6.0 with both ABAP + Java Stacks. When installing app server do we also choose ABAP and Java or just ABAP. Installation manual does not cover this and when we ru

  • NVRM error, now mt MBP won't start

    Over the last 2 weeks, my MBP would crash (mostly during playing games). At first, I could just restart it and after a few restart it would be fine. But yesterday it crashed again after I managed to restart it (after another crash). But now it won't

  • Requisition notifications to Suppliers

    Dear All, How can PO requisition notify to suppliers for special requests form Iprocurement module , Specifically  for the  requisitions that auto create and do not come through  Buyers Regards Sai

  • How to find function location (TPLNR) from OBJEK field

    Dear All, PLEASE HELP ME IN FINDING THE FUNCTIONAL LOCATION (TPLNR) USING OBJEK FIELD (KEY OF OBJECT TO BE CLASSIFIED. Regards, Nikhil..

  • Server Migration from SLS to ML and from OD to AD

    Current setup: small network of macs (10 users, mostly ML), using a mac mini with SLS as OD server. SLS provides: OD, DNS, AFP, VPN, Time Machine we are merging with a (somewhat) larger company that has a Windows 2003 Server (which provides AD, DNS a