Response time of form

I developed a form with triggers like key-up,key-down, post-query, on-lock in three blocks. It works fine in client server environment but when I run the form from the web server it slows down and a job that it has to do on one click of a button actually does when clicked three times.
In master detail blocks which are both in tabular form I have placed an additional field which works as a current record indicator and when I scroll through the records the current record indicator lacks behind.
I even have removed most of the triggers but still the response time is very very poor.
I am using form server with Appachi web server.

<BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by Ahmar Mirza ([email protected]):
I developed a form with triggers like key-up,key-down, post-query, on-lock in three blocks. It works fine in client server environment but when I run the form from the web server it slows down and a job that it has to do on one click of a button actually does when clicked three times.
In master detail blocks which are both in tabular form I have placed an additional field which works as a current record indicator and when I scroll through the records the current record indicator lacks behind.
I even have removed most of the triggers but still the response time is very very poor.
I am using form server with Appachi web server.
<HR></BLOCKQUOTE>
null

Similar Messages

  • Having response time issues using Studio to manage 3000+ forms

    We are currently using Documaker Studio to create and maintain our forms, of which we have thousands. Once we create the form we export it to a very old version of Documerge where it is then used in our policy production. 
    The problem is that because we have so many forms/sections, everytime we click on "SECTIONS" in Studio it takes a significant amount of time to load the screen that lists all of the sections. Many of these forms/sections are old and will never change but we want to still have access to them in the future.
    What is the best way to "backup" all these forms somewhere where they are still accessible? Ideally I think I would like to have one workspace (let's call it "PRODUCTION") that has all 3000+ forms and delete the older resources from our existing workspace (called "FORMS") so that just has the forms that we are currently working on.  This way the response time in the "FORMS" workspace would be much better. Couple questions:
    1. How would I copy my existing workspace "FORMS" (and all the resources in it) to a new workspace called "PRODUCTION"?
    2. How would I delete from the "FORMS" workspace all of the older resources?
    3. Once I am satisfied with a new form/section in my "FORMS" workspace how would I move it to "PRODUCTION"?
    4. How could I move a form/section from "PRODUCTION" back into "FORMS" in order to make corrections, or use it as a base for a new form down the road?
    5. Most importantly....Is there a better way to do this?
    Again, we are only using this workspace for forms creation and not using it to generate output...we will be doing that in the future once we upgrade from the very old Documerge on the mainframe, to Documaker Studio.
    Many thanks to any of you who can help me with this!

    However, I am a little confused on the difference between extracting and promoting. Am I correct in assuming that I would go into my PROD workspace and EXTRACT the resources that I want to continue to work on. I would then go into my new, and empty, DEV workspace and IMPORT FILES (or IMPORT LIBRARY?) using the file(s) that I created with the EXTRACT? In effect, I would have two totally separate workspaces, one called DEV and one called PROD?
    Extraction is writing a copy of a resource from the library out to disk. Promotion is copying a resource from one library to another, with the option of modifying the metadata values of the source and target resources. You would use extract in a case where you don't have access to both libraries to do a promote.
    An example promotion scenario would go something like this. You have resources in the source (DEV) that you want to promote to the target (PROD). Items to be promoted are tagged with the MODE = "To Promote". When you perform the promotion, you can select the items that you want to promote with the filter MODE="To Promote". When you perform the promotion, you can also configure Studio to set the MODE of the resource(s) in the source to be MODE="To Delete", and set the MODE of the resource(s) in the target to be MODE="" (empty). Then you can go back and delete the resources from the source (DEV) where MODE=DELETE.
    Once you have the libraries configured you could bypass the whole extract/import bit and just use promote. The source would be PROD, and the target would be DEV. During promotion, set the target MODE = "To Do", and source MODE = "In Development". In this fashion you will see which resources in PROD are currently being edited in DEV (because in PROD the MODE = "In Development"). When development is completed, change the MODE in DEV to "To Promote", then proceed with the promotion scenario described above.
    I am a bit confused on the PROMOTE function and the libraries that have the  _DEV _TEST _PROD suffixes. This looks like it duplicates the entire workspace to new libraries _PROD but it is all part of the same workspace, not two separate workspaces?  Any clarification here would be helpful.
    Those suffixes are just attached by default; these suffixes don't mean anything to Documaker. You could name your library PROD and use it for DEV. It might be confusing though ;-) The usual best practice is to name the library and subsequent tablespaces/schemas according to their use. It's possible to have multiple libraries within a single tablespace or schema (but not recommended to mix PROD and non-PROD libraries).
    Getting there, I think!
    -A

  • Unable to capture the Citrix network response time using OATS Load testing.

    Unable to capture the Citrix network response time using OATS Load testing. Here is the scenario " in our project users logs into Citrix network and select the Hyperion application and does the Transaction and the Clients wants us to simulate the same scenario for load testing. We have scripted starting from Citrix Login and then launching Hyperion application. But the time taken to launch the Hyperion Application from Citrix network has not been captured whereas Hyperion Transaction time have been recorded. Can any help to resolve this issue ASAP?

    Hi keerthi,
    1. I have pasted the code for the first issue
    web
                             .button(
                                       122,
                                       "/web:window[@index='0' or @title='Manage Network Targets - Oracle Communications Order and Service Management - Order and Service Management']/web:document[@index='0' or @name='1824fhkchs_6']/web:form[@id='pt1:_UISform1' or @name='pt1:_UISform1' or @index='0']/web:button[@id='pt1:MA:0:n1:1:pt1:qryId1::search' or @value='Search' or @index='3']")
                             .click();
                        adf
                        .table(
                                  "/web:window[@index='0' or @title='Manage Network Targets - Oracle Communications Order and Service Management - Order and Service Management']/web:document[@index='0' or @name='1c9nk1ryzv_6']/web:ADFTable[@absoluteLocator='pt1:MA:n1:pt1:pnlcltn:resId1']")
                        .columnSort("Ascending", "Name" );
         }

  • SQL tune (High response time)

    Hi,
    I am writing the following function which is causing high response time. Can you please help? Please SBMS_SQLTUNE advise.
    GENERAL INFORMATION SECTION
    Tuning Task Name : BFG_TUNING1
    Tuning Task Owner : ARADMIN
    Scope : COMPREHENSIVE
    Time Limit(seconds) : 60
    Completion Status : COMPLETED
    Started at : 01/28/2013 15:48:39
    Completed at : 01/28/2013 15:49:43
    Number of SQL Restructure Findings: 7
    Number of Errors : 1
    Schema Name: ARADMIN
    SQL ID : 2d61kbs9vpvp6
    SQL Text : SELECT /*+no_merge(chg)*/ chg.CHANGE_REFERENCE,
    chg.Customer_Name, chg.Customer_ID, chg.Contract_ID,
    chg.Change_Title, chg.Change_Type, chg.Change_Description,
    chg.Risk, chg.Impact, chg.Urgency, chg.Scheduled_Start_Date,
    chg.Scheduled_End_Date, chg.Scheduled_Start_Date_Int,
    chg.Scheduled_End_Date_Int, chg.Outage_Required,
    chg.Change_Status, chg.Change_Status_IM, chg.Reason_for_change,
    chg.Customer_Visible, chg.Change_Source,
    chg.Related_Ticket_Type, chg.Related_Ticket_ID,
    chg.Requested_By, chg.Requested_For, chg.Site_ID, chg.Site_Name,
    chg.Element_id, chg.Element_Type, chg.Element_Name,
    chg.Search_flag, chg.remedy_id, chg.Change_Manager,
    chg.Email_Manager, chg.Queue, a.customer as CUSTOMER_IM,
    a.contract as CONTRACT_IM, a.cid FROM exp_cm_cusid1 a, (sELECT *
    FROM EXP_BFG_CM_JOIN_V WHERE CUSTOMER_ID = 14187) chg WHERE
    a.bfg_con_id IS NULL AND a.bfg_cus_id = chg.customer_id AND
    NOT EXISTS (SELECT a.bfg_con_id FROM exp_cm_cusid1 a WHERE
    a.bfg_con_id IS NOT NULL AND a.bfg_cus_id = chg.customer_id
    AND a.bfg_con_id = chg.contract_id ) UNION SELECT
    /*+no_marge(chg)*/ chg.CHANGE_REFERENCE, chg.Customer_Name,
    chg.Customer_ID, chg.Contract_ID, chg.Change_Title,
    chg.Change_Type, chg.Change_Description, chg.Risk, chg.Impact,
    chg.Urgency, chg.Scheduled_Start_Date, chg.Scheduled_End_Date,
    chg.Scheduled_Start_Date_Int, chg.Scheduled_End_Date_Int,
    chg.Outage_Required, chg.Change_Status, chg.Change_Status_IM,
    chg.Reason_for_change, chg.Customer_Visible, chg.Change_Source,
    chg.Related_Ticket_Type, chg.Related_Ticket_ID,
    chg.Requested_By, chg.Requested_For, chg.Site_ID, chg.Site_Name,
    chg.Element_id, chg.Element_Type, chg.Element_Name,
    chg.Search_flag, chg.remedy_id, chg.Change_Manager,
    chg.Email_Manager, chg.Queue, a.customer as CUSTOMER_IM,
    a.contract as CONTRACT_IM, a.cid FROM exp_cm_cusid1 a, (sELECT *
    FROM EXP_BFG_CM_JOIN_V WHERE CUSTOMER_ID = 14187) chg WHERE
    a.bfg_cus_id = chg.customer_id AND a.bfg_con_id =
    chg.contract_id AND a.bfg_con_id IS NOT NULL
    FINDINGS SECTION (7 findings)
    1- Restructure SQL finding (see plan 1 in explain plans section)
    The predicate REGEXP_LIKE ("T100"."C536871160",'^[[:digit:]]+$') used at
    line ID 26 of the execution plan contains an expression on indexed column
    "C536871160". This expression prevents the optimizer from selecting indices
    on table "ARADMIN"."T100".
    Recommendation
    - Rewrite the predicate into an equivalent form to take advantage of
    indices. Alternatively, create a function-based index on the expression.
    Rationale
    The optimizer is unable to use an index if the predicate is an inequality
    condition or if there is an expression or an implicit data type conversion
    on the indexed column.
    2- Restructure SQL finding (see plan 1 in explain plans section)
    The predicate TO_NUMBER(TRIM("T100"."C536871160"))=:B1 used at line ID 26 of
    the execution plan contains an expression on indexed column "C536871160".
    This expression prevents the optimizer from selecting indices on table
    "ARADMIN"."T100".
    Recommendation
    - Rewrite the predicate into an equivalent form to take advantage of
    indices. Alternatively, create a function-based index on the expression.
    Rationale
    The optimizer is unable to use an index if the predicate is an inequality
    condition or if there is an expression or an implicit data type conversion
    on the indexed column.
    3- Restructure SQL finding (see plan 1 in explain plans section)
    The predicate REGEXP_LIKE ("T100"."C536871160",'^[[:digit:]]+$') used at
    line ID 10 of the execution plan contains an expression on indexed column
    "C536871160". This expression prevents the optimizer from selecting indices
    on table "ARADMIN"."T100".
    Recommendation
    - Rewrite the predicate into an equivalent form to take advantage of
    indices. Alternatively, create a function-based index on the expression.
    Rationale
    The optimizer is unable to use an index if the predicate is an inequality
    condition or if there is an expression or an implicit data type conversion
    on the indexed column.
    4- Restructure SQL finding (see plan 1 in explain plans section)
    The predicate TO_NUMBER(TRIM("T100"."C536871160"))=:B1 used at line ID 10 of
    the execution plan contains an expression on indexed column "C536871160".
    This expression prevents the optimizer from selecting indices on table
    "ARADMIN"."T100".
    Recommendation
    - Rewrite the predicate into an equivalent form to take advantage of
    indices. Alternatively, create a function-based index on the expression.
    Rationale
    The optimizer is unable to use an index if the predicate is an inequality
    condition or if there is an expression or an implicit data type conversion
    on the indexed column.
    5- Restructure SQL finding (see plan 1 in explain plans section)
    The predicate REGEXP_LIKE ("T100"."C536871160",'^[[:digit:]]+$') used at
    line ID 6 of the execution plan contains an expression on indexed column
    "C536871160". This expression prevents the optimizer from selecting indices
    on table "ARADMIN"."T100".
    Recommendation
    - Rewrite the predicate into an equivalent form to take advantage of
    indices. Alternatively, create a function-based index on the expression.
    Rationale
    The optimizer is unable to use an index if the predicate is an inequality
    condition or if there is an expression or an implicit data type conversion
    on the indexed column.
    6- Restructure SQL finding (see plan 1 in explain plans section)
    The predicate TO_NUMBER(TRIM("T100"."C536871160"))=:B1 used at line ID 6 of
    the execution plan contains an expression on indexed column "C536871160".
    This expression prevents the optimizer from selecting indices on table
    "ARADMIN"."T100".
    Recommendation
    - Rewrite the predicate into an equivalent form to take advantage of
    indices. Alternatively, create a function-based index on the expression.
    Rationale
    The optimizer is unable to use an index if the predicate is an inequality
    condition or if there is an expression or an implicit data type conversion
    on the indexed column.
    7- Restructure SQL finding (see plan 1 in explain plans section)
    An expensive "UNION" operation was found at line ID 1 of the execution plan.
    Recommendation
    - Consider using "UNION ALL" instead of "UNION", if duplicates are allowed
    or uniqueness is guaranteed.
    Rationale
    "UNION" is an expensive and blocking operation because it requires
    elimination of duplicate rows. "UNION ALL" is a cheaper alternative,
    assuming that duplicates are allowed or uniqueness is guaranteed.
    ERRORS SECTION
    - The current operation was interrupted because it timed out.
    EXPLAIN PLANS SECTION
    1- Original
    Plan hash value: 1047651452
    | Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time | Inst |IN-OUT|
    | 0 | SELECT STATEMENT | | 2 | 28290 | 567 (37)| 00:00:07 | | |
    | 1 | SORT UNIQUE | | 2 | 28290 | 567 (37)| 00:00:07 | | |
    | 2 | UNION-ALL | | | | | | | |
    |* 3 | HASH JOIN RIGHT ANTI | | 1 | 14158 | 373 (5)| 00:00:05 | | |
    | 4 | VIEW | VW_SQ_1 | 1 | 26 | 179 (3)| 00:00:03 | | |
    | 5 | NESTED LOOPS | | 1 | 37 | 179 (3)| 00:00:03 | | |
    |* 6 | TABLE ACCESS FULL | T100 | 1 | 28 | 178 (3)| 00:00:03 | | |
    |* 7 | INDEX RANGE SCAN | I1451_536870913_1 | 1 | 9 | 1 (0)| 00:00:01 | | |
    | 8 | NESTED LOOPS | | 1 | 14132 | 193 (5)| 00:00:03 | | |
    |* 9 | HASH JOIN | | 1 | 14085 | 192 (5)| 00:00:03 | | |
    |* 10 | TABLE ACCESS FULL | T100 | 1 | 28 | 178 (3)| 00:00:03 | | |
    | 11 | VIEW | EXP_BFG_CM_JOIN_V | 3 | 42171 | 13 (24)| 00:00:01 | | |
    | 12 | UNION-ALL | | | | | | | |
    |* 13 | HASH JOIN | | 1 | 6389 | 5 (20)| 00:00:01 | | |
    | 14 | REMOTE | PROP_CHANGE_REQUEST_V | 1 | 5979 | 2 (0)| 00:00:01 | ARS_B~ | R->S |
    | 15 | REMOTE | PROP_CHANGE_INVENTORY_V | 1 | 410 | 2 (0)| 00:00:01 | ARS_B~ | R->S |
    | 16 | HASH UNIQUE | | 1 | 6052 | 6 (34)| 00:00:01 | | |
    |* 17 | HASH JOIN | | 1 | 6052 | 5 (20)| 00:00:01 | | |
    | 18 | REMOTE | PROP_CHANGE_REQUEST_V | 1 | 5979 | 2 (0)| 00:00:01 | ARS_B~ | R->S |
    | 19 | REMOTE | PROP_CHANGE_INVENTORY_V | 1 | 73 | 2 (0)| 00:00:01 | ARS_B~ | R->S |
    | 20 | HASH UNIQUE | | 1 | 5979 | 3 (34)| 00:00:01 | | |
    | 21 | REMOTE | PROP_CHANGE_REQUEST_V | 1 | 5979 | 2 (0)| 00:00:01 | ARS_B~ | R->S |
    | 22 | TABLE ACCESS BY INDEX ROWID| T1451 | 1 | 47 | 1 (0)| 00:00:01 | | |
    |* 23 | INDEX RANGE SCAN | I1451_536870913_1 | 1 | | 1 (0)| 00:00:01 | | |
    | 24 | NESTED LOOPS | | 1 | 14132 | 193 (5)| 00:00:03 | | |
    |* 25 | HASH JOIN | | 1 | 14085 | 192 (5)| 00:00:03 | | |
    |* 26 | TABLE ACCESS FULL | T100 | 1 | 28 | 178 (3)| 00:00:03 | | |
    | 27 | VIEW | EXP_BFG_CM_JOIN_V | 3 | 42171 | 13 (24)| 00:00:01 | | |
    | 28 | UNION-ALL | | | | | | | |
    |* 29 | HASH JOIN | | 1 | 6389 | 5 (20)| 00:00:01 | | |
    | 30 | REMOTE | PROP_CHANGE_REQUEST_V | 1 | 5979 | 2 (0)| 00:00:01 | ARS_B~ | R->S |
    | 31 | REMOTE | PROP_CHANGE_INVENTORY_V | 1 | 410 | 2 (0)| 00:00:01 | ARS_B~ | R->S |
    | 32 | HASH UNIQUE | | 1 | 6052 | 6 (34)| 00:00:01 | | |
    |* 33 | HASH JOIN | | 1 | 6052 | 5 (20)| 00:00:01 | | |
    | 34 | REMOTE | PROP_CHANGE_REQUEST_V | 1 | 5979 | 2 (0)| 00:00:01 | ARS_B~ | R->S |
    | 35 | REMOTE | PROP_CHANGE_INVENTORY_V | 1 | 73 | 2 (0)| 00:00:01 | ARS_B~ | R->S |
    | 36 | HASH UNIQUE | | 1 | 5979 | 3 (34)| 00:00:01 | | |
    | 37 | REMOTE | PROP_CHANGE_REQUEST_V | 1 | 5979 | 2 (0)| 00:00:01 | ARS_B~ | R->S |
    | 38 | TABLE ACCESS BY INDEX ROWID | T1451 | 1 | 47 | 1 (0)| 00:00:01 | | |
    |* 39 | INDEX RANGE SCAN | I1451_536870913_1 | 1 | | 1 (0)| 00:00:01 | | |
    Predicate Information (identified by operation id):
    3 - access("ITEM_0"="EXP_BFG_CM_JOIN_V"."CUSTOMER_ID" AND "ITEM_1"="EXP_BFG_CM_JOIN_V"."CONTRACT_ID")
    6 - filter("C536871050" LIKE '%FMS%' AND REGEXP_LIKE ("C536871160",'^[[:digit:]]+$') AND ("C536871088" IS NULL
    OR REGEXP_LIKE ("C536871088",'^[[:digit:]]+$')) AND TO_NUMBER(TRIM("C536871088")) IS NOT NULL AND
    TO_NUMBER(TRIM("C536871160"))=:SYS_B_0 AND "C536871160" IS NOT NULL AND "C536871050" IS NOT NULL AND "C7"=0)
    7 - access("C536870913"="C536870914")
    9 - access("EXP_BFG_CM_JOIN_V"."CUSTOMER_ID"=TO_NUMBER(TRIM("C536871160")))
    10 - filter("C536871050" LIKE '%FMS%' AND REGEXP_LIKE ("C536871160",'^[[:digit:]]+$') AND ("C536871088" IS NULL
    OR REGEXP_LIKE ("C536871088",'^[[:digit:]]+$')) AND TO_NUMBER(TRIM("C536871088")) IS NULL AND
    TO_NUMBER(TRIM("C536871160"))=:SYS_B_0 AND "C536871160" IS NOT NULL AND "C536871050" IS NOT NULL AND "C7"=0)
    13 - access("CHG"."PRP_CHG_REFERENCE"="INV"."PRP_CHG_REFERENCE")
    17 - access("CHG"."PRP_CHG_REFERENCE"="INV"."PRP_CHG_REFERENCE")
    23 - access("C536870913"="C536870914")
    25 - access("EXP_BFG_CM_JOIN_V"."CUSTOMER_ID"=TO_NUMBER(TRIM("C536871160")) AND
    "EXP_BFG_CM_JOIN_V"."CONTRACT_ID"=TO_NUMBER(TRIM("C536871088")))
    26 - filter("C536871050" LIKE '%FMS%' AND REGEXP_LIKE ("C536871160",'^[[:digit:]]+$') AND ("C536871088" IS NULL
    OR REGEXP_LIKE ("C536871088",'^[[:digit:]]+$')) AND TO_NUMBER(TRIM("C536871088")) IS NOT NULL AND
    TO_NUMBER(TRIM("C536871160"))=:SYS_B_1 AND "C536871160" IS NOT NULL AND "C536871050" IS NOT NULL AND "C7"=0)
    29 - access("CHG"."PRP_CHG_REFERENCE"="INV"."PRP_CHG_REFERENCE")
    33 - access("CHG"."PRP_CHG_REFERENCE"="INV"."PRP_CHG_REFERENCE")
    39 - access("C536870913"="C536870914")
    Remote SQL Information (identified by operation id):
    14 - SELECT "PRP_CHG_REFERENCE","CUS_ID","CUS_NAME","CNT_BFG_ID","PRP_TITLE","PRP_CHG_TYPE","PRP_DESCRIPTION","PR
    P_BTIGNITE_PRIORITY","PRP_CUSTOMER_PRIORITY","PRP_CHG_URGENCY","PRP_RESPONSE_REQUIRED_BY","PRP_REQUIRED_BY_DATE","P
    RP_CHG_OUTAGE_FLAG","PRP_CHG_STATUS","PRP_CHG_FOR_REASON","PRP_CHG_CUSTOMER_VISIBILITY","PRP_CHG_SOURCE_SYSTEM","PR
    P_RELATED_TICKET_TYPE","PRP_RELATED_TICKET_ID","CHANGE_INITIATOR","CHANGE_ORIGINATOR","CHANGE_MANAGER","QUEUE"
    FROM "PROP_OWNER2"."PROP_CHANGE_REQUEST_V" "CHG" WHERE "CUS_ID"=:1 (accessing 'ARS_BFG_DBLINK.WORLD' )
    15 - SELECT "PRP_CHG_REFERENCE","SIT_ID","SIT_NAME","ELEMENT_SUMMARY","PRODUCT_NAME" FROM
    "PROP_OWNER2"."PROP_CHANGE_INVENTORY_V" "INV" (accessing 'ARS_BFG_DBLINK.WORLD' )
    18 - SELECT "PRP_CHG_REFERENCE","CUS_ID","CUS_NAME","CNT_BFG_ID","PRP_TITLE","PRP_CHG_TYPE","PRP_DESCRIPTION","PR
    P_BTIGNITE_PRIORITY","PRP_CUSTOMER_PRIORITY","PRP_CHG_URGENCY","PRP_RESPONSE_REQUIRED_BY","PRP_REQUIRED_BY_DATE","P
    RP_CHG_OUTAGE_FLAG","PRP_CHG_STATUS","PRP_CHG_FOR_REASON","PRP_CHG_CUSTOMER_VISIBILITY","PRP_CHG_SOURCE_SYSTEM","PR
    P_RELATED_TICKET_TYPE","PRP_RELATED_TICKET_ID","CHANGE_INITIATOR","CHANGE_ORIGINATOR","CHANGE_MANAGER","QUEUE"
    FROM "PROP_OWNER2"."PROP_CHANGE_REQUEST_V" "CHG" WHERE "CUS_ID"=:1 (accessing 'ARS_BFG_DBLINK.WORLD' )
    19 - SELECT "PRP_CHG_REFERENCE","SIT_ID","SIT_NAME" FROM "PROP_OWNER2"."PROP_CHANGE_INVENTORY_V" "INV"
    (accessing 'ARS_BFG_DBLINK.WORLD' )
    21 - SELECT "PRP_CHG_REFERENCE","CUS_ID","CUS_NAME","CNT_BFG_ID","PRP_TITLE","PRP_CHG_TYPE","PRP_DESCRIPTION","PR
    P_BTIGNITE_PRIORITY","PRP_CUSTOMER_PRIORITY","PRP_CHG_URGENCY","PRP_RESPONSE_REQUIRED_BY","PRP_REQUIRED_BY_DATE","P
    RP_CHG_OUTAGE_FLAG","PRP_CHG_STATUS","PRP_CHG_FOR_REASON","PRP_CHG_CUSTOMER_VISIBILITY","PRP_CHG_SOURCE_SYSTEM","PR
    P_RELATED_TICKET_TYPE","PRP_RELATED_TICKET_ID","CHANGE_INITIATOR","CHANGE_ORIGINATOR","CHANGE_MANAGER","QUEUE"
    FROM "PROP_OWNER2"."PROP_CHANGE_REQUEST_V" "CHG" WHERE "CUS_ID"=:1 (accessing 'ARS_BFG_DBLINK.WORLD' )
    30 - SELECT "PRP_CHG_REFERENCE","CUS_ID","CUS_NAME","CNT_BFG_ID","PRP_TITLE","PRP_CHG_TYPE","PRP_DESCRIPTION","PR
    P_BTIGNITE_PRIORITY","PRP_CUSTOMER_PRIORITY","PRP_CHG_URGENCY","PRP_RESPONSE_REQUIRED_BY","PRP_REQUIRED_BY_DATE","P
    RP_CHG_OUTAGE_FLAG","PRP_CHG_STATUS","PRP_CHG_FOR_REASON","PRP_CHG_CUSTOMER_VISIBILITY","PRP_CHG_SOURCE_SYSTEM","PR
    P_RELATED_TICKET_TYPE","PRP_RELATED_TICKET_ID","CHANGE_INITIATOR","CHANGE_ORIGINATOR","CHANGE_MANAGER","QUEUE"
    FROM "PROP_OWNER2"."PROP_CHANGE_REQUEST_V" "CHG" WHERE "CUS_ID"=:1 (accessing 'ARS_BFG_DBLINK.WORLD' )
    31 - SELECT "PRP_CHG_REFERENCE","SIT_ID","SIT_NAME","ELEMENT_SUMMARY","PRODUCT_NAME" FROM
    "PROP_OWNER2"."PROP_CHANGE_INVENTORY_V" "INV" (accessing 'ARS_BFG_DBLINK.WORLD' )
    34 - SELECT "PRP_CHG_REFERENCE","CUS_ID","CUS_NAME","CNT_BFG_ID","PRP_TITLE","PRP_CHG_TYPE","PRP_DESCRIPTION","PR
    P_BTIGNITE_PRIORITY","PRP_CUSTOMER_PRIORITY","PRP_CHG_URGENCY","PRP_RESPONSE_REQUIRED_BY","PRP_REQUIRED_BY_DATE","P
    RP_CHG_OUTAGE_FLAG","PRP_CHG_STATUS","PRP_CHG_FOR_REASON","PRP_CHG_CUSTOMER_VISIBILITY","PRP_CHG_SOURCE_SYSTEM","PR
    P_RELATED_TICKET_TYPE","PRP_RELATED_TICKET_ID","CHANGE_INITIATOR","CHANGE_ORIGINATOR","CHANGE_MANAGER","QUEUE"
    FROM "PROP_OWNER2"."PROP_CHANGE_REQUEST_V" "CHG" WHERE "CUS_ID"=:1 (accessing 'ARS_BFG_DBLINK.WORLD' )
    35 - SELECT "PRP_CHG_REFERENCE","SIT_ID","SIT_NAME" FROM "PROP_OWNER2"."PROP_CHANGE_INVENTORY_V" "INV"
    (accessing 'ARS_BFG_DBLINK.WORLD' )
    37 - SELECT "PRP_CHG_REFERENCE","CUS_ID","CUS_NAME","CNT_BFG_ID","PRP_TITLE","PRP_CHG_TYPE","PRP_DESCRIPTION","PR
    P_BTIGNITE_PRIORITY","PRP_CUSTOMER_PRIORITY","PRP_CHG_URGENCY","PRP_RESPONSE_REQUIRED_BY","PRP_REQUIRED_BY_DATE","P
    RP_CHG_OUTAGE_FLAG","PRP_CHG_STATUS","PRP_CHG_FOR_REASON","PRP_CHG_CUSTOMER_VISIBILITY","PRP_CHG_SOURCE_SYSTEM","PR
    P_RELATED_TICKET_TYPE","PRP_RELATED_TICKET_ID","CHANGE_INITIATOR","CHANGE_ORIGINATOR","CHANGE_MANAGER","QUEUE"
    FROM "PROP_OWNER2"."PROP_CHANGE_REQUEST_V" "CHG" WHERE "CUS_ID"=:1 (accessing 'ARS_BFG_DBLINK.WORLD' )
    -------------------------------------------------------------------------------

    Please review the following threads:
    {message:id=9360002}
    {message:id=9360003}

  • Veiw responses in orignal Form (PDF, Word format, etc)

    Is there a way to download or view the completed form that individual would fill out? It's great having it in the spreadsheet format, i am also looking to keep the completed form on file for future reference.

    Hi,
    Will the "Download response as PDF form.." work for you? Open the form in FormsCentral, go to View Responses tab, then select a row and right click, you will see the item menu that I just mentioned in the list.  You will have to save it each row at a time.
    Hope this helps,
    Thanks,
    Lucia

  • Experiencing very slow response time using AirPort while wired response is fine. Suggestions?

    After a surfing session a few weeks ago the response time for my AirPort internet connection became painfully slow.  When I plug in to the wired connection it works fine.  Resetting the router and modem doesn't seem to fix it.  What might have happened and what can I do to fix it?

    It is very possible that you may have some form of Wi-Fi interference that appears during these hours that is preventing your AirPort Base Station from providing a clean RF signal.
    I suggest you perform a simple site survey, using utilities like iStumbler, Wi-Fi Explorer or AirRadar to determine potential areas of interference, and then, try to either eliminate or significantly reduce them where possible.

  • Any way to improve response time with iPhoto 8.1.2 using Mountain Lion ?

    Any way to improve response time with iPhoto 8.1.2 using Mountain Lion ?  Can you store photos on a separate hard drive and use a smaller file for openning iphoto?

    How did you move your iPhoto library to the new system?  the recommended way is Connect the two Macs together (network, firewire target mode, etc)  or use an external hard drive formed Mac OS extended (journaled) and drag the iPhoto library intact as a single entity from the old Mac to the pictures folder of the new Mac - launch iPhoto on the new mac and it will open the library and convert it as needed and you will be ready move forward.
    LN

  • Slow response times

    This last weekend, we migrated off of our old server 2003 cf7mx server, to a new Server 2008r2 x64 CF 9,0,1,274733 server.  We are now experiencing slow response times for our server,according to the Windows Performance monitor our Avg Request times are way up. I have been going through some logs but nothing seems to leap out at me, but i am not the best CF admin.  As i dont know what information would be usefull, i have tried to give all the info. 
    Here is our coldfusion-event log:
    08/14 16:22:58 user JSPServlet: init
    08/14 16:23:00 user ColdFusionStartUpServlet: init
    08/14 16:23:00 user ColdFusionStartUpServlet: ColdFusion: Starting application services
    08/14 16:23:00 user ColdFusionStartUpServlet: ColdFusion: VM version = 14.3-b01
    08/14 16:23:08 user ColdFusionStartUpServlet: ColdFusion: application services are now available
    08/14 16:23:08 user CFMxmlServlet: init
    08/14 16:23:08 user CFMxmlServlet: Macromedia Flex Build: 87315.134646
    08/14 16:23:10 user CFSwfServlet: init
    08/14 16:23:10 user CFCServlet: init
    08/14 16:23:12 user FlashGateway: init
    08/14 16:23:12 user MessageBrokerServlet: init
    08/14 16:23:13 user CFFormGateway: init
    08/14 16:23:13 user CFInternalServlet: init
    08/14 16:23:13 user WSRPProducer: init
    08/14 16:23:13 user ServerCFCServlet: init
    08/15 08:13:09 user GraphServlet: init
    08/15 10:25:45 user CFInternalServlet: destroy
    08/15 10:25:50 user FlashGateway: destroy
    08/15 10:25:50 user WSRPProducer: destroy
    08/15 10:25:50 user CFCServlet: destroy
    08/15 10:25:50 user GraphServlet: destroy
    08/15 10:25:50 user CFMxmlServlet: destroy
    08/15 10:25:50 user CFFormGateway: destroy
    08/15 10:25:50 user CFSwfServlet: destroy
    08/15 10:26:50 info No JDBC data sources have been configured for this server (see jrun-resources.xml)
    08/15 10:26:50 info JRun Proxy Server listening on *:51800
    08/15 10:26:50 info Deploying enterprise application "Adobe_ColdFusion_9" from: file:/C:/ColdFusion9/
    08/15 10:26:51 info Deploying web application "Adobe ColdFusion 9" from: file:/C:/ColdFusion9/
    08/15 10:26:52 user JSPServlet: init
    08/15 10:26:53 user ColdFusionStartUpServlet: init
    08/15 10:26:53 user ColdFusionStartUpServlet: ColdFusion: Starting application services
    08/15 10:26:53 user ColdFusionStartUpServlet: ColdFusion: VM version = 14.3-b01
    08/15 10:26:58 user ColdFusionStartUpServlet: ColdFusion: application services are now available
    08/15 10:26:58 user CFMxmlServlet: init
    08/15 10:26:58 user CFMxmlServlet: Macromedia Flex Build: 87315.134646
    08/15 10:26:59 user CFSwfServlet: init
    08/15 10:27:00 user CFCServlet: init
    08/15 10:27:03 user FlashGateway: init
    08/15 10:27:03 user MessageBrokerServlet: init
    08/15 10:27:04 user CFFormGateway: init
    08/15 10:27:04 user CFInternalServlet: init
    08/15 10:27:04 user WSRPProducer: init
    08/15 10:27:05 user ServerCFCServlet: init
    08/15 10:36:30 user CFInternalServlet: destroy
    08/15 10:36:36 user FlashGateway: destroy
    08/15 10:36:36 user WSRPProducer: destroy
    08/15 10:36:36 user CFCServlet: destroy
    08/15 10:36:36 user CFMxmlServlet: destroy
    08/15 10:36:36 user CFFormGateway: destroy
    08/15 10:36:36 user CFSwfServlet: destroy
    08/15 10:38:35 info No JDBC data sources have been configured for this server (see jrun-resources.xml)
    08/15 10:38:35 info JRun Proxy Server listening on *:51800
    08/15 10:38:35 info Deploying enterprise application "Adobe_ColdFusion_9" from: file:/C:/ColdFusion9/
    08/15 10:38:35 info Deploying web application "Adobe ColdFusion 9" from: file:/C:/ColdFusion9/
    08/15 10:38:36 user JSPServlet: init
    08/15 10:38:37 user ColdFusionStartUpServlet: init
    08/15 10:38:37 user ColdFusionStartUpServlet: ColdFusion: Starting application services
    08/15 10:38:37 user ColdFusionStartUpServlet: ColdFusion: VM version = 14.3-b01
    08/15 10:38:41 user ColdFusionStartUpServlet: ColdFusion: application services are now available
    08/15 10:38:41 user CFMxmlServlet: init
    08/15 10:38:41 user CFMxmlServlet: Macromedia Flex Build: 87315.134646
    08/15 10:38:44 user CFSwfServlet: init
    08/15 10:38:44 user CFCServlet: init
    08/15 10:38:46 user FlashGateway: init
    08/15 10:38:46 user MessageBrokerServlet: init
    08/15 10:38:47 user CFFormGateway: init
    08/15 10:38:47 user CFInternalServlet: init
    08/15 10:38:47 user WSRPProducer: init
    Here is our cf application log:
    "Severity","ThreadID","Date","Time","Application","Message"
    "Information","jrpp-0","06/25/11","19:24:19",,"C:\ColdFusion9\logs\application.log initialized"
    "Error","jrpp-0","06/25/11","19:24:19",,"File not found: /CFIDE/main/ide.cfm The specific sequence of files included or processed is: C:\inetpub\wwwroot\CFIDE\main\ide.cfm'' "
    "Error","jrpp-1","06/25/11","22:14:06",,"File not found: /CFIDE/main/ide.cfm The specific sequence of files included or processed is: C:\inetpub\wwwroot\CFIDE\main\ide.cfm'' "
    "Error","jrpp-2","06/25/11","22:32:05",,"File not found: /CFIDE/main/ide.cfm The specific sequence of files included or processed is: C:\inetpub\wwwroot\CFIDE\main\ide.cfm'' "
    "Information","jrpp-3","06/26/11","11:45:13","CFADMIN","Invalid login for Default User"
    "Error","jrpp-0","07/06/11","15:46:07",,"File not found: /CFIDE/main/ide.cfm The specific sequence of files included or processed is: C:\inetpub\wwwroot\CFIDE\main\ide.cfm'' "
    "Error","jrpp-1","08/14/11","20:56:25","impager","Element DEPT is undefined in FORM. The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\clinical\pagers\impager\sendpage.cfm, line: 1 "
    "Error","jrpp-1","08/14/11","21:04:33",,"File not found: /CFIDE/main/ide.cfm The specific sequence of files included or processed is: C:\inetpub\wwwroot\CFIDE\main\ide.cfm'' "
    "Error","jrpp-4","08/14/11","22:04:43","impager","Element DEPT is undefined in FORM. The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\clinical\pagers\impager\sendpage.cfm, line: 1 "
    "Error","jrpp-36","08/15/11","08:50:29","TEAMS_database","Variable MYPERIOD is undefined. The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\monthly_entry.cfm, line: 220 "
    "Error","jrpp-50","08/15/11","09:12:36",,"Invalid CFML construct found on line 2 at column 25.ColdFusion was looking at the following text:<p>=</p><p>The CFML compiler was processing:<ul><li>An expression that began on line 2, column 8.<br>The expression might be missing an ending #, for example, #expr instead of #expr#.<li>A cfset tag beginning on line 2, column 2.</ul> The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\clinpub\L2\ess\page3e.cfm, line: 2 "
    "Error","jrpp-50","08/15/11","09:13:39",,"Invalid CFML construct found on line 2 at column 25.ColdFusion was looking at the following text:<p>=</p><p>The CFML compiler was processing:<ul><li>An expression that began on line 2, column 8.<br>The expression might be missing an ending #, for example, #expr instead of #expr#.<li>A cfset tag beginning on line 2, column 2.</ul> The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\clinpub\L2\ess\page3e.cfm, line: 2 "
    "Error","jrpp-50","08/15/11","09:14:17",,"Invalid CFML construct found on line 42 at column 35.ColdFusion was looking at the following text:<p>=</p><p>The CFML compiler was processing:<ul><li>An expression that began on line 42, column 8.<br>The expression might be missing an ending #, for example, #expr instead of #expr#.<li>A cfset tag beginning on line 42, column 2.</ul> The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\clinpub\L2\ess\page4a.cfm, line: 42 "
    "Error","jrpp-78","08/15/11","09:56:53","TEAMS_database","Variable MYPERIOD is undefined. The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\monthly_entry.cfm, line: 220 "
    "Error","jrpp-75","08/15/11","10:10:12","TEAMS_database","Error Executing Database Query.Timed out trying to establish connection The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\monthlyentry1a.cfm, line: 9 "
    "Error","jrpp-80","08/15/11","10:10:46","TEAMS_database","Error Executing Database Query.Timed out trying to establish connection The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\monthlyentry1a.cfm, line: 9 "
    "Error","jrpp-78","08/15/11","10:11:28","TEAMS_database","The request has exceeded the allowable time limit Tag: CFQUERY The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\monthlyentry1a.cfm, line: 124 "
    "Error","jrpp-89","08/15/11","10:13:20","TEAMS_database","The request has exceeded the allowable time limit Tag: CFQUERY The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\monthlyentry1a.cfm, line: 418 "
    "Error","jrpp-88","08/15/11","10:13:52","TEAMS_database","Error Executing Database Query.Timed out trying to establish connection The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\login.cfm, line: 77 "
    "Error","jrpp-82","08/15/11","10:14:35","TEAMS_database","The request has exceeded the allowable time limit Tag: CFQUERY The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\monthlyentry1a.cfm, line: 9 "
    "Error","jrpp-92","08/15/11","10:16:25","TEAMS_database","The request has exceeded the allowable time limit Tag: CFQUERY The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\login.cfm, line: 77 "
    "Error","jrpp-84","08/15/11","10:18:11","impager","Error Executing Database Query.Timed out trying to establish connection The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\clinical\pagers\impager\pagerlist.cfm, line: 620 "
    "Error","jrpp-89","08/15/11","10:19:37","TEAMS_database","The request has exceeded the allowable time limit Tag: CFQUERY The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\entry_2011_1.cfm, line: 233 "
    "Error","jrpp-80","08/15/11","10:22:19","TEAMS_database","Error Executing Database Query.Timed out trying to establish connection The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\entry_2011_1.cfm, line: 154 "
    "Error","jrpp-92","08/15/11","10:23:24","TEAMS_database","Error Executing Database Query.Timed out trying to establish connection The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\login.cfm, line: 84 "
    "Error","jrpp-96","08/15/11","10:24:12","TEAMS_database","The request has exceeded the allowable time limit Tag: cfoutput The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\entry_2011_2.cfm, line: 157 "
    "Error","jrpp-95","08/15/11","10:24:12","TEAMS_database","The request has exceeded the allowable time limit Tag: CFQUERY The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\entry_2011_1.cfm, line: 233 "
    "Error","jrpp-14","08/15/11","10:31:03","TEAMS_database","Error Executing Database Query.Timed out trying to establish connection The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\monthlyentry1a.cfm, line: 9 "
    "Error","jrpp-2","08/15/11","10:31:33","TEAMS_database","Error Executing Database Query.Timed out trying to establish connection The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\sow.cfm, line: 144 "
    "Error","jrpp-15","08/15/11","10:32:04","TEAMS_database","Error Executing Database Query.Timed out trying to establish connection The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\TA_entry.cfm, line: 154 "
    "Error","jrpp-1","08/15/11","10:32:15","TEAMS_database","The request has exceeded the allowable time limit Tag: CFQUERY The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\monthlyentry1a.cfm, line: 124 "
    "Error","jrpp-12","08/15/11","10:32:58","TEAMS_database","The request has exceeded the allowable time limit Tag: CFQUERY The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\sow.cfm, line: 169 "
    "Error","jrpp-10","08/15/11","10:33:21","TEAMS_database","The request has exceeded the allowable time limit Tag: CFQUERY The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\monthlyentry1a.cfm, line: 418 "
    "Error","jrpp-4","08/15/11","10:33:43","TEAMS_database","The request has exceeded the allowable time limit Tag: CFQUERY The specific sequence of files included or processed is: C:\inetpub\wwwroot\cf\tpep\teendata\entry_2011_2.cfm, line: 233 "
    Please let me know if any other data would be usefull. 

    Hi,
    I expect the error details in CF9\runtime\logs\coldfusion-out.log will like provide a clue as to problem.
    For a more long term plan I think you should do what you can to migrate the ACCESS data to SQL or some other database.
    HTH, Carl.

  • Minimizing the response time of the 10g Application

    Hi,
    We are developing an application using 10g iDs & Planning to deploy the app using 10g AS in on the internet.
    While trying to test the deployment the forms are not responding as desired.
    Please let me know if there are ways to minimize the response time so that the App run faster.
    can we compress the traffic any way.
    best regards
    Arkesh

    thanks for ur reply
    Jinit version=1.3.1.22 is in place
    I have checked the form making that as thinnest as possibe.
    response time is still high.
    I would like add few more information
    1. The AS is having a Real IP.
    2. Users are connected to that AS via Internet being located at a different site.
    Users are annoyed with the responses they are getting from the application.
    waiting for replies....
    Arkesh

  • I accidentally deleted the responses on my form colum D U12 program, I need those response back

    I accidentally deleted the responses on my form colum D U12 program, I need those response added back to my formcentral form

    Hi;
    You can copy all of the rows of data from the last point in History where they existed in the View Responses table and paste them all into the current point in time at once, you do not have to do that response by response.  You do not have to do this one at a time.
    1) Go to the View Responses tab
    2) Click the History button (clock icon near bottom right)
    3) Click on previous points in history and locate the best source for the responses that were deleted
    4) Select all rows that were deleted and are missing from current time
    Crtl+C to copy (Cmd+C on Mac)
    6) Go to the most recent point in History now
    7a) If you need to simply put the data back into existing rows that were just empty, the data had been deleted but not the rows click into the first cell (time submitted) and click Crtl+V (Cmd+V on Mac)
    If you are not able to do that let me know and I can work futher with you on it.
    7b) If you need these at the bottom of some existing/new responses simply click "Add Row" to add a new row to the bottom, click in the first cell and use Crtl+V to paste (Cmd+V on Mac)
    7c) If you need to insert the rows in between existing rows use the "+" sign in between the rows you need the data inserted - repeat enough times to add the same number of rows you copied and then click in the first cell and use Crtl+V to paste (Cmd+V on Mac)
    Let me know if you need further assistance.
    Thanks,
    Josh

  • WebDynpro SSR / Browser Response Time

    Good morning,
    When we are visualizing a WebDynpro view we take an unacceptable response time (of almost 1 minute) and the CPU of the computer client rises almost until the 100%.
    The View is composed by a menu to the left (which is a embedded view )and a main view, which is formed by a group that contains a Table within a ScrollContainer. So, the view is not much complex.
    The table is mapped to a simple structure whose attributes are simple objects (string) and the maximum table record size is 100.
    Additionally whenever any event takes place, either or in the menu of the left or the own table, the response time remains in 1 minute although business logic is not executed.
    We have proven to delete the ScrollContainer and show the table but the performance doesn’t improve. We have also tested that communication network problems doesn’t exist.
    The performance of the client browser has been verified including the SSR parameter (“sap.session.ssr.showInfo=true”). A document with an
    image is attached, it is possible to see that the browser response time is 45 seconds to display a content of 1 MB (isn´t it too much?? Why WD generates too much HTML??).
    SAP WAS 6.40 y SP15
    Browser:Internet Explorer 6.0.2800.1106 SP1
    Thanks in advance,
    Eloy

    Hi Eloy,
    We also faced a similar problem in our project. When the page size reaches 0.5MB+ the reposonse becomes too slow.
    This is becuase WebDynpro gets marshalled data from backend and unmarshlles it based on your screen design. So in your case if you have 100 rows * 50 columns it will unmarshall all these records at front-end i.e, the client. Hence you see the response time of your CPU reaching 100 %
    You have very few options
    1) Decrease the no of visible rows on the screen at a time. Say max 10. If you have 40-50 columns explore using Tab Strips with 12-15 columns in each tab.
    2) Increase the RAM and Processing capabilities of your Client PC's. We were kind of lucky that our customer agreed to this and got P4 1GB machines.
    Lets hope the performance is improved in the future releases.
    Regards,
    Shubham

  • Average Response Time for Reports

    Hi Gurus,
    I am using OAS 10.1.2.0.2 with Business Intelligence and Forms Installation.
    Previously I have never seen Average Response Time of Reports server greater than 10000(ms),
    But It is increasing continously and now within 2-3 days it has increased upto 114668(ms)
    CPU Usage (%)          N/A
         Memory Usage (MB)          N/A
         Average Response Time (ms)          114668
    its Maximum queue size is 1000....
    I am not able to find out why it is increasing in this manner.....
    plz help..
    Thanx

    Hi,
    Today it has been increased upto 170236 ...
    Thanx,
    Santosh

  • The FormsListener response time is unacceptable

    We are getting this critical message on Response time metrics under all metrics on FORMS screen on Oracle Enterprise Grid Control.
    Message:
    The FormsListener response time is unacceptable.
    On one of the server we are getting this message at every 2 minutes followed by a message saying 'CLEARED-The FormsListener response time is unacceptable'.
    We are using Oracel Application server 10g (9.0.4) on Windows 2003
    Did any one got this message?
    I tried to search on metalink and on google but did not fine anything helpful.
    Any help would be appreciated.
    Thanks,
    Mukesh

    Hi,
    you my want to check your network latency which plays into teh Listener response time
    Frank

  • Email Response Time Tracking

    Hello good afternoon everyone.
    I am told that we need to start tracking our customer email response time from the customer service emails into our shared Outlook mailbox which is maintained by three membes of staff.
    I do not think we can do this with any the add-ins from Outlook and none of my IT colleagues in our Global Group currently appear to have this setup.  I have found some free software on the net but none so far that fits the bill.  Does Microsoft
    partner any software houses producing software of this nature either free add-ons or reasonably priced, about which someone could advise me please?
    We use Exchange through O365 with E3 licences, our mail is synchronized with our Active Directory which is also synchronised through our European Group FM servers
    Many thanks
    Kind regards, Nickowiz

    Hi Nickowiz,
    According to your description, I understand that you want to track the respond time between service people and customer by E-mail.
    If I misunderstand your concern, please do not hesitate to let me know.
    We can use Message Transport log, Protocol log or other Exchange log to monitor inbound and outbound message. Those tools usually used for mail flow troubleshooting.
    Unfortunately, those tools cannot intelligent query all message with same sender, recipient, subject and so on conditions, then generate a form or list.
    This function may be achieved by script. More details about Exchange Script, please refer to relevant forums:
    https://technet.microsoft.com/en-us/scriptcenter/dd742246.aspx
    By the way, this forum is related to Exchange server. Please contact Office 365 Team so that you can get more professional suggestions, for your reference:
    http://community.office365.com/en-us/default.aspx
    Best Regards,
    Allen Wang

  • High Response Time : Weblogic v8.1

    Our application performs really bad in our production environment. Same code running on a different server (TEST or QA) is better. QA is bad when compared to TEST. Inside HTML there are references to images, css, js and the fetch times as reported in firebug (firefox) for each one of them is around 1300 ms in PROD. QA is around 700 ms and TEST less than 400 ms. PROD server is in not in the same building as TEST And QA. But there are other applications on the same box and their response time is less than 400 ms. So it is not the machine but it is a problem with application or webserver set up. We also timed using "wget" of linux to bypass browser and still notice the same high response times. There is no encryption or any form of SSL here. Ping with 4K size is in the range of 30 ms in PROD compared to 10-20 in QA.
    What can we check to make sure there are no obvious set up errors in our case?

    What supervisor are you running ?
    Be aware that the 6548 module is quite heavily oversubscribed and you could be running into this ie. servers will definitely be sending a lot more data than an IP phone.
    The 6548 has an 8Gbps connection the switch fabric (note this is assuming a sup 720). So you have 48 1Gbps ports going in (in = from end devices) and only 8Gbps going out (out = to the switch fabric). Each group of 6 ports shares a 1Gbps connection the switch fabric. So lets says on one of those groupings you had 6 servers with 1Gbps connections. If only three of these servers were sending out 400Mbps of data each you have contention because 3 x 400Mbps = 1.2Gbps and you only have 1Gbps to the switch fabric.
    If this is the problem there are a couple of solutions -
    1) you can look into how the devices are spread across the port groupings and maybe you could move things around to make the traffic more even among port groupings. However you mention servers and trunks so you could just end up with the same problem but also the phones are not working either.
    2) you can look to migrate some connections off that module onto another but this supposes you have spare capacity elsewhere. If those trunk links are connected to other switches then you would defintely want to look into moving these as the 6548 was never intended to be used as an uplink module between switches.
    3) you could upgrade the module to something like a 6748 (or whatever the latest is as they keep introducing new ones) which has a much better connection to the switch fabric. Again this assumes a sup720 as the 6748 would't work with a sup32.
    However don't just rush into 3) without first doing some investigation work eg. what do the servers do, how much traffic are they churning out etc.  Have you looked at the interface stats for each connection to see if you are getting drops etc ?
    Jon

Maybe you are looking for

  • What is "other" under usage data and how do I get rid of it?

    I was recently given a used iPad 3rd generation 16GB A1416 with IOS 7.1.2.  I set up my own Apple ID and then I deleted all existing apps, photos, movies, messages, reminders, etc to make room in storage and then downloaded 4 new apps for my granddau

  • Two values in Dashboard prompt

    Hi, How do i display two values in the dashboard prompt. In order to make the prompt more discriptive and easy to understand for the user i am trying to display both the processing period and the week number in the prompt but i want to pass only the

  • The purpose of the executable permission

    Hi, What is the purpose behind the execute permission of Unix? Because if you can read it, you can always find a way to execute it anyway. The read and write are pretty obvious. If you can read a file, even if its execute permission isn't set and you

  • Any Examples of using ActiveCollectionModelDecorator with POJO data control

    Are there any Examples of using ActiveCollectionModelDecorator with POJO data control in any of the sample packs?

  • Inscribing the image in the points connectd by lines

    i have a problem in drawing lines inscribing the image over array of points my code is public class CaptureConvex extends JPanel { private BufferedImage image; Rectangle rect; Ellipse2D.Double circle; Dimension size = new Dimension(); private ConvexH