Oracle mgw more time while fetching data from MQ

Dear All,
Good morning.
I am using oracle 11.2.0.3 version and using MGW to connect to MQ series (IBM WebSphere MQ ,Version: 7.5.0.0).
We are experincing delay while retriving message from MQ for some times and it is not happening always.
MQ2AQ
Found MQ MSG
>>2014-12-22 09:25:49  MGW  Engine  1  TRACE  Polling
Polling thread: JOB_MQ2AQ new msgCount = 1
2014-12-22 09:25:49
MQ2AQ
MQ MSG DEQ Retry
>>2014-12-22 09:25:49  MGW  Engine  1  TRACE  Polling
Polling thread: skip polling JOB_MQ2AQ msg count=1
2014-12-22 09:25:49
MQ2AQ
MQ MSG DEQ Retry
>>2014-12-22 09:25:50  MGW  Engine  1  TRACE  Polling
Polling thread: skip polling JOB_MQ2AQ msg count=1
2014-12-22 09:25:50
MQ2AQ
MQ MSG DEQ Retry
>>2014-12-22 09:25:51  MGW  Engine  1  TRACE  Polling
Polling thread: skip polling JOB_MQ2AQ msg count=1
2014-12-22 09:25:51
MQ2AQ
MQ MSG DEQ Retry
>>2014-12-22 09:25:52  MGW  Engine  1  TRACE  Polling
Polling thread: skip polling JOB_MQ2AQ msg count=1
2014-12-22 09:25:52
MQ2AQ
MQ MSG DEQ Retry
>>2014-12-22 09:25:53  MGW  Engine  1  TRACE  Polling
Polling thread: skip polling JOB_MQ2AQ msg count=1
2014-12-22 09:25:53
MQ2AQ
MQ MSG DEQ Retry
>>2014-12-22 09:25:54  MGW  Engine  1  TRACE  Polling
Polling thread: skip polling JOB_MQ2AQ msg count=1
2014-12-22 09:25:54
MQ2AQ
MQ MSG DEQ Retry
>>2014-12-22 09:25:55  MGW  Engine  1  TRACE  Polling
Polling thread: skip polling JOB_MQ2AQ msg count=1
2014-12-22 09:25:55
MQ2AQ
MQ MSG DEQ Retry
>>2014-12-22 09:25:56  MGW  Engine  1  TRACE  Polling
Polling thread: skip polling JOB_MQ2AQ msg count=1
2014-12-22 09:25:56
MQ2AQ
MQ MSG DEQ Retry
>>2014-12-22 09:25:57  MGW  Engine  1  TRACE  Polling
Polling thread: skip polling JOB_MQ2AQ msg count=1
2014-12-22 09:25:57
MQ2AQ
MQ MSG DEQ Retry
>>2014-12-22 09:25:58  MGW  Engine  1  TRACE  Polling
Polling thread: skip polling JOB_MQ2AQ msg count=1
2014-12-22 09:25:58
MQ2AQ
MQ MSG DEQ Start
>>2014-12-22 09:25:59  MGW  Engine  1  TRACE  worker0
entering deqMessages for job JOB_MQ2AQ
2014-12-22 09:25:59
MQ2AQ
MQ MSG DEQ Retry
>>2014-12-22 09:25:59  MGW  Engine  1  TRACE  Polling
Polling thread: skip polling JOB_MQ2AQ msg count=1
2014-12-22 09:25:59
MQ2AQ
MQ MSG DEQ Finsh
>>2014-12-22 09:25:59  MGW  Engine  1  TRACE  worker0
leaving deqMessages for job JOB_MQ2AQ with seqno 1544 deqMsgCount = 1
2014-12-22 09:25:59
MQ2AQ
MQ MSG ENQ to AQ
>>2014-12-22 09:25:59  MGW  Engine  1  TRACE  worker0
entering enqMessages for job JOB_MQ2AQ with request 1544
2014-12-22 09:25:59
MQ2AQ
ENQ MSG 2 AQ Finish
>>2014-12-22 09:26:02  MGW  Engine  1  TRACE  worker0
leaving enqMessages for job JOB_MQ2AQ with seqno 1544
2014-12-22 09:26:02
MQ2AQ
MSG REC
>>2014-12-22 09:26:02  MGW  Engine  1  TRACE  worker0
leaving commitMessages for job JOB_MQ2AQ with seqno 1544
2014-12-22 09:26:02
Above log took 11 sec to retrive the message from MQ
But the below log took 1 sec and i use the same message in both time.
MQ2AQ
Found MQ MSG
>>2014-12-22 09:35:01  MGW  Engine  1  TRACE  Polling
Polling thread: JOB_MQ2AQ new msgCount = 1
2014-12-22 09:35:01
MQ2AQ
MQ MSG DEQ Start
>>2014-12-22 09:35:01  MGW  Engine  1  TRACE  worker0
entering deqMessages for job JOB_MQ2AQ
2014-12-22 09:35:01
MQ2AQ
MQ MSG DEQ Finsh
>>2014-12-22 09:35:01  MGW  Engine  1  TRACE  worker0
leaving deqMessages for job JOB_MQ2AQ with seqno 1545 deqMsgCount = 1
2014-12-22 09:35:01
MQ2AQ
MQ MSG ENQ to AQ
>>2014-12-22 09:35:01  MGW  Engine  1  TRACE  worker0
entering enqMessages for job JOB_MQ2AQ with request 1545
2014-12-22 09:35:01
MQ2AQ
ENQ MSG 2 AQ Finish
>>2014-12-22 09:35:01  MGW  Engine  1  TRACE  worker0
leaving enqMessages for job JOB_MQ2AQ with seqno 1545
2014-12-22 09:35:01
MQ2AQ
MSG REC
>>2014-12-22 09:35:01  MGW  Engine  1  TRACE  worker0
leaving commitMessages for job JOB_MQ2AQ with seqno 1545
2014-12-22 09:35:01
It showing Polling thread: skip polling JOB_MQ2AQ msg count=1"  multiple times. what is it mean?
Can anyone share any good doc which i can refer to debug oracle message gate way issue.
Thanks,
Shine.

Your messaging gateway question would be better addressed in the SQL PL/SQL forum
All Places > Database > Oracle Database + Options > SQL and PL/SQL >  Discussions

Similar Messages

  • BAM reports taking more time while fetching data from EDS

    I have created an external Data Source(EDS) in the BAM, and when i create an object with that EDS, it is taking very long to fetch data from the EDS( more than 20 mins.), but as the Database is installed on my local system, when i fire a query there, it don't take much time. Please help me in this, what could be the possible reason?

    Your messaging gateway question would be better addressed in the SQL PL/SQL forum
    All Places > Database > Oracle Database + Options > SQL and PL/SQL >  Discussions

  • SSRS report is consuming much more time to fetch data from DB even direct run SP takes less than a second

    Hi,
    we are using SQL SERVER 2008R2 X64 RTM version. 
    One of the SSRS report designed by developer  is  consuming  much more time( 5 to 6 minutes ) to fetch data from DB. Even direct run of  Stored Procedure (Called in report )takes less than a second to display the result set.
    Please help.
    Regards Naveen MSSQL DBA

    Hi Naveen,
    Based on my understanding, you spend a little time retrieving data with a stored procedure from database in dataset designer. However, it takes long time to run the report to display the data, right?
    In Reporting Services, the total time to generate a report include TimeDataRetreval, TimeProcessing and TimeRendering. In your scenario, since you mentioned retrieving data costs a little time, you should check the table
    Executionlog3 in the ReportServer database to find which section costs most of time, TimeProcessing or TimeRendering. Then you can refer to this article to optimize your report:
    Troubleshooting Reports: Report Performance.
    Besides, if parameters exist in the report, you should declare variables inside of the stored procedure and assign the incoming parameters to the variables. For more information, please refer to the similar thread:
    Fast query runs slow in SSRS.
    If you have any question, please feel free to ask.
    Best regards,
    Qiuyun Yu
    Qiuyun Yu
    TechNet Community Support

  • How to use for all entires clause while fetching data from archived tables

    How to use for all entires clause while fetching data from archived tables using the FM
    /PBS/SELECT_INTO_TABLE' .
    I need to fetch data from an Archived table for all the entries in an internal table.
    Kindly provide some inputs for the same.
    thanks n Regards
    Ramesh

    Hi Ramesh,
    I have a query regarding accessing archived data through PBS.
    I have archived SAP FI data ( Object FI_DOCUMNT) using SAP standard process through TCODE : SARA.
    Now please tell me can I acees this archived data through the PBS add on FM : '/PBS/SELECT_INTO_TABLE'.
    Do I need to do something else to access data archived through SAP standard process ot not ? If yes, then please tell me as I am not able to get the data using the above FM.
    The call to the above FM is as follows :
    CALL FUNCTION '/PBS/SELECT_INTO_TABLE'
      EXPORTING
        archiv           = 'CFI'
        OPTION           = ''
        tabname          = 'BKPF'
        SCHL1_NAME       = 'BELNR'
        SCHL1_VON        =  belnr-low
        SCHL1_BIS        =  belnr-low
        SCHL2_NAME       = 'GJAHR'
        SCHL2_VON        =  GJAHR-LOW
        SCHL2_BIS        =  GJAHR-LOW
        SCHL3_NAME       =  'BUKRS'
        SCHL3_VON        =  bukrs-low
        SCHL3_BIS        =  bukrs-low
      SCHL4_NAME       =
      SCHL4_VON        =
      SCHL4_BIS        =
        CLR_ITAB         = 'X'
      MAX_ZAHL         =
      tables
        i_tabelle        =  t_bkpf
      SCHL1_IN         =
      SCHL2_IN         =
      SCHL3_IN         =
      SCHL4_IN         =
    EXCEPTIONS
       EOF              = 1
       OTHERS           = 2
       OTHERS           = 3
    It gives me the following error :
    Index for table not supported ! BKPF BELNR.
    Please help ASAP.
    Thnaks and Regards
    Gurpreet Singh

  • How can we improve the performance while fetching data from RESB table.

    Hi All,
    Can any bosy suggest me the right way to improve the performance while fetching data from RESB table. Below is the select statement.
    SELECT aufnr posnr roms1 roanz
        INTO (itab-aufnr, itab-pposnr, itab-roms1, itab-roanz)
        FROM resb
        WHERE kdauf  = p_vbeln
        AND   ablad  = itab-sposnr+2.
    Here I am using 'KDAUF'  & 'ABLAD' in condition. Can we use secondary index for improving the performance in this case.
    Regards,
    Himanshu

    Hi ,
    Declare intenal table with only those four fields.
    and try the beloe code....
    SELECT aufnr posnr roms1 roanz
    INTO  table itab
    FROM resb
    WHERE kdauf = p_vbeln
    AND ablad = itab-sposnr+2.
    yes, you can also use secondary index for improving the performance in this case.
    Regards,
    Anand .
    Reward if it is useful....

  • Eliminate duplicate while fetching data from source

    Hi All,
    CUSTOMER TRANSACTION
    CUST_LOC     CUT_ID          TRANSACTION_DATE     TRANSACTION_TYPE
    100          12345          01-jan-2009          CREDIT
    100          23456          15-jan-2000          CREDIT
    100          12345          01-jan-2010          DEBIT
    100          12345          01-jan-2000          DEBITNow as per my requirement, i need to fetch data from CISTOMER_TRANSACTION table for those customer which has transaction in last 10 years. In my above data, customer 12345 has transaction in last 10 years, whereas for customer 23456, does not have transaction in last 10 years so will eliminate it.
    Now, CUSTOMER_TRANSACTION table has approximately 100 million records. So, we are fectching data in batches. Batching is divided into months. Total 120 months. Below is my query.
    select *
    FROM CUSTOMER_TRANSACTION CT left outer join
    (select distinct CUST_LOC, CUT_ID FROM CUSTOMER_TRANSACTION WHERE TRANSACTION_DATE >= ADD_MONTHS(SYSDATE, -120) and TRANSACTION_DATE < ADD_MONTHS(SYSDATE, -119) CUST
    on CT.CUST_LOC = CUST.CUST_LOC and CT.CUT_ID = CUST.CUT_IDThru shell script, months number will change. -120:-119, -119:-118 ....., -1:-0.
    Now the problem is duplication of records.
    while fetching data for jan-2009, it will get cust_id 12345 and will fetch all 3 records and load it into target.
    while fetching data for jan-2010, it will get cust_id 12345 and will fetch all 3 records and load in into target.
    So instead of having only 3 records, for customer 12345 it will be having 6 records. Can someone help me on how can i eliminate duplicate records from getting in.
    As of now i have 2 ways in mind.
    1. Fetch all records at once. Which is impossible as it will give space issue.
    2. After each batch, run a procedure which will delete duplicate records based on cust_loc, cut_id and transaction_date. But again it will have performance problem.
    I want to eliminate it while fetching data from source.
    Edited by: ace_friends22 on Apr 6, 2011 10:16 AM

    You can do it this way....
    SELECT DISTINCT cust_doc,
                    cut_id
      FROM customer_transaction
    WHERE transaction_date >= ADD_MONTHS(SYSDATE, -120)
       AND transaction_date < ADD_MONTHS(SYSDATE, -119)However please note that - if want to get the transaction in a month like what you said earlier jan-2009 and jan-2010 and so on... you might need to use TRUNC...
    Your date comparison could be like this... In this example I am checking if the transaction date is in the month of jan-2009
    AND transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)  AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)) Your modified SQL...
    SELECT *
      FROM customer_transaction 
    WHERE transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)  AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27))Testing..
    --Sample Data
    CREATE TABLE customer_transaction (
    cust_loc number,
    cut_id number,
    transaction_date date,
    transaction_type varchar2(20)
    INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2009','dd-MON-yyyy'),'CREDIT');
    INSERT INTO customer_transaction VALUES (100,23456,TO_DATE('15-JAN-2000','dd-MON-yyyy'),'CREDIT');
    INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2010','dd-MON-yyyy'),'DEBIT');
    INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2000','dd-MON-yyyy'),'DEBIT');
    --To have three records in the month of jan-2009
    UPDATE customer_transaction
       SET transaction_date = TO_DATE('02-JAN-2009','dd-MON-yyyy')
    WHERE cut_id = 12345
       AND transaction_date = TO_DATE('01-JAN-2010','dd-MON-yyyy');
    UPDATE customer_transaction
       SET transaction_date = TO_DATE('03-JAN-2009','dd-MON-yyyy')
    WHERE cut_id = 12345
       AND transaction_date = TO_DATE('01-JAN-2000','dd-MON-yyyy');
    commit;
    --End of sample data
    SELECT *
      FROM customer_transaction 
    WHERE transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)  AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27));Results....
    CUST_LOC     CUT_ID TRANSACTI TRANSACTION_TYPE
          100      12345 01-JAN-09 CREDIT
          100      12345 02-JAN-09 DEBIT
          100      12345 03-JAN-09 DEBITAs you can see, there are only 3 records for 12345
    Regards,
    Rakesh
    Edited by: Rakesh on Apr 6, 2011 11:48 AM

  • Fatal error while fetching data from bi

    hi,
    i am getting following error while fetching data from bi using select statement
    i have written code in this way
    SELECT  [Measures].[D2GFTNHIOMI7KWV99SD7GPLTU] ON COLUMNS, NON EMPTY { [DEM_STATE].MEMBERS} ON ROWS FROM DEM_CUBE/TEST_F_8
    error description when i click on test
    Fatal Error
    com.lighthammer.webservice.SoapException: The XML for Analysis provider encountered an error

    thanks for answering .but when i tried writing the statement in transaction 'MDXTEST' and clicked on check i am getting following error
    Error occurred when starting the parser: timeout during allocate / CPIC-CALL: 'ThSAPCMRCV'
    Message no. BRAINOLAPAPI011
    Diagnosis
    Failed to start the MDX parser.
    System Response
    timeout during allocate / CPIC-CALL: 'ThSAPCMRCV'
    Procedure
    Check the Sys Log in Transaction SM21 and test the TCP-IP connection MDX_PARSER in Transaction SM59.
    SO I WENT IN SM 59 TO CHECK THE CONNECTION.
    CAN U TELL ME WHAT CONFIGERATION I NEED TO DO FOR MAKING SELECT STATEMENTS WORK?

  • Sessions opening\closing in oracle while fetching data from XI

    Hi Friends,
    I used JDBC adaptor to fetch data from Oracle. I set the poll interval 86400 seconds because we need to run it on daily basis.
    Now when XI fetch data from Oracle, It will open session in oracle, but it is not closed automatically. So, I have nearly 100 channels, and all channels are activated. So, for each channel 1 session is opened in oracle. By this way, the oracle server performance goes down. It is not working properly at that time.
    How can I close these sessions in oracle.
    Regards,
    Narendra

    Did u try this paramter in JDBC sender adapter...
    Disconnect from Database After Processing Each Message
    Set this indicator if the database connection is to be released and reestablished before every poll interval.
    -Siva Maranani

  • Select query taking too much time to fetch data from pool table a005

    Dear all,
    I am using 2 pool table a005 and a006 in my program. I am using select query to fetch data from these table. i.e. example is mentioned below.
    select * from a005 into table t_a005 for all entries in it_itab
                       where vkorg in s_vkorg
                       and     matnr in  s_matnr
                       and     aplp   in  s_aplp
                       and     kmunh = it_itab-kmunh.
    here i can't create index also as tables are pool table...If there is any solutions , than please help me for same..
    Thanks ,

    it would be helpful to know what other fields are in the internal table you are using for the FOR ALL ENTRIES.
    In general, you should code the order of your fields in the select in the same order as they appear in the database.  If you do not have the top key field, then the entire database is read. If it's large then it's going to take a lot of time.  The more key fields from the beginning of the structure that you can supply at faster the retrieval.
    Regards,
    Brent

  • Slow Speed While fetching data from SQL 2008 using DoQuery.

    Hello,
    I am working for an AddOn and tried to use DoQuery for fetching data from SQL 2008 in C#.
    There are around 148 records which full fill this query condition but it takes much time to fetch the data.
    I wanna know that is there any problem in this code by which my application is getting slower.
    I used break Points and checked it, I founds that while connecting to the server it is taking time.
    Code:
    // Get an initialized SBObob object
    oSBObob = (SAPbobsCOM.SBObob)oCompany.GetBusinessObject(SAPbobsCOM.BoObjectTypes.BoBridge);
    //// Get an initialized Recordset object
    oRecordset = (SAPbobsCOM.Recordset)oCompany.GetBusinessObject(SAPbobsCOM.BoObjectTypes.BoRecordset);
    string sqlstring = "select DocEntry,ItemCode From OWOR  where OWOR.Status='R' and DocEntry not in ( Select distinct(BaseRef) from IGE1 where IGE1.BaseRef = OWOR.DocEntry)";
    oRecordset.DoQuery(sqlstring);
    var ProductList = new BindingList<KeyValuePair<string, string>>();
    ProductList.Add(new KeyValuePair<string, string>("", "---Please Select---"));
    while (!(oRecordset.EoF))
    ProductList.Add(new KeyValuePair<string, string>(oRecordset.Fields.Item(0).Value.ToString(), oRecordset.Fields.Item(0).Value.ToString() + " ( " + oRecordset.Fields.Item(1).Value.ToString() + " ) "));
    oRecordset.MoveNext();
    cmbProductionOrder.ValueMember = "Key";
    cmbProductionOrder.DisplayMember = "Value";
    Thanks and Regards,
    Ravi Sharma

    Hi Ravi,
    your code and query look correct. But can you ellaborate a little bit.
    It seems to be a DI API program ( no UI API ) ?
    When you say "I founds that while connecting to the server it is taking time." do you mean the recordset query or the DI API connection to SBO ? The later would be "normal" since the connection can take up to 30 seconds.
    To get data it is usually better to use direct SQL connections.
    regards,
    Maik

  • Error while fetching data from OWB Client using External Table.

    Dear All,
    I am using Oracle Warehouse Builder 11g & Oracle 10gR2 as repository database on Windows 2000 Server.
    I facing some issue in fetching data from a Flat File using external table from OWB Client.
    I have perform all the steps without any error but when I try to view the data, I got the following error.
    ======================================
    RA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04040: file expense_categories.csv in SOURCE_LOCATION not found
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
    java.sql.SQLException: ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04040: file expense_categories.csv in SOURCE_LOCATION not found
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
         at oracle.jdbc.driver.SQLStateMapping.newSQLException(SQLStateMapping.java:70)
         at oracle.jdbc.driver.DatabaseError.newSQLException(DatabaseError.java:110)
         at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:171)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:455)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:413)
         at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:1030)
         at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:183)
         at oracle.jdbc.driver.T4CStatement.executeForDescribe(T4CStatement.java:774)
         at oracle.jdbc.driver.T4CStatement.executeMaybeDescribe(T4CStatement.java:849)
         at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1186)
         at oracle.jdbc.driver.OracleStatement.executeQuery(OracleStatement.java:1377)
         at oracle.jdbc.driver.OracleStatementWrapper.executeQuery(OracleStatementWrapper.java:386)
         at oracle.wh.ui.owbcommon.QueryResult.<init>(QueryResult.java:18)
         at oracle.wh.ui.owbcommon.dataviewer.relational.OracleQueryResult.<init>(OracleDVTableModel.java:48)
         at oracle.wh.ui.owbcommon.dataviewer.relational.OracleDVTableModel.doFetch(OracleDVTableModel.java:20)
         at oracle.wh.ui.owbcommon.dataviewer.RDVTableModel.fetch(RDVTableModel.java:46)
         at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerPanel$1.actionPerformed(BaseDataViewerPanel.java:218)
         at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1849)
         at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2169)
         at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:420)
         at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:258)
         at javax.swing.AbstractButton.doClick(AbstractButton.java:302)
         at javax.swing.AbstractButton.doClick(AbstractButton.java:282)
         at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerPanel.executeQuery(BaseDataViewerPanel.java:493)
         at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerEditor.init(BaseDataViewerEditor.java:116)
         at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerEditor.<init>(BaseDataViewerEditor.java:58)
         at oracle.wh.ui.owbcommon.dataviewer.relational.DataViewerEditor.<init>(DataViewerEditor.java:16)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
         at java.lang.reflect.Constructor.newInstance(Constructor.java:494)
         at oracle.wh.ui.owbcommon.IdeUtils._tryLaunchEditorByClass(IdeUtils.java:1412)
         at oracle.wh.ui.owbcommon.IdeUtils._doLaunchEditor(IdeUtils.java:1349)
         at oracle.wh.ui.owbcommon.IdeUtils._doLaunchEditor(IdeUtils.java:1367)
         at oracle.wh.ui.owbcommon.IdeUtils.showDataViewer(IdeUtils.java:869)
         at oracle.wh.ui.owbcommon.IdeUtils.showDataViewer(IdeUtils.java:856)
         at oracle.wh.ui.console.commands.DataViewerCmd.performAction(DataViewerCmd.java:19)
         at oracle.wh.ui.console.commands.TreeMenuHandler$1.run(TreeMenuHandler.java:188)
         at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:209)
         at java.awt.EventQueue.dispatchEvent(EventQueue.java:461)
         at java.awt.EventDispatchThread.pumpOneEventForHierarchy(EventDispatchThread.java:242)
         at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:163)
         at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:157)
         at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:149)
         at java.awt.EventDispatchThread.run(EventDispatchThread.java:110)
    ===========================
    In the error it is showing that file expense_categories.csv in SOURCE_LOCATION not found but I am 100% sure that file is very much there.
    Is anybody face the same issue?
    Do we need to configure something before loading data from a flat file from OWB Client?
    Any help would higly appreciable.
    Regards,
    Manmohan Sharma

    Hi Detlef / Gowtham,
    Now I am able to fetch data from flat files from OWB Server as well as OWB Client.
    One way I have achieved as suggested by you
    1) Creating location on the OWB Client
    2) Samples the files at client
    3) Created & Configured external table
    4) Copy all flat files on OWB Server
    5) Updated the location which I created at the client.
    Other way
    1) Creating location on the OWB Client
    2) Samples the files at client
    3) Created & Configured external table
    4) Copied flat files on the sever in same drive & directory . like if my all flat files are on C:\data at OWB Client then I copied flat file C:\data on the OWB Server. But this is feasible for Non-Windows.
    Hence my problem solved.
    Thanks a lot.
    Regards,
    Manmohan

  • SELECT is taking lot of time to fetch data from cluster table BSET

    <Modified the subject line>
    Hi experts,
    I want to fetch data of some fields from bset table but it is taking a lot of time as the table is cluster table.
    Can you please suggest me any other process to fetch data from cluster table. I am using native sql to fetch data.
    Regards,
    SURYA
    Edited by: Suhas Saha on Jun 29, 2011 1:51 PM

    Hi Subhas,
    As per your suggestion I am now using normal SQL statement to select data from BSET but it is still taking much time.
    My SQL statement is :
    SELECT BELNR
                  GJAHR
                  BUZEI
                  MWSKZ
                  HWBAS
                  KSCHL
                  KNUMH FROM BSET INTO CORRESPONDING FIELDS OF TABLE IT_BSET
                  FOR ALL ENTRIES IN IT_BKPF
                  WHERE BELNR = IT_BKPF-BELNR
                      AND BUKRS = IT_BKPF-BUKRS.
    <Added code tags>
    Can you suggest me anymore?
    Regards,
    SURYA
    Edited by: Suhas Saha on Jun 29, 2011 4:16 PM

  • Facing prolem  in Dashboard 4.1, while fetching data from Bex Query

    Hi Exports,
    I am facing an error message  " Failed to (de-)serialise data. (Xsl 000004)"  while fetching data in Dashboard from Bex Query.
    The query is getting connect. while drag n drop some dimensions and measures then going for Refesh or Run Query, geting this error.
    The same query is working fine with other comp like webi n crystal.
    Anybody having solution for this please let me know. I am stuck somewhere.
    Thank You

    Hi,
    Check the data in the infoProvider.Reduce the  Bex query Characteristics & Key figure fields.Try to identify due to which characteristsic adding in Bex Query ,are you facing the issue.Check that characteristic data in the infoProvider.
    Regards,
    Venkat

  • Reg: Proxy error while fetching data from RFC's

    Hi All,
    I am fetching data from RFC's. When the data is in bulk I am getting an error like:
    Proxy Error
    The proxy server received an invalid response from an upstream server.
    The proxy server could not handle the request POST /webdynpro/dispatcher/sap.com/pb/PageBuilder.
    Reason: Error reading from remote server
    When the data is samll only I am not getting any error and am able to fetch it easily. Please suggest me something in this regard.
    Thanks in advance,
    Gaurav

    Hi Detlef / Gowtham,
    Now I am able to fetch data from flat files from OWB Server as well as OWB Client.
    One way I have achieved as suggested by you
    1) Creating location on the OWB Client
    2) Samples the files at client
    3) Created & Configured external table
    4) Copy all flat files on OWB Server
    5) Updated the location which I created at the client.
    Other way
    1) Creating location on the OWB Client
    2) Samples the files at client
    3) Created & Configured external table
    4) Copied flat files on the sever in same drive & directory . like if my all flat files are on C:\data at OWB Client then I copied flat file C:\data on the OWB Server. But this is feasible for Non-Windows.
    Hence my problem solved.
    Thanks a lot.
    Regards,
    Manmohan

  • SSRS reports running slow while fetching data from SQL server DB

    We are currently facing issues with lots of our SSRS reports running very slow. Approx they are taking around 15-20 minutes compared to earlier which was 2-3 minutes.
    Also, the SSRS reports which fetch their data from a Database is acting as Subscriber for a reporting server. That is we have an OLTP DB server whose Database say 'A' is getting replicated onto the server where say Database 'B'. On this B we have our SSRS
    reports fetching the required data and are reported to be running very slow.
    As a start I have checked for Index and stats to be properly updated and yes they are.
    I am not sure now, from where to start analysing the issue.
    I believe replication, not to be the issue as earlier also we had replication set up and never saw the slowness.
    IS memory or IO causing a problem, please help as i am not sure on how to find a fix for this?
    Thanks in advance!

    Hi,
    You need to first find out what all are slow? Is it that the whole instance is slow? How about other queries /replication etc against those databases? Are they slow?
    Check the reports which are slow, try to find the slowest reports and then start looking at execution plans. Check if there is any kind of blocking happening during the time the reports run. Also try running the reports manually and see if there is a difference
    in execution time. Take a look at the waitstats during the time the reports are running.
    Read this to get more better understanding -
    https://technet.microsoft.com/en-us/library/ms177500%28v=sql.105%29.aspx?f=255&MSPPError=-2147217396
    Regards, Ashwin Menon My Blog - http:\\sqllearnings.com

Maybe you are looking for