Fetching Data... while querying composites in EM

Hi,
I installed SOA Suite 11.1.1.4 on windows and database XE. The installation is successful, but when I go to EM --> SOA --> soa-infra --> default, I get only Fetching Data... but doesn't result in anything.
I even deployed a composite but I can't query the instance because of above issue. The DB itself don't have issues.
Did anyone encounter this issue before?
Thanks
Venkat

Hi Venkat,
What output do you get when you go:
http://<hostname>:<soa-server-port-number>/soa-infra/
Cheers,
Vlad
If you think this is helpful, please consider giving points - it is good etiquette to reward an answerer points (5 - helpful; 10 - correct) for their post if they answer your question.

Similar Messages

  • How to make JScrollpane not to fetch data while scrollbar is adjusting?

    how to make JScrollpane not to fetch data while scrollbar is adjusting?
    Hi,
    I need to make the jscrollpane get data only when the scrollbar stops scrolling.
    for instance if I hold the scrollbar's thumb and drag it to pass 1000 records, I want the view to wait until I release the thumb before taking any action( ex. adjust the view). in other words if the value of getValueIsAdjusting() of scrollbar is true, then the jscrollpsne should wait until it changes to false before doing anything. this is the same approach that Outlook takes when browsing through the list of emails in your mailbox.
    I don't know how to solve this issue. any help regarding this issue would be appreciated.
    thanks
    Saba

    You are planning to mention your cross-post(s) somewhere in this post, correct?

  • Use APD to fetch data from Query's Structure

    Hi,
    i have created a query , with a fixed structure in the Row area and two keyfigures.
    I have this query on DSO.
    i want to store the same query output to one table. 
    How can i achive this?
    I think to use the APD, but as i have fix structure in query , APD is not the solution . correct me if i am worng.
    Please suggest.
    Regards,
    Macwan James.

    Hi SVU123,
    Currently I am trying to load data into ODS only. But I want to avoid this path. I can always extract data out of the ODS to Flat File through Open Hub, but that will require twice the time to extract data.
    First from Query to ODS and second from ODS to Flat File through open hub.
    Basically, I want to automate this process and that is why I don't want to opt the path for RSCRM_BAPI t-code.
    - Danny

  • Facing prolem  in Dashboard 4.1, while fetching data from Bex Query

    Hi Exports,
    I am facing an error message  " Failed to (de-)serialise data. (Xsl 000004)"  while fetching data in Dashboard from Bex Query.
    The query is getting connect. while drag n drop some dimensions and measures then going for Refesh or Run Query, geting this error.
    The same query is working fine with other comp like webi n crystal.
    Anybody having solution for this please let me know. I am stuck somewhere.
    Thank You

    Hi,
    Check the data in the infoProvider.Reduce the  Bex query Characteristics & Key figure fields.Try to identify due to which characteristsic adding in Bex Query ,are you facing the issue.Check that characteristic data in the infoProvider.
    Regards,
    Venkat

  • Hyperion IR : Getting out of memory error while fetching data for whole year through web client (wrokspace)

    Hi,
    While fetching data though IR wen client from workspace for a year(all 12 months) I am getting error as ("Out of Memory .Advice : Close other applications or windows and try again").
    If I am trying same through IR studio it does not give any output and show me same repoting front page.
    If i am selecting periods till 8 months it is giving the required data in both IR web client and IR studio.
    Could you please suggest how can we resolve this issue.
    Thanks,
    D.N.Rana

    Issue Cause :
    Sometimes this is due to excessive data which brings the size of the BQY file up around one gigabyte uncompressed in size (for processing may take twice as actual RAM, plus the memory space space for the plugin, and the typical memory limit on a 32-bit system is 2 gigabytes).
    Solution :
    To avoid excessive BQY size exceeding memory availability:
    Ensure that your computer has at least 2Gb of free RAM before he runs IR Studio.
    Put a limit to the number of rows that can be pulled down: Right click on Request label of Query section and put a value in Return First xxx Rows (and check the check box).
    Do not pull down more than 750 MB of data (remember it may be duplicated while processing).
    Place limits or aggregations in Query section (as opposed to Result section) to limit data entering the BQY.

  • Select query taking 30 sec to fetch data from table containing BLOB column.

    Hi Friends,<o:p></o:p>
    Please help me...<o:p></o:p>
    I have 15 columns in a table, in that 2 columns containing blob (images).<o:p></o:p>
    More than 22 lakhs records are in a table.<o:p></o:p>
    While i am try to fetch data from this table it nearly takes 25~30 sec to execute that query.<o:p></o:p>
    When i deleted the two columns containing blob and i tried its executing fast .<o:p></o:p>
    I should not change the table schema.<o:p></o:p>
    Views also i created and indexes also there in a table.<o:p></o:p>
    How can i improve the query execution speed?.<o:p></o:p>

    And how large is the size of BLOB data in your table? If it's several giga bytes, then the time is simply required to read the amount of data from disk and to transfer it to the Client.
    Olaf Helper
    [ Blog] [ Xing] [ MVP]

  • How to use for all entires clause while fetching data from archived tables

    How to use for all entires clause while fetching data from archived tables using the FM
    /PBS/SELECT_INTO_TABLE' .
    I need to fetch data from an Archived table for all the entries in an internal table.
    Kindly provide some inputs for the same.
    thanks n Regards
    Ramesh

    Hi Ramesh,
    I have a query regarding accessing archived data through PBS.
    I have archived SAP FI data ( Object FI_DOCUMNT) using SAP standard process through TCODE : SARA.
    Now please tell me can I acees this archived data through the PBS add on FM : '/PBS/SELECT_INTO_TABLE'.
    Do I need to do something else to access data archived through SAP standard process ot not ? If yes, then please tell me as I am not able to get the data using the above FM.
    The call to the above FM is as follows :
    CALL FUNCTION '/PBS/SELECT_INTO_TABLE'
      EXPORTING
        archiv           = 'CFI'
        OPTION           = ''
        tabname          = 'BKPF'
        SCHL1_NAME       = 'BELNR'
        SCHL1_VON        =  belnr-low
        SCHL1_BIS        =  belnr-low
        SCHL2_NAME       = 'GJAHR'
        SCHL2_VON        =  GJAHR-LOW
        SCHL2_BIS        =  GJAHR-LOW
        SCHL3_NAME       =  'BUKRS'
        SCHL3_VON        =  bukrs-low
        SCHL3_BIS        =  bukrs-low
      SCHL4_NAME       =
      SCHL4_VON        =
      SCHL4_BIS        =
        CLR_ITAB         = 'X'
      MAX_ZAHL         =
      tables
        i_tabelle        =  t_bkpf
      SCHL1_IN         =
      SCHL2_IN         =
      SCHL3_IN         =
      SCHL4_IN         =
    EXCEPTIONS
       EOF              = 1
       OTHERS           = 2
       OTHERS           = 3
    It gives me the following error :
    Index for table not supported ! BKPF BELNR.
    Please help ASAP.
    Thnaks and Regards
    Gurpreet Singh

  • Slow Speed While fetching data from SQL 2008 using DoQuery.

    Hello,
    I am working for an AddOn and tried to use DoQuery for fetching data from SQL 2008 in C#.
    There are around 148 records which full fill this query condition but it takes much time to fetch the data.
    I wanna know that is there any problem in this code by which my application is getting slower.
    I used break Points and checked it, I founds that while connecting to the server it is taking time.
    Code:
    // Get an initialized SBObob object
    oSBObob = (SAPbobsCOM.SBObob)oCompany.GetBusinessObject(SAPbobsCOM.BoObjectTypes.BoBridge);
    //// Get an initialized Recordset object
    oRecordset = (SAPbobsCOM.Recordset)oCompany.GetBusinessObject(SAPbobsCOM.BoObjectTypes.BoRecordset);
    string sqlstring = "select DocEntry,ItemCode From OWOR  where OWOR.Status='R' and DocEntry not in ( Select distinct(BaseRef) from IGE1 where IGE1.BaseRef = OWOR.DocEntry)";
    oRecordset.DoQuery(sqlstring);
    var ProductList = new BindingList<KeyValuePair<string, string>>();
    ProductList.Add(new KeyValuePair<string, string>("", "---Please Select---"));
    while (!(oRecordset.EoF))
    ProductList.Add(new KeyValuePair<string, string>(oRecordset.Fields.Item(0).Value.ToString(), oRecordset.Fields.Item(0).Value.ToString() + " ( " + oRecordset.Fields.Item(1).Value.ToString() + " ) "));
    oRecordset.MoveNext();
    cmbProductionOrder.ValueMember = "Key";
    cmbProductionOrder.DisplayMember = "Value";
    Thanks and Regards,
    Ravi Sharma

    Hi Ravi,
    your code and query look correct. But can you ellaborate a little bit.
    It seems to be a DI API program ( no UI API ) ?
    When you say "I founds that while connecting to the server it is taking time." do you mean the recordset query or the DI API connection to SBO ? The later would be "normal" since the connection can take up to 30 seconds.
    To get data it is usually better to use direct SQL connections.
    regards,
    Maik

  • Eliminate duplicate while fetching data from source

    Hi All,
    CUSTOMER TRANSACTION
    CUST_LOC     CUT_ID          TRANSACTION_DATE     TRANSACTION_TYPE
    100          12345          01-jan-2009          CREDIT
    100          23456          15-jan-2000          CREDIT
    100          12345          01-jan-2010          DEBIT
    100          12345          01-jan-2000          DEBITNow as per my requirement, i need to fetch data from CISTOMER_TRANSACTION table for those customer which has transaction in last 10 years. In my above data, customer 12345 has transaction in last 10 years, whereas for customer 23456, does not have transaction in last 10 years so will eliminate it.
    Now, CUSTOMER_TRANSACTION table has approximately 100 million records. So, we are fectching data in batches. Batching is divided into months. Total 120 months. Below is my query.
    select *
    FROM CUSTOMER_TRANSACTION CT left outer join
    (select distinct CUST_LOC, CUT_ID FROM CUSTOMER_TRANSACTION WHERE TRANSACTION_DATE >= ADD_MONTHS(SYSDATE, -120) and TRANSACTION_DATE < ADD_MONTHS(SYSDATE, -119) CUST
    on CT.CUST_LOC = CUST.CUST_LOC and CT.CUT_ID = CUST.CUT_IDThru shell script, months number will change. -120:-119, -119:-118 ....., -1:-0.
    Now the problem is duplication of records.
    while fetching data for jan-2009, it will get cust_id 12345 and will fetch all 3 records and load it into target.
    while fetching data for jan-2010, it will get cust_id 12345 and will fetch all 3 records and load in into target.
    So instead of having only 3 records, for customer 12345 it will be having 6 records. Can someone help me on how can i eliminate duplicate records from getting in.
    As of now i have 2 ways in mind.
    1. Fetch all records at once. Which is impossible as it will give space issue.
    2. After each batch, run a procedure which will delete duplicate records based on cust_loc, cut_id and transaction_date. But again it will have performance problem.
    I want to eliminate it while fetching data from source.
    Edited by: ace_friends22 on Apr 6, 2011 10:16 AM

    You can do it this way....
    SELECT DISTINCT cust_doc,
                    cut_id
      FROM customer_transaction
    WHERE transaction_date >= ADD_MONTHS(SYSDATE, -120)
       AND transaction_date < ADD_MONTHS(SYSDATE, -119)However please note that - if want to get the transaction in a month like what you said earlier jan-2009 and jan-2010 and so on... you might need to use TRUNC...
    Your date comparison could be like this... In this example I am checking if the transaction date is in the month of jan-2009
    AND transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)  AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)) Your modified SQL...
    SELECT *
      FROM customer_transaction 
    WHERE transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)  AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27))Testing..
    --Sample Data
    CREATE TABLE customer_transaction (
    cust_loc number,
    cut_id number,
    transaction_date date,
    transaction_type varchar2(20)
    INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2009','dd-MON-yyyy'),'CREDIT');
    INSERT INTO customer_transaction VALUES (100,23456,TO_DATE('15-JAN-2000','dd-MON-yyyy'),'CREDIT');
    INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2010','dd-MON-yyyy'),'DEBIT');
    INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2000','dd-MON-yyyy'),'DEBIT');
    --To have three records in the month of jan-2009
    UPDATE customer_transaction
       SET transaction_date = TO_DATE('02-JAN-2009','dd-MON-yyyy')
    WHERE cut_id = 12345
       AND transaction_date = TO_DATE('01-JAN-2010','dd-MON-yyyy');
    UPDATE customer_transaction
       SET transaction_date = TO_DATE('03-JAN-2009','dd-MON-yyyy')
    WHERE cut_id = 12345
       AND transaction_date = TO_DATE('01-JAN-2000','dd-MON-yyyy');
    commit;
    --End of sample data
    SELECT *
      FROM customer_transaction 
    WHERE transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)  AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27));Results....
    CUST_LOC     CUT_ID TRANSACTI TRANSACTION_TYPE
          100      12345 01-JAN-09 CREDIT
          100      12345 02-JAN-09 DEBIT
          100      12345 03-JAN-09 DEBITAs you can see, there are only 3 records for 12345
    Regards,
    Rakesh
    Edited by: Rakesh on Apr 6, 2011 11:48 AM

  • How will write SQL query to fetch data from  each Sub-partition..

    Hi All,
    Anyone does have any idea about How to write SQL query to fetch data from Sub-partition.
    Actually i have one table having composite paritition(Range+list)
    Now if i want to fetch data from main partition(Range) the query will be
    SELECT * FROM emp PARTITION(q1_2005);
    Now i want to fetch data at sub-partition level(List) .But i am not able to get any SQL query for that.
    Pls help me to sort out.
    Thanks in Advance.
    Anwar

    SELECT * FROM emp SUBPARTITION(sp1);

  • Sys_refcursor not fetching any data although query returns value

    hi!!!
    I am using sys_refcursor to return columns,and using below procedure to do so.Although data is there in table_1 and table_2.
    PROCEDURE test_pro(abc_date N DATE,
    cur_get_data OUT sys_refcursor
    OPEN cur_get_data
    for
    select A.col1,B.col2
    from table_1 A
    where A.dis_date=abc_date
    left outer join
    table_2 B
    on
    A.dis_date=B.dis_date;
    IF cur_get_data%rowcount=0
    then
         raise e_error;
    END if;
    EXCEPTION
         when e_error
         then
              ------no_data_found;
         when others
         then
    --------(giving SQL error with error code);
    END      test_pro;
    while running below sql in sql window of pl/sql developer fetching
    data
    select A.col1,B.col2
    from table_1 A
    where A.dis_date=abc_date
    left outer join
    table_2 B
    on
    A.dis_date=B.dis_date;
    but while testing the test_pro in test window of pl/sql developer it is
    not fetching any data and raising e_error each time
    is there any problem arising using IF cur_get_data%rowcount=0 as each time it is going to exception block..
    so can somebody please put some ideas what cud be the possible reason for this??

    Welcome to the forum!
    Unfortunatley you posted to the wrong forum. This question belongs in the SQL and PL/SQL forum.
    PL/SQL
    >
    sys_refcursor not fetching any data although query returns value
    but while testing the test_pro in test window of pl/sql developer it is
    not fetching any data and raising e_error each time
    is there any problem arising using IF cur_get_data%rowcount=0 as each time it is going to exception block..
    >
    A cursor doesn't fetch data - your code has to do that. The code you posted doesn't have any FETCH statements so no data will be fetched.
    There is no problem using 'IF cur_get_data%rowcount=0' but it will always be 0 in your code because you are not fetching any data.
    I'm guessing that you are trying to determine if there are any rows for the query. That isn't going to work since a cursor doesn't fetch rows.
    You just have to return the cursor to the caller and the caller will have to perform at least one fetch to see if there are any rows.
    If the above answers your question the just mark the question ANSWERED. Otherwise, since you have posted in the wrong forum
    1. repost the question in the SQL and PL/SQL forum
    2. Edit this post to add a link to the new thread in the other forum
    3. Mark this question ANSWERED so people will follow up in the other forum.
    Thanks.

  • Select query taking too much time to fetch data from pool table a005

    Dear all,
    I am using 2 pool table a005 and a006 in my program. I am using select query to fetch data from these table. i.e. example is mentioned below.
    select * from a005 into table t_a005 for all entries in it_itab
                       where vkorg in s_vkorg
                       and     matnr in  s_matnr
                       and     aplp   in  s_aplp
                       and     kmunh = it_itab-kmunh.
    here i can't create index also as tables are pool table...If there is any solutions , than please help me for same..
    Thanks ,

    it would be helpful to know what other fields are in the internal table you are using for the FOR ALL ENTRIES.
    In general, you should code the order of your fields in the select in the same order as they appear in the database.  If you do not have the top key field, then the entire database is read. If it's large then it's going to take a lot of time.  The more key fields from the beginning of the structure that you can supply at faster the retrieval.
    Regards,
    Brent

  • Error while fetching data into collection type.

    Hi all,
    I'm facing a problem while i'm fetching data into a table type.
    Open c1;
    open c2;
    loop
    Fetch c1 into partition_name_1;
    fetch c2 into partition_name_2;
    exit when c1%notfound or c2%notfound;
    open C1_refcursor for 'select a.col1,b.col2 from table1 partition('||partition_name_1||') a, table2 partition('||partition_name_2) b
    where a.col2=b.col2';
    loop
    fetch c1_refcursor BULK COLLECT into v1,v2 <-----This is the line where i'm getting the error as "ORA-01858: a non-numeric character was found where a numeric was expected"
    limit 100000;
    exit when v1.count=0;
    forall i in 1..v1.count
    end loop;
    i also checked the data type of the table variable its same as the column selected in the refcursor.
    Please help me out.
    Message was edited by:
    Sumit Narayan

    Ok I see that, but I don't think you can use associative arrays in this manner then, because you cannot select data directly into an associative_array.
    As far as I'm aware, they must be assigned an index and the error suggests to me that its missing an index.
    You can select directly into records and maybe this is where you need to go.

  • BAM reports taking more time while fetching data from EDS

    I have created an external Data Source(EDS) in the BAM, and when i create an object with that EDS, it is taking very long to fetch data from the EDS( more than 20 mins.), but as the Database is installed on my local system, when i fire a query there, it don't take much time. Please help me in this, what could be the possible reason?

    Your messaging gateway question would be better addressed in the SQL PL/SQL forum
    All Places > Database > Oracle Database + Options > SQL and PL/SQL >  Discussions

  • Select query is taking lot of time to fetch data.....

    Select query is taking lot of time to fetch data.
        SELECT algnum atanum  abdatu abzeit abname abenum bmatnr bmaktx bqdatu bqzeit bvlenr bnlenr bvltyp bvlber b~vlpla
               bnltyp bnlber bnlpla bvsola b~vorga INTO TABLE it_final FROM ltak AS a
                       INNER JOIN ltap AS b ON  btanum EQ atanum AND algnum EQ blgnum
                       WHERE a~lgnum = p_whno
                       AND a~tanum IN s_tono
                       AND a~bdatu IN s_tocd
                       AND a~bzeit IN s_bzeit
                       AND a~bname IN s_uname
                       AND a~betyp = 'P'
                       AND b~matnr IN s_mno
                       AND b~vorga <> 'ST'.
    Moderator message: Please Read before Posting in the Performance and Tuning Forum
    Edited by: Thomas Zloch on Mar 27, 2011 12:05 PM

    Hi Shiva,
    I am using two more select queries with the same manner ....
    here are the other two select query :
    ***************1************************
    SELECT * APPENDING CORRESPONDING FIELDS OF TABLE tbl_summary
        FROM ztftelpt LEFT JOIN ztfzberep
         ON  ztfzberep~gjahr = st_input-gjahr
         AND ztfzberep~poper = st_input-poper
         AND ztfzberepcntr  = ztftelptrprctr
        WHERE rldnr  = c_telstra_projects
          AND rrcty  = c_actual
          AND rvers  = c_ver_001
          AND rbukrs = st_input-bukrs
          AND racct  = st_input-saknr
          AND ryear  = st_input-gjahr
          and rzzlstar in r_lstar             
          AND rpmax  = c_max_period.
    and the second one is
    *************************2************************
      SELECT * APPENDING CORRESPONDING FIELDS OF TABLE tbl_summary
        FROM ztftelnt LEFT JOIN ztfzberep
         ON  ztfzberep~gjahr = st_input-gjahr
         AND ztfzberep~poper = st_input-poper
         AND ztfzberepcntr  = ztftelntrprctr
        WHERE rldnr  = c_telstra_networks
          AND rrcty  = c_actual
          AND rvers  = c_ver_001
          AND rbukrs = st_input-bukrs
          AND racct  = st_input-saknr
          AND ryear  = st_input-gjahr
          and rzzlstar in r_lstar                              
          AND rpmax  = c_max_period.
    for both the above table program is taking very less time .... although both the table used in above queries have similar amount of data. And i can not remove the APPENDING CORRESPONDING. because i have to append the data after fetching from the tables.  if i will not use it will delete all the data fetched earlier.
    Thanks on advanced......
    Sourabh

Maybe you are looking for

  • Computers Can't Find OfficeJet 6500 Wireless Connection

    For a while, my OfficeJet 6500 709n worked perfectly for a week or two over my wireless network. For reasons I cannot understand, none of my computers can see it anymore. It shows up on the router information page, but even browsing to the printer's

  • Unknown error (1417)

    I want to update my Ipod classic (30GB), but i always receive the message 'the ipod of .... cannot be updated. There is an unknown error (1417). My itunes is 7.5 and my system windows XP. I also want to synchronise my ipod, but then i get the message

  • How to hide MastHead from iView

    Hi Experts, I have created linktoURL UI element for WD program1. I am calling WD program2 using linktoURL UI element of WD program1. When I am clinking on the linktoURL UI element, iView of WD program2 gets open successfully. In the iView MastHead (c

  • How to load iTunes with my playlists?

    I accidently erased my HD by installing Leopard when it was already installed! The good news is that I have my Library backed up on an external hard drive and on DVDs. When I opened iTunes all of my playlists were gone, my Apple TV is not there eithe

  • Problems Facing in BDC

    Hi All,          I want to know the problems facing in Uploading Xternal file in BDC. Its very Urgent . Points will be rewarded to good ans. Bye.