NiUsrp Fetch Rx data band

Hi all,
the "niUsrp Fetch Rx data.vi" does output signal in baseband or in the carrier frequency set-up by the "niUSRP Configure Signal.vi"?
I will use it to make a mixer "via software", in radar application, and I have to know if the correct way for mixing Tx and Rx signals is to take directly the output of the "niUsrp Fetch Rx data.vi" or if I should downconvert this signal.
Thanks,
Giube

Hi Giube,
have you tried with the example you can find in LabVIEW? There are a couple that come up when you search for DDC, but the most appropriate for your case is probably the "niScope EX OSP Downconversion.vi.
Let me know if that can work out for you!

Similar Messages

  • Error -1074118610 occurred at niUSRP Fetch Rx Data (CDB Cluster).vi

       I researched this error in forum ,however, I don't reach the solutions. 
    Error -1074118610 occurred at niUSRP Fetch Rx Data (CDB Cluster).vi
    The received data is out of sequence. This may be a symptom of overflows due to an inability to maintain streaming at the specified IQ Rate, or rearrangement of packets by an ethernet adapter or network switch.
       When I start to USRP, I am taking this error 6-7 minutes later. I tried IQ Rate= 2M and samples per symbol=4 also IQ Rate is approximately 192k and samples per symbol= 20 for 9600 baud rate. 

    Hello ITUbahadir
    I understand you are NOT testing a LabVIEW shipping example. If this is correct, Could you try to test the shipping examples? Same error? If this is incorrect, what is the name of the example?
    If the problem is specific to your code, could you share it? If you are not able to do that, could you explain me what have you changed to the code?
    Regards
    Frank R.

  • Data sequence to niUSRP write Tx Data

    Hay all,
    Currently, I'm trying to develop 802.15.4 system. I have a few question about the data sequence that being fetched into niUSRP write Tx Data.vi. 
    The vi (niUSRP write Tx Data) accepts IQ data from modulation block, What is the sequence for data that is being transmitted? For example, bitstream is modeled as array, which array[] is tranmitted first? The LSB, or the MSB.
    The standard itself (802.15.4) specify that the least bit (C0) is transmitted first. 
    Thank you.

    my TICC2530 has been sucessfully detected the signal, but the decoding still messy.
    And I have problew, because 
    "niUSRP Write Tx Data (CDB Cluster).vi<ERR>Underflow: the Tx buffer was emptied before new data was provided.  Consider reducing the IQ rate, increasing the Write rate, or increasing the number of samples per Write."
    I already tried to change the registry, as mentioned in here Increasing-write-rate , but, I still have the problem, when I run the IQ rate > 4M and samples per symbol 4. Hence, the CC2530 won't recognize the signal, if I set the IQ rate to be 4M. This in the constelation for my custom OQPSK half sine pulse (4M, 4samples/symbol, Symbol rate 1M)
    and my CC2530 will recognize the signal if I set the IQ rate 8M and samples per symbol 8, here is the constelation diagram
    So, my question is, how to handle the error?

  • Not able to fetch the data by Virtual Cube

    Hi Experts,
    My requirement is I need to fetch the data from Source System (From Data base table) by using Virtual Cube.
    What I have done is I have created Virtual Cube and created corresponding Function Module in Source System.
    That Function MOdule is working fine, if Data base table is small in Source System.But If data base table contains huge amount of data (millions of record), I am not able to fetch the data.
    Below is the code I have incorporated in my function module.
    c_th_mapping TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_TH_IOBJ_FLD_MAPPING.
      DATA:
        l_s_map TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_S_IOBJ_FLD_MAPPING.
      l_s_map-iobjnm = '0PARTNER'.
      l_s_map-fldnm  = 'PARTNER'.
      insert l_s_map into table l_th_mapping.
    create object l_r_srv
        exporting
           i_tablnm              = '/SAPSLL/V_BLBP'
          i_th_iobj_fld_mapping = l_th_mapping.
      l_r_srv->open_cursor(
        i_t_characteristics = characteristics[]
        i_t_keyfigures      = keyfigures[]
        i_t_selection       = selection[] ).
       l_r_srv->fetch_pack_data(
        importing
          e_t_data = data[] ).
      return-type = 'S'.
    In the above function Module,Internal table L_TH_MAPPING contains Info Objects from Virtual Cube and corresponding field from Underlying data base table.
    The problem where I am facing is, in the method FETCH_PACK_DATA, initially program is trying to fetch all the recordsfrom data base table to internal table.If Data base table so lagre, this logic is not working.
    So would you please help me how to handle these kind of issues.

    Hi Experts,
    My requirement is I need to fetch the data from Source System (From Data base table) by using Virtual Cube.
    What I have done is I have created Virtual Cube and created corresponding Function Module in Source System.
    That Function MOdule is working fine, if Data base table is small in Source System.But If data base table contains huge amount of data (millions of record), I am not able to fetch the data.
    Below is the code I have incorporated in my function module.
    c_th_mapping TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_TH_IOBJ_FLD_MAPPING.
      DATA:
        l_s_map TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_S_IOBJ_FLD_MAPPING.
      l_s_map-iobjnm = '0PARTNER'.
      l_s_map-fldnm  = 'PARTNER'.
      insert l_s_map into table l_th_mapping.
    create object l_r_srv
        exporting
           i_tablnm              = '/SAPSLL/V_BLBP'
          i_th_iobj_fld_mapping = l_th_mapping.
      l_r_srv->open_cursor(
        i_t_characteristics = characteristics[]
        i_t_keyfigures      = keyfigures[]
        i_t_selection       = selection[] ).
       l_r_srv->fetch_pack_data(
        importing
          e_t_data = data[] ).
      return-type = 'S'.
    In the above function Module,Internal table L_TH_MAPPING contains Info Objects from Virtual Cube and corresponding field from Underlying data base table.
    The problem where I am facing is, in the method FETCH_PACK_DATA, initially program is trying to fetch all the recordsfrom data base table to internal table.If Data base table so lagre, this logic is not working.
    So would you please help me how to handle these kind of issues.

  • Reg: fetch the data by using item_id which is retuned by In line View Query

    Hi all,
    create table xxc_transactions(type_id number,trx_line_id number ,item_id number,org_id number);
    insert into xxc_transactions values(null,null,null,null);
    create table xxc_items1(item_id number,org_id number,item_no varchar2(10));
    insert into xxc_items1 values(123,12,'book');
    create table xxc_headers(header_id number,order_id number);
    insert into xxc_headers values(null,null);
    create table xxc_lines(header_id number,item_id number,line_id number);
    insert into xxc_lines values(null,null,null);
    create table xxc_types_tl(transaction_id number,NAME varchar2(10));
    insert into xxc_types_tl values(106,'abc');
    create table xxc_quantity(item_id number);
    insert into xxc_quantity values (123);
    create table xxc_quantity_1(item_id number);
    insert into xxc_quantity_1 values (123);
    SELECT union_id.item_id,
           b.org_id,
           e.name,
           fun1(union_id.item_id) item_no
    FROM   xxc_transactions a,
           xxc_items1 b,
           xxc_headers c,
           xxc_lines d,
           xxc_types_tl e,
           (SELECT item_id
            FROM   xxc_quantity
            WHERE  item_id = 123
            UNION
            SELECT item_id
            FROM   xxc_quantity_1
            WHERE  item_id = 123
            UNION
            SELECT item_id
            FROM   xxc_transactions
            WHERE  item_id = 123) union_id
    WHERE  a.type_id = 6
           AND a.item_id  = b.item_id
           AND union_id.item_id = b.item_id
           AND a.org_id = b.org_id
           AND c.header_id = d.header_id
           AND d.line_id = a.trx_line_id
           AND d.item_id = b.item_id
           AND c.order_id = e.transaction_id
           AND b.org_id = 12
    GROUP  BY union_id.item_id,
              b.org_id,
              e.name
    ORDER  BY union_id.item_id;
    create or replace function fun1(v_item in number)
    return varchar2
    is
    v_item_no
    Begin
       select item_no from xxc_items1
       where item_id=v_item;
       return v_item_no ;
        Exception
         When Others Then
          v_item_no := null;
          return v_item_no;
    END fun1;
    I  need  fetch the data by using item_id which is retuned by In line View Query(UNION)
    item_id  org_id  name    item_no
    123        12        abc       book
    Version: 11.1.0.7.0  and 11.2.0.1.0
    Message was edited by: Rajesh123 Added test cases script
    Message was edited by: Rajesh123 changed Question as fetch the data by using item_id which is retuned by In line View Query(UNION)

    Hi Master , sorry for the late reply and can you please help on this?
    create table xxc_transactions(type_id number,trx_line_id number ,item_id number,org_id number);
    insert into xxc_transactions values(null,null,null,null);
    create table xxc_items(item_id number,org_id number,item_no varchar2(10));
    insert into xxc_items values(123,12,'book');
    create table xxc_headers(header_id number,order_id number);
    insert into xxc_headers values(null,null);
    create table xxc_lines(header_id number,item_id number,line_id number);
    insert into xxc_lines values(null,null,null);
    create table xxc_types_tl(transaction_id number,NAME varchar2(10));
    insert into xxc_types_tl values(106,'abc');
    create table xxc_uinon_table(item_id number);
    insert into xxc_types_tl values(123);
    SELECT   union_id.item_id,
             b.org_id ,
             e.name ,
             fun1(union_id.item_id) item_no   --> to get item_no
             FORM xxc_transactions a,
             xxc_items             b,
             xxc_headers           c,
             xxc_lines             d,
             xxc_types_tl          e,
             ( SELECT item_id
                 FROM   xxc_uinon_table ) union_id
    WHERE    a.type_id= 6
    AND      a.item_id = b.item_id
    AND      union_id.item_id = b.item_id
    AND      a.org_id = b.org_id
    AND      c.header_id = d.header_id
    AND      d.line_id= a.trx_line_id
    AND      d.item_id= b.item_id
    AND      c.order_id= e.transaction_id ---106
    AND      b.org_id = 12
    GROUP BY union_id.item_id,
             b.org_id ,
             e.name
    ORDER BY union_id.item_id;
    Note: xxc_uinon_table is a combination of UNION's
    select 1 from dual
    union
    select 1 from dual
    union
    select no rows returned  from dual;
    I will get 1 from the above Query
    Thank you in advanced

  • Error when Fetching CATs Data to BI

    Hi,
    I have created data source in R/3 System (Source System) for CATS data and extracting data Using function Module. Using transaction RSA3 it is fetching all the desired records in R/3 System.
    In BI System extracting Full Data and when monitoring it is throwing Error.
    Error 7 when sending an IDoc
    Error when opening an RFC connection    2
    Error when opening an RFC connection
    Errors in source system
    Function Module for fetching CATS Data
    FUNCTION ZLBTIME_GET_CATSREC.
    *"*"Local Interface:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SRSC_S_IF_SIMPLE-REQUNR
    *"     VALUE(I_DSOURCE) TYPE  SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SRSC_S_IF_SIMPLE-MAXSIZE OPTIONAL
    *"     VALUE(I_INITFLAG) TYPE  SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
    *"     VALUE(I_READ_ONLY) TYPE  SRSC_S_IF_SIMPLE-READONLY OPTIONAL
    *"     VALUE(I_REMOTE_CALL) TYPE  SBIWA_FLAG DEFAULT SBIWA_C_FLAG_OFF
    *"  TABLES
    *"      I_T_SELECT TYPE  SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL
    *"      E_T_DATA STRUCTURE  ZLSCATS OPTIONAL
    *"  EXCEPTIONS
    *"      NO_MORE_DATA
    *"      ERROR_PASSED_TO_MESS_HANDLER
    * Example: DataSource for table SFLIGHT
    *  TABLES: SFLIGHT.
    * Auxiliary Selection criteria structure
      DATA: L_S_SELECT TYPE SRSC_S_SELECT.
    * Maximum number of lines for DB table
      STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE,
    * counter
              S_COUNTER_DATAPAKID LIKE SY-TABIX,
    * cursor
              S_CURSOR TYPE CURSOR.
    * Select ranges
    **  RANGES: L_R_CARRID  FOR SFLIGHT-CARRID,
    **          L_R_CONNID  FOR SFLIGHT-CONNID.
    * Initialization mode (first call by SAPI) or data transfer mode
    * (following calls) ?
      IF I_INITFLAG = SBIWA_C_FLAG_ON.
    * Initialization: check input parameters
    *                 buffer input parameters
    *                 prepare data selection
    * Check DataSource validity
        CASE I_DSOURCE.
          WHEN 'ZLTIME_DS_3'.
          WHEN OTHERS.
            IF 1 = 2. MESSAGE E009(R3). ENDIF.
    * this is a typical log call. Please write every error message like this
            LOG_WRITE 'E'                  "message type
                      'R3'                 "message class
                      '009'                "message number
                      I_DSOURCE   "message variable 1
                      ' '.                 "message variable 2
            RAISE ERROR_PASSED_TO_MESS_HANDLER.
        ENDCASE.
        APPEND LINES OF I_T_SELECT TO S_S_IF-T_SELECT.
    * Fill parameter buffer for data extraction calls
        S_S_IF-REQUNR    = I_REQUNR.
        S_S_IF-DSOURCE = I_DSOURCE.
        S_S_IF-MAXSIZE   = I_MAXSIZE.
    * Fill field list table for an optimized select statement
    * (in case that there is no 1:1 relation between InfoSource fields
    * and database table fields this may be far from beeing trivial)
        APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS.
      ELSE.                 "Initialization mode or data extraction ?
    * Data transfer: First Call      OPEN CURSOR + FETCH
    *                Following Calls FETCH only
    * First data package -> OPEN CURSOR
        IF S_COUNTER_DATAPAKID = 0.
    * Fill range tables BW will only pass down simple selection criteria
    * of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
    **      LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'CARRID'.
    **        MOVE-CORRESPONDING L_S_SELECT TO L_R_CARRID.
    **        APPEND L_R_CARRID.
    **      ENDLOOP.
    **      LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'CONNID'.
    **        MOVE-CORRESPONDING L_S_SELECT TO L_R_CONNID.
    **        APPEND L_R_CONNID.
    **      ENDLOOP.
    * Determine number of database records to be read per FETCH statement
    * from input parameter I_MAXSIZE. If there is a one to one relation
    * between DataSource table lines and database entries, this is trivial.
    * In other cases, it may be impossible and some estimated value has to
    * be determined.
    **      OPEN CURSOR WITH HOLD S_CURSOR FOR
    **      SELECT (S_S_IF-T_FIELDS) FROM SFLIGHT
    **                               WHERE CARRID  IN L_R_CARRID AND
    **                                     CONNID  IN L_R_CONNID.
            OPEN CURSOR WITH HOLD s_cursor FOR
    *        SELECT * FROM zlvctmbw.
            SELECT a~counter a~pernr a~workdate a~skostl a~lstar a~rproj a~awart a~kokrs a~meinh a~tcurr a~price
                   a~unit a~bukrs a~ersda a~erstm a~ernam a~laeda a~laetm a~status a~refcounter a~reason a~belnr
                   a~catshours a~act1 a~act2 a~task a~lang1 a~narr1 a~lang2 a~narr2 a~cmnt a~clst a~plchl
                   b~ltxa1 b~transfer b~hrkostl b~hrlstar b~hrcostasg b~statkeyfig b~catsquantity b~bemot
                   c~pspnr c~pspid
                   d~kunnr
            FROM   catsdb AS a
            LEFT OUTER JOIN catsco AS b ON a~counter = b~counter
            JOIN   prps AS p ON p~pspnr = a~rproj
            JOIN   proj AS c ON c~pspnr = p~psphi
            JOIN   zltproj AS d ON d~pspnr = c~pspnr.
        ENDIF.                             "First data package ?
    * Fetch records into interface table.
    *   named E_T_'Name of extract structure'.
        FETCH NEXT CURSOR S_CURSOR
                   APPENDING CORRESPONDING FIELDS
                   OF TABLE E_T_DATA
                   PACKAGE SIZE S_S_IF-MAXSIZE.
        IF SY-SUBRC <> 0.
          CLOSE CURSOR S_CURSOR.
          RAISE NO_MORE_DATA.
        ENDIF.
        S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1.
      ENDIF.              "Initialization mode or data extraction ?
    ENDFUNCTION.
    When fetching data using VIEW in source system it works fine.
    When testing for fetching data of SFLIGHT using function module ZRSAX_BIW_GET_DATA_SIMPLE
    made a copy of RSAX_BIW_GET_DATA_SIMPLE, it is working fine.
    Pls suggest the cause of error.
    Regards
    Vishal

    Hello Praveen,
    Connection is fine as I tried fetching data of SFLIGHT using standard template function module which works fine and in transaction sm58 I do not any data.
    Regards
    Vishal

  • How to fetch the data from pl/sql table dynamically

    Hi All, I have the requirement of comparing two db views data in pl/sql. So, I have bulk collect the view into pl/sql table. But, the issue is, It is expecting to provide the column name for comparison. But, in my case, column name is dynamic. So, i cannot provide the column name directly for comparison.
    For eg: In my view t1_VW, i have 4 columns. stid, c1,c2,c3,c4 and similar structure for t2_vw
    my code
    TYPE v1_type IS TABLE OF t1_vw%ROWTYPE;
    l_data v1_type;
    TYPE v1_type1 IS TABLE OF t2_vw%ROWTYPE;
    l_data1 v1_type1;
    test varchar2(1000);
    test1 varchar2(1000);
    temp1 number;
    begin
    SELECT * Bulk collect into l_data
    FROM T1_VW;
    SELECT * Bulk collect into l_data1
    FROM T2_VW;
    select l_data(1).stid into temp1 from dual; -- It is working fine and gives me the value properly
    -- But, in my case, we are reading the column names from array, i am constructing the query dynamically and execute it.
    test :='select l_data(1).stid into temp1 from dual';
    execute immediate test into temp1;
    -- I am getting error as follows:
    Error report:
    ORA-00904: "L_DATA": invalid identifier
    ORA-06512: at "SYSTEM.BULKCOMPARISON", line 93
    ORA-06512: at line 2
    00904. 00000 - "%s: invalid identifier"
    *Cause:   
    *Action
    end;
    - Please help me to get rid of this issue. Is it possible to construct the query dynamically and fetch the data?. If not, is there any other better approach to compare the data between two views?.

    Output should display what are all columns changed and its old value and new value.
    For eg., output should be
    COLUMNNAME OLD_VALUE NEW_VALUE STID
    C1 20 10 1
    C2 50 40 2
    C3 60 70 2
    C2 80 90 3Why no do this only via a simple sql ?
    create table a (STID number, C1 number,  C2 number, C3 number);
    insert into a values (1, 20, 30, 40)
    insert into a values (2, 40, 50, 60);
    insert into a values (3, 90, 80, 100);
    create table b as select *
    from a where 1 = 0;
    insert into b values (1, 10, 30, 40)
    insert into b values (2, 40, 40, 70);
    insert into b values (3, 90, 90, 100);
    commit;And now you can issue such a kind of select
    SELECT stid , c1, c2, c3                      
       FROM
      ( SELECT a.*,
             1 src1,
             to_number(null) src2        
       FROM  a   
       UNION ALL
       SELECT b.*,
             to_number(null) src1,
             2  src2        
        FROM b
       GROUP BY stid , c1, c2, c3
       HAVING count(src1) <> count(src2)
       order by stid;I would then create a new table a_b_difference having the same structure as a or b and insert into it like this
    create table a_b_diff as select * from a where 1 = 0;
    insert into a_b_diff
    SELECT stid , c1, c2, c3                      
       FROM
      ( SELECT a.*,
             1 src1,
             to_number(null) src2        
       FROM  a   
       UNION ALL
       SELECT b.*,
             to_number(null) src1,
             2  src2        
        FROM b
       GROUP BY stid , c1, c2, c3
       HAVING count(src1) <> count(src2)
       order by stid
       ;Then each time there is a difference between a column in a and its equivalente one in b (per unique stid ) a record will be inserted in this table.
    You can do more by adding the name of the table in front of each record in this table to see exactly where the data comes from
    Best Regards
    Mohamed Houri

  • How to fetch the data from databse table and get the required output

    Hi,
    I have made a project that connects CEP to database table but i m getting some problem in fetching the data from database.
    From the following code :
    If the where condition is removed then the application runs fine but i am still not able to fetch the data from the table because it is not showing any output.
    Can anyone please suggest me that how to write WHERE statement correctly and how i will be able to see the output.
    Following is the config.xml for processor:
    ======================================
    <?xml version="1.0" encoding="UTF-8"?>
    <wlevs:config xmlns:wlevs="http://www.bea.com/ns/wlevs/config/application"
         xmlns:jdbc="http://www.oracle.com/ns/ocep/config/jdbc">
         <processor>
              <name>JDBC_Processor</name>
              <rules>
                   <query id="q1"><![CDATA[
                             SELECT STOCK.SYMBOL as symbol, STOCK.EXCHANGE as exchange
                             FROM ExchangeStream [Now] as datastream, STOCK
                             WHERE datastream.SYMBOL = datastream.SYMBOL ]]></query>
              </rules>
         </processor>
         <jms-adapter>
              <name>JMS_IN_Adapter</name>
              <jndi-provider-url>t3://CHDSEZ135400D:7001</jndi-provider-url>
              <destination-jndi-name>jms.TestKanikaQueue</destination-jndi-name>
              <user>weblogic</user>
              <password>welcome1</password>
         </jms-adapter>
    </wlevs:config>
    Following is the assembly file:
    <?xml version="1.0" encoding="UTF-8"?>
    <beans xmlns="http://www.springframework.org/schema/beans"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:osgi="http://www.springframework.org/schema/osgi"
         xmlns:wlevs="http://www.bea.com/ns/wlevs/spring" xmlns:jdbc="http://www.oracle.com/ns/ocep/jdbc"
         xmlns:spatial="http://www.oracle.com/ns/ocep/spatial"
         xsi:schemaLocation="
              http://www.springframework.org/schema/beans
              http://www.springframework.org/schema/beans/spring-beans.xsd
              http://www.springframework.org/schema/osgi
              http://www.springframework.org/schema/osgi/spring-osgi.xsd
              http://www.bea.com/ns/wlevs/spring
              http://www.bea.com/ns/wlevs/spring/spring-wlevs-v11_1_1_3.xsd
              http://www.oracle.com/ns/ocep/jdbc
              http://www.oracle.com/ns/ocep/jdbc/ocep-jdbc.xsd
              http://www.oracle.com/ns/ocep/spatial
              http://www.oracle.com/ns/ocep/spatial/ocep-spatial.xsd">
         <wlevs:event-type-repository>
              <wlevs:event-type type-name="StockEvent">
                   <wlevs:properties>
                        <wlevs:property name="SYMBOL" type="byte[]" length="16" />
                        <wlevs:property name="EXCHANGE" type="byte[]" length="16" />
                   </wlevs:properties>
              </wlevs:event-type>
              <wlevs:event-type type-name="ExchangeEvent">
                   <wlevs:class>com.bea.wlevs.event.example.JDBC_CEP.ExchangeEvent</wlevs:class>
              </wlevs:event-type>
              <wlevs:event-type type-name="StockExchangeEvent">
                   <wlevs:properties>
                        <wlevs:property name="symbol" type="byte[]" length="16" />
                        <wlevs:property name="price" type="byte[]" length="16" />
                        <wlevs:property name="exchange" type="byte[]" length="16" />
                   </wlevs:properties>
              </wlevs:event-type>
         </wlevs:event-type-repository>
         <bean id="readConverter" class="com.bea.wlevs.adapter.example.JDBC_CEP.Adapter_JDBC" />
         <bean id="outputJDBCBean" class="com.bea.wlevs.bean.example.JDBC_CEP.OutputBean_JDBC">
         </bean>
         <wlevs:adapter id="JMS_IN_Adapter" provider="jms-inbound">
              <wlevs:listener ref="ExchangeStream" />
              <wlevs:instance-property name="converterBean"
                   ref="readConverter" />
         </wlevs:adapter>
         <wlevs:processor id="JDBC_Processor" advertise="true">
              <wlevs:listener ref="OutputChannel" />
              <wlevs:table-source ref="STOCK" />
         </wlevs:processor>
         <wlevs:channel id="ExchangeStream" event-type="ExchangeEvent" advertise="true">
              <wlevs:listener ref="JDBC_Processor" />
         </wlevs:channel>
         <wlevs:channel id="OutputChannel" event-type="StockExchangeEvent"
              advertise="true">
              <wlevs:listener ref="outputJDBCBean" />
         </wlevs:channel>
         <wlevs:table id="STOCK" event-type="StockEvent"
              data-source="StockDs" table-name="STOCK" />
         <wlevs:table id="STOCK_EXCHANGE" event-type="StockExchangeEvent"
              data-source="StockDs" table-name="STOCK_EXCHANGE" />
    </beans>
    ExchangeEvent.java:
    package com.bea.wlevs.event.example.JDBC_CEP;
    public class ExchangeEvent {
         public String SYMBOL;
         public String symbol;
         public String exchange;
         public ExchangeEvent() {
         public String getSYMBOL() {
              return SYMBOL;
         public void setSYMBOL(String sYMBOL) {
              SYMBOL = sYMBOL;
         public String getSymbol() {
              return symbol;
         public void setSymbol(String symbol) {
              this.symbol = symbol;
         public String getExchange() {
              return exchange;
         public void setExchange(String price) {
              this.exchange = price;
    Adapter Class:
    package com.bea.wlevs.adapter.example.JDBC_CEP;
    import com.bea.wlevs.adapter.example.JDBC_CEP.MyLogger;
    import com.bea.wlevs.adapters.jms.api.InboundMessageConverter;
    import java.text.DateFormat;
    import java.util.Date;
    import com.bea.wlevs.adapters.jms.api.MessageConverterException;
    import com.bea.wlevs.event.example.JDBC_CEP.ExchangeEvent;
    import javax.jms.JMSException;
    import javax.jms.Message;
    import javax.jms.TextMessage;
    import java.util.ArrayList;
    import java.util.List;
    import java.util.Random;
         public class Adapter_JDBC implements InboundMessageConverter{
         @SuppressWarnings("unchecked")
         public List convert(Message message) throws MessageConverterException, JMSException {
              Random rand = new Random();
              int unique_id = rand.nextInt();
              DateFormat dateFormat;
              dateFormat = DateFormat.getTimeInstance();
              dateFormat.format(new Date());
              MyLogger.info(unique_id + " CEP Start Time is: " + dateFormat.format(new Date()));
              System.out.println("Message from the Queue is :"+ message);
              TextMessage textMessage = (TextMessage) message;
              String stringMessage = textMessage.getText().toString();
              System.out.println("Message after getting converted into String is :"+ stringMessage);
                   String[] results = stringMessage.split(",\\s*"); // split on commas
                   ExchangeEvent event1 = new ExchangeEvent();
                   event1.setSYMBOL(results[0]);
         List events = new ArrayList(2);
         events.add(event1);
         return events;
    Output Bean Class :
    package com.bea.wlevs.bean.example.JDBC_CEP;
    import com.bea.wlevs.ede.api.StreamSink;
    import com.bea.wlevs.event.example.JDBC_CEP.ExchangeEvent;
    import com.bea.core.datasource.DataSourceService;
    public class OutputBean_JDBC implements StreamSink{
         public void onInsertEvent(Object event) {
         if (event instanceof ExchangeEvent) {
              ExchangeEvent cacheEvent = (ExchangeEvent) event;
         System.out.println("Symbol is: " + cacheEvent.getSymbol());
         System.out.println("Exchange is: " + cacheEvent.getExchange());
         System.out.println(DataSourceService.class.getClass());
    Kindly let me know if you need further info.

    Do you have StockDs configured in your server config.xml?
    I think the query should look more like this:
    SELECT stocks.SYMBOL, stocks.EXCHANGE
    FROM STOCK as stocks, ExchangeStream [Now] as datastream WHERE stocks.SYMBOL = datastream.SYMBOL
    Thanks
    andy

  • How to Fetch the Data from a Cube

    Dear All,
    We created a cube containing dimensions
    Customer, Product, Branch, Activity, Time dimensions
    using Oracle Analytical WorkSpace Manager.
    Once Cube is created,
    How can I see the Data existing in the Cube using normal SQL Queries ??? Through Analytical Workspace Manager Toll we r able to see the data. But our requirement is to see the data from the Cube using SQL Queries.
    Regards,
    S.Vamsi Krishna

    Hey I got the Solution. It follows in this way :
    A Cube is nothing but a Data Storage. Based on the Mapping we given it considers the data.
    To fetch the data from the Cube -> we have to write the SQl Query as below :
         SELECT dealer_name,model_name,sales
         FROM TABLE(OLAP_TABLE('MDB.FINAL_AW DURATION SESSION',
         'DIMENSION dealer_name AS varchar2(30) FROM FINALDEAL
         DIMENSION model_name AS varchar2(30) FROM FINALMODEL'));
    We can create View for the above statement o
    we can apply group by ,rollup, etc etc clauses
    and even we can write where clauses for the above select statement.
    But now my doubt is :
    can we apply any calculations while mapping the Level to an Dimension.
    Generally we will map Level toa dimension as DBUSER.TABLENAME.COLUMN NAME
    can we apply any calculation like :
    MIS.PROPKEY020MB.MATURITY_DATE+2
    Please help for the above.
    If any wrong is there please let me know
    Regards,
    S.Vamsi Krishna
    can we apply

  • Report is not fetching the data from Aggregate..

    Hi All,
    I am  facing the problem  in aggregates..
    For example when i  am running the report using Tcode RSRT2, the BW report is not fetching the data from Aggregates.. instead going into the aggregate it is scanning whole cube Data....
    FYI.. Checked the characteristcis is exactely matching with aggregates..
    and also it is giving the  message as:
    <b>Characteristic 0G_CWWPTY is compressed but is not in the aggregate/query</b>
    Can some body explain me about this error message.. pls let me know solution asap..
    Thankyou in advance.
    With regards,
    Hari

    Hi
    Deactivate the aggregates and then rebuild the indexes and then activate the aggregates again.
    GTR

  • How to fetch the data from a pl/sql table and varray, with some example

    I want to fetch the data using a cursor from Pl/sql table and varry and I want to update the data.
    Please provide me some example.

    PL/SQL Table  - please note that, right term is Associative Array.
    Presumably you are referring to the 'often heated' back-and-forth that sometimes goes on in the forums when people refer to ANY PL/SQL type using a term with the word 'table' in it?
    Curious that you then show an example of a nested table!
    type emp_tab is table of employees%rowtype;
    The 'right term' for that is 'nested table'. The following would be an 'associative array' or 'index-by table'
    type emp_tab is table of employees%rowtype INDEX BY PLS_INTEGER;
    Those used to be called 'PL/SQL tables' or 'index-by tables' but 'associative array' is the current term used.
    Associative Arrays
    An associative array (formerly called PL/SQL table or index-by table) is a set of key-value pairs. Each key is a unique index, used to locate the associated value with the syntax variable_name(index).
    The data type of index can be either a string type or PLS_INTEGER.
    Since the Oracle docs often use 'PL/SQL table' or 'index-by table' it isn't unusual for someone asking a question to use those terms also. Technically the types may not be 'tables' but it's clear what they mean when they use the term.
    In PL/SQL the term 'nested table' is still used even though the PL/SQL collection is not really a table. SQL does have nested tables where the data is actually stored in a table. The PL/SQL  'nested table' type can be used as the source/destination of the SQL data from a nested table so that may be why Oracle uses that term for the PL/SQL type.
    The doc that SKP referenced refers to this use:
    Nested Tables
    In the database, a nested table is a column type that stores an unspecified number of rows in no particular order. When you retrieve a nested table value from the database into a PL/SQL nested table variable, PL/SQL gives the rows consecutive indexes, starting at 1.

  • Couldn't fetch the data from the data source...[nQSError: 16023]

    Hi all.
    I think that the problem I want to discuss is well-known, but still I got no answer whatever I tried ...
    I installed the BIEE on Linux (32 bit, OEL 5 - to be more precise), the complete installation was not a big deal. After that I installed the Administration tool on my laptop and created the repository. So... my tnsnames.ora on the laptop looks like this:
    TESTDB =
    (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = 192.168.1.5)(PORT = 1521))
    (CONNECT_DATA =
    (SERVER = DEDICATED)
    (SERVICE_NAME = testdb)
    And the tnsnames.ora on server, in its turn, looks like this:
    TESTDB =
    (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = localhost.localdomain)(PORT = 1521))
    (CONNECT_DATA =
    (SERVER = DEDICATED)
    (SERVICE_NAME = testdb.localdomain)
    The database worked normally and I created and transferred the repository to the server and started it up.
    It started without any errors, but when I tried to fetch the data via the representation services I got the error:
    Odbc driver returned an error (SQLExecDirectW).
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred.
    [nQSError: 16023] The ODBC function has returned an error. The database may not be available, or the network may be down. (HY000)
    I discovered, that the ODBC on my laptop was named not correctly (it should have been identical to tnsnames entry) - so I corrected it, saved and replaced the repository on the server and restarted it... - and still got the same error.
    Apparently, something is wrong with the data source. So let me put here some more information...
    My user.sh looks like this:
    ORACLE_HOME=/u01/app/ora/product/11.2.0/dbhome_1
    export ORACLE_HOME
    TNS_ADMIN=$ORACLE_HOME/network/admin
    export TNS_ADMIN
    PATH=$ORACLE_HOME/bin:/opt/bin:$PATH
    export PATH
    LD_LIBRARY_PATH=$ORACLE_HOME/lib:$LD_LIBRARY_PATH
    export LD_LIBRARY_PATH
    and my odbc.ini looks like this:
    [ODBC]
    Trace=0
    TraceFile=odbctrace.out
    TraceDll=/u01/OracleBI/odbc/lib/odbctrac.so
    InstallDir=/u01/OracleBI/odbc
    UseCursorLib=0
    IANAAppCodePage=4
    [ODBC Data Sources]
    AnalyticsWeb=Oracle BI Server
    Cluster=Oracle BI Server
    SSL_Sample=Oracle BI Server
    TESTDB=Oracle BI Server
    [TESTDB]
    Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
    Description=Oracle BI Server
    ServerMachine=local
    Repository=SH
    Catalog=
    UID=
    PWD=
    Port=9703
    [AnalyticsWeb]
    Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
    Description=Oracle BI Server
    ServerMachine=local
    Repository=
    Catalog=
    UID=
    PWD=
    Port=9703
    [Cluster]
    Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
    Description=Oracle BI Server
    ServerMachine=local
    Repository=
    FinalTimeOutForContactingCCS=60
    InitialTimeOutForContactingPrimaryCCS=5
    IsClusteredDSN=Yes
    Catalog=SnowFlakeSales
    UID=Administrator
    PWD=
    Port=9703
    PrimaryCCS=
    PrimaryCCSPort=9706
    SecondaryCCS=
    SecondaryCCSPort=9706
    Regional=No
    [SSL_Sample]
    Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
    Description=Oracle BI Server
    ServerMachine=localhost
    Repository=
    Catalog=SnowflakeSales
    UID=
    PWD=
    Port=9703
    SSL=Yes
    SSLCertificateFile=/path/to/ssl/certificate.pem
    SSLPrivateKeyFile=/path/to/ssl/privatekey.pem
    SSLPassphraseFile=/path/to/ssl/passphrase.txt
    SSLCipherList=
    SSLVerifyPeer=No
    SSLCACertificateDir=/path/to/ca/certificate/dir
    SSLCACertificateFile=/path/to/ca/certificate/file.pem
    SSLTrustedPeerDNs=
    SSLCertVerificationDepth=9
    Can anybody point a finger where the error line is? According to the documentation it should work fine.Maybe the driver name is wrong? What driver I need then?
    Cause I can't find it.
    I'm really sorry to bother, guys :) Let me know if you get some ideas about it (metalink didn't help).

    OK, several things wrong here. First the odbc.ini is not meant to be used for Oracle databases, that's not supported on Linux. On Linux you should OCI (Oracle native drivers) and nothing should be added on odbc.ini. Your user.sh seems to be pointing to your DB installation path. This is not correct. It should point to your Oracle client installation so you need to install the Oracle FULL client somewhere. Typically this is normally done with the same OS account as the one used for OBIEE whereas the DB normally runs with the oracle account. Once you got the client installed test it under the OBIEE account doing tnsping and sqlplus to your DB. Also the LD_LIBRARY_PATH should point to $ORACLE_HOME/lib32 not lib as the lib directory is the 64bits and OBIEE uses the 32bits libraries even in 64bits OSes. Finally change your RPD connection to use OCI. Make all those changes and you should be good.

  • Error NO Data Found  in fetching a data from ResultSet

    I am fetching a data from Access Data base ...and i am getting data in a result set...
    query.append(" select * from college_admission_form ");
    stmt = con.createStatement();
    rs = stmt.executeQuery(query.toString());
    while(rs.next()){
    String ad_id = rs.getString(1);
    String admFormId = rs.getString(2); // line 2
    String collegeId = rs.getString(26);
    String temp = rs.getString(2); // error is showing in this line that NO Data Found.
    but earlier in line 2 i am getting the data.
    how to solve it ...plz reply fast

    I am accessing MS Access DB through a simple web appln using jdbc-odbc
    I am getting error at
    while (rs.next())
    System.out.println("rs.getString(1)="+rs.getString(1));
    objEmp.setEmpName(rs.getString(2)); -----at this line
    The problem i am getting is when i assign/set "rs.getString" value to something.If I commect the problem line, it goes ahead well.
    Can some one pls tell me how i shud resolve this problem or how shud i make use of the resultset?
    Thanks

  • Getting an error while fetching the data and bind it in the Tree table

    Hi All,
    I am getting an error "A navigation paths parameter object has to be defined - " while fetching the data and bind it in the Tree table.
    Please find the code and screenshot below
    var oModel = new sap.ui.model.odata.ODataModel("../../../XXXX.xsodata/", true);
    var oTable = sap.ui.getCore().byId("table");
    oTable.setModel(oModel);
    oTable.bindRows({
        path: "/Parent",
        parameters: {expand: "Children"}
    Can anyone please give me a suggestion to rectify this?
    Thanks in Advance,
    Aravindh

    Hi All,
    Please see the below code. It works fine for me.
    var oController = sap.ui.controller("member_assignment");
    var oModel = new sap.ui.model.odata.ODataModel("../../../services/XXXX.xsodata/", true);
    var Context = "/PARENT?$expand=ASSIGNEDCHILD&$select=NAME,ID,ASSIGNEDCHILD/NAME,ASSIGNEDCHILD/ID,ASSIGNEDCHILD/PARENT_ID";
    var oTable = sap.ui.getCore().byId("tblProviders");
    oModel.read(Context, null, null, true, onSuccess, onError);
    function onSuccess(oEventdata){
        var outputJson = {};
        var p = 0;
        var r = {};
        try {
            if (oEventdata.results){
                r = oEventdata.results;
        } catch(e){
            //alert('oEventdata.results failed');
        $.each(r, function(i, j) {
            outputJson[p] = {};
            outputJson[p]["NAME"] = j.NAME;
            outputJson[p]["ID"] = j.ID;
            outputJson[p]["PARENT_ID"] = j.ID;
            outputJson[p]["DELETE"] = 0;
            var m = 0;
            if (j.ASSIGNEDCHILD.results.length > 0) {
                $.each(j.ASSIGNEDCHILD.results, function(a,b) {
                outputJson[p][m] = { NAME: b.NAME,
                                     ID : b.ID,
                                     PARENT_ID: b.PARENT_ID,
                                     DELETE: 1};
                m++;
            p++;
        var oPM = new sap.ui.model.json.JSONModel();
        oPM.setData(outputJson);
        oTable.setModel(oPM);
    function onError(oEvent){
        console.log("Error on Provider Members");
    oTable.bindRows({
        path:"/"
    Regards
    Aravindh

  • Failed to fetch pluggable data source error

    I'm getting this error when creating a report that is connecting to a DB2 table using jdbc. Report builder seems to connect to the database fine, however when I attempt to compile/run the report I get the following error -
    'REP-4102: Failed to fetch pluggable data source.
    java.lang.AbstractMethodError: COM/ibm/db2/jdbc/app/DB2ResultSet.getStatement'
    I'm using the following in my jdbcpds.conf on my local machine.
    <driver name = "ibm-db2"
    subProtocol = "db2"
    connectString = "subProtocol:databaseName"
    class = "COM.ibm.db2.jdbc.app.DB2Driver"
    connection = "oracle.reports.plugin.datasource.jdbcpds.JDBCConnectionHandling"
    loginTimeout = "0">
    </driver>
    Has anyone experienced this issue or has some insight as to what I need to do to resolve this?
    TIA
    Brian

    Dear Oracle Reports Team,
    Can you please help me out ???
    Need your help in this regard urgently.
    Best Regards,
    Ramakrishnan L
    Dear Friends,
    I am encountering the following the error while running the reports.
    REP-4102: Failed to fetch pluggable data source.
    java.lang.NumberFormatException: 75,000000
    I am using an Express Query in this case.
    The same report runs perfectly when connected to the NT server that has the reports server,Http server and the rdf file.The db that the reports uses is on the SOLARIS server.
    But when I use the reports server,http server and the rdf file avaiable to me on the SOLARIS server with the db also now on the same machine it does not run and gives the error.
    What could be the problem ?
    Just to add more clarity below is the URL which I use to display the report
    NT URL(This works perfectly)
    http://sa560:7778/reports/rwservlet?server=rep_sa560&report=c:\DWHReports\Man_Days_Spent.rdf&express_server="server=ora_ro_tcp:172.20.128.199/sl=-2/st=1/ct=0/st=1/"&destype=cache&desformat=htmlcss&OESHOST=172.20.128.199&SUSR_NAME='oraexp'&PASSWORD='test'
    SOLARIS URL(This does not work and gives the error)
    http://172.20.128.199:7778/reports/rwservlet?server=rep_gosisvr&report=Man_Days_Spent.rdf&express_server="server=ora_ro_tcp:172.20.128.199/sl=-2/st=1/ct=0/st=1/"&destype=cache&desformat=htmlcss&OESHOST=172.20.128.199&USR_NAME='oraexp'&PASSWORD='test'
    Regds,
    Ramakrishnan L

Maybe you are looking for