Entity-Named Query Caching - Need live results from DB not Cached

I've developed an application EJB3 web application which uses Entity Beans and NamedQueries to retrieve my results.
Aswell as my main EJB3 web application I also have an external interface (Oracle Stored Procedure) which is updating data in my database tables. The problem is that the when the NamedQueries are called they only bring back "Cached" resultsets and not the live results that exists in the database at that time. If my Stored Procedure updates a value (or if I update a value via Toad for testing) the NamedQuery does not bring back these changes. From what I have read about NamedQueries it seems that they do actually query the live db but send back the cached values for performance reasons.
Is there a way to turn off caching of Query results?
If not is there another way to deal with this issue?
ps. I'm using Jdev 10.1.3.4 and appserver 10.1.3.1

Anyone?

Similar Messages

  • Toplink named query returns maximum 1413 results

    Hi all.
    As the heading states, I have a Toplink named query that returns 1413 results when it should return 3030. Is this configured somewhere? thanks in advance.

    Go to your toplink map in the application navigator, select the descriptor in the structure pane and then click on the "Queries" tab in the editor pane. You should be able to configure your named query there.
    Hope this helps.
    Anuj

  • Syntax of a query rule to exclude results from site collection

    Hi,
    I'm trying to edit a query rule to exclude results from a site collection.
    I'm coping the "Local SharePoint Results" and want to add something like -path=http://server/sites/demo/*
    but somehow it doesn't work. can someone help with the exact syntax (should to write http, "", to write with {} etc.)
    keren tsur

    -SiteId:<guid>
    should work. Or:
    -path:http://server/sites/sitecollection
    Thanks,
    Mikael Svenson
    Search Enthusiast - SharePoint MVP/MCT/MCPD - If you find an answer useful, please up-vote it.
    http://techmikael.blogspot.com/
    Author of Working with FAST Search Server 2010 for SharePoint

  • Named query cache not hit

    Hi,
    I'm using Toplink ORM 10.1.3.
    I have a table called STORE_CONFIG which has a primary key called KEYWORD (a VARCHAR2). The POJO mapped to this table is called StoreConfig.
    In the JDeveloper (10.1.3.1.0) mapping workbench I've defined a named query to query by the PK called "getConfigPropertyByKeyword". The type of the named query is ReadObjectQuery.
    SELECT keyword, key_value
    FROM STORE_CONFIG
    WHERE (keyword = #key)
    Under the options tab I have the following settings:
    Cache Statement: true
    Bind Parameters: true
    Cache Usage: Check Cache by Primary Key
    The application logs show that the same database queries are executed multiple times for the same PK (keyword)! Why is that? Shouldn't it be checking the Object Cache rather than going to the DB?
    I've tried it with "Cache Statement: false" and "Bind Parameters: false" with the same problem.
    If I click the Advanced tab and check "Cache Query Results" then the database is not hit twice for the same record. However it was my understanding that since I am querying by PK that I wouldn't need to set "Cache Query Results".
    Doesn't "Cache Query Results" apply to the Query Cache and not the Object Cache?

    Your issue seems to be that you are using custom SQL for the query, not a TopLink expression. When you use an Expression query TopLink's know if the query is by primary key and can get a cache hit.
    When you use custom SQL, TopLink does not know that the SQL is by primary key, so does not get a cache hit.
    You could either use an Expression for the query,
    or when using custom SQL you should be able to name your query argument the same as your database field defined as the primary key in your descriptor (case sensitive).
    i.e.
    SELECT keyword, key_value
    FROM STORE_CONFIG
    WHERE (keyword = #KEYWORD)

  • Results from conditions not passed to Jump query

    Dear all,
    I have created a summary query and I have a condition applied on it. The condition is 'KF' < 4. I get the results of the query. I have changed the Calculate Result As to 'Total' to show the total of only the records that are displayed.
    However I have a jump query on it and when I goto the Jump query, it shows all the records though they dont satisfy the conditions. I want the drill down report to show the results of only those values displayed in the Summary report.
    Is there a setting where I can change to get the required result.
    Any helpful reply would be awarded.
    thanks,
    KK

    Hey Roberto,
    Thank you for the reply. Yes I would also need to have the same condition in the reciever query. However, my condition is on result values in the reciever query. To explain u clearly...
    I have the data as below in my cube
    Material   QTY
    123     10
    124     20
    125     30
    126     40
    I have a summary query for which I want a condition where QTY < 40
    When I enter Material 123;124;125;126 I get the result as below
    Material   QTY
    123     10
    124     20
    125     30
    When I drill down I get the below result
    123     1     2
    123     2     4
    123     3     4
    <b>Result          10</b>
    124     1     5
    124     2     10
    124     3     5
    <b>Result          20</b>
    125     1     10
    125     2     10
    125     3     10
    <b>Result          30</b>
    126     1     10
    126     2     20
    126     3     10
    <b>Result          40</b>
    I cannot apply the same condition in the reciever query because I want the condition to be applied on the result rows. I have created one more formula KF with SUMCT(QTY) and applied condition on that but it still doesnt work.
    Any suggestions would be highly appreciated.
    thanks,
    KK

  • Can we make one query to get same results from 3 tables

    CREATE TABLE TABLE1 (NODEID VARCHAR2(4));
    CREATE TABLE TABLE2 (NODEID VARCHAR2(4));
    CREATE TABLE TABLE3 (NODEID VARCHAR2(4));
    INSERT INTO TABLE1 VALUE('1004');
    INSERT INTO TABLE1 VALUE('1004');
    INSERT INTO TABLE1 VALUE('1002');
    INSERT INTO TABLE1 VALUE('1002');
    INSERT INTO TABLE1 VALUE('1001');
    INSERT INTO TABLE1 VALUE('1001');
    INSERT INTO TABLE1 VALUE('1006');
    INSERT INTO TABLE1 VALUE('1006');
    INSERT INTO TABLE1 VALUE('1005');
    INSERT INTO TABLE1 VALUE('1005');
    INSERT INTO TABLE2 VALUE('1004');
    INSERT INTO TABLE2 VALUE('1004');
    INSERT INTO TABLE2 VALUE('1004');
    INSERT INTO TABLE2 VALUE('1002');
    INSERT INTO TABLE2 VALUE('1002');
    INSERT INTO TABLE2 VALUE('1002');
    INSERT INTO TABLE2 VALUE('1002');
    INSERT INTO TABLE3 VALUE('1001');
    INSERT INTO TABLE3 VALUE('1001');
    INSERT INTO TABLE3 VALUE('1006');
    INSERT INTO TABLE3 VALUE('1006');
    INSERT INTO TABLE3 VALUE('1005');
    INSERT INTO TABLE3 VALUE('1005');
    INSERT INTO TABLE3 VALUE('1004');
    INSERT INTO TABLE3 VALUE('1004');
    INSERT INTO TABLE3 VALUE('1004');
    INSERT INTO TABLE3 VALUE('1002');
    INSERT INTO TABLE3 VALUE('1002');
    INSERT INTO TABLE3 VALUE('1002');
    INSERT INTO TABLE3 VALUE('1002');
    Select count(*), count(distinct nodeid)
    from table1, table2,table3
    where table1.nodeid=table2.nodeid and table1.nodeid=table3.nodeid;
    Select count(*), count(distinct nodeid)
    from table1, table3
    where table1.nodeid=table2.nodeid;
    Select count(*), count(distinct nodeid)
    from table2, table3
    where table2.nodeid=table3.nodeid;

    Aside from your insert statements not working... (should be as follows)...
    DROP TABLE TABLE1;
    DROP TABLE TABLE2;
    DROP TABLE TABLE3;
    CREATE TABLE TABLE1 (NODEID VARCHAR2(4));
    CREATE TABLE TABLE2 (NODEID VARCHAR2(4));
    CREATE TABLE TABLE3 (NODEID VARCHAR2(4));
    INSERT INTO TABLE1 VALUES('1004');
    INSERT INTO TABLE1 VALUES('1004');
    INSERT INTO TABLE1 VALUES('1002');
    INSERT INTO TABLE1 VALUES('1002');
    INSERT INTO TABLE1 VALUES('1001');
    INSERT INTO TABLE1 VALUES('1001');
    INSERT INTO TABLE1 VALUES('1006');
    INSERT INTO TABLE1 VALUES('1006');
    INSERT INTO TABLE1 VALUES('1005');
    INSERT INTO TABLE1 VALUES('1005');
    INSERT INTO TABLE2 VALUES('1004');
    INSERT INTO TABLE2 VALUES('1004');
    INSERT INTO TABLE2 VALUES('1004');
    INSERT INTO TABLE2 VALUES('1002');
    INSERT INTO TABLE2 VALUES('1002');
    INSERT INTO TABLE2 VALUES('1002');
    INSERT INTO TABLE2 VALUES('1002');
    INSERT INTO TABLE3 VALUES('1001');
    INSERT INTO TABLE3 VALUES('1001');
    INSERT INTO TABLE3 VALUES('1006');
    INSERT INTO TABLE3 VALUES('1006');
    INSERT INTO TABLE3 VALUES('1005');
    INSERT INTO TABLE3 VALUES('1005');
    INSERT INTO TABLE3 VALUES('1004');
    INSERT INTO TABLE3 VALUES('1004');
    INSERT INTO TABLE3 VALUES('1004');
    INSERT INTO TABLE3 VALUES('1002');
    INSERT INTO TABLE3 VALUES('1002');
    INSERT INTO TABLE3 VALUES('1002');
    INSERT INTO TABLE3 VALUES('1002');and you're queries not working...
    SQL> Select count(*), count(distinct nodeid)
      2  from table1, table2,table3
      3  where table1.nodeid=table2.nodeid and table1.nodeid=table3.nodeid;
    Select count(*), count(distinct nodeid)
    ERROR at line 1:
    ORA-00918: column ambiguously definedI'm guessing you want:
    SQL> Select count(*), count(distinct table1.nodeid)
      2  from table1, table2,table3
      3  where table1.nodeid=table2.nodeid and table1.nodeid=table3.nodeid;
      COUNT(*) COUNT(DISTINCTTABLE1.NODEID)
            50                            2You haven't explained to us what database version you are using or what you want the output to look like when these 3 queries are combined.
    Please read: {message:id=9360002} and learn to post a good question, and use {noformat}{noformat} tags to show your code/data.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Can we use Result from another query in Webi using Bex uery universe?

    Hi,
    Can we use Result from another query filter option in Webi to create a report using a Bex Query universe?
    I need to create a report using two universes, one is Bex Query Universe and the other is Orcle universe. I have two queries, one is using Oracle universe; the other using Bex Query universe. I need to pass the Oracle data from the Oracle query to the Bex Query query to get the matched data from SAP Bex query.
    I used Result from another query in the query filter panel for the query using Bex query universe. But I got an error saying that 'A filter contains a wrong value. you cannot run this query. (Error: WIS 00007). The data used in the filter on both sides are the same. they are char.
    I have tested by using two queries from the same Bex query universe to see if the Result from another query filter option works. And I got the same error.
    Has anyone run into the same issue and if this is possible and what should be the solution?
    Thanks in advance!
    Edited by: BO_Haiyan on Oct 6, 2010 3:47 PM

    In that situation:
    Create two queries : Oracle and BW query.
    @ Report:
    As you have to see result set from both the Dataproviders, correct? To achieve thise one must have common dimension objects to merge them at report and use Objects those are coming from both queries to use them in single Table/Report.
    Unless you don't use Merge Dimensions, you don't get a chane to use both queries objects in single Table/Report. (It will give tooltip saying: You can't drop here -- Incompatable Objects)
    In case, if you don't have common dimensions, change object definitions to Detail objects, for those required.
    Hope it helps you.
    Thank You!!

  • Treat objects from named query as new ones

    Hello,
    I use TopLink 11.1.1.1.0 and I need to do following. I want to read object from database with named query and let TopLink treat them like new object created in java. Is it possible? Obejcts are in one table but the named query computes them with select from several different tables, so this obejcts are not present in the table.
    Why I need it? It is import of data from one module of our application to another. Previously it was done in java. About 500 000 objects were read in one module just to convert into 20 000 objects in another module. If database does this conversion (named query with grouping which reads the 20 000 objects) it is much faster and take much less memory.
    What I can do is to create new class with the same fields query this objects and then copy them to object I need. But I don't like creating new class just for this.
    Thank for any help
    Frank

    Thanks for help I solved it different way eventually. I made a stored procedure which inserts data to the table of SOURCE_ITEMS. First I call stored procedure, then I load all new created SOURCE_ITEMS with ReadAllQuery (I have business parameters by which to select them) and make post processing in java with them. I used StoredProcedureCall in TopLink so if exception occurs and transaction rollbacks, stored procedure will rollback as well (I tried it, works for sure:).
    StoredProcedureCall call = new StoredProcedureCall();
    call.setProcedureName(SQL_IMPORT_CALL);
    call.addNamedArgumentValue(SQL_IMPORT_PARAM_1, getId());
    call.addNamedArgumentValue(SQL_IMPORT_PARAM_2, getXXX().getId());
    call.addNamedArgumentValue(SQL_IMPORT_PARAM_3, getYYY.getId());
    call.addNamedArgumentValue(SQL_IMPORT_PARAM_4, getZZZ());
    DataModifyQuery query = new DataModifyQuery();
    query.setCall(call);
    query.setShouldBindAllParameters(true);
    // insert of imported SOURCE_ITEMS
    ctx.executeQuery(query);
    Thanks Chris and hope this will help to somebody else as well..
    Frank
    Edited by: user604333 on Nov 17, 2010 12:20 PM
    Edited by: user604333 on Nov 17, 2010 12:20 PM
    Edited by: user604333 on Nov 17, 2010 12:20 PM

  • Use BEx query results from a FM in a Transformation to Infocube

    I am trying to use the Function Module RRW3_GET_QUERY_VIEW_DATA to execute a query and read the results from E_CELL_DATA. The issue i am having is that i get the keyfigure values but not the characteristics values. Is their a new FM that i could use to read the data from a query. I need to call this FM in a transformation and update a field during dataload to cube.
    Thanks
    Points will be awarded

    Hi,
    this FM is the right one.
    See in debugger. There is an internal table with fieldname AXES - with a deep structure.
    Sven

  • Named Query or Native Query

    Hello,
    I am using native query to search for results on my search page. A friend of mine just told me that I better use named query or the time to get my search results will be too slow.
    Does Named Query give you faster result than Native Query?
    Thanks in advance.
    Regards,
    Hemen

    When you define named queries (note that this is a Hibernate example) you can put them in for example XML,
    <query name="findItemsByDescription">
    <![CDATA[from Item item where item.description like :desc]]>
    </query>or a SQL query like
    <sql-query name="findItemsByDescription">
    <return alias="item" class="Item"/>
    <![CDATA[select {item.*} from item where description like :desc]]>
    </sql-query>You can call these queries by using the same code, for example,
    session.getNamedQuery("findItemsByDescription").setString("desc", description);This is what is being meant by sharing the same calling API.
    When taking about performance. You have to know something about what goes on under the hood.
    You have probably programmed something using straightforward JDBC, so you know how to queries
    get passed to the driver and send to the database. When using HQL or JPA-QL the queries first have
    to be parsed into an SQL language that the database can understand. In this case, we have an extra
    parsing step in between. Note that Native SQL queries, including stored procedure calls, the persistence
    framework still takes care of mapping the JDBC result sets to graphs of persistent objects.
    If you want to include a native SQL hint to instruct the database management systems query optimizer,
    for example, you need to write the SQL yourself. HQL and JPA-QL do not have keywords for this.
    The disadvantage of putting native SQL in your mapping metadata is lost database portability, because
    your mappings, and hence your application, will work only for a particular database. But usually this is of a
    minor concern as you are probably not creating a framework that has to work on every database.
    When you want to get behind the performance of your query, you really have to consult the database
    and look at the execution plan - A DBA can tell you exactly what is good and what can be optimized.

  • WHERE using results from another querie

    Hi,
    Example:
    <cfquery name="QUERY 1" datasource="MyDatabase">
    SELECT Table1ID, A, B, C
    FROM Table1
    WHERE A = #Form.A#
    <cfoutput query="QUERY1">#A#<br></cfoutput>
    It returns:
    3
    4
    <cfquery name="QUERY 2" datasource="MyDatabase">
    SELECT Table2ID, A, Z, Y
    FROM Table2
    WHERE A IN (3,4)
    <cfoutput query="QUERY
    2">#Table2ID#</br></cfoutput>
    It returns:
    a
    b
    c
    d
    e
    f
    But if I make QUERY2 as:
    <cfquery name="QUERY 2" datasource="MyDatabase">
    SELECT Table2ID, A, Z, Y
    FROM Table2
    WHERE A IN (#QUERY1.A#)
    it returns only:
    a
    b
    c
    The values a,b,c are the Table2ID's where A=3
    The values d,e,f are the Table2ID'a where A=4
    How can I make return all values?
    Hope you understand.
    Thanks
    Manel

    Think about what you are doing for a second. You are
    selecting multiple rows of data with your first query. Then you are
    using a cfloop to loop through each row returned from that first
    query to perform another query that uses the value returned in the
    A column. The problem is that you are executing your second query
    once for each row returned by your first query, and you really
    aren't saving that result anywhere, since each iteration of the
    loop causes your cfselect to overwrite your previous results. You
    can create a list variable and populate it with the results from
    your first query, but I have to ask, unless you need the results
    from your first query for something else, why don't you just join
    the two tables and use on equery?
    Based on what you have, something like this?
    <cfquery name="QUERY 2" datasource="MyDatabase">
    SELECT T2.ID, T2.A, T2.Z, T2.Y
    FROM Table2 T2
    INNER JOIN Table1 T1 ON T2.A = T1.A
    WHERE T1.A = #Form.A#
    </cfquery>
    Phil

  • Named query, nested

    If I defined a named query like the following select * from (
         select q.*, rownum r from user04.quote q
         where version = 1
         order by id     
    where r between #startIndex and #endIndex I can achieve the 'paging' function by specifying the starting row and ending row for the query at runtime.
    However, if I modify the query by supplying an argument in the inner select, like this select * from (
         select q.*, rownum r from user04.quote q
         where version =#id
         order by id     
    where r between #startIndex and #endIndex my code will fail with NullPointerException in SQLCall. Here is the trace: [5/28/05 19:48:40:816 EDT] 468db180 SystemOut     O 2005.05.28 07:48:40.816--ClientSession(1108210094)--Thread[Servlet.Engine.Transports : 2,5,main]--java.lang.NullPointerExceptionjava.lang.NullPointerException
         at oracle.toplink.queryframework.SQLCall.translate(SQLCall.java:266)
         at oracle.toplink.internal.queryframework.CallQueryMechanism.executeCall(CallQueryMechanism.java:133)
         at oracle.toplink.internal.queryframework.CallQueryMechanism.executeCall(CallQueryMechanism.java:115)
         at oracle.toplink.internal.queryframework.CallQueryMechanism.executeSelectCall(CallQueryMechanism.java:197)
         at oracle.toplink.internal.queryframework.CallQueryMechanism.selectAllRows(CallQueryMechanism.java:567)
         at oracle.toplink.queryframework.ReadAllQuery.execute(ReadAllQuery.java:447)Is this an expected behavior of named query or a bug in named query? Thanks.
    Haiwei

    Solved the problem by
    1) adding an space (i.e. blank) after #id
    and 2) removing the blank line at the end of the SQL query.
    It seems that there is a bug in the TopLink code parsing the SQL. My best advice is to not use new line, tab. Use hard space as de-limiter instead. The debugging is very time-consuming.
    Please feel free to comment on my findings.
    Haiwei

  • Simple invoking BPEL process from JSP not working

    Hello Everybody,
    I'm trying to invoke a BPEL process from a client JSP. I'm following the tutorial 7 Invoking the BPEL processes.
    I'm trying to use the same tutorial for a different application.
    Here is my WSDL snippet.
    <types>
    <schema attributeFormDefault="qualified"
    elementFormDefault="qualified"
    targetNamespace="http://www.arcwebservices.com/v2006"
    xmlns="http://www.w3.org/2001/XMLSchema">
              <element name="SpatialQueryRequest" type="s1:SpatialQueryRequestType"/>
              <element name="SpatialQueryResponse" type="s1:SpatialQueryResponseType"/>
              <complexType name="SpatialQueryRequestType">
              <sequence>
                   <element name="username" type="string"/>
                   <element name="password" type="string"/>
              </sequence>
              </complexType>
              <complexType name="SpatialQueryResponseType">
              <sequence>
                   <element name="token" type="string"/>
              </sequence>
              </complexType>
         </schema>
    </types>
    <message name="SpatialQueryRequestMessage">
    <part name="payload" element="s1:SpatialQueryRequest"/>
    </message>
    <message name="SpatialQueryResponseMessage">
    <part name="payload" element="s1:SpatialQueryResponse"/>
    </message>
    <portType name="SpatialQuery">
    <operation name="initiate">
    <input message="tns:SpatialQueryRequestMessage"/>
    </operation>
    </portType>
    <!-- portType implemented by the requester of SpatialQuery BPEL process
    for asynchronous callback purposes
    -->
    <portType name="SpatialQueryCallback">
    <operation name="onResult">
    <input message="tns:SpatialQueryResponseMessage"/>
    </operation>
    </portType>
    The JSP code is:
    <%
    String ssn = request.getParameter("ssn");
    //if(ssn == null)
    // ssn = "123-12-1234";
         String username = "jaweed";
         String password = "ibrahim";
    //String xml = "<ssn xmlns=\"http://services.otn.com\">" + ssn + "</ssn>";
         //String xml = "<UserName xmlns="http://www.arcwebservices.com/v2006">" + username + "</UserName><Password xmlns="http://www.arcwebservices.com/v2006">" + password + "</Password>";
         String xml = "<UserName xmlns=\"http://www.arcwebservices.com/v2006\">" + username + "</UserName><Password xmlns=\"http://www.arcwebservices.com/v2006\">" + password + "</Password>";
    Locator locator = new Locator("default","bpel");
    IDeliveryService deliveryService = (IDeliveryService)locator.lookupService(IDeliveryService.SERVICE_NAME );
    // construct the normalized message and send to Oracle BPEL Process Manager
    NormalizedMessage nm = new NormalizedMessage( );
    nm.addPart("payload", xml );
    NormalizedMessage res = deliveryService.request("SpatialQuery", "initiate", nm);
    Map payload = res.getPayload();
    //out.println( "BPELProcess CreditRatingService executed!<br>" );
    out.println( "Token is " + payload.get("payload") );
    %>
    com.oracle.bpel.client.ServerException: IDeliveryService.request() invoked for one-way operation 'initiate'. This method can only be used to invoke two-way operations which return an output message. Please check the WSDL which defines this operation and use the method IDeliveryService.post() to invoke a one-way operation
    But I'm getting an exception. I'm in deep trouble please help me. Urgent

    Yeah ..This is my final project for my masters degree.
    I'm integrating BPEL with GIS(Geographic Information System) webservices provided by ESRI(Arcweb services). My main BPEL process that I built follows this sample example. The problem is that I need to show a working client application within 2 days or else my project is termed as incomplete and my graduation will be postponed to next semester. :(
    I tried your advice, as soon as I add a Java code to my BPEL process, the application is taking longer time to execute in the BPEL console itself.
    <bpelx:exec xmlns:bpelx="http://schemas.oracle.com/bpel/extension" language="java" version="1.4" name="exec-1">
    <![CDATA[setConversationId("output");]]>
    </bpelx:exec>
    So after the thread sleep the webservice is not ready to give its output.
    Just a quick thought, I assumed that if I attach the output of the my Invoke(Client) to a Reply activity,which will make it a two way operation.
    Then I could use,
    NormalizedMessage res = deliveryService.request("SpatialQuery", "initiate", nm);
    will work.
    I assigned the ouput of the Invoke(Client) to the variable in Reply activity.
    But my reply activity is throwing a NULL pointer exception.
    This is my BPEL code:
    <sequence name="main">
              <receive name="receiveInput" partnerLink="client" portType="tns:InvokeTest" operation="initiate" variable="input" createInstance="yes"/>
              <assign name="assign-1">
                   <copy>
                        <from variable="input" part="payload" query="/nsxml0:InvokeTestRequest/nsxml0:username"></from>
                        <to variable="IsaInput" part="parameters" query="/nsxml0:getToken/nsxml0:username"/>
                   </copy>
                   <copy>
                        <from variable="input" part="payload" query="/nsxml0:InvokeTestRequest/nsxml0:password"></from>
                        <to variable="IsaInput" part="parameters" query="/nsxml0:getToken/nsxml0:password"/>
                   </copy>
              </assign>
              <invoke name="invoke-1" partnerLink="Authentication" portType="nsxml0:IAuthentication" operation="getToken" inputVariable="IsaInput" outputVariable="IsaOutput"/>
              <assign name="assign-2">
                   <copy>
                        <from variable="IsaOutput" part="parameters" query="/nsxml0:getTokenResponse/nsxml0:Result"></from>
                        <to variable="output" part="payload" query="/nsxml0:InvokeTestResponse/nsxml0:token"/>
                   </copy>
              </assign>
              <reply name="reply-2" partnerLink="client" portType="tns:InvokeTest" operation="initiate" variable="output"/>     </sequence>
    I really appreciate your patience and help.

  • Need help with named query for 1-to-many entities

    Hi all,
    I'm in desperate need of a query and I can't seem to figure it out.
    Here's the situation:
    I have 2 entities in a one-to-many relationship. One ServerInstance to Many AppUrl's related by a join table
    @Entity
    public class ServerInstance implements Serializable {
    @Id
    @GeneratedValue(strategy = GenerationType.SEQUENCE)
    private Integer id;
    private String instancename;
    private String description;
    private String home;
    private String hostname;
    private String hostname_front;
    private String ipfront;
    private String ipback;
    private String os;
    private String layer;
    @OneToMany (fetch = FetchType.EAGER, cascade = {CascadeType.ALL})
    private List<AppUrl> appUrl = new ArrayList<AppUrl>();
    @Entity
    public class AppUrl implements Serializable {
    @Id
    @GeneratedValue(strategy = GenerationType.SEQUENCE)
    private Integer id;
    private String appurl;
    private String description;
    private String keyword;
    private Integer priority;
    private String curlresult;
    In the underlying database they are mapped by the table T_SERVERINSTANCE_T_APPURL
    What I need is a named query in the ServerInstance entity to find any AppUrl where curlresult = 'down'
    I tried the following:
    @NamedQuery
    (name = "findDownInstances", query = "SELECT i FROM ServerInstance i WHERE i.appUrl.result = 'down')
    This gave me the following error:
    invalid navigation expression [i.appUrl.result], cannot navigate collection valued association field [appUrl].
    Is there any way to create a query which will do what I want?
    Thanx in advance!

    can you provide the equivalent SQL (not JPQL!) query that you want, based on tables T_SERVERINSTANCE and T_APPURL, so that there are no misunderstandings ?

  • Need different rows from single query based on condition

    Hi,
    I have a table with 100 rows that holds employees and their roles.
    I need to write a SQL(not a PL/SQL block) as below
    1. When employee with role 'VP' logs in, the query should return all the 100 rows.
    2. When employee with role 'MGR' logs in, the query should return only those rows whose MGR is the logged in employee.
    3. When employee with role 'SALE_EXEC' logs in, it should return single rows corresponding to this SALE_EXEC.
    My requirement here is to get these outputs from a single query.
    Can anyone please help me with this.
    Thanks,
    Vivek.

    use vpd
    New Policy Groups
    When adding the policy to a table, view, or synonym, you can use the DBMS_RLS.ADD_GROUPED_POLICY interface to specify the group to which the policy belongs. To specify which policies will be effective, you add a driving context using the DBMS_RLS.ADD_POLICY_CONTEXT interface. If the driving context returns an unknown policy group, then an error is returned.
    If the driving context is not defined, then all policies are executed. Likewise, if the driving context is NULL, then policies from all policy groups are enforced. In this way, an application accessing the data cannot bypass the security setup module (which sets up application context) to avoid any applicable policies.
    You can apply multiple driving contexts to the same table, view, or synonym, and each of them will be processed individually. In this way, you can configure multiple active sets of policies to be enforced.
    Consider, for example, a hosting company that hosts Benefits and Financial applications, which share some database objects. Both applications are striped for hosting using a SUBSCRIBER policy in the SYS_DEFAULT policy group. Data access is partitioned first by subscriber ID, then by whether the user is accessing the Benefits or Financial applications (determined by a driving context). Suppose that Company A, which uses the hosting services, wants to apply a custom policy which relates only to its own data access. You could add an additional driving context (such as COMPANY A SPECIAL) to ensure that the additional, special policy group is applied for data access for Company A only. You would not apply this under the SUBSCRIBER policy, because the policy relates only to Company A, and it is more efficient to segregate the basic hosting policy from other policies.
    How to Implement Policy Groups
    To create policy groups, the administrator must do two things:
    Set up a driving context to identify the effective policy group.
    Add policies to policy groups as required.
    The following example shows how to perform these tasks.
    Note:
    You need to set up the following data structures for the examples in this section to work:
    DROP USER finance CASCADE;
    CREATE USER finance IDENTIFIED BY welcome2;
    GRANT RESOURCE TO apps;
    DROP TABLE apps.benefit;
    CREATE TABLE apps.benefit (c NUMBER);
    Step 1: Set Up a Driving Context
    Begin by creating a namespace for the driving context. For example:
    CREATE CONTEXT appsctx USING apps.apps_security_init;
    Create the package that administers the driving context. For example:
    CREATE OR REPLACE PACKAGE apps.apps_security_init IS
    PROCEDURE setctx (policy_group VARCHAR2);
    END;
    CREATE OR REPLACE PACKAGE BODY apps.apps_security_init AS
    PROCEDURE setctx ( policy_group varchar2 ) IS
    BEGIN
    REM Do some checking to determine the current application.
    REM You can check the proxy if using the proxy authentication feature.
    REM Then set the context to indicate the current application.
    DBMS_SESSION.SET_CONTEXT('APPSCTX','ACTIVE_APPS', policy_group);
    END;
    END;
    Define the driving context for the table APPS.BENEFIT.
    BEGIN
    DBMS_RLS.ADD_POLICY_CONTEXT('apps','benefit','APPSCTX','ACTIVE_APPS');
    END;
    Step 2: Add a Policy to the Default Policy Group.
    Create a security function to return a predicate to divide the data by company.
    CREATE OR REPLACE FUNCTION by_company (sch varchar2, tab varchar2)
    RETURN VARCHAR2 AS
    BEGIN
    RETURN 'COMPANY = SYS_CONTEXT(''ID'',''MY_COMPANY'')';
    END;
    Because policies in SYS_DEFAULT are always executed (except for SYS, or users with the EXEMPT ACCESS POLICY system privilege), this security policy (named SECURITY_BY_COMPANY), will always be enforced regardless of the application running. This achieves the universal security requirement on the table: namely, that each company should see its own data regardless of the application that is running. The function APPS.APPS_SECURITY_INIT.BY_COMPANY returns the predicate to make sure that users can only see data related to their own company:
    BEGIN
    DBMS_RLS.ADD_GROUPED_POLICY('apps','benefit','SYS_DEFAULT',
    'security_by_company',
    'apps','by_company');
    END;
    Step 3: Add a Policy to the HR Policy Group
    First, create the HR group:
    CREATE OR REPLACE FUNCTION hr.security_policy
    RETURN VARCHAR2
    AS
    BEGIN
    RETURN 'SYS_CONTEXT(''ID'',''TITLE'') = ''MANAGER'' ';
    END;
    The following creates the policy group and adds a policy named HR_SECURITY to the HR policy group. The function HR.SECURITY_POLICY returns the predicate to enforce security on the APPS.BENEFIT table:
    BEGIN
    DBMS_RLS.CREATE_POLICY_GROUP('apps','benefit','HR');
    DBMS_RLS.ADD_GROUPED_POLICY('apps','benefit','HR',
    'hr_security','hr','security_policy');
    END;
    Step 4: Add a Policy to the FINANCE Policy Group
    Create the FINANCE policy:
    CREATE OR REPLACE FUNCTION finance.security_policy
    RETURN VARCHAR2
    AS
    BEGIN
    RETURN ('SYS_CONTEXT(''ID'',''DEPT'') = ''FINANCE'' ');
    END;
    Create a policy group named FINANCE and add the FINANCE policy to the FINANCE group:
    BEGIN
    DBMS_RLS.CREATE_POLICY_GROUP('apps','benefit','FINANCE');
    DBMS_RLS.ADD_GROUPED_POLICY('apps','benefit','FINANCE',
    'finance_security','finance', 'security_policy');
    END;
    As a result, when the database is accessed, the application initializes the driving context after authentication. For example, with the HR application:
    execute apps.security_init.setctx('HR');
    Validating the Application Used to Connect to the Database
    The package implementing the driving context must correctly validate the application that is being used to connect to the database. Although the database always checks the call stack to ensure that the package implementing the driving context sets context attributes, inadequate validation can still occur within the package.
    For example, in applications where database users or enterprise users are known to the database, the user needs the EXECUTE privilege on the package that sets the driving context. Consider a user who knows that:
    The BENEFITS application allows more liberal access than the HR application
    The setctx procedure (which sets the correct policy group within the driving context) does not perform any validation to determine which application is actually connecting. That is, the procedure does not check either the IP address of the incoming connection (for a three-tier system) or the proxy_user attribute of the user session.
    Such a user could pass to the driving context package an argument setting the context to the more liberal BENEFITS policy group, and then access the HR application instead. Because the setctx does no further validation of the application, this user bypasses the normally more restrictive HR security policy.
    By contrast, if you implement proxy authentication with VPD, then you can determine the identity of the middle tier (and the application) that is actually connecting to the database on behalf of a user. In this way, the correct policy will be applied for each application to mediate data access.
    For example, a developer using the proxy authentication feature could determine that the application (the middle tier) connecting to the database is HRAPPSERVER. The package that implements the driving context can thus verify whether the proxy_user in the user session is HRAPPSERVER. If so, then it can set the driving context to use the HR policy group. If proxy_user is not HRAPPSERVER, then it can disallow access.
    In this case, when the following query is executed
    SELECT * FROM APPS.BENEFIT;
    Oracle Database picks up policies from the default policy group (SYS_DEFAULT) and active namespace HR. The query is internally rewritten as follows:
    SELECT * FROM APPS.BENEFIT WHERE COMPANY = SYS_CONTEXT('ID','MY_COMPANY') and SYS_CONTEXT('ID','TITLE') = 'MANAGER';
    How to Add a Policy to a Table, View, or Synonym
    The DBMS_RLS package enables you to administer security policies by using its procedures for adding, enabling, refreshing, or dropping policies, policy groups, or application contexts. You need to specify the table, view, or synonym to which you are adding a policy, as well as the data pertinent to that policy, such as the policy name. Such data also includes names for the policy group and the function implementing the policy. You can also specify the types of statements the policy controls (SELECT, INSERT, UPDATE, DELETE, CREATE INDEX, or ALTER INDEX).
    for more you can refer to
    http://download-west.oracle.com/docs/cd/B19306_01/network.102/b14266/apdvcntx.htm

Maybe you are looking for