Custom sql mappings

Hi,
Is it possible to model associations between classes that are not foreign key based, but rather, arbitrary SQL? How about class attributes that are a PL/SQL call or a subselect? How is this done? What kind of mapping would this be? It seems to me that the many to one, one to one and direct to field mapping editors only let me pick columns, not PL/SQL expressions.
Thanks!
Calin

The mapping editor does not provide support for
providing custom SQL or a stored procedure within
your mapping. You can do this through code by
customizing the selection query on the mapping to use
either a SQLCall or a StoredProcedureCall.
This is typically configured through the use of a
descriptor after-load method. Within our employee
example included in the
[url=http://www.oracle.com/technology/products/ias/top
link/examples/index.html]Foundation Examples.A very trivial example of customizing the SQL for a
1:1 mapping would look like:
public static void
afterLoadEmployee(ClassDescriptor descriptor) {
OneToOneMapping mapping =
OneToOneMapping)descriptor.getMappingForAttributeName(
"address");
mapping.setSelectionSQLString("SELECT * FROM ADDRESS
WHERE ADDRESS_ID = #ADDR_ID");
de]
DougI tried this but I get a NullPointerException when this is used on a one to many mapping (the additional caveat is that I have turned on useBatchReading. There was no setSelectionSqlString. So I used setSQLString.
there was 1 error:
(JUnitTestForToplink)java.lang.NullPointerException
at oracle.toplink.mappings.ForeignReferenceMapping.batchedValueFromRow(F
reignReferenceMapping.java:105)
at oracle.toplink.mappings.ForeignReferenceMapping.valueFromRow(ForeignR
ferenceMapping.java:891)
at oracle.toplink.mappings.DatabaseMapping.readFromRowIntoObject(Databas
Mapping.java:860)
at oracle.toplink.internal.descriptors.ObjectBuilder.buildAttributesInto
bject(ObjectBuilder.java:164)
at oracle.toplink.internal.descriptors.ObjectBuilder.buildObject(ObjectB
ilder.java:322)
at oracle.toplink.internal.descriptors.ObjectBuilder.buildObjectsInto(Ob
ectBuilder.java:382)
at oracle.toplink.internal.queryframework.DatabaseQueryMechanism.buildOb
ectsFromRows(DatabaseQueryMechanism.java:146)
at oracle.toplink.queryframework.ReadAllQuery.execute(ReadAllQuery.java:
48)
at oracle.toplink.queryframework.DatabaseQuery.execute(DatabaseQuery.jav
:493)
at oracle.toplink.queryframework.ReadQuery.execute(ReadQuery.java:125)
at oracle.toplink.publicinterface.Session.internalExecuteQuery(Session.j
va:1958)
at oracle.toplink.threetier.ServerSession.internalExecuteQuery(ServerSes
ion.java:629)
at oracle.toplink.publicinterface.Session.executeQuery(Session.java:1086
at oracle.toplink.publicinterface.Session.executeQuery(Session.java:1038

Similar Messages

  • OracleXADataSource and Custom SQL mappings

    Hi.
    In WLS 7 SP4, custom SQL types worked fine with my datasource, which was configured with a connection pool whose driver was OracleConnection. I changed the pool driver to OracleXADataSource because of a new requirement, and set up two new TxDataSources and an extra connection pool for the second DB, as directed by the BEA e-docs. Everything deployed fine but I ran into this problem: Connection.getTypeMap() returns null from the TxDataSource. Creating a new Map and passing it as parameter to setTypeMap() won't work.
    Am I missing some additional configuration in WLS/Oracle DB or can't custom SQL types be used with OracleXADataSource?
    Just in case: the original DB is Oracle 9i, which has all the custom types defined. The new DB is Oracle 8i, which does not use any custom types at all.
    Any hint to this problem will be greatly appreciated.

    Hi.
    In WLS 7 SP4, custom SQL types worked fine with my datasource, which was configured with a connection pool whose driver was OracleConnection. I changed the pool driver to OracleXADataSource because of a new requirement, and set up two new TxDataSources and an extra connection pool for the second DB, as directed by the BEA e-docs. Everything deployed fine but I ran into this problem: Connection.getTypeMap() returns null from the TxDataSource. Creating a new Map and passing it as parameter to setTypeMap() won't work.
    Am I missing some additional configuration in WLS/Oracle DB or can't custom SQL types be used with OracleXADataSource?
    Just in case: the original DB is Oracle 9i, which has all the custom types defined. The new DB is Oracle 8i, which does not use any custom types at all.
    Any hint to this problem will be greatly appreciated.

  • DB polling using custom SQL in SOA Suite 11g

    Hi,
    We are trying to poll records from a database using custom sql. But all the records in the table are being picked up at the same time and not as per the polling frequency set in the adapter.
    Below is the configuration from the .jca file:
    <endpoint-activation portType="II_ptt" operation="receive">
    <activation-spec className="oracle.tip.adapter.db.DBActivationSpec">
    <property name="DescriptorName" value="II.OrderRequest"/>
    <property name="QueryName" value="IISelect"/>
    <property name="MappingsMetaDataURL" value="II-or-mappings.xml"/>
    <property name="PollingStrategy" value="DeletePollingStrategy"/>
    <property name="PollingInterval" value="60"/>
    <property name="MaxRaiseSize" value="10"/>
    <property name="MaxTransactionSize" value="10"/>
    <property name="SequencingColumn" value="REQUEST_SAK"/>
    <property name="NumberOfThreads" value="1"/>
    <property name="ReturnSingleResultSet" value="false"/>
    <property name="DeleteDetailRows" value="false"/>
    </activation-spec>
    </endpoint-activation>
    Please let us know if anything else needs to be set to enable the polling as per the frequency in the adapter.
    Thanks.

    As the link from Anuj said, you need to also configure 'Distributed Polling' in the wizard. This speed limit trick will no longer work out of the box in 11.1.1.3 on, you will also have to set usesSkipLocking="false" in your DbAdapter connection pool definition. Skip locking eliminates the locking contention issue, so this primitive kind of load balancing is no longer needed in that case.
    Thanks
    Steve

  • Error  While Polling Database changes using Custom SQL

    Hi,
    I am using Oralce SOA Suite 11g DbAdapter for polling Database changes.
    My source database is DB2,I am using custom query for polling the changes in source database.
    I am using "Update an External Sequencing Table on a Different Database" polling Strategy
    Query section of the Mapping file is as follows.
    <querying xsi:type="query-policy">
    <queries>
    <query name="ReceiveF554102TXEDataSelect" xsi:type="read-all-query">
    <call xsi:type="sql-call">
    <sql><![CDATA[SELECT FILE_NAME, LIBRARY_NAME,
                   USER_NAME, SEQUENCE_NUM,
                   JOURNAL_CODE, ENTRY_TYPE, TIME_STAMP, JOB_NAME, JOB_USER,
                   JOB_NUMBER, PROGRAM_NAME, ARM_NUMBER, ADDRESS_FAMILY, RMT_ADDRESS,
                   REMOTE_PORT, SYSTEM_NAME, REL_RECORD, OTRCLN, IBMCU, IBITM, IBLITM,
                   IBAITM, IBY55ETAFR, IBY55ETATH, IBY55NUM01, IBY55NUM02, IBY55NUM03,
                   IBY55NUM04, IBY55NUM05, IBY55NUM06, IBY55NUM07, IBY55NUM08,
                   IBY55NUM09, IBY55NUM10, IBELM01, IBELM02, IBELM03, IBELM04, IBELM05, IBY55STR01,
                   IBY55STR02, IBY55STR03, IBY55STR04, IBY55STR05, IBY55CHA01,
                   IBY55CHA02, IBY55CHA03, IBY55CHA04, IBY55CHA05, IBY55DAT01, IBY55DAT02,
                   IBY55DAT03, IBY55DAT04, IBY55DAT05, IBUSER, IBPID, IBJOBN, IBUPMT,
                   IBUPMJ FROM PY_JRNMON.F554102T WHERE ((TIME_STAMP > (SELECT LAST_READ_DATE FROM SOALIB.SOASEQHDR WHERE (TABLE_NAME = 'F554102T'))) AND (TIME_STAMP < SYSDATE)) ORDER BY TIME_STAMP ASC]]> </sql>
    </call>
    <reference-class>ReceiveF554102TXEData.F554102T</reference-class>
    <lock-mode>none</lock-mode>
    <container xsi:type="list-container-policy">
    <collection-type>java.util.Vector</collection-type>
    </container>
    </query>
    </queries>
    <delete-query xsi:type="delete-object-query">
    <call xsi:type="sql-call">
    <sql>UPDATE SOALIB.SOASEQHDR SET LAST_READ_DATE = #LAST_READ_DATE WHERE (TABLE_NAME = 'F554102T')</sql>
    </call>
    </delete-query>
    </querying>
    I am getting following Error after defining the custom SQL in my mapping.xml file. (After executing the Created Process)
    Caused by: BINDING.JCA-11622
    Could not create/access the TopLink Session.
    This session is used to connect to the datastore.
    Caused by BINDING.JCA-11626
    Query Not Found Exception.
    Could not find Named Query [ReceiveSampleCustomPollSelect] with ? arguments, belonging to Descriptor [ReceiveSampleCustomPoll.F554102T] for [oracle.tip.adapter.db.DBActivationSpec@2ec733b0] inside of mappings xml file: [ReceiveSampleCustomPoll-or-mappings.xml].
    You may have changed the queryName info in either the _db.jca or toplink-mappings.xml files so that they no longer match.
    Make sure that the queryName in the activation/interactionSpec and in the Mappings.xml file match. If an old version of the descriptor has been loaded by the database adapter, you may need to bounce the app server. If the same descriptor is described in two separate Mappings.xml files, make sure both versions include this named query.
    You may need to configure the connection settings in the deployment descriptor (i.e. DbAdapter.rar#META-INF/weblogic-ra.xml) and restart the server. This exception is considered not retriable, likely due to a modelling mistake. This polling process will shut down, unless the fault is related to processing a particular row, in which case polling will continue but the row will be rejected (faulted).
         at oracle.tip.adapter.db.exceptions.DBResourceException.createNonRetriableException(DBResourceException.java:653)
         at oracle.tip.adapter.db.exceptions.DBResourceException.createEISException(DBResourceException.java:619)
         at oracle.tip.adapter.db.exceptions.DBResourceException.couldNotCreateTopLinkSessionException(DBResourceException.java:291)
         at oracle.tip.adapter.db.DBManagedConnectionFactory.acquireSession(DBManagedConnectionFactory.java:883)
         at oracle.tip.adapter.db.transaction.DBTransaction.getSession(DBTransaction.java:375)
         at oracle.tip.adapter.db.DBConnection.getSession(DBConnection.java:266)
         at oracle.tip.adapter.db.InboundWork.init(InboundWork.java:322)
         at oracle.tip.adapter.db.InboundWork.run(InboundWork.java:526)
         ... 4 more
    Caused by: BINDING.JCA-11626
    Query Not Found Exception.
    Could not find Named Query [ReceiveSampleCustomPollSelect] with ? arguments, belonging to Descriptor [ReceiveSampleCustomPoll.F554102T] for [oracle.tip.adapter.db.DBActivationSpec@2ec733b0] inside of mappings xml file: [ReceiveSampleCustomPoll-or-mappings.xml].
    You may have changed the queryName info in either the _db.jca or toplink-mappings.xml files so that they no longer match.
    Make sure that the queryName in the activation/interactionSpec and in the Mappings.xml file match. If an old version of the descriptor has been loaded by the database adapter, you may need to bounce the app server. If the same descriptor is described in two separate Mappings.xml files, make sure both versions include this named query.
         at oracle.tip.adapter.db.exceptions.DBResourceException.queryNotFoundException(DBResourceException.java:694)
         at oracle.tip.adapter.db.ox.TopLinkXMLProjectInitializer.initializeQuery(TopLinkXMLProjectInitializer.java:1238)
         at oracle.tip.adapter.db.DBManagedConnectionFactory.acquireSession(DBManagedConnectionFactory.java:728)
         ... 8 more
    Help required to fit custom SQL in mapping.xml file for complex queries.
    Thanks,
    Arun Jadhav.

    "Execuite Custom SQL" in DB adapter and you'l get it.
    But in this case you'll have to implement your own polling strategy (if you need one).
    If you want to use one of the predefined polling strategy you should use "Poll for New or Changed records" and import all the tables you use and connect them in wizard.

  • Discoverer Custom SQL Report with Group By,Having

    Hi all,
    I'm working with Discoverer 4.1.41.05 and retrieve data from Oracle HRMS.
    I want to create a report based on a custom SQL Folder in Discoverer Administration to solve the following scenario:
    SELECT EMPLOYEE_NUMBER FROM PER_ALL_PEOPLE_F
    MINUS
    SELECT EMPLOYEE_NUMBER FROM PER_ALL_PEOPLE_F,PER_EVENTS, PER_BOOKINGS
    GROUP BY EMPLOYEE_NUMBER
    HAVING SUM(PER_EVENTS.ATTRIBUTE1) >= 10)
    The problem is when I run the sql from SQL*Plus is working fine producing the correct result.
    When I create the Custom Folder and create a Discoverer report based on it I retrieve all employees since the second select statement retrieve no rows!
    Does anyone has any idea how I can solve this problem or if there is another way to create this scenario?
    Any feedback is much appreciated.
    Thanking you in advance.
    Elena

    Hi Elena,
    I think you are missing some joins, you could try something like:
    SELECT EMPLOYEE_NUMBER FROM PER_ALL_PEOPLE_F p
    WHERE NOT EXISTS
    (SELECT NULL
    FROM PER_EVENTS e, PER_BOOKINGS b
    WHERE b.person_id = p.person_id
    AND e.event_id = b.event_id
    GROUP BY b.person_id
    HAVING SUM(e.ATTRIBUTE1) >= 10)
    Rod West

  • How can I use the Rownum/Customized SQL query in a Mapping?

    Hi,
    * I need to use a Rownum for populating one of the target field? How to create a mapping with Rownum?
    * How can I use an Dual table in OWB mapping?
    * Can I write Customized SQL query in OWB? How can I achieve this in a Mapping?
    Thanks in Advance
    Kishan

    Hi Niels,
    As I'm sure you know, the conundrum is that Reports doesn't know how many total pages there will be in the report until it is all done formatting, which is too late for your needs. So, one classical solution to this problem is to run the report twice, storing the total number of pages in the database using a format trigger, and throwing away the output from the first run when you don't know the total number of pages.
    Alternatively, you could define a report layout so that the number of pages in the output is completely predictable based upon, say, the number of rows in the main query. E.g., set a limit of one, two, ... rows per page, and then you'll know how many pages there will be simply because you can count the rows in a separate query.
    Hope this helps...
    regards,
    Stewart

  • How to specify custom SQL in polling db adapter with logical delete option

    Hi all,
    I am writing a SOA composite app using JDeveloper SOA Suite 11.1.1.4 connecting to a SQL Server db using a polling DB Adapter with the logical delete option to send data to a BPEL process.
    I have requirements which go beyond what is supported in the JDeveloper UI for DB Adapter polling options, namely:
    * update more than one column to mark each row read, and
    * specify different SQL for the logical delete operation based on whether bpel processing of the data polled was successful or not.
    A complicating factor is that the polling involves two tables. Here is my full use-case:
    1) Polling will select data derived from two tables: e.g. 'headers' and 'details' simplified for this example:
    table: headers
    hid - primary key
    name - data label
    status - 'unprocessed', 'processed', or 'error'
    processedDate - null when data is loaded, set to current datetime when row is processed
    table: details
    hid - foreign key pointed at header.hid
    attr - data attribute name
    value - value of data attribute
    2) There is a many:1 relationship between detail and header rows through the hid columns. The db adapter polling SELECT shall return results from an outer join consisting of one header row and the associated detail rows where header.status = 'unprocessed' and header.hid = details.hid. (This is supported by the Jdeveloper UI)
    3) The polled data will be sent to be processed by a bpel process:
    3.1) If the bpel processing succeeds, the logical delete (UPDATE) operation shall set header.status = 'processed', and header.processedDate = 'getdate()'.
    3.2) If bpel processing fails (e.g. hits a data error while processing the selected data) the logical delete (UPDATE) operation shall set header.status = 'failed', header.processedDate = 'getdate()', and header.errorMsg = '{some text returned from bpel}'.
    Several parts of #3 are not supported by the JDeveloper UI: updating multiple columns to mark the row processed, using getdate() to populate a value of one of those column updates, doing different update operations based on the results of the BPEL processing of the data (success or error), and using data obtained from BPEL processing as a value of those column updates (error message).
    I have found examples which describe specifying custom SQL using the polling delete option to create a template then modifying the toplink file(s) to specify custom select and update SQL to implement a logical delete. (e.g. http://dlimiter.wordpress.com/2009/11/05/advanced-logic-in-oracle-bpel-polling-database-adapter/ and http://myexperienceswithsoa.blogspot.com/2010/06/db-adapter-polling-tricks.html). But none of them match what I've got in my project, in the first case because maybe because I'm using a higher version of JDeveloper, and in the second I think because in my case two tables are involved.
    Any suggestions would be appreciated. Thanks, John

    Hi John,
    You've raised a good scenario.
    First of all let me say that the purpose of the DB polling transaction, is to have an option to initiate a process from a DB table/view and not to update multiple fields in a table (or have other complex manipulation on the table).
    So, when choose to update a field in a record, after reading it, you are "telling" the engine not to poll this record again. Sure, i guess you can find a solution/workaround for it, but I don't think this is the way....
    The question now is what to do?
    You can have another DB adapter where you can update the data after finishing the process. In that case, after reading the data (on polling transaction) - update the header.status = 'processed' for example, and after processing the selected data update the rest of the fields.
    Hope it make some sense to you.
    Arik

  • Using information from ItResource for executing custom sql in OIM 11g

    I need executing a custom sql query using a connection of a ItResource. How can I get the instance of connection for to execute the sql query ? Someone have a idea of how i make this ? Help me ...
    Tks

    HashMap itAttribute = new HashMap();
    itAttribute.put("IT Resources.Name","IT-DB01");
    tcITResourceInstanceOperationsIntf itResource=null;
    tcResultSet resItResKey;
    resItResKey = itResource.findITResourceInstances(itAttribute);
    long iTesourceKey = resItResKey.getLongValue("IT Resource.Key");
    tcResultSet resItRes = itResource.getITResourceInstanceParameters(iTesourceKey);
    Append following lines:-
              HashMap<String, String> hashMap = new HashMap<String, String>();
              int countResultGetITResourceInstanceParameters = resItRes .getRowCount();
              for(int i = 0; i < countResultGetITResourceInstanceParameters; i++) {
                   resultGetITResourceInstanceParameters.goToRow(i);
                   hashMap.put(
                             resItRes .getStringValue(ZAPConstants.IT_RESOURCES_TYPE_PARAMETER_NAME),
                             resItRes .getStringValue(ZAPConstants.IT_RESOURCES_TYPE_PARAMETER_VALUE)
              return hashMap;

  • Batch Reading with Custom SQL

    Hello,
    Queries
    1. Is it possible to use Batch Reading in conjunction with Custom Stored Procs/ SQL?
    2. Is it possible to map an attribute to a SQL expression (like in Hibernate we have formula columns mapped using the formula* property)?
    Background
    1. We use Toplink 11g (11.1.1.0.1) (not EclipseLink) in our application and are controlling mapping using XML files (not annotations).
    2. We are migrating a legacy application with most of its data retreival logic present in stored procedures to Java.
    3. I am effectively a newbie to Toplink.
    Scenario
    1. We have a deep class heirarchy with ClassA+ at the following having a one-to-many relation with ClassB+ and ClassB+ having a one-to-many relation with ClassC+ and so on and so forth.
    2. For each of these classes the data retreival logic is present in stored procedures (coming from the legacy application) containing not so simple queries.
    3. Also there are a quite a few attributes that actually represent computed values (computed and returned from the stored procedure). Also the logic for computing the values are not simple either.
    4. So to make things easy we configured toplink to use the stored procedures to retreive data for objects of ClassA+, ClassB+ and ClassC+.
    5. But since the class heirarchy was deep, we ended up firing too many stored procedure calls to the database.
    6. We thought we could use the Batch Reading feature to help with this, but I have come across documentation that says that it wont work if you override toplink's queries with stored procedures.
    7. I wrote some sample code to determine this and for the heirarchy shown above it uses the speicifed Custom procedure (I also tried replacing the stored procs with custom SQL, but the behavior is the same) for ClassA+ and ClassB+, but for ClassC+ and below it resorts to its own generated SQL.
    8. This is a problem because the generated SQL contains the names of the computed columns which is not present in the underlying source tables.
    Thanks
    Arvind

    Batch reading is not supported with custom SQL or stored procedures.
    Join fetching is though, so you may wish to investigate that (you need to ensure you return the correct data from the stored procedure).
    James : http://www.eclipselink.org

  • How to submit custom SQL to the DB in SAP system

    Hi
    In the project in which I had participated before, there were CRM server and WAS which had IPC server (price decision tool). There was a requirement that IPC referred CRM data, but there was no suitable BAPI in CRM. So, at the UserExit in IPC I made the programs that submit custom SQL to CRM DB via JCo.
    In a similar idea, Can I submit custom SQL to the DB of SAP environment (ERP, CRM and so on) from general WAS (don't have the IPC) ?
    I already know the way that I make the function module which throw SQL in SAP and call it from WAS. Could you tell me other methods if it exists ?
    Best regards,

    What's about RFC?

  • Adding custom SQL in ReadAllQuery on hierarchy fails (11.1.1.0.1)

    Hello,
    We are encountering a problem with Toplink 11.1.1.0.1 when adding a custom sql statement within a ReadAllQuery on a parent class. The same code with the same descriptor works fine with 10.1.3.3 and 9.0.4.7.
         ReadAllQuery query = new ReadAllQuery(Person.class);
         ExpressionBuilder b = query.getExpressionBuilder();
         query.addArgument("instId", Integer.class);
         ReportQuery subQuery = new ReportQuery();
         subQuery.setReferenceClass(Person.class); // dummy mapped class...
         SQLCall selectIdsCall = new SQLCall();
         String subSelect = "select inst.ID from PERSON inst where inst.ID = #instId"; // <= re-use bind variable in child query
         selectIdsCall.setSQLString(subSelect);
         subQuery.setCall(selectIdsCall);
         Expression expr = b.get("id").in(subQuery);
         query.setSelectionCriteria(expr);
         Vector params = new Vector(1);
         params.add(new Integer(1));
         List res = (List) session.executeQuery(query, params);
    SQL statements generated by Toplink 11.1.1.0.1_
    SELECT DISTINCT t0.OBJECTTYPE_ID FROM PERSON t0
    WHERE t0.ID IN (select inst.ID from PERSON inst where inst.ID = 1)
    SELECT t0.ID, t0.OBJECTTYPE_ID, t0.GENDER, t0.AGE, t0.FIRSTNAME, t0.LASTNAME, t0.ADDRESS_ID, t0.TITI_ID, t1.ID, t1.BONUS_ID
    FROM PERSON t0, EMPLOYEE t1 WHERE ((t0.ID IN (select inst.ID from PERSON inst where inst.ID = 1)
    AND (t1.ID = t0.ID)) AND (t0.OBJECTTYPE_ID = NULL))
    SQL statements generated by Toplink 10.1.3.3 and 9.0.4.7_
    SELECT DISTINCT t0.OBJECTTYPE_ID FROM PERSON t0
    WHERE t0.ID IN (select inst.ID from PERSON inst where inst.ID = 1)
    SELECT t0.ID, t0.OBJECTTYPE_ID, t0.GENDER, t0.AGE, t0.FIRSTNAME, t0.LASTNAME, t0.ADDRESS_ID, t0.TITI_ID, t1.ID, t1.BONUS_ID
    FROM PERSON t0, EMPLOYEE t1 WHERE ((t0.ID IN (select inst.ID from PERSON inst where inst.ID = 1)
    AND (t1.ID = t0.ID)) AND (t0.OBJECTTYPE_ID = 'EM'))It looks like the arguments are shifted. You can see it when using bind variables in 11.1.1.0.1
    Exception [TOPLINK-4002] (Oracle TopLink - 11g (11.1.1.0.1) (Build 081030)): oracle.toplink.exceptio
    ns.DatabaseException
    Internal Exception: java.sql.SQLException: Invalid column index
    Error Code: 17003
    Call: SELECT t0.ID, t0.OBJECTTYPE_ID, t0.GENDER, t0.AGE, t0.FIRSTNAME, t0.LASTNAME, t0.ADDRESS_ID, t
    0.TITI_ID, t1.ID, t1.BONUS_ID FROM PERSON t0, EMPLOYEE t1 WHERE ((t0.ID IN (select inst.ID from PERS
    ON inst where inst.ID = ?) AND (t1.ID = t0.ID)) AND (t0.OBJECTTYPE_ID = ?))
            bind => [1, null, EM]Has anyone already encountered this problem ?

    Hello,
    We are encountering a problem with Toplink 11.1.1.0.1 when adding a custom sql statement within a ReadAllQuery on a parent class. The same code with the same descriptor works fine with 10.1.3.3 and 9.0.4.7.
         ReadAllQuery query = new ReadAllQuery(Person.class);
         ExpressionBuilder b = query.getExpressionBuilder();
         query.addArgument("instId", Integer.class);
         ReportQuery subQuery = new ReportQuery();
         subQuery.setReferenceClass(Person.class); // dummy mapped class...
         SQLCall selectIdsCall = new SQLCall();
         String subSelect = "select inst.ID from PERSON inst where inst.ID = #instId"; // <= re-use bind variable in child query
         selectIdsCall.setSQLString(subSelect);
         subQuery.setCall(selectIdsCall);
         Expression expr = b.get("id").in(subQuery);
         query.setSelectionCriteria(expr);
         Vector params = new Vector(1);
         params.add(new Integer(1));
         List res = (List) session.executeQuery(query, params);
    SQL statements generated by Toplink 11.1.1.0.1_
    SELECT DISTINCT t0.OBJECTTYPE_ID FROM PERSON t0
    WHERE t0.ID IN (select inst.ID from PERSON inst where inst.ID = 1)
    SELECT t0.ID, t0.OBJECTTYPE_ID, t0.GENDER, t0.AGE, t0.FIRSTNAME, t0.LASTNAME, t0.ADDRESS_ID, t0.TITI_ID, t1.ID, t1.BONUS_ID
    FROM PERSON t0, EMPLOYEE t1 WHERE ((t0.ID IN (select inst.ID from PERSON inst where inst.ID = 1)
    AND (t1.ID = t0.ID)) AND (t0.OBJECTTYPE_ID = NULL))
    SQL statements generated by Toplink 10.1.3.3 and 9.0.4.7_
    SELECT DISTINCT t0.OBJECTTYPE_ID FROM PERSON t0
    WHERE t0.ID IN (select inst.ID from PERSON inst where inst.ID = 1)
    SELECT t0.ID, t0.OBJECTTYPE_ID, t0.GENDER, t0.AGE, t0.FIRSTNAME, t0.LASTNAME, t0.ADDRESS_ID, t0.TITI_ID, t1.ID, t1.BONUS_ID
    FROM PERSON t0, EMPLOYEE t1 WHERE ((t0.ID IN (select inst.ID from PERSON inst where inst.ID = 1)
    AND (t1.ID = t0.ID)) AND (t0.OBJECTTYPE_ID = 'EM'))It looks like the arguments are shifted. You can see it when using bind variables in 11.1.1.0.1
    Exception [TOPLINK-4002] (Oracle TopLink - 11g (11.1.1.0.1) (Build 081030)): oracle.toplink.exceptio
    ns.DatabaseException
    Internal Exception: java.sql.SQLException: Invalid column index
    Error Code: 17003
    Call: SELECT t0.ID, t0.OBJECTTYPE_ID, t0.GENDER, t0.AGE, t0.FIRSTNAME, t0.LASTNAME, t0.ADDRESS_ID, t
    0.TITI_ID, t1.ID, t1.BONUS_ID FROM PERSON t0, EMPLOYEE t1 WHERE ((t0.ID IN (select inst.ID from PERS
    ON inst where inst.ID = ?) AND (t1.ID = t0.ID)) AND (t0.OBJECTTYPE_ID = ?))
            bind => [1, null, EM]Has anyone already encountered this problem ?

  • Can we implement the custom sql query in CR for joining the two tables

    Hi All,
    Is there anyway to implement the custom sql query in CR for joining the two tables?
    My requirement here is I need to write sql logics for joining the two tables...
    Thanks,
    Gana

    In the Database Expert, expand the Create New Connection folder and browse the subfolders to locate your data source.
    Log on to your data source if necessary.
    Under your data source, double-click the Add Command node.
    In the Add Command to Report dialog box, enter an appropriate query/command for the data source you have opened.
    For example:
    SELECT
        Customer.`Customer ID`,
        Customer.`Customer Name`,
        Customer.`Last Year's Sales`,
        Customer.`Region`,
        Customer.`Country`,
        Orders.`Order Amount`,
        Orders.`Customer ID`,
        Orders.`Order Date`
    FROM
        Customer Customer INNER JOIN Orders Orders ON
            Customer.`Customer ID` = Orders.`Customer ID`
    WHERE
        (Customer.`Country` = 'USA' OR
        Customer.`Country` = 'Canada') AND
        Customer.`Last Year's Sales` < 10000.
    ORDER BY
        Customer.`Country` ASC,
        Customer.`Region` ASC
    Note: The use of double or single quotes (and other SQL syntax) is determined by the database driver used by your report. You must, however, manually add the quotes and other elements of the syntax as you create the command.
    Optionally, you can create a parameter for your command by clicking Create and entering information in the Command Parameter dialog box.
    For more information about creating parameters, see To create a parameter for a command object.
    Click OK.
    You are returned to the Report Designer. In the Field Explorer, under Database Fields, a Command table appears listing the database fields you specified.
    Note:
    To construct the virtual table from your Command, the command must be executed once. If the command has parameters, you will be prompted to enter values for each one.
    By default, your command is called Command. You can change its alias by selecting it and pressing F2.

  • How to update flag in multiple tables using custom sql DB adapter

    hi all,
    I have a scenario: I want to update flags in multiple tables in DB2. I have used toplink update only to update all tabless after creating relationships between them. But that approach is not working as it couldnot detect emmisions with DB2 and update the complete record with blank values in other columns.
    So, i want to use custom sql now. Can anybody help in resolving the issue or in writing the custom sql.
    Regards
    Richa

    Dear SeánMacGC thanks for reply,
    But "a.changed" is not a field in GNMT_CUSTOMER_MASTER_CHG. what i am doing in this procedure is i am collecting bulck data and validating field by field from GNMT_CUSTOMER_MASTER_CHG with GNMT_CUSTOMER_MASTER table as their structure is same.. if v_name is not same as v_name_chg then i am setting changed flag to "Y" changed is "changed dbms_sql.varchar2_table" and updating GNMT_CUSTOMER_MASTER in bluck where changed flag ='Y'...
    type custRec is record
    n_cust_ref_no dbms_sql.number_table,
    v_name dbms_sql.varchar2_table,
    v_name_chg dbms_sql.varchar2_table,
    rowid rowidArray,
    *changed dbms_sql.varchar2_table*
    i cannot use simple SQL as i need to validate field for each records with GNMT_CUSTOMER_MASTER_CHG and insert into log file as well.....
    to run this procedure:
    execute DO_DC_NAME_UPDATE_OTHER_TAB.DO_NAME_UPDATE_OTHER_TAB;
    Thanks...

  • Execute custom SQL in DB Adpater errors out in Production - urgent please.

    Hi,
    I get this error when running the process in PROD at invoking delete operation that I'm doing by selecting the execute custom SQL option while defining the DB adapter. It worked fine in TEST environment but errors out in PROD env.
    Any suggestions on why? Its urgent.
    Thanks
    -Prapoorna
    The state of this instance is Faulted
    <messages><input><InvokeDelete_InputVariable><part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="DeleteDataInput_msg"><DeleteDataInput xmlns="http://xmlns.oracle.com/pcbpel/adapter/db/DeleteData"/>
    </part></InvokeDelete_InputVariable></input><fault><remoteFault xmlns="http://schemas.oracle.com/bpel/extension"><part name="code"><code>17410</code>
    </part><part name="summary"><summary>file:/prod/app/bpel/as/bpel/domains/verint/tmp/.bpel_SynchOTLLoad_1.0_0259c5f142631959ec0fb067d23fe98e.tmp/DeleteData.wsdl [ DeleteData_ptt::DeleteData(DeleteDataInput_msg) ] - WSIF JCA Execute of operation 'DeleteData' failed due to: Pure SQL Exception.
    Pure SQL Execute of delete from xxhxc_timeattend_to_otl where report_date >= TRUNC(ADD_MONTHS(sysdate, -1),'MM') failed. Caused by java.sql.SQLException: No more data to read from socket.
    ; nested exception is:
    ORABPEL-11633
    Pure SQL Exception.
    Pure SQL Execute of delete from xxhxc_timeattend_to_otl where report_date >= TRUNC(ADD_MONTHS(sysdate, -1),'MM') failed. Caused by java.sql.SQLException: No more data to read from socket.
    The Pure SQL option is for border use cases only and provides simple yet minimal functionality. Possibly try the "Perform an operation on a table" option instead.
    </summary>
    </part><part name="detail"><detail>No more data to read from socket</detail>
    </part></remoteFault></fault></messages>

    Looks like in production you have much more data than you do in your test environments.
    It could be something wrong with your rollback segments. Is there any other activity happening when this is running?
    What happens when you run this command inside SQLPlus?
    cheers
    James

  • EDI customer specific mappings

    Hello Experts,
    I am having a scenario where the old legacy system is planned to be replaced by PI (7.11)-Seeburger.
    Now the old system based out of Gentran had around 100 maps pertaining to their 100 customers for distribution of their Invoices and ASN.
    Hence wanted to find out how best can be achievable using PI Seeburger. We thought of following options
    1] Create 100 mappings in PI and have conditional interface determination based on the customer. But in this we will have to hard-code customer DUNS number into interface determination conditions
    2] Create one base mapping in PI and another in ABAP mappings with customer format maintained in the Z-tables of PI ABAP stack and ABAP mappings filtering the same. This would lead to complex ABAP mappings program.
    3] Create one mapping in PI and write conditions in Seeburger BIC mappings. Again writing so many conditions in BIC mappings.
    Anyone worked in such scenario, please let me know the best suitable solution in this case based on your experience. Thanks in advance.
    Regards
    Rajeev

    >> One Receiver say EDIReceiver
    This is recommended. But then your EDIReceiver system should be able to send invoices to all the 100 customers based on the data in EDI file.
    >>100 interface determinations for 100 maps with conditions based on the partners
    If you have option/resource/time to functionally check the details for each of these customer individually, it is recommended to group them based on some business rule. Creating 100 maps blindly will make many redundant mappings. Try to make small groups of 4-5 mappings based on business requirement and reduce the number of maps. When thinking about changes to customer specific mappings, always think two ways.
    e.g. If you have separate maps for 100 customers and lets say they belong to same region. In future, if due to any legal requirement change, any change is required in the maps, it would be tedious and not so good design to change all the maps. Therefore, as I said group them based on business/functional logic.
    >>Multiple Receiver agreements based on the versions required each having comm channel with X2E mapping for the version required.
    Note that these standard X2E and E2X mappings are part of Seeburger BIC license. If you already have purchased BIC, your estimation for this part should be minimal.
    Regards,
    Prateek Raj Srivastava

Maybe you are looking for

  • So_new_document_att_send_api1....please give me test data

    Hi Experts, i am using so_new_document_att_send_api1 FM for send a mail to users with out any attachment, plz give me test data , why i am using so_new_document_att_send_api1    instead of so_new_document_send_api1 , due to i need text in BOLD.  i ca

  • Issue with Adobe Acrobat 9 and check writer

    My user is having the following issue when running check writer. I am having an issue when I try and print the Oracle checks. They run fine, but when I click on the view output which usually brings up a pdf screen of my check. Instead gives me 2 blan

  • Line width of e-mail

    Some e-mail text displays with lines of fixed width, i.e., text appears as a column and doesn't fill the whole width of the pane. Is there a way to undo this, so that I have complete control of the width of text by resizing the pane? iMac PPC G5   Ma

  • Is there an easier way to adjust volumes of tracks?

    Easier than making control points with the automator? What I would like to do, if possible, is to adjust the volume with the slider as it plays. I can already do this while I am just listening to it, but I can not find a way to have it record or save

  • Speaker Malfunction, Unable to End Call

    Hello,      I am currently on my second Q10. My first Q10 was sent in for repairs and I was given a replacement nearly 3 weeks later. My current replacement constantly drops calls, will not ring for calls/texts/email, speaker will not work at all (la