Direct to field mapping to a foreign column

I am using toplink workbench to map an attribute to a column in a foriegn table.
Example:
Table Emp {
EmpID,
AddressID, -- (Foriegn Key to the table Address)
StreetName }
Table Address {
AddressID, (Primary Key)
StreetName }
Class Emp{
empId;
addressId;
streetName;
Class Address {
addressId;
streetName;
In the workbench I have the descriptors defined and I am trying to map attribute 'Emp->streetName' to the column 'Address->StreetName'.
I added the the Address table in the advanced properties->Multitable Info of Emp class and also defined the association between the tables.
I mapped the 'streetName' property to the the 'Address->StreetName' using a direct to field mapping.
When I do getter on the streetName of the Emp class I get Null and I don't see any query getting executed in the Toplink to retrieve the streetName.
I don't want to load the complete Address object just to grab the streetName from the Address table.
Did anybody try to do something similar? if yes I would appreciate your hint.
-Rama

TopLink for POJOs does not support lazy loading of primitive type attributes. Lazy loading is only possible with relationships (i.e., one-to-one, one-to-many, many-to-many).
Mapping the same column data to two different classes does entail a number of complications but regardless, you'll eventually have to perform a join between the EMP and ADDRESS table to obtain the street name associated with an Employee. If you model your Employee-Address relationship as a one-to-one rmapping with Employee having an attribute of type Address:
class Emp {
    long id;
    Address address;
class Address {
   long id;
   String streetName;
}you can configure the relationship to use joining[1] so that when you read an Emp you'll also read the associated Address in a single statement.
--Shaun
[1] http://www.oracle.com/technology/products/ias/toplink/doc/10131/main/_html/oomapcfg002.htm#sthref3924

Similar Messages

  • Lazy loading or indirection for direct to field mapping striing object

    Hi,
    I need to achieve indirection for String instance variable, currently mapped using direct to field mapping.
    This field is a text column in sql server storing large data which is rarely read.
    Recently we have been noticing that it is causing lot of IO on our SAN system.
    Is there a way indirection can be achieved for this which is currently mapped as direct to field.
    I have tried using indirection in transformation mapping. But profiling db indicates above mentioned text column is still part of the select query.
    Thanks
    Nagesh

    Hi Nagesh,
    There is no support for lazy loading of DirectToField mappings in TopLink 10.1.3 POJO persistence. Typically, people who have large CLOBs or BLOBs will put them in a separate table to avoid loading them with their owner. If you can't refactor your data model to accomodate this things get more complicated. Do you ever update these large structures are they read only in your application? If they are read only then a very simple approach is to not map the large data column and instead lazily load the data with a report query.
    Ideally you would upgrade to TopLink JPA in 11g where lazy loading of Basic mappings is supported. :-)
    --Shaun                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • One-to-one-to-one direct to field mapping

    Yes, I didn't mispelled, there IS a one-to-one-to-one relationship in the database I am mapping, don't ask me why!
    To make it simple:
    TABLE A (a,b, att1)
    TABLE B (b,c, att2)
    TABLE C (c, att3)
    My Descriptor class is DescA.java with all attributes (String att1, String att2 and String att3) and is associated directly with table A.
    I have mapped att1 using direct to field.
    I have mapped att2 using direct to field and multitable.
    But there is no direct relationship between A and C tables, which makes multitable configuration impossible (?).
    So, how can I map C.att3 direct to DescA.att3 ??
    []'s
    Henrique

    Henrique,
    If you are using the Toplink Workbench you will have to specify the relationship between Table B and C using an After Load method. The mapping workbench does not support this scenario yet, but it is allowed by TopLink. Use the api ClassDescriptor.addMultipleTableForeignKeyFieldName(String, String) to specify the relationship.
    In the workbench you will see a problem message about the primary keys not matching and no reference being specified. You can safely ignore this warning, or if it bothers you, the only workaround I can think of is to also specify the mappings to Table C in the afterload method.
    Karen

  • Mapping tool creates new columns one at a time

    Hi All,
    I'm using Kodo 3.2.2 with MySql 4.1. I have a fairly large table (over
    2million rows) representing one persistent class. I need to add several
    new fields to the object, which will require a refresh mapping on the
    database to add new columns for the fields.
    It seems when I run refresh mapping on the database, Kodo is adding each
    new field individually in seperate ALTER TABLE statements, rather than
    adding all the columns in one ALTER TABLE statement.
    With MySQL and large tables, this can really increase the time required to
    refresh the mapping - I've tried with MSSQL and that database must handle
    ALTER TABLE commands differently because it doesn't take a fraction of the
    time that MySQL takes.
    Is this a bug (I doubt it), but rather a feature improvement?
    Thoughts?
    Thanks,
    Brendan

    Thanks Abe for the reply - I've been meaning to look at the process of
    creating SQL scripts to perform the refresh mappings on our databases.
    I'll have a look at this process before we talk contracting!
    Thanks,
    Brendan
    Abe White wrote:
    Unfortunately, we don't have any immediate plans to optimize ALTER TABLEsince
    it's not a runtime operation, and as you say most DBs don't have a problemwith
    it. Also, note that the documentation shows how to get Kodo to create SQL
    scripts rather than directly modify the DB; you could easily edit thescripts
    Kodo generates to make them more efficient before piping the SQL to MySQLusing
    its client app. (Note that Kodo 3.3 introduces the ability to generate SQL
    scripts directly from the mapping tool, rather than having to take an extrastep
    through the schema tool).
    If getting Kodo to combine ALTER statements is important to you, you couldalso
    contact [email protected] about contracting us to expedite this work.

  • How to create Rules with Flex Field mapping in the bpm worklist

    I Have created a flex field label and was able to map to the flex field attributes .
    But when i try to create a rules , I don't see the label or the flex attributes in the task payload .
    Can someone please help is understanding how to create Rules with Flex Field mapping in the bpm worklist .
    Even I am also searching for any scripts which will take the flex fields prompts and can directly create a label in the bpm worklist .
    Any pointers or suggestion is highly appreciated .

    Hi,
    SE38 -> Enter program
    Select Variants button and display. In the next screen, enter a variant name, (If not existing , press Create to create new one), else click on Change.
    Now the selection screen will display with a button "Variant Attributes" at the top.
    Click on that button.
    In the next screen, go to the selection variable column of the date field. Press F4 or drop down and select 'D' for date maintenance.
    In the column "Name of Variable (Input Only Using F4)" press F4 or drop down, select whichever kind of date calculation you want and save the variant.
    Now whenever you run the prgrm with this variant, date will be displayed by default.
    Regards,
    Subramanian

  • R11I : AR 과 GL FIELD MAPPING

    제품 : FIN_AR
    작성날짜 : 2004-05-13
    R11I : AR 과 GL FIELD MAPPING
    ============================
    PURPOSE
    AR에서 GL로 TRANSFE 된 DATA에 대한 TRACKING이 가능하게 한다.
    Explanation
    GL_INTERFACE TABLE을 기준으로 작성되었다.
    GL_INTERFACE의 REFRENCE21 은 GL_JE_LINES의 REFERENCE_1 과 동일하고 그 이후 COLUMN도 마찬가지로 MAPPING하면 된다.
    즉 GL_INTERFACE.REFERENCE21 = GL_JE_LINES.REFERENCE_1,...,
    GL_INTERFACE.REFERENCE29 = GL_JE_LINES.REFERENCE_9
    Adjustments
    REFERENCE21     posting_control_id
    REFERENCE22     adjustment_id
    REFERENCE23     line_id
    REFERENCE24     trx_number
    REFERENCE25     adjustment_number
    REFERENCE26     cust_trx_type
    REFERENCE27     bill_to_customer_id
    REFERENCE28     ADJ
    REFERENCE29     source_type
    Transactions
    REFERENCE21     posting_control_id
    REFERENCE22     customer_trx_id
    REFERENCE23     cust_trx_line_gl_dist_id
    REFERENCE24     trx_number
    REFERENCE25     cust.account_number
    REFERENCE26     CUSTOMER
    REFERENCE27     bill_to_customer_id
    REFERENCE28     type(CM/DM/CB/INV)
    REFERENCE29     type||account_class
    REFERENCE30     RA_CUST_TRX_LINE_GL_DIST
    Receipts
    REFERENCE21     posting_control_id
    REFERENCE22     cash_receipt_id||cash_receipt_history_id or cash_receipt_id for MISC
    REFERENCE23     line_id
    REFERENCE24     receipt_number
    REFERENCE25     null for CASH / cash_receipt_history_id for MISC
    REFERENCE26     null
    REFERENCE27     pay_from_customer
    REFERENCE28     MISC / TRADE
    REFERENCE29     MISC_sourcetype or TRADE_source_type
    REFERENCE30     AR_CASH_RECEIPT_HISTORY
    Applications
    REFERENCE21     posting_control_id
    REFERENCE22     cash_receipt_id||receivable_application_id for CASH /
    receivable_application_id for CM
    REFERENCE23     line_id
    REFERENCE24     receipt_number for CASH / trx_number for CM
    REFERENCE25     trx_number if status = ?PP?/ NULL for unapplied records
    REFERENCE26     cust_trx_type
    REFERENCE27     pay_from_customer for CASH / bill_to_customer_id for CM
    REFERENCE28     application_type (TRADE or CCURR for CASH / CMAPP for CM)
    REFERENCE29     application_type||source_type
    REFERENCE30     AR_RECEIVABLE_APPLICATIONS
    Bills Receivable
    REFERENCE21     posting_control_id
    REFERENCE22     transaction_history_id
    REFERENCE23     line_id
    REFERENCE24     trx_number
    REFERENCE25     customer_Trx_id
    REFERENCE26     cust_trx_type
    REFERENCE27     drawee_id
    REFERENCE28     cust_trx_type
    REFERENCE29     BR_||source_type
    REFERENCE30     AR_TRANSACTION_HISTORY
    Example
    Reference Documents
    Note 199673.1

    Hi,
    Here my problem is transactional data also there..
    Here a direct input program is there…RFBIBL00
    <b>Doubt:</b>
    While creating source structure ….whether I’ve to create one source structure or two source structures ( one for header ,one for item ).
    1) <b>Problems with two source structures</b>
       If I’ve created two source structures then while assigning files step, for which source structure I’ve to assign the input file…?
            If I assigned to both of the structures then it’s giving error while reading data…
                If I assigned to only header structure then also it’s giving error while reading data…
    2) <b>Problems with one source structure</b>
       If I’ve created one source structure then while reading it’s showing error..
    So what I’ve to do…yaar……?????
    Please try to solve my problem quickly…..
    Thank u very much for u r response....

  • How to add a new field in the Field Mapping

    Dear Freinds,
                   I have two un used fields in the  Source fields to the Target Fields ,
    i have to add the 3   fields which r there in the my source fields to that
    of the target fields ( as per the change in the requirement) in the 5th Step
    i.e Maintain Field Mapping and Conversion Rules
    the Target strucutre & the Source fields( by clicking the button
    Source field and selcting the fields from the popup menu) in the mapping and conversion step  ,i want to  add  3 fields which are availabe in the list of th source list  which i want to map to the Target strucutre  .
    When i tried it is adding in the othe target field
    presently i have mapping (5th sept ) as below.
       ZPA30_08                       LSMW
           Fields
               TABNAME                      Table Name
                                   Source:  ZP0008_STRC-PERNR (Personnel Number)
                                   Rule :   Default Settings
                                   Code:    ZPA30_08-TABNAME = 'ZPA30_08'.
               TCODE                        Transaction Code
                                   Rule :   Default Settings
                                   Code:    ZPA30_08-TCODE = 'PA30'.
    now i want to add 3 fields Pernr,wage type and Amount  , but when iam trying it is
    getting and saved in the TABNAME .
    How can i map please let me know .
    Regards
    syamla

    Hi,
    So you need that this new field have data in old records?
    1.- If you are in BI 7.0 and the logic or data for that New field are in the same Dimension, you can use a Remodeling to fill it. I mean if you want if you want to load from a Master Data from other InfoObject in the same Dim.
    2.- If condition "1" is not yours.
    First add the new field, then create a Backup Cube (both cubes with the new field) and make a full update with all information in the original Cube. The new field willl be empty in both cubes.
    Create an UR from BackUp_Cube to Original_Cube with all direct mapping and create a logic in the Start Routine of the UR (modiying the data_package) you can look for the data in the DSO that you often use to load.
    To do that both cubes have to be Datasources ( right click on Cube-> aditional function-> and I think is "Extract Datasource")
    Hope it helps. Regards, Federico

  • LDAP Field Mapping in 4.6C - Using WebAS 6.10+ as an LDAP Gateway

    Dear All,
      We have a need to enable CUP Functionality (we use GRC AC 5.3) for one of our oldest R/3 systems - on 4.6C. All other R/3 backends are on 4.7+ releases, so it's a multiple backend configuration for GRC AC.
      However, LDAP Field Mapping functionality is missing in 4.6C. It was enabled through LDAPMAP in the higher releases only.
      At the same time, I discovered in one SAP HR document a diagram, which shows that indeed 4.7+ can map and post data directly to LDAP, but for 4.6C and below you can use WebAS 6.10+ as an LDAP Gateway. Meaning that 4.6C calls through RFC some functions in the higher release R/3 system to use its functions for Field Mapping and further transfer of user data to the target LDAP server.
      But... I can not find anywhere how to configure 4.7 / 6.0 servers to act as an LDAP Gateway for the older 4.6C server to bypass its limitation - absence of built-in LDAP Field Mapping functionality.
      Advice on how to realise this concept will be highly appreciated.
    Thanks,
    Laziz

    Hi,
    In order to migrate users, groups and password you have to use the command ldapaddent as you did with this sintax:
    # ldapaddent -D "cn=Directory Manager" -w secret -f /etc/group group
    # ldapaddent -D "cn=Directory Manager" -w secret -f /etc/passwd passwd
    # ldapaddent -D "cn=Directory Manager" -w secret -f /etc/shadow shadowNote that you must use passwd instead of people container.
    I suggest you to check this article from BigAdmin http://www.sun.com/bigadmin/features/articles/nis_ldap_part1.jsp
    G.

  • Example of a custom field mapping?

    Ok, I admit it I am struggling here. I have simplified my example from
    what I actually have.
    I have a table that models a flat hierarchy
    ID | START_DATE | END_DATE | CLASSNAME | FIELD1 | FIELD2 | ...
    one of the objects in my hiearchy (CashFlow) has a field that is in fact
    another object called DatePeriod that contains two fields startDate and
    endDate.
    I understand that what I am trying to do is embed the DatePeriod object
    inside of the larger object when it get's persisted.
    I have the following metadata set-up
    <class name="CashFlow" persistence-capable-superclass="InstrumentFlow">
    <extension vendor-name="kodo" key="table" value="INSTRUMENT_FLOW"/>
    <extension vendor-name="kodo" key="pk-column" value="ID"/>
    <extension vendor-name="kodo" key="class-column" value="none"/>
    <extension vendor-name="kodo" key="lock-column" value="none"/>
    <field name="accrualPeriod" embedded="true"/>
    </class>
    and for my DatePeriod object
    <class name="DatePeriod">
    <extension vendor-name="kodo" key="table" value="INSTRUMENT_FLOW"/>
    <extension vendor-name="kodo" key="pk-column" value="ID"/>
    <extension vendor-name="kodo" key="lock-column" value="none"/>
    <extension vendor-name="kodo" key="class-column" value="none"/>
    <field name="startDate">
    <extension vendor-name="kodo" key="data-column" value="START_DATE"/>
    </field>
    <field name="endDate">
    <extension vendor-name="kodo" key="data-column" value="END_DATE"/>
    </field>
    </class>
    Every time I try to fetch a CashFlow object I get an error as KODO is
    trying to select the column 'ACCRUALPERIODX.'
    What am I doing wrong? Do I need to use a custom field mapping? If so
    where is the documentation to help me write a custom field mapping?
    A second question would be what happens if the DatePeriod object is used
    in a couple of places, I don't want to tie it's persistence to the
    INSTRUMENT_FLOW table.
    All help gratefully received
    Cheers
    Matt

    As you suspect, Kodo 2.x does not support embedded class mappings. Kodo
    3.0 will support embedded mappings.
    In the meantime, you can create a custom mapping, but unfortunately our
    documentation for custom mappings is lacking right now. Given how simple
    your DatePeriod object is, you're probably better off with something
    simpler (and as a bonus, less bound to Kodo):
    Just make your DatePeriod class and the field that holds the DatePeriod
    instance non-persistent. In the class that has the (now non-persistent)
    DatePeriod field, add two new persistent fields for the startDate and
    endDate. Then just use internal logic to construct the DatePeriod from
    the startDate and endDate. You can do this using the
    javax.jdo.InstanceCallbacks methods, or just do the logic in your setter
    and getter methods for the DatePeriod.

  • Help needed - Field Mapping

    Hi,
    Detail:
    ========
    I have a class called CustomerSpecificField extending another class
    NameValueTimestamp which contains 3 attibutes name, value, timestamp.
    In my package.jdo file when i tried to map my fields, I got the following
    exception. the attibutes in the parent class are not at all recognized for
    mapping.
    JDO Info
    ========
    <class name="CustomerSpecificField">
    <extension vendor-name="kodo" key="jdbc-class-map" value="base">
    <extension vendor-name="kodo" key="table" value="ADB.UP_CUST_SPEC_FIELD"/>
    <extension vendor-name="kodo" key="pk-column" value="ID"/>
    </extension>
    <field name="name">
    <extension vendor-name="kodo" key="jdbc-field-map" value="value">
    <extension vendor-name="kodo" key="column" value="NAME"/>
    </extension>
    </field>
    <field name="timestamp">
    <extension vendor-name="kodo" key="jdbc-field-map" value="value">
    <extension vendor-name="kodo" key="column" value="TIMESTAMP"/>
    </extension>
    </field>
    </class>
    Error:
    =======
    Exception in thread "main" kodo.util.FatalException: java.io.IOException:
    org.xml.sax.SAXException:
    file:/C:/kodo-jdo-3.1.3/bin/com/bo/package.jdo [Location: Line: 65, C:
    37]: Field "timestamp
    " is not declared in "CustomerSpecificField".
    [java.lang.NoSuchFieldException]
    NestedThrowables:
    java.io.IOException: org.xml.sax.SAXException:
    file:/C:/kodo-jdo-3.1.3/bin/com/bo/package.jd
    o [Location: Line: 65, C: 37]: Field "timestamp" is not declared in
    "CustomerSpecificField". [java.l
    ang.NoSuchFieldException]

    I have a class called CustomerSpecificField extending another class
    NameValueTimestamp which contains 3 attibutes name, value, timestamp.In JDO, each class can only persist fields that it declares. So you
    need to make your superclas a persistent class and map it, and reference
    it with the persistence-capable-superclass attribute from your subclass.

  • Accessing value of a textfield mapped to an infobus column

    I am attempting to access the text value of a textfield mapped to an Infobus column.
    In this instance, I am trying to get an account type from the appropriate textfield to pass it to a new query.
    I have tried using getText - this returns a string of "oracle.sql.STRUCT@nnnn" (where nnnn is an hexidecimal bnumber)
    I also tried creating a method on the textfield using code I found here to retrieve the dataItem, map it to an immediate access control and retrieve the value using getValueAsString().
    This approach also returns "oracle.sql.STRUCT@nnnn" of course the numbers are different but as far as I can see, I have know way of knowing if this is the same class or not.
    What I am wanting to do is this....
    jpLookup.getRowSetInfo().setQueryCondition("ACCOUNTTYPENAME" = '"+tfType.getFieldContents()+"'");
    jpLookup.getRowSetInfo().executeQuery();
    for testing, I took out the internal single quotes so I could see what was being generated through the sql error dialog. With them in place, I simply get an empty result set since I don't have any account types of 'oracle.sql.STRUCT@nnnn'
    SO, could someone point me in the right direction??

    TextFieldControl tField = new TextFieldControl();
    ((ImmediateAccess)tField.getDataItem()).getValueAsString();
    Linda

  • Maintain Field Mapping and Conversion Rules//LSMW

    Hello Friends,
    I want to add new fields in the step.no.5(Maintain Field Mapping and Conversion Rules).
    Indetail i'm going to upload the GL balances, for DR and CR line item fields are same so system is not accepting the same field value, so i have added 1 for the CR line item fields like in the below example.
    BSEG-WRBTR(Dr line item)
    BSEG-WRBTR1(Cr line item)
    but BSEG-WRBTR1(Cr line item) field not displaying in the step.no.5 to mapping to source field.
    please let me know the solution for this.
    thanks
    swapna.

    Hi,
    I would like to ask few questions.
    1. Are you using batch input recording or using any program for uploading. (through LSMW)
    2. Whether all your debit or credit line items are same every transactions. I believe they should be same, because you are uploading the balances.
    You should not have two similar fields for example, if it is WMBTR, then again WMBTR should not be there, it should WMBTR1 and WMBTR2. Make sure you have done the field mapping properly. When you have done the field mapping all the fileds must have been mapped. If any one of the fields are not mapped, then it will not be uploaded.
    Please see the following LSMW sample guide:
    http://www.scmexpertonline.com/downloads/SCM_LSMW_StepsOnWeb.doc
    Maintain Object Attributes Do the recording - Make sure that you do not have two fields with the similar name. If you have two fields with the same name double click on the field name and add1 or 2 to differentiate between field names. Just Copy those fields and descriptions in excel sheet, delete the blank lines, then in excel data => text to columns, your field names and descriptions will be now in two columns. Copy them, then put your cursor on the next sheet, then edit => Paste Special => Transpose, all the columns will become your rows. Now your file structure is ready. Maintain Source Structures Give some unique structure name and description Maintain Source Fields Here you add the fields that are being used in EXCEL first sheet, just copy them and make all the fields as C (Constant) and then give length of 60 for all fields. Maintain Structure Relations Though structure relations are already created just go to this step, click on edit, then click on create structure relation, just accept the message stating that the structure relation has already been created. Maintain Field Mapping and Conversion Rules Do the field mapping for all the fields, all the fields willl be stretched and you will see five rows against each row. In case if there is any row that has NOT stretched means, there is something wrong in the mapping. Maintain Fixed Values, Translations, User-Defined Routines There is nothing to be done at this step. You can simply ignore this. Specify Files Make you must have saved your excel file as .txt (before saving make sure you have copied data from sheet2 to sheet 3 and then sheet 3 is saved at tab delimited file. Text (Tab delimited) Select your file, make SURE that you have select "TABULATOR" radio button and say OK. Assign Files Go to this step and click on Create assignment button and accept the message and say ok. Read Data Remove two check boxes and just click on execute button. See the log. Make sure you have number of entries (lines) in your excel file are matching with this. Display Read Data Display data give 1 to 999 lines and double click on one of the line and see whether fields are mapped correctly are not. Convert Data Execute and see the log match the number of entries. Display Converted Data Display converted data, give 1 to 999 and double click on one of the line and see whether fields are mapped correctly or not. Create Batch Input Session Check on Keep Batch Input sessions check box, then execute. If you select that check box, even after execution it will be there and you can analyze what happened. Run Batch Input Session (Takes you to SM35) Go to SM35 select the batch and click on process button (execute), make sure you have checked right hand side first three check boxes and FOREGROUND (because you want to save what it is creating) Say OK Keep on press ENTER on your key board in order to move the session further. If you follow these steps along with the guide, surely you should be successful. There may be small difference between the file and what I have explained but ultimately the purpose is same. Hope this is useful and let me know in case if you have any issues.
    Regards, Ravi

  • SAP GRC 10 - Field Mapping

    Gurus,
    we have defined field mapping for User type but when request got closed ....still its provisions user type "dialog" , But i have selected user type as "service" in request .
    And also define field mapping  for user type AC custom field with system field name USTYP and table is IS_LOGONDATA  @Governance, Risk and Compliance >Access Control>Maintain Mapping for Actions and Connector Groups.
    Please let me if I am missing any configuration for field mapping.
    Thanks,
    Jagat

    Hello Aman,
    Thanks for the reply.
    As per my knowledge there is no direct way or configuration in GRC10 for provisioning the non-dialog useru2019s , So thatu2019s why we have created Custom field and map this field to Su01 field in SAP.
    FYI u2026.We did all the configuration related to field mapping u2026but still by default itu2019s taking user type as dialog. Do you know any SAP Note related field mapping released by SAP for GRC10.
    Thanks,
    Jagat

  • TopLink : attribute-mapping direct-xml-type-mapping

    hi
    Using TopLink Workbench 10g Release 3 (10.1.3.0.0) Build 060118 it is possible to configure a "Direct to XML Type" mapping.
    see http://www.oracle.com/technology/products/ias/toplink/doc/1013/main/_html/relmapun004.htm#CHDFIFEF
    In the TopLink map this results in a
    <opm:attribute-mapping xsi:type="toplink:direct-xml-type-mapping">Is there a way to configure this kind of attribute mapping using JDeveloper?
    thanks
    Jan Vervecken

    Hi Paul,
    The problem you're going to hit trying to do this with a TransformationMapping in 904 (as in my other post) is that during the UnitOfWork commit, when TopLink is checking for changes a .equals will end up being called on an XMLType instance which will throw a NullPointerException (in oracle.sql.Datum I think).
    One way to work around this (depending on the requirements for your app) is to set the isMutable flag on the TransformationMapping to false. This flag indicates that the value in the object's attribute isn't going to be changed, so we don't bother trying to check to see if it's changed. This will allow you to do reads, inserts and deletes with no problems.
    The downside is that if you need to be able to change the XML content in your objects model and do an update of the row, TopLink will never detect a change and never issue an update of the XMLType field.
    Incidentilly, if you happen to hit the same issue you had with the DirectToXMLTypeMapping where you were getting back an instance of java.sql.Opaque instead of the expected oracle.xdb.XMLType from JDBC, you should be able to handle that case in your AttributeTransformer by doing
    XMLType myXML = XMLType.createXML(myOpaque)
    Document = myXML.getDocument();
    Hope this helps
    Matt MacIvor

  • SolMan-HPQC Integration - Field Mapping

    Hi,
    All the installations and configurations are done on both SolMan and QC.
    Blueprint Integration is done and working fine.
    During Defect Integration, when we try to do "Field Mapping" in QC, getting the below error message.
    "The number of required defect fields in Quality Center is different from the number of required support message field in Application; Make sure the number of required fields is equal in both systems before you continue"
    Once clicked "OK" to error message, there are no SAP Fields listed on both 'required fileds' and 'Non-required fileds' Columns.
    Any solutions to resolve this issue?
    Thanks,
    Veera

    Hi,
    In the future, please reference the product being used to communicate between SOLMAN and HPQC.  I assume this may be SAP TAO?  Either way, you may be getting quicker replies to assist in resolving your problems this way.
    Regards,
    Mark

Maybe you are looking for

  • Payment Program F110 - Proposal Status check

    Dear Sir, We have broken down Payment process using roles. AP clerk with create the Payment proposal ( he will not have authorization to execute proposal or payment run). FI Manager will schedule the paymen proposal and release it for payment run. CE

  • Purchase Document not appearing on GL Line item report

    Hi gurus: Please help me figure out why even though I have activated the purchase document field in the field status group I still cannot see it in the line item reports when I run FBL3n. Note the purchase document field is showing up as a field for

  • Video on burned DVD drops about 5 minutes in

    So my graduation open house is this weekend. I created a slideshow using iMovie, brought the file into iDVD, and set up the menu and all that. On the computer, it ran very smoothly. The burning process also went very smoothly. I was able to put it in

  • Date stamping Cubes

    <p>Our firm has a requirement that the date the cube was lastupdated be included in any queries made of the cube. Does anyonehave an idea how this might be handled?</p><p>To be useful the date should natively show as a date without theuser having to

  • SPROXY - tcode in BW.

    Hi Xperts,   Does anybody come across TCODE : SPROXY in BW. Can any tell me what is the significance of SXPROXY in BW.