Parent-Chold O/R Mapping

Hello
I have two tables in an Oracle database: PARENT and CHILD.
The PARENT table has a column called IDENTIFIER (String)
The CHILD table has a column called PARENT_IDENTIFIER (String)
I want to markup my Parent has having a collection of Children. To do
this I have put the following definition into the Parent class:
private Collection children;
Now I need to add extensions to the persistence descriptor to describe the
mapping. I've tried several variations of data-column and ref-column, but
most of the time the SQL generated by Kodo refers to a table called
"CHILDX" which does not exist.
This is a simple Parent-Child relationship, and there is no intervening
mapping table.
Any suggestions would be greatly appreciated.
Kind regards, Robin.

<class name="Parent">
<field name="children">
<collection element-type="Child"/>
<extension vendor-name="kodo" key="inverse" value="parent"/>
</field>
</class>
<class name="Child">
<field name="parent">
<extension vendor-name="kodo" key="data-column"
value="PARENT_IDENTIFIER"/>
</field>
</class>
That's all, assuming you're using datastore identity. If you're using
application identity, "data-column" becomes "<parent pk field
name>-data-column".

Similar Messages

  • How to get a Parent Message Id in Mapping?

    Hi,
    I have a scenario to post the idoc to file to mulitple receivers. I have to get both Message ID and Parent Message ID in mapping. I am able to get Message ID but not Parent Message ID. I tried to get it from Table sxmspmast but it does not contain the entry during runtime.
    Any suggestions to get the Parent ID while in Mapping.
    Thanks,
    Manikandan R

    you can create a UDF,say getMSGID,  to get current message ID:
    String headerField;
    java.util.Map map;
    // get runtime constant map
    AbstractTrace trace=container.getTrace();
    map = container.getTransformationParameters();
    String key = "MessageId";
    // get value of header field by using variable key
    headerField = (String) map.get(key);
    return headerField;
    pass this output to an RFC  lookup UDF which calls this FM in XI ABAP stack:
    FUNCTION ZGET_PARENT_MSG_ID.
    ""Local Interface:
    *"  IMPORTING
    *"     VALUE(I_CURRMSGID) TYPE  CHAR40
    *"  EXPORTING
    *"     VALUE(E_PARENTMSGID) TYPE  CHAR40
    DATA: lv_currmsgid type SXMSPMAST-MSGGUID,
          lv_parentmsgid LIKE SXMSPMAST-PARENTMSG,
    lv_currmsgid = i_currmsgid.
    select single PARENTMSG from SXMSPMAST into lv_parentmsgid where MSGGUID = lv_currmsgid.
    e_premsgid = lv_parentmsgid.
    ENDFUNCTION.

  • Context Mapping: child-nodes of non-mapped parent nodes

    I am somewhat curious about Context Mapping in WebDynpro.
    Which parents nodes need to be mapped in order to map child nodes ? Does a child node has more than one parent node (e.g. grandparent - two steps above) ?
    A.1  -
    MAPPED----
    >    B.1
      - A1.1   NOT MAPPED
      - A1.2   NOT MAPPED
            - A.1.2.1     MAPPED TO>    B.1.2.4
            - A.1.2.2   NOT MAPPED
      - A1.3   NOT MAPPED
    Is it sufficient that only one of these parent nodes (e.g. direct parent node not mapped, but parent node of this parent node is mapped) need to be mapped so that the child node can be mapped ?
    Is this assumption true or not ?
    The SAP library states:
    "Conversely, child nodes of non-mapped parent nodes cannot be mapped, otherwise this would result in irresolvable conflicts at runtime with respect to the parent-child relation in the context and the mapping relation." (http://help.sap.com/saphelp_nw04/helpdata/de/51/a3384162316532e10000000a1550b0/content.htm)
    This statement is not absolutely clear on this issue.

    Let me put it this way
    You have a node vehicle and you have a child node for that the car. The car node have parameters car1, car2.
    Let the vehicle node have parameters veh1 and veh2
    Then if you map vehicle node one to any other node (say in the comp. controller) you have the option of mapping its children i.e veh1,veh2 and carnode.
    You can have that veh1 is mapped and veh2 is not mapped.
    vehicle node(mapped)
    veh1(may or may not be mapped)
    veh2(may or may not be mapped)
    car node(may or may not be mapped)
    If instead you try to map only the car node the vehicle node will also get mapped automatically but not it's child parameters like veh1 and veh2.
    vehicle node(mapped)
    veh1(may or may not be mapped)
    veh2(may or may not be mapped)
    car nodemapped)
    car1(may or may not be mapped)
    car2(may or may not be mapped)
    Hope this would help.
    Do revert for further clarification
    Regards
    Noufal

  • External Context Mapping - Pass data from Child to Parent

    Hello,
    I have the following scenario:
    DCParent Component (contains)
    - DCChildComp1    (used DC)
    - DCChildComp2    (used DC)
    - DCChildComp3    (used DC)
    - DCChildComp4    (used DC)
    What user enters in DCChildComp1 then needs to be made available to DCParent and all other DCChildComp(n) siblings.
    I have looked the posts and blogs in SDN and all of them seem to deal with passing inputField data from Parent to Child. May be I am missing it.
    In my case, I need the data to be passed from DCChildComp1 to DCParent ie Child to Parent. Then from DCParent to other DCChildComps.
    How should I go about
    a. defining the context nodes and Component Interface context nodes in parent vs child vs siblings and
    b. how should I map them externally?
    Step by step instruction would be helpful.
    Thanks in advance,
    SK.

    Thanks for all the help. As I had already seen all the links and blogs you had linked here, I was still confused about how it all came together. Finally, I got it after reading Bertram Ganz's response in this thread [Context Mapping problem;.
    when you map a context in the parent comp to an interface context in the used child component you do not define an external context mapping relation. That's normal context mapping as the data context resides in the child component.
    I have it working now and I am able to push the changes in the child component's context to the parent.
    For those who are interested in how I did it (and those who know a better way to do it
    In the child component DC:
    Map Child's View Context to Child's Controller Context
    Map Child's Controller Context to Child's Interface Controller (make sure the inputEnabled is FALSE - as the child is the data producer and the parent is the data consumer, in my case)
    In the parent DC:
    Add child DC as a Used DC
    Add child Component as a Used Component in the Parent Component
    Add Child's Interface Controller as Required Controller in Parent Component
    Map Child's Interface Controller Context to Parent's Controller Context
    Map Parent's Controller Context to Parent's View Context
    No external mapping required per the thread above. Now, any change in the child component's view is visible in the parent component view.
    Thanks again very much for the help.
    - Siva

  • Mapping issue, how to create properly data

    Hi All,
    I have a multiple IDOCs to FILE scenario, and I'm having issues with mapping data.
    Please take a look at this:
    IDOC structure:
    <IDOCNAME> (0..unbounded)
       <IDOC> (1..1)
          <SOURCESEGMENT1> (0..unbounded)
             <SOURCEFIELD1> (0..1)
             <SOURCESEGMENT2> (0..unbounded)
                <SOURCEFIELD2> (0..1)
    Target structure:
    <Recordset> (1..unbounded)
       <targetsegment> (1..unbounded)
          <targetfield> (1..1)
    Each Recordset will be create for each incoming IDOC. So, if there are 5 incoming IDOCs, then 5 Recordset will be created.
    Each targetsegment will be created if SOURCEFIELD2 = "1". So, if 1 IDOC contains 3 sourcefield2 with values 1, 2, 1, then it will need to be created twice.
    The targetfield will have corresponding value of SOURCEFIELD1when SOURCEFIELD2 = "1". So, if for sourcefield2 values 1, 2, 1 we got sourcefield values A, B, C then we will have A and C created.
    My issue is in the targetfield. I am not able to generate corresponding values.
    My actual map is
    IDOCNAME --> Recordset
    if SOURCEFIELD2 (IDOC context) = "1" --> CREATE targetsegment
    For targetfield, how can I generate the right values I need?

    The problem is with the context handling at the parent and child level.
    The parent level: targetsegment is mapped with the source field context set at IDoc level: so all the values 1,2,1 at ifwithoutelse input are within same context. now after the condition check, if you see the output (Display queue) of ifwithoutelse (I have a feel you have used this function) function, you will see the queue has two values true, true. The entry for the second context (2) would be missing. While the entry for the targetfield level will have all three values from SOURCEFIELD1. So there is a queue mismatch.
    At the parent level use:
    SOURCEFIELD2(context at IDoc level) -> Equals (=1)->CreateIf->targetsegment
    SOURCEFIELD1(default context)->targetfield
    This should work.
    Though I dont understand one thing, how the 0..unbounded occurance at the IDoc name level would work for Idoc collection! Even if you receive multiple IDocs of same idoc type, the multiple occurance for each idoc would be at <IDOC> level, which should be 1..Onbounded. For receiver IDoc multimapping this is the case.
    I really doubt if multimapping at <IDocName> level works with IDoc!
    Hope this works.
    Regards,
    Suddha

  • 1-1 mapping question

    I am trying to post the question in a different way. My previous post on the related subject can be found at Re: 1-1 mapping
    Two entities, Parent and Child, where a Parent has a Child (privately owned) or zero Child.
    In object model, Parent has an attribute child; Child does not have a parent attribute.
    Two tables, Parent and Child, with constraint where a child record cannot exist without its parent record (i.e. Child table has a foreign key to the Parent table).
    Class Parent and Child are mapped to table Parent and Child respectively.
    How do we define the 1-1 mapping in the Mapping Workbench so that Parent and Child can be inserted into the tables in one unif of work?
    Any comment or help would be greatly appreciated.
    Haiwei

    You can't do this as you described. You must have and map a relationship from child to parent in order to be able to map the 1-1 from parent to child (since your foreign key is in the child table).
    - Don

  • Owb error(please pass a parent frame)

    Hi all,
    While synchronizing mapping in a process flow i am getting "this message is not modal pass a parent frame" error.My mapping has operating mode of "set based fail over to row based".Any idea why i am getting the error?

    Anyone??????????

  • Inheritance of Attributes / Mapping

    Hello!
    I have one abstract class BaseBean which holds common getter and setter (e.g. date of creation, date of last modification...), which is used in all of my beans.
    My concrete class TestBean extends from BaseBean.
    If i want to map TestBean with the Workbench, only the attributes of my concrete class can be mapped, but not the attributes which i need from the abstract class BaseBean.
    My question: Is it possible to map attributes inherited from base classes?
    regards
    Harald.

    Harald,
    You can view attributes from your parent class in the Mapping Workbench and map them. See "Mapping Inherited Attributes in a Subclass" in the docs at:
    http://otn.oracle.com/docs/products/ias/doc_library/90200doc_otn/toplink.903/b10063/descript.htm#1016101
    Do not use the advanced "Inheritance" settings on your descriptor unless there is a common table for your root class which has rows for multiple subclasses in it.
    Doug

  • External Context Mapping & Normal Mapping in same IF Controller?

    I have a Parent DC that embeds a Child DC.
    The Parent uses data from the Child DC obtained through the Child's Interface Controller, which works fine.
    But there is also some data (value node) that I need to pass to the Child DC through the Interface Controller as External Context Mapping, from the Parent DC.
    I know how to make these work separately but am not sure how to make both Ext. Mapping and Normal mapping in the same Interface Controller (the parent IF controller) and am not sure if it is even possible.
    Is it possible to both pass data (external context mapping) and get data (normal mapping) from the same Interface Controller?
    I guess I am a little confused in how the arrows should point in  the Data Modeler and how to approach this.
    Really appreciate your help,
    SK

    Thanks Satya, Vishal
    My issue is not about navigation but how to transfer data back and forth to the same embedded child component. Here's some clarification.
    1. Parent has declared Usage of Child
    2. Child is available in Data Modeler
    3. Child has data coming from its Interface Controller, which is mapped to the Parent's Comp Controller, which is working fine.
    4. Now there is a different node that exists in both Parent and Child.
    5. The data origin for this node is the Parent and so the Child needs to get it from the Parent using External Context Mapping to its Interface Controller so that it is available inside the Child.
    My question is, is this data transfer scenario possible? If yes, what steps should be taken to make this happen?
    Thanks again,
    SK

  • MacOS 10.9's Maps and GPS

    Recently, I upgraded my late-2012 slimline iMac from MacOS 10.8 "Mountain Lion" to MacOS 10.9 "Mavericks". Also recently, my parents replaced an aging white MacBook with a refurbished late-2013 13-inch MacBook Pro with Retina Display. Now both machines run on MacOS 10.9.
    One new feature for both my parents and myself is Maps. We occasionally use a handheld DeLorme Earthmate PN-40 GPS receiver for travel (finding key points, directions, etc.), as well as recording needed coordinates for recreational trails inspections and maintenance work bees and other activities. DeLorme does not provide GPS interface software for MacOS. We have, however, used Google Earth in the past to load GPX files off of the PN-40's SD card.
    Short of loading Microsoft Windows onto our respective computers, is there a way to extract GPS waypoint coordinates and GPX route files from the SD card to display and manipulate in MacOS 10.9's Maps?
    Also: if there a way to manually type in GPS coordinates into Maps to display and/or record points of interest for future reference and use in e-mail communiqués, newsletters, reports, etc.?
    Thanks in advance.

    You can enter the GPS coordinates manually into Maps using the search field.  Here is an example of entering the decimal GPS coordinates for the Empire State Building in NYC.
    I also tried it on the GPS location for my house, and it went to the correct location.  Just remember that Maps is a street navigation tool.  It will try to find a nearby street address (if there is one) to associate with the coordinates you enter.

  • Parent Variable Configuration - Passing Connection strings..

    I have one parent package and a few child packages. I am using SQL Server 2008 R2.
    I want to pass variables from Parent Package containing Connection String and other data to the child package. I created few variables in parent package e.g. ParentConnectionString & child package e.g. ChildConnectionString. I set the child package variables
    using Package Configurations > Configuration Type = Parent package variable and map the ChildConnectionString value to ParentConnectionString.
    I have put a script component that displays variables value set in the child package from parent package.
    In child package i have a Connection of ConnectionManagerType = OLEDB. I have added an expression to set the ConnectionString using the ChildConnectionString.
    When i run the parent package during the validation phase, it calls the child package. The script component displays the values are being set in child package from parent package but after that it fails saying  - invalid username/password; logon denied.
    I suspect that the connectionstring expression are going to be executed at run time and not at validation time.
    If i put the complete connection string in ChildConnectionString it works fine.
    I am not sure if i am doing anything wrong?

    It could be that you simply need to put the DelayValidation property to True.
    Thing is, when the package starts it validates itself; but at this time the design time connection string property is used, whereas the "real" conn string has been designed to come at a later - runtime, hence you need to set the DelayValidation to true to
    make the package not checking the connectivity in advance, and thus successfully start.
    Arthur My Blog

  • Nested Table Problem

    Hello,
    I am using Oracle10g and TopLink 10.1. I would like to implement a table descriptor which has nested tables.
    When I try to read "root" table, I get error messages described below.
    Could anybody tell me what's wrong?
    Thanks,
    ***** [3 Tables] *****
    Table: PARENT
    Fields: PARENT_NO
    Table: CHILD
    Fields: CHILD_NO, PARENT_NO
    Table: CHILD_MASTER
    Fields: CHILD_NO, CHILD_NAME, etc.
    ***** [The relation between tables] *****
    Parent<---(OneToMany)--->Child<---(Multitable)--->ChildMaster
    *****   *****[/b]
    RelationalDescriptor descriptor = new RelationalDescriptor();
    descriptor.setJavaClass(Parent.class);
    descriptor.setTableName("PARENT");
    descriptor.setPrimaryKeyFieldName("PARENT_NO");
    descriptor.addTableName("CHILD");
    descriptor.addTableName("CHILD_MASTER");
    // OneToMany configuration
    OneToManyMapping oneToMany = new OneToManyMapping();
    oneToMany.setAttributeName("Child");
    oneToMany.setReferenceClass(Child.class);
    oneToMany.setTargetForeignKeyFieldName("PARENT_NO");
    descriptor.addMapping(oneToMany);
    // Multitable configuration
    ExpressionBuilder builder = new ExpressionBuilder();
    Expression exp = builder.getField("CHILD.CHILD_NO").equal("CHILD_MASTER.CHILD_NO");
    descriptor.getDescriptorQueryManager().setMultipleTableJoinExpression(exp);
    // Read
    ReadObjectQuery query = new ReadObjectQuery("PARENT");
    UnitOfWork uow = toplinkSession.acquireUnitOfWork();
    uow.executeQuery(query);
    ***** [Resulting SQL] *****
    SELECT t0.PARENT_NO, t1.PARENT_NO, t2.PARENT_NO,...
    FROM PARENT t0, CHILD t1, CHILD_MASTER t2
    WHERE ((t1.PARENT_NO = t0.PARENT_NO) AND (t1.CHILD_NO = t2.CHILD_NO))
    ***** [Error Message] *****
    When I read "Parent",  I get following error...
    "T2" is "CHILD_MASTER" table which has no "PARENT_NO" field. I don't want select "CHILD_MASTER.PARENT_NO" because it doesn't exist!
    Exception[TOPLINK-4002] (Oracle TopLink - 10g Release 3 (10.1.3.3.0) (Build 070620)): oracle.toplink.exceptions.DatabaseException
    Internal Exception: java.sql.SQLException: ORA-00904: "T2"."PARENT_NO": Invalid Identifier
    Error Code: 904
    Query:ReadAllQuery(Parent)
         at oracle.toplink.exceptions.DatabaseException.sqlException(DatabaseException.java:290)
         at oracle.toplink.internal.databaseaccess.DatabaseAccessor.basicExecuteCall(DatabaseAccessor.java:581)
         at oracle.toplink.internal.databaseaccess.DatabaseAccessor.executeCall(DatabaseAccessor.java:441)
    Message was edited by:
            user601162                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    Thank you for your reply. Maybe my explanation was confusing.
    >Parent should map to the PARENT table,
    Child should map the CHILD
    and ChildMaster should map to the CHILD_MASTER table.
    Exactly. What I want to do is "Parent has Child" using OneToMay, "Child has ChildMaster" using Multitable. I guess it's still unclear...
    Anyway, I appreciate your advice.

  • SGD 4.3 authenticate with AD(Users login n get different set of application

    Hi SGD Forum users,
    First of all, happy new year and happy holiday to all of you from new SGD user :-).
    We are planing to Demo SGD 4.3 to one of our customer by early next week.
    So, what the customer would like to see with the demo:
    1) From SunRay client, user1 launch Firefox browser and type the sgd web page.
    - Enter username and password ( Username and password must authenticate with AD ).
    - After successfully authenticate, user1 will get his webtop page.
    - In the webtop page, user1 only have two(2) applications to launch. First application is MS Office Word and Second application is Full Virtual XP desktop( 192.168.5.205 ).
    2) From SunRay client, user2 launch Firefox browser and type the sgd web page.
    - Enter username and password ( Username and password must authenticate with AD ).
    - After successfully authenticate, user2 will get his webtop page.
    - In the webtop page, user2 only have two(2) applications to launch. First application is MS Office Excel and Second application is Full Virtual XP desktop( 192.168.5.206 ).
    3) From SunRay client, user "manager" launch Firefox browser and type the sgd web page.
    - Enter username and password ( Username and password must authenticate with AD ).
    - After successfully authenticate, user "manager" will get his webtop page.
    - In the webtop page, user "manager" only have four(4) applications to launch. First application is MS Office Word, Second application is MS Office Excel, Third application is MS Office Powerpoint and Fourth application is Full Virtual XP desktop( 192.168.5.207 ).
    Note: The above mentioned users( user1, user2 and manager ) launch a different MS Office applications and different Virtual XP desktop servers.
    Here are my SGD 4.3 demo setup:
    - Install Solaris 10 06/06 OS for Sparc.
    - Install latest patches.
    - Create a local zone.
    - Install SRSS 3.1 and patches in Global zone.
    - Install SGD 4.3 in the local zone.
    - My colleague install 2x MS Server 2003( AD and DNS server )
    - My coleague install ESX( VM Server ) and created 3x Virtual XP Desktop( 192.168.5.205, 192.168.5.206 & 192.168.5.207 ).
    In my SGD, Array Manager, i had successfully set "Enabling the Active Directory login authority" as mentioned in the SGD Administrator Guide. I also login successfully to SGD server using user1, user2 and manager( Created in AD server ). So, my SGD server successfully communicated with AD server.
    When i test login user1 or user2 or manager to SGD server, they get same webtop with same applications. If i am not wrong, these behaviour is due to LDAP Profile under "o=Tarantella System Objects". If i put any application in LDAP Profile's Links tap, all the user whose authenticated with AD will be able to launch it.
    The customer requirement is, all the users authenticate with AD and the users should launch different applications and different Virtual XP Desktop as i mentioned earlier.
    Is it possible to perform the SGD demo as customer requirement ? If yes, can you guide and help me on how-to create different profile for each AD authenticated users.
    Thanks in advance.
    # Yours Sincerely,
    # Mohamed Ali Bin Abdullah.

    Hi Wai,
    Sorry not including full details of the person object in my previous posting.
    Here are the details of person object:
    General
    - Name: user1
    - Description:
    - Surname: esuria
    - Username: user1
    - Email: [email protected]
    - Locale: Automatic
    - Keyboard Map: Use XPE setting
    - Windows NT Domain: BIA
    - Bandwidth: None
    - Webtop Theme: Standard
    - Inherit parent's webtop content: NO
    - Shared between users(quest): NO
    - May log in to Secure Global Desktop: YES
    - Profile Editing: Use Parent setting
    - Clipboard Access: Use Parent Setting
    - Serial Port Mapping: Use Parent Setting
    Links
    o=BIA/cn=MS XP Desktop 192.168..5.205
    Thats the setting of user1 person object which i had created in my SGD but when user1 authenticated with AD, the user1 still sees LDAP Profile applications.
    What else, do i need to set in SGD and AD server side ?
    Thanks in advance.

  • Dynamically Setting a Variable from a Connection String that has been set by a Config File

    Hi Guys
    I'm setting up a Master / Slave (Parent / Child) dtsx environment but I'm unable to work out how to dynamically set a variable in the Master dtsx from a connection string that has had its value set by a config file. I'm sure it's possible.
    Below is the what I'm hoping to achieve. I've set up everything apart from the highlighted section.
    Any ideas?

    First, what version of SQL Server are you using?
    You could switch the problem around.  You could set the value of a variable from the config file, then it is easy to use that variable as the connection string source for your connection manager.  At the same time you can use a parent variable
    configuration to map that variable to variables in your child package.
    Russel Loski, MCT, MCSE Data Platform/Business Intelligence. Twitter: @sqlmovers; blog: www.sqlmovers.com

  • FODC0002 [{bea-err}FODC0002a]: Error parsing input XML: Error at line:2 col

    I have an ODSI Physical Service that is based on a Java Function. The Java Function builds a SQL statement and uses JDBC to query for a ResultSet. One of the columns that is queried is a Clob. Sometimes, the data in this column causes an XMLBeans validation exception in ODSI: {err}XQ0027: Validation failed: error: decimal: Invalid decimal value: unexpected char '114'
    The issue is not consistently replicable with particular database record, the database records that present this issue at one point in time will be resolved after a restart of ODSI and replaced by another list of records that present the same error.
    As can be seen from the stack trace, it looks like the issue is happening after the database query has returned and while the process is assembling the SOAP response.
    Error at line:2 col:481 Line:2 '=' expected, got char[99]
    at weblogic.xml.babel.scanner.ScannerState.expect(ScannerState.java:241)
    at weblogic.xml.babel.scanner.OpenTag.read(OpenTag.java:60)
    at weblogic.xml.babel.scanner.Scanner.startState(Scanner.java:251)
    at weblogic.xml.babel.scanner.Scanner.scan(Scanner.java:178)
    at weblogic.xml.babel.baseparser.BaseParser.accept(BaseParser.java:533)
    at weblogic.xml.babel.baseparser.BaseParser.accept(BaseParser.java:510)
    at weblogic.xml.babel.baseparser.EndElement.parse(EndElement.java:34)
    at weblogic.xml.babel.baseparser.BaseParser.parseElement(BaseParser.java:457)
    at weblogic.xml.babel.baseparser.BaseParser.parseSome(BaseParser.java:326)
    at weblogic.xml.stax.XMLStreamReaderBase.advance(XMLStreamReaderBase.java:195)
    at weblogic.xml.stax.XMLStreamReaderBase.next(XMLStreamReaderBase.java:237)
    at weblogic.xml.stax.XMLEventReaderBase.parseSome(XMLEventReaderBase.java:189)
    at weblogic.xml.stax.XMLEventReaderBase.nextEvent(XMLEventReaderBase.java:122)
    at weblogic.xml.query.parsers.StAXEventAdaptor.queueNextTokens(StAXEventAdaptor.java:136)
    at weblogic.xml.query.parsers.StAXEventAdaptor.queueNextTokens(StAXEventAdaptor.java:124)
    at weblogic.xml.query.parsers.BufferedParser.fetchNext(BufferedParser.java:79)
    at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:104)
    at weblogic.xml.query.runtime.navigation.ChildPath.fetchNext(ChildPath.java:308)
    at weblogic.xml.query.iterators.GenericIterator.hasNext(GenericIterator.java:133)
    at weblogic.xml.query.schema.BestEffortValidatingIterator$OpenedIterator.hasNext(BestEffortValidatingIterator.java:224)
    at weblogic.xml.query.schema.ValidatingIterator.fetchNext(ValidatingIterator.java:82)
    at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:104)
    at weblogic.xml.query.xdbc.iterators.ItemIterator.fetchNext(ItemIterator.java:86)
    at weblogic.xml.query.iterators.LegacyGenericIterator.next(LegacyGenericIterator.java:109)
    at weblogic.xml.query.schema.BestEffortValidatingIterator.fetchNext(BestEffortValidatingIterator.java:85)
    at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:104)
    at weblogic.xml.query.xdbc.iterators.ItemIterator.fetchNext(ItemIterator.java:86)
    at weblogic.xml.query.iterators.LegacyGenericIterator.next(LegacyGenericIterator.java:109)
    at weblogic.xml.query.runtime.typing.SeqTypeMatching.fetchNext(SeqTypeMatching.java:137)
    at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:104)
    at com.bea.dsp.wrappers.jf.JavaFunctionIterator.fetchNext(JavaFunctionIterator.java:273)
    at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:104)
    at weblogic.xml.query.runtime.querycide.QueryAssassin.fetchNext(QueryAssassin.java:54)
    at weblogic.xml.query.iterators.GenericIterator.peekNext(GenericIterator.java:163)
    at weblogic.xml.query.runtime.qname.InsertNamespaces.fetchNext(InsertNamespaces.java:247)
    at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:104)
    at weblogic.xml.query.runtime.core.ExecutionWrapper.fetchNext(ExecutionWrapper.java:88)
    at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:104)
    at weblogic.xml.query.xdbc.iterators.ItemIterator.fetchNext(ItemIterator.java:86)
    at weblogic.xml.query.iterators.LegacyGenericIterator.hasNext(LegacyGenericIterator.java:130)
    at weblogic.xml.query.xdbc.util.Serializer.serializeItems(Serializer.java:251)
    at com.bea.ld.server.ResultPusher$DSP25CompatibilityPusher.next(ResultPusher.java:236)
    at com.bea.ld.server.ResultPusher.pushResults(ResultPusher.java:112)
    at com.bea.ld.server.XQueryInvocation.execute(XQueryInvocation.java:770)
    at com.bea.ld.EJBRequestHandler.invokeQueryInternal(EJBRequestHandler.java:624)
    at com.bea.ld.EJBRequestHandler.invokeOperationInternal(EJBRequestHandler.java:478)
    at com.bea.ld.EJBRequestHandler.invokeOperation(EJBRequestHandler.java:323)
    at com.bea.ld.ServerWrapperBean.invoke(ServerWrapperBean.java:153)
    at com.bea.ld.ServerWrapperBean.invokeOperation(ServerWrapperBean.java:80)
    at com.bea.ld.ServerWrapper_s9smk0_ELOImpl.invokeOperation(ServerWrapper_s9smk0_ELOImpl.java:63)
    at com.bea.dsp.ws.RoutingHandler$PriviledgedRunner.run(RoutingHandler.java:96)
    at com.bea.dsp.ws.RoutingHandler.handleResponse(RoutingHandler.java:217)
    at weblogic.wsee.handler.HandlerIterator.handleResponse(HandlerIterator.java:287)
    at weblogic.wsee.handler.HandlerIterator.handleResponse(HandlerIterator.java:271)
    at weblogic.wsee.ws.dispatch.server.ServerDispatcher.dispatch(ServerDispatcher.java:176)
    at weblogic.wsee.ws.WsSkel.invoke(WsSkel.java:80)
    at weblogic.wsee.server.servlet.SoapProcessor.handlePost(SoapProcessor.java:66)
    at weblogic.wsee.server.servlet.SoapProcessor.process(SoapProcessor.java:44)
    at weblogic.wsee.server.servlet.BaseWSServlet$AuthorizedInvoke.run(BaseWSServlet.java:285)
    at weblogic.wsee.server.servlet.BaseWSServlet.service(BaseWSServlet.java:169)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
    at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:175)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3498)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
    at weblogic.security.service.SecurityManager.runAs(Unknown Source)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2180)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2086)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1406)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    <Apr 29, 2011 12:47:01 PM EDT> <Notice> <ODSI> <BEA-000000> <LabOrderDataServices> <Error occurred performing ODSI operation: {ld:LabOrder/logical/LabOrderReport}getLabOrderDetails:1
    weblogic.xml.query.exceptions.XQueryDynamicException: ld:LabOrder/logical/LabOrderReport.ds, line 34, column 6: {err}FODC0002 [{bea-err}FODC0002a]: Error parsing input XML: Error at line:2 col:481 Line:2 '=' expected, got char[99]
    at weblogic.xml.query.iterators.AbstractIterator.reportUserError(AbstractIterator.java:95)
    at weblogic.xml.query.iterators.AbstractIterator.reportUserError(AbstractIterator.java:147)
    at weblogic.xml.query.parsers.Parser.reportParseError(Parser.java:157)
    at weblogic.xml.query.parsers.StAXEventAdaptor.queueNextTokens(StAXEventAdaptor.java:225)
    at weblogic.xml.query.parsers.StAXEventAdaptor.queueNextTokens(StAXEventAdaptor.java:124)
    Truncated. see log file for complete stacktrace
    javax.xml.stream.XMLStreamException: Error at line:2 col:481 Line:2 '=' expected, got char[99]
    at weblogic.xml.stax.XMLStreamReaderBase.advance(XMLStreamReaderBase.java:206)
    at weblogic.xml.stax.XMLStreamReaderBase.next(XMLStreamReaderBase.java:237)
    at weblogic.xml.stax.XMLEventReaderBase.parseSome(XMLEventReaderBase.java:189)
    at weblogic.xml.stax.XMLEventReaderBase.nextEvent(XMLEventReaderBase.java:122)
    at weblogic.xml.query.parsers.StAXEventAdaptor.queueNextTokens(StAXEventAdaptor.java:136)
    Truncated. see log file for complete stacktrace
    Error at line:2 col:481 Line:2 '=' expected, got char[99]
    at weblogic.xml.babel.scanner.ScannerState.expect(ScannerState.java:241)
    at weblogic.xml.babel.scanner.OpenTag.read(OpenTag.java:60)
    at weblogic.xml.babel.scanner.Scanner.startState(Scanner.java:251)
    at weblogic.xml.babel.scanner.Scanner.scan(Scanner.java:178)
    at weblogic.xml.babel.baseparser.BaseParser.accept(BaseParser.java:533)
    Truncated. see log file for complete stacktrace
    >
    Can somebody shed some light on this issue?
    Thanks
    Edited by: user738507 on May 6, 2011 7:21 AM

    Here is the java function:
         * Iterate through the search results and build out the XmlBean response
         * @param helper A helper class used to simplify common JDBC commands
         * @param doc The XmlBean document to populate
         * @param isCollectionsIncluded True if Collection info should be included in results, False otherwise
         * @param isFullDetailsIncluded True if Result data should be included in results, False otherwise
         * @throws Exception
         private static void addOrders(XmlBeansJDBCHelper helper, LabOrderReportDocument doc,
                   boolean isCollectionsIncluded, boolean isFullDetailsIncluded) throws Exception {
              int rows = 0;
              ResultSet rs = helper.getResultSet();
              LabOrders labOrders = doc.getLabOrderReport().addNewLabOrders();
              LabOrder record = null;
              HashMap<Long, Collection> parentCollectionMap = null;
              // initialize variable used to track when child elements of the XML should be created
              long previousRowOrderId = 0;
              long previousRowParentOrderCollectionId = 0;
              long previousRowOrderCollectionId = 0;
              long previousRowResultId = 0;
              boolean isRootCollectionNode = false;
              LabOrder.Collections lastParentOuterCollectionsAdded = null;
              com.idexx.services.lde.laborder.Collection.Collections lastParentInnerCollectionsAdded = null;
              com.idexx.services.lde.laborder.Collection lastCollectionAdded = null;
              Result lastResultAdded = null;
              // Loop through the results and build XmlBean nodes for each row
              // Since the SQL is joining Orders to Collections (one-to-many) to Results (one-to-many),
              // and returning a flat structure, there will be duplicate Order data on each row when
              // multiple collections exist on the Order, and duplicate Collection data when multiple
              // Results exist. We can use this fact to determine when to create a new Collection, or
              // Result node.
              while (helper.getResultSet().next())
                   rows++;
                   long currentRowParentOrderCollectionId = 0;
                   long currentRowOrderCollectionId = 0;
                   long currentRowResultId = 0;
                   long currentRowResultRemarkId = 0;
                   //int rowno = helper.getResultSet().getRow();
                   // Get the Order ID
                   logDebug("Getting the OrderId.....");
                   BigInteger dbOrderId = JDBCHelper.getBigInteger(rs, DataConstants.ORDER_ID);
                   logDebug("DONE getting the OrderId.");
                   long currentRowOrderId = dbOrderId.longValue();
                   // Determine the Order ID, Order Collection ID, and Result ID currently being processed.
                   // These will be used to determine whether to start a new LabOrder Bean, Collections Bean, or Results Bean
                   if (isCollectionsIncluded || isFullDetailsIncluded) {
                        // Get the ParentOrderCollectionID
                        logDebug("Getting the Parent Collection Order ID.....");
                        BigInteger dbParentOrderCollectionId = JDBCHelper.getBigInteger(rs, DataConstants.PARENT_ORDER_COLLECTION_ID);
                        if ( dbParentOrderCollectionId != null )
                             currentRowParentOrderCollectionId = dbParentOrderCollectionId.longValue();
                        else
                             currentRowParentOrderCollectionId = 0;
                        // Get the OrderCollectionID
                        logDebug("Getting the Order Collection ID.....");
                        BigInteger dbOrderCollectionId = JDBCHelper.getBigInteger(rs, DataConstants.ORDER_COLLECTION_ID);
                        if ( dbOrderCollectionId != null )
                             currentRowOrderCollectionId = dbOrderCollectionId.longValue();
                        else
                             currentRowOrderCollectionId = 0;
                        if ( isFullDetailsIncluded ) {
                             // Get the ResultID
                             logDebug("Getting the Result Id.....");
                             BigInteger dbResultId = JDBCHelper.getBigInteger(rs, DataConstants.RESULT_ID);
                             if ( dbResultId != null )
                                  currentRowResultId = dbResultId.longValue();
                             else
                                  currentRowResultId = 0;
                             // Get the ResultRemarkID
                             BigInteger dbResultRemarkId = JDBCHelper.getBigInteger(rs, DataConstants.RESULT_REMARK_ID);
                             if ( dbResultRemarkId != null )
                                  currentRowResultRemarkId = dbResultRemarkId.longValue();
                             else
                                  currentRowResultRemarkId = 0;
                   isRootCollectionNode = (currentRowParentOrderCollectionId == 0);
                   logDebug("currentRowOrderId: " + currentRowOrderId);
                   logDebug("previousRowOrderId: " + previousRowOrderId);
                   logDebug("currentRowResultId: " + currentRowResultId);
                   logDebug("previousRowResultId: " + previousRowResultId);
                   logDebug("currentRowResultRemarkId: " + currentRowResultRemarkId);
                   logDebug("previousRowResultRemarkId: N/A");
                   logDebug("currentRowParentOrderCollectionId: " + currentRowParentOrderCollectionId);
                   logDebug("previousRowParentOrderCollectionId: " + previousRowParentOrderCollectionId);
                   logDebug("currentRowOrderCollectionId: " + currentRowOrderCollectionId);
                   logDebug("previousRowOrderCollectionId: " + previousRowOrderCollectionId);
                   if ( currentRowOrderId != previousRowOrderId ) {
                        parentCollectionMap = new HashMap<Long, Collection>();
                        lastParentOuterCollectionsAdded = null;
                        lastParentInnerCollectionsAdded = null;
                        lastCollectionAdded = null;
                        lastResultAdded = null;
                        // This is a new Order, generate a new Lab Order bean
                        record = addOrder(labOrders, helper, dbOrderId, isFullDetailsIncluded);
                        logDebug("Order Added!");
                        // If there is Parent Collection data and it should be included, build a Collections element,
                        // and populate the first one
                        if ( !isRootCollectionNode && (isCollectionsIncluded || isFullDetailsIncluded) ) {
                             lastParentOuterCollectionsAdded = record.addNewCollections();
                             lastCollectionAdded = addCollection(record, helper, lastParentOuterCollectionsAdded, true);
                             logDebug("Collection Added! Is it null? " + (lastCollectionAdded == null));
                        // If there is Collection data and it should be included, build a Collections element,
                        // and populate the first one
                        if ( currentRowOrderCollectionId > 0 && (isCollectionsIncluded || isFullDetailsIncluded) ) {
                             if ( isRootCollectionNode ) {
                                  lastParentOuterCollectionsAdded = record.addNewCollections();
                                  lastCollectionAdded = addCollection(record, helper, lastParentOuterCollectionsAdded, false);
                                  parentCollectionMap.put(new Long(currentRowOrderCollectionId), lastCollectionAdded);
                                  logDebug("parent collection added to map: " + currentRowOrderCollectionId);
                             else {
                                  lastParentInnerCollectionsAdded = lastCollectionAdded.addNewCollections();
                                  lastCollectionAdded = addCollection(record, helper, lastParentInnerCollectionsAdded, false);
                             logDebug("Collection Added! Is it null? " + (lastCollectionAdded == null));
                             // If there is Result data and it should be included, build a Results element,
                             // and populate the first one
                             if ( currentRowResultId > 0 && isFullDetailsIncluded ) {
                                  logDebug("Adding result....");
                                  lastResultAdded = addResult(record, helper, lastCollectionAdded);
                                  logDebug("Result Added!");
                                  // If there is Result Remark data and it should be included, build a ResultRemarks element,
                                  // and populate the first one
                                  if ( currentRowResultRemarkId > 0 && isFullDetailsIncluded ) {
                                       addResultRemark(record, helper, lastResultAdded);
                        logDebug("DONE getting first Collection and Result.");
                   else if ( currentRowParentOrderCollectionId != previousRowParentOrderCollectionId
                             && (isCollectionsIncluded || isFullDetailsIncluded) ) {
                        // This is a new, top level, Order Collection to be included
                        lastParentOuterCollectionsAdded = null;
                        lastParentInnerCollectionsAdded = null;
                        lastCollectionAdded = null;
                        lastResultAdded = null;
                        logDebug("Getting next Order Collection...");
                        // If there is Parent Collection data and it should be included, build a Collections element,
                        // and populate the first one
                        if ( !isRootCollectionNode ) {
                             lastCollectionAdded = (com.idexx.services.lde.laborder.Collection)parentCollectionMap.get(new Long(currentRowParentOrderCollectionId));
                             logDebug("A Collection Added! Is it null? " + (lastCollectionAdded == null));
                        // If there is Collection data and it should be included, build a Collections element,
                        // and populate the first one
                        if ( currentRowOrderCollectionId > 0 ) {
                             if ( isRootCollectionNode ) {
                                  //LabOrder.Collections collections = record.addNewCollections();
                                  lastParentOuterCollectionsAdded = record.getCollections();
                                  lastCollectionAdded = addCollection(record, helper, lastParentOuterCollectionsAdded, false);
                                  parentCollectionMap.put(new Long(currentRowOrderCollectionId), lastCollectionAdded);
                             else {
                                  lastParentInnerCollectionsAdded = lastCollectionAdded.addNewCollections();
                                  lastCollectionAdded = addCollection(record, helper, lastParentInnerCollectionsAdded, false);
                             logDebug("B Collection Added! Is it null? " + (lastCollectionAdded == null));
                             // If there is Result data and it should be included, build a Results element,
                             // and populate the first one
                             if ( currentRowResultId > 0 && isFullDetailsIncluded ) {
                                  lastResultAdded = addResult(record, helper, lastCollectionAdded);
                                  // If there is Result Remark data and it should be included, build a ResultRemarks element,
                                  // and populate the first one
                                  if ( currentRowResultRemarkId > 0 && isFullDetailsIncluded ) {
                                       addResultRemark(record, helper, lastResultAdded);
                   else if ( currentRowOrderCollectionId != previousRowOrderCollectionId
                             && (isCollectionsIncluded || isFullDetailsIncluded) ) {
                        // This is a new Order Collection to be included inside of a parent collection
                        logDebug("Getting next CHILD Order Collection...");
                        logDebug("isRootCollectionNode: " + isRootCollectionNode);
                        logDebug("Order ID: " + helper.getBigInteger(DataConstants.ORDER_ID));
                        logDebug("Order Collection ID: " + helper.getBigInteger(DataConstants.ORDER_COLLECTION_ID));
                        logDebug("Collection ID: " + helper.getBigInteger(DataConstants.COLLECTION_ID));
                        if ( isRootCollectionNode ) {
                             lastCollectionAdded = addCollection(record, helper, lastParentOuterCollectionsAdded, false);
                             parentCollectionMap.put(new Long(currentRowOrderCollectionId), lastCollectionAdded);
                        else {
                             com.idexx.services.lde.laborder.Collection parentCollection = (com.idexx.services.lde.laborder.Collection)parentCollectionMap.get(new Long(currentRowParentOrderCollectionId));
                             if(parentCollection == null) {
                                  log(LOG_LEVEL.WARN, "Parent Collection with id: " + currentRowParentOrderCollectionId + " is null for collection id: " + currentRowOrderCollectionId + " but isRootCollectionNode is " + isRootCollectionNode);
                             } else {
                                  lastParentInnerCollectionsAdded = parentCollection.getCollections();
                                  logDebug("Is lastParentInnerCollectionsAdded null? " + (lastParentInnerCollectionsAdded == null));
                                  lastCollectionAdded = addCollection(record, helper, lastParentInnerCollectionsAdded, false);
                        // If there is Result data and it should be included, build a Results element,
                        // and populate the first one
                        if ( currentRowResultId > 0 && isFullDetailsIncluded ) {
                             lastResultAdded = addResult(record, helper, lastCollectionAdded);
                             // If there is Result Remark data and it should be included, build a ResultRemarks element,
                             // and populate the first one
                             if ( currentRowResultRemarkId > 0 && isFullDetailsIncluded ) {
                                  addResultRemark(record, helper, lastResultAdded);
                   else if ( currentRowResultId != previousRowResultId
                             && isFullDetailsIncluded ) {
                        // There is a new Result to be included
                        logDebug("Getting next Result...");
                        // This is a new result to be included
                        lastResultAdded = addResult(record, helper, lastCollectionAdded);
                        // If there is Result Remark data and it should be included, build a ResultRemarks element,
                        // and populate the first one
                        if ( currentRowResultRemarkId > 0 && isFullDetailsIncluded ) {
                             addResultRemark(record, helper, lastResultAdded);
                   else if ( isFullDetailsIncluded ) {
                        // There is a new Result Remark to include
                        logDebug("Getting next Result Remark...");
                        // This is a new result remark to be included
                        addResultRemark(record, helper, lastResultAdded);
                   logDebug("Done building response.");
                   previousRowResultId = currentRowResultId;
                   previousRowParentOrderCollectionId = currentRowParentOrderCollectionId;
                   previousRowOrderCollectionId = currentRowOrderCollectionId;
                   previousRowOrderId = currentRowOrderId;
              logDebug("Found " + rows + " rows of data.");
         }

Maybe you are looking for