Hibernate Table Mappings

Hi, does one have to map all tables appearing in a query ... Consider the following SQL query ...
SELECT m.force, m.surname, m.init, a.id, NVL(a.qualification_name_not_found, b.qualification_name) qual_name, q.qualcode, m.app_cde, r.description AS rank FROM locnavy.MEMBERS m, nlrd.QUALIFICATION_ENROLMENT a, nlrd.MASTER_QUALIFICATION b, JOB_QUAL_LNK c, RANKS r, JOB_QUALIFICATION q WHERE a.id = m.force AND r.rank_id = m.rank_cde AND a.qualification_id = b.qualification_id (+) AND a.qualification_id = c.qualification_id AND a.learner_achievement_status_id IN (2,32) AND c.job_qualification_id = q.job_qualification_id" AND q.qualcode != 'XXX' AND (m.mustering_cde = '" + mustCode + "' OR '"+ mustCode +"' IS NULL) AND (m.rank_cde = '" + rankCode+ "' OR '" + rankCode+ "' IS NULL) AND NVL(m.STATUS,'A') = 'A' AND m.app_cde IN ('MC','MG','MP','PE','PF','PV')
Note that joins are not used as this was written under Oracle 8i ... For the purpose of this query only MEMBERS object/POJO is used to contain the results, other tables appearing here i used to help retrieve the results.
QUALIFICATION_ENROLMENT MASTER_QUALIFICATION JOB_QUAL_LNK RANKS JOB_QUALIFICATION
In hibernate SQL/HQL query, would i need to have each table appearing here mapped? Is there a rule that all table joins must be mapped and appearing in hibernate.cfg.xml?
Edited by: jsunsnx on Jul 10, 2009 11:21 AM

dcminter wrote:
I have two queries because Hibernate does not support UNIONs so had to join two queries in a hast table to avoid duplicates Huh? You're doing SQL not HQL. SQL supports unions so there's no reason to avoid them. I think you're confused.
You might want to take this to the Hibernate forums: [http://forum.hibernate.org/|http://forum.hibernate.org/]
Oops thanks, no need to go there, was a silly thinking ..
@SuppressWarnings("unchecked")
public List findAllBySQL(String mustCode, String rankCode) {
     log.debug("finding all Member instances using native SQL");
     try {
          String queryString
               = "SELECT m.force, m.surname, m.init, a.id,"
                    + " NVL(a.qualification_name_not_found, b.qualification_name) qual_name,"
                    + " q.qualcode, m.app_cde,"
                    + " r.description AS rank"
               + " FROM locnavy.MEMBERS m,"
                    + " nlrd.QUALIFICATION_ENROLMENT a,"
                    + " nlrd.MASTER_QUALIFICATION b,"
                    + " JOB_QUAL_LNK c,"
                    + " RANKS r,"
                    + " JOB_QUALIFICATION q"
               + " WHERE a.id = m.force"
                    + " AND r.rank_id = m.rank_cde"
                    + " AND a.qualification_id = b.qualification_id (+)"
                    + " AND a.qualification_id = c.qualification_id"
                    + " AND a.learner_achievement_status_id IN (2,32)"
                    + " AND c.job_qualification_id = q.job_qualification_id"
                    + " AND q.qualcode != 'XXX'"
                    + " AND (m.mustering_cde = '" + ":mustcde" + "' OR '"+ ":mustcde" +"' IS NULL)"
                    + " AND (m.rank_cde = '" + ":rankcde"+ "' OR '" + ":mustcde"+ "' IS NULL)"
                    + " AND NVL(m.STATUS,'A') = 'A' AND m.app_cde IN ('MC','MG','MP','PE','PF','PV')"
               + "UNION"
               + " SELECT m.force, m.surname, m.init, a.id,"
                    + " NVL(a.qualification_name_not_found, b.milqual_qualification_name) qual_name,"
                    + " q.qualcode, m.app_cde,"
                    + " r.description AS rank"
               + " FROM locnavy.MEMBERS m,"
                    + " nlrd.MILQUAL_ENROLMENT a,"
                    + " nlrd.MASTER_MILQUAL_QUALIFICATION b,"
                    + " JOB_QUAL_LNK c,"
                    + " RANKS r,"
                    + " JOB_QUALIFICATION q"
               + " WHERE a.id = m.force"
                    + " AND r.rank_id = m.rank_cde"
                    + " AND a.qualification_id = b.qualification_id (+)"
                    + " AND a.qualification_id = c.qualification_id"
                    + " AND a.learner_achievement_status_id IN (2,32)"
                    + " AND c.job_qualification_id = q.job_qualification_id"
                    + " AND q.qualcode != 'XXX'"
                    + " AND (m.mustering_cde = '" + ":mustcde" + "' OR '"+ ":mustcde" +"' IS NULL)"
                    + " AND (m.rank_cde = '" + ":rankcde"+ "' OR '" + ":mustcde"+ "' IS NULL)"
                    + " AND NVL(m.STATUS,'A') = 'A' AND m.app_cde IN ('MC','MG','MP','PE','PF','PV')";     
          Query queryObject = getSession().createSQLQuery(queryString);               
          queryObject.setParameter("mustcde", mustCode);
          queryObject.setParameter("rankcde", rankCode);                               
          return queryObject.list();
     } catch (RuntimeException re) {
          log.error("find all by SQL failed", re);
          throw re;
}thanks guys ...

Similar Messages

  • CRM Tables , Tables Mappings , T-codes , Function modules

    Hi All,
    I am new to CRM , I want to know the tables , table mappings transaction
    codes , functional modules used . Can any one mention the link / list down all which i have mentioned above .
    I will be very helpful if i get this information.
    cheers,
    Chandra.

    hi,
       CRM tables:
        https://forums.sdn.sap.com/click.jspa?searchID=2400935&messageID=3134573
        https://forums.sdn.sap.com/click.jspa?searchID=2400935&messageID=3128631
        https://forums.sdn.sap.com/click.jspa?searchID=2400935&messageID=3269744
    CRM FM:
       https://forums.sdn.sap.com/click.jspa?searchID=2400963&messageID=3239545
    Reward points if useful..
    Regards
    Nilesh

  • Migrating the 6.5 Universe table mappings to XIR3?

    I am trying to find in XIR3 where the universe table mappings exist.  This was a feature in supervisor 6.5, on universe properties.
    Any suggestions where i can find this?
    And how do you about migrating this?

    It's in Designer now, Tools>Manage security

  • Association Table Mappings

    Is there any way to use one-to-many mappings via an association table?
    More thoroughly, I have a table of groups, a table of users and an association table matching groups and users. I have foreign keys mapping between the association table and each of the group and user tables.
    Beyond the fact that I'm not sure how to make toplink realize that a value in the association table is either a group or a user, I also have inheritance between a generic "authorized party" class and both the group and user classes.
    The setup is complex to allow for users to be in multiple groups and to simplify the mapping between groups, users and permissions that they might have (via a similar association table, but one solution ought to be good enough for both problems).

    You were right about the many-to-many relationship, or at least it appears to have worked. I haven't been able to test it yet because of some difficulties with inheritance. Is it possible to use an abstract class as the root of an inheritance tree? I am currently getting the following errors when I try to run at test case against my project.
    ==================
    Exception [TOPLINK-108] (Oracle9iAS TopLink - Release 2 (9.0.4.0) (Build 030612)): oracle.toplink.exceptions.DescriptorException
    Exception Description: Cannot find value in class indicator mapping in parent descriptor [null].
    Descriptor: Descriptor(edu.cornell.finsys.datamodel.appmgmt.AuthorizedParty --> [DatabaseTable(APPMGMT.APPMGMT_AUTH_PARTIES), DatabaseTable(APPMGMT.APPMGMT_GROUPS), DatabaseTable(APPMGMT.APPMGMT_USERS)])
    Exception [TOPLINK-41] (Oracle9iAS TopLink - Release 2 (9.0.4.0) (Build 030612)): oracle.toplink.exceptions.DescriptorException
    Exception Description: A non-read-only mapping must be defined for the sequence number field.
    Descriptor: Descriptor(edu.cornell.finsys.datamodel.appmgmt.Group --> [DatabaseTable(APPMGMT.APPMGMT_AUTH_PARTIES), DatabaseTable(APPMGMT.APPMGMT_GROUPS), DatabaseTable(APPMGMT.APPMGMT_USERS)])
    Exception [TOPLINK-41] (Oracle9iAS TopLink - Release 2 (9.0.4.0) (Build 030612)): oracle.toplink.exceptions.DescriptorException
    Exception Description: A non-read-only mapping must be defined for the sequence number field.
    Descriptor: Descriptor(edu.cornell.finsys.datamodel.appmgmt.AuthorizedParty --> [DatabaseTable(APPMGMT.APPMGMT_AUTH_PARTIES), DatabaseTable(APPMGMT.APPMGMT_GROUPS), DatabaseTable(APPMGMT.APPMGMT_USERS)])
    ==================
    The AuthorizedParty class is abstract and it is the root of the tree. The user and group classes are the only two leaves of the tree. What I'm guessing is that Toplink is complaining that I have not provided a type mapping to indicate the root class, but I don't want there to be a type for it as it is abstract and should never be instantiated anyway. The last two errors make no sense to me as there is a writeable field for authorized party's sequence number and I would think that the user class should use the authorized party's sequence number. The other point of confusion is that the other leaf (the group class) does not appear to be exhibiting the same problem as the user class.
    Any ideas? Thanks in advance.

  • Fact vs. Dimension table mappings

    What is the difference between how Dimensions and Facts behave in the Mapping Editor?
    I have no problems loading my data into a Dimension, but with the fact table the mapping gets really complex (I have about 5 dimensions and the fact table only contains one measure in addition to the foreign keys).
    I'm guessing there is an easy way to map data to a fact table when the dimensions are defined (otherwise I can't see the need to separate dimensions and facts in the mapping editor), anyone who can help me out on this?
    thanks,
    Ulf

    Ulf,
    Quite often the mapping of the fact is indeed most complex. Common models use lookups to the dimension keys and an aggregation across the keys. In OWB you can use the key lookup operator as well as the aggregator to build the dataflow you need.
    Mark.

  • Portal events are not getting loaded into the Analytics database tables

    Analytics database ASFACT tables (ASFACT_PAGEVIEWS,ASFACT_PORLETVIEW) are not getting populated with data.
    Possible diagnosis/workarounds tried:
    -Checked the analytics configuration in configuration manager, Enable Analytics Communication option checked
    -Registered Portal Events during analytics installation
    -Verified that UDP events are sent out from the portal: Test: OK
    -Reinstalled Interaction analytics component
    Any inputs highly appreciated.
    Cheers,
    Sandeep
    In collector.log, found the exception:
    08 Jul 2010 07:12:54,613 ERROR PageViewHandler - could not retrieve user: com.plumtree.analytics.collector.exception.DimensionManagerException: Could not insert dimension in the database
    com.plumtree.analytics.collector.exception.DimensionManagerException: Could not insert dimension in the database
    at com.plumtree.analytics.collector.cache.DimensionManager.insertDB(DimensionManager.java:271)
    at com.plumtree.analytics.collector.cache.DimensionManager.manageDBImage(DimensionManager.java:139)
    at com.plumtree.analytics.collector.cache.DimensionManager.handleNewDimension(DimensionManager.java:85)
    at com.plumtree.analytics.collector.eventhandler.BaseEventHandler.insertDimension(BaseEventHandler.java:63)
    at com.plumtree.analytics.collector.eventhandler.BaseEventHandler.getUser(BaseEventHandler.java:198)
    at com.plumtree.analytics.collector.eventhandler.PageViewHandler.handle(PageViewHandler.java:71)
    at com.plumtree.analytics.collector.DataResolver.handleEvent(DataResolver.java:165)
    at com.plumtree.analytics.collector.DataResolver.run(DataResolver.java:126)
    Caused by: org.hibernate.MappingException: Unknown entity: com.plumtree.analytics.core.persist.BaseCustomEventDimension$$BeanGeneratorByCGLIB$$6a0493c4
    at org.hibernate.impl.SessionFactoryImpl.getEntityPersister(SessionFactoryImpl.java:569)
    at org.hibernate.impl.SessionImpl.getEntityPersister(SessionImpl.java:1086)
    at org.hibernate.event.def.AbstractSaveEventListener.saveWithGeneratedId(AbstractSaveEventListener.java:83)
    at org.hibernate.event.def.DefaultSaveOrUpdateEventListener.saveWithGeneratedOrRequestedId(DefaultSaveOrUpdateEventListener.java:184)
    at org.hibernate.event.def.DefaultSaveEventListener.saveWithGeneratedOrRequestedId(DefaultSaveEventListener.java:33)
    at org.hibernate.event.def.DefaultSaveOrUpdateEventListener.entityIsTransient(DefaultSaveOrUpdateEventListener.java:173)
    at org.hibernate.event.def.DefaultSaveEventListener.performSaveOrUpdate(DefaultSaveEventListener.java:27)
    at org.hibernate.event.def.DefaultSaveOrUpdateEventListener.onSaveOrUpdate(DefaultSaveOrUpdateEventListener.java:69)
    at org.hibernate.impl.SessionImpl.save(SessionImpl.java:481)
    at org.hibernate.impl.SessionImpl.save(SessionImpl.java:476)
    at com.plumtree.analytics.collector.cache.DimensionManager.insertDB(DimensionManager.java:266)
    ... 7 more
    In analyticsui.log, found the exception below:
    08 Jul 2010 06:50:25,910 ERROR Configuration - Could not compile the mapping document
    org.hibernate.MappingException: duplicate import: com.plumtree.analytics.core.persist.BaseCustomEventFact$$BeanGeneratorByCGLIB$$6a896b0d
    at org.hibernate.cfg.Mappings.addImport(Mappings.java:105)
    at org.hibernate.cfg.HbmBinder.bindPersistentClassCommonValues(HbmBinder.java:541)
    at org.hibernate.cfg.HbmBinder.bindClass(HbmBinder.java:488)
    at org.hibernate.cfg.HbmBinder.bindRootClass(HbmBinder.java:234)
    at org.hibernate.cfg.HbmBinder.bindRoot(HbmBinder.java:152)
    at org.hibernate.cfg.Configuration.add(Configuration.java:362)
    at org.hibernate.cfg.Configuration.addXML(Configuration.java:317)
    at com.plumtree.analytics.core.HibernateUtil.loadEventMappings(HibernateUtil.java:796)
    at com.plumtree.analytics.core.HibernateUtil.loadEventMappings(HibernateUtil.java:652)
    at com.plumtree.analytics.core.HibernateUtil.refreshCustomEvents(HibernateUtil.java:496)
    at com.plumtree.analytics.ui.common.AnalyticsInitServlet.init(AnalyticsInitServlet.java:104)
    at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1161)
    at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:981)
    at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4045)
    at org.apache.catalina.core.StandardContext.start(StandardContext.java:4351)
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:525)
    at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:920)
    at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:883)
    at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:492)
    at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1138)
    at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:311)
    at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
    at org.apache.catalina.core.StandardHost.start(StandardHost.java:719)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
    at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
    at org.apache.catalina.core.StandardService.start(StandardService.java:516)
    at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
    at org.apache.catalina.startup.Catalina.start(Catalina.java:566)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at com.plumtree.container.Bootstrap.start(Bootstrap.java:531)
    at com.plumtree.container.Bootstrap.main(Bootstrap.java:254)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at org.tanukisoftware.wrapper.WrapperStartStopApp.run(WrapperStartStopApp.java:238)
    at java.lang.Thread.run(Thread.java:595)
    08 Jul 2010 06:50:25,915 ERROR Configuration - Could not configure datastore from XML
    org.hibernate.MappingException: duplicate import: com.plumtree.analytics.core.persist.BaseCustomEventFact$$BeanGeneratorByCGLIB$$6a896b0d
    at org.hibernate.cfg.Mappings.addImport(Mappings.java:105)
    at org.hibernate.cfg.HbmBinder.bindPersistentClassCommonValues(HbmBinder.java:541)
    at org.hibernate.cfg.HbmBinder.bindClass(HbmBinder.java:488)
    at org.hibernate.cfg.HbmBinder.bindRootClass(HbmBinder.java:234)
    at org.hibernate.cfg.HbmBinder.bindRoot(HbmBinder.java:152)
    at org.hibernate.cfg.Configuration.add(Configuration.java:362)
    at org.hibernate.cfg.Configuration.addXML(Configuration.java:317)
    at com.plumtree.analytics.core.HibernateUtil.loadEventMappings(HibernateUtil.java:796)
    at com.plumtree.analytics.core.HibernateUtil.loadEventMappings(HibernateUtil.java:652)
    at com.plumtree.analytics.core.HibernateUtil.refreshCustomEvents(HibernateUtil.java:496)
    at com.plumtree.analytics.ui.common.AnalyticsInitServlet.init(AnalyticsInitServlet.java:104)
    at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1161)
    at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:981)
    at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4045)
    at org.apache.catalina.core.StandardContext.start(StandardContext.java:4351)
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:525)
    at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:920)
    at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:883)
    at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:492)
    at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1138)
    at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:311)
    at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
    at org.apache.catalina.core.StandardHost.start(StandardHost.java:719)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
    at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
    at org.apache.catalina.core.StandardService.start(StandardService.java:516)
    at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
    at org.apache.catalina.startup.Catalina.start(Catalina.java:566)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at com.plumtree.container.Bootstrap.start(Bootstrap.java:531)
    at com.plumtree.container.Bootstrap.main(Bootstrap.java:254)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at org.tanukisoftware.wrapper.WrapperStartStopApp.run(WrapperStartStopApp.java:238)
    at java.lang.Thread.run(Thread.java:595)
    wrapper_collector.log
    INFO | jvm 1 | 2009/11/10 17:25:22 | at com.plumtree.analytics.collector.eventhandler.PortletViewHandler.handle(PortletViewHandler.java:46)
    INFO | jvm 1 | 2009/11/10 17:25:22 | at com.plumtree.analytics.collector.DataResolver.handleEvent(DataResolver.java:165)
    INFO | jvm 1 | 2009/11/10 17:25:22 | at com.plumtree.analytics.collector.DataResolver.run(DataResolver.java:126)
    INFO | jvm 1 | 2009/11/10 17:25:22 | Caused by: java.sql.SQLException: [plumtree][Oracle JDBC Driver][Oracle]ORA-00001: unique constraint (ANALYTICSDBUSER.IX_USERBYUSERID) violated
    INFO | jvm 1 | 2009/11/10 17:25:22 |
    INFO | jvm 1 | 2009/11/10 17:25:22 | at com.plumtree.jdbc.base.BaseExceptions.createException(Unknown Source)

    Key words from the error msg suggests reinstallation of Analytics is needed to resolve this.Analytics database is failing to get updated with the correct event mapping and this is why no data is being inserted.
    "Could not insert dimension in the database",
    "ERROR Configuration - Could not configure datastore from XML
    org.hibernate.MappingException: duplicate import: com.plumtree.analytics.core.persist.BaseCustomEventFact$$BeanGeneratorByCGLIB$$6a896b0d"
    "ORA-00001: unique constraint (ANALYTICSDBUSER.IX_USERBYUSERID) violated",
    "ERROR Configuration - Could not compile the mapping document

  • How to create a new user in OWB that can only view other user's mappings

    Hello Everyone,
    I am new to OWB and want to know if possible to create different users that can only view other user's mappings without making any changes. Do you know if possible each user can have his own password?
    Thank you for your help!
    Jennifer

    A good starting point would be this blog posting:
    https://blogs.oracle.com/warehousebuilder/entry/build_up_multi-role_development_environment_in_owb
    I seem to recall as well from the OWB Admin class that existing objects in the repository would need to have the security updated if the additional restricted roles are created after tables/mappings/etc. so that they reflect the new security model.
    Also, for individual logins to OWB, each developer would require a separate database login.

  • Missing Table Link CE4E003_ACCT to COEP to CE4E003

    Hi folks! I am having a difficulty in table mappings.. i need to link the 4 tables to extract some data. First, I selected the Profitability Segment No. from table CE4E003_ACCT to link with COEP, then the last 10 digit of Object Number on COEP will be the link to table CE4E003 and that 10 digit will be the Profitability Segment Number of table CE4E003... here's the missing link... i need to link with table COEJ using the Profitability Segment Number I retrieve from table CE4E003 but the problem is the Profitability Segment Number in table COEJ is not being populated.. and the Object Number is the field that is being populated... Is there any table that also have fields with Profitability Segment Number with its corresponding object number so that using that Object Number I can link it with table COEJ..? thanks a lot...

    Hi,
    Please check the thread Re: tables for information on relations.
    -Vikram

  • VL74 - Output - tables/data sent to configured program

    Hi,
    I am currently working on a label that will be printed during execution of VL74 (Output from Handling Units). what tables/structures will be sent to my label/program during the execution?
    I can't debug since the config and samples have not yet been created. However, I would like to build the field and table mappings, since I have large number of fields going through to the label. As such, I need to know what kind of fields/info would be coming into the program.
    Regards,
    Stanley

    Since i do not fully know the type of label you need, i would say you can start with tables VEKP, VEPO

  • Inserting XML String into Table with help of Stored Proc

    I will be getting XML String from JAVA, which I have to insert in Table A, XML String is as follows
    <?xml version = '1.0'?>
    < TableA>
    <mappings Record="3">
    < MESSAGEID >1</ MESSAGEID >
    < MESSAGE >This  is available at your address!</ MESSAGE>
    </mappings>
    <mappings Record="3">
    < MESSAGEID >2</ MESSAGEID>
    < MESSAGE >This isn’t available at your address. </ MESSAGE>
    </mappings>
    </ TableA >
    Table Structure*
    MESSAGEID     VARCHAR2(15 BYTE)
    MESSAGE     VARCHAR2(500 BYTE)
    This is the stored procedure which I have written to insert data into TableA, V_MESSAGE will be input parameter for inserting XML String 
    create or replace procedure   AP_DBI_PS_MESSAGE_INSERT
    V_MESSAGE VARCHAR2(1024)
    AS
    declare
    charString varchar2(80);
    finalStr varchar2(4000) := null;
    rowsp integer;
    V_FILEHANDLE UTL_FILE.FILE_TYPE;
    begin
    -- the name of the table as specified in our DTD
    xmlgen.setRowsetTag('TableA');
    -- the name of the data set as specified in our DTD
    xmlgen.setRowTag('mappings');
    -- for getting the output on the screen
    dbms_output.enable(1000000);
    -- open the XML document in read only mode
    v_FileHandle := utl_file.fopen(V_MESSAGE);
    --v_FileHandle := V_MESSAGE;
    loop
    BEGIN
    utl_file.get_line(v_FileHandle, charString);
    exception
    when no_data_found then
    utl_file.fclose(v_FileHandle);
    exit;
    END;
    dbms_output.put_line(charString);
    if finalStr is not null then
    finalStr := finalStr || charString;
    else
    finalStr := charString;
    end if;
    end loop;
    -- for inserting the XML data into the table
    rowsp := xmlgen.insertXML('ONE.TableA',finalStr);
    dbms_output.put_line('INSERT DONE '||TO_CHAR(rowsp));
    xmlgen.resetOptions;
    end;Please Help
    Edited by: 846857 on Jul 18, 2011 10:55 PM

    with t as (select xmltype('<TableA >
                               <mappings Record="3">
                               <MessageId>1</MessageId>
                               <Message> This bundle is available at your address!</Message>
                               </mappings>
                               <mappings Record="3">
                               <MessageId>2</MessageId>
                               <Message>This isn’t available at your address. </Message>
                               </mappings>
                               </TableA  >') col FROM dual)
      --End Of sample data creation with subquery factoring.
      --You can use the query from here with your table and column name.
    select EXTRACTVALUE(X1.column_value,'/mappings/MessageId') MESSAGEID
          ,EXTRACTVALUE(X1.column_value,'/mappings/Message') MESSAGE
    from t,table(XMLSEQUENCE(extract(t.COL,'/TableA/mappings'))) X1;Above Code works as i get result
    MESSAGEID     MESSAGE
    1             This bundle is available at your address!
    2             This isn’t available at your address.
    _____________________________________________ now I want to insert the result into Table A... How to proceed... Please help
    Edited by: 846857 on Jul 19, 2011 12:15 AM

  • Set xml element for table

    Hi guys,
    I am new to indesign scripting using applescript.
    my code is shown in below, my problem is can't set xml element for table
    tell application "Adobe InDesign CS6"
            tell active document
                set rootele to associated XML element of story 1
                if rootele = nothing then
                    set rootele to XML element 1
                end if
                tell story 1
                    select text 1
                    set markup tag of selection to "table" -----------------> this is my problem please help me.
                              end tell
            end tell
        end tell
    this code is right or not. please help me.

    The metadata about the table mappings are also stored in the compiled XML Schema document. If you lookup the schema document located at /sys/schemas/.... , you will see that the element with SQLInline=false will have a xdb:defaultTable attribute which provides the name of the table used to store the element.
    - Ravi

  • Extract is picking up rows twice because both an explicit TABLE and a TABLE

    Our extract parameter file has this following(full file uploaded):
    TABLE TK_APP_DATA.PAYMENT_INFO , COLSEXCEPT (ENCRYPTED_VALUE,HASH_VALUE)
    #add_scn()
    TABLE TK_APP_DATA.*
    #add_scn()
    When i look at the change data inserted by the replicat on target and at the extract trail file I can see that a row which is updated is actually extracted twice and duplicated. I also see that the table is resolving twice in the EXTRACT rpt file:
    Using the following key columns for source table TK_APP_DATA.PAYMENT_INFO : INSTRUMENT_ID, FIELD_HANDLE.
    TABLEWildcard resolved (entry SCECOMM.*):
    TABLE SCECOMM.PAYMENT_INSTRUMENT_FIELD, TOKENS ( TK-SCN = @GETENV ( "TRANSACTION" , "CSN" ) ) ;
    Using the following key columns for source table TK_APP_DATA.PAYMENT_INFO : INSTRUMENT_ID, FIELD_HANDLE.
    TABLEWildcard resolved (entry SCECOMM.*):
    TABLE SCECOMM.APPLICATION_PARAMETER, TOKENS ( TK-SCN = @GETENV ( "TRANSACTION" , "CSN" ) ) ;
    Anyone know how I can make it so that only one TABLE statement is used to resolve it? Or conversely a way to filter out table in a wild card statement? I would prefer to avoid having to create a seperate extract for just 1 table(schema has over 350+ tables in it)
    I tried using a TABLEEXCLUDE but that excludes the table entirely. I also tried using WILDCARDRESOLVE IMMEDIATE on the specific table mappings, but can't because we also use DDL replication.
    Any help would be greatly appreciated.
    #anh

    This is unfortunately what I have right now as my stop gap. We will sql generate the list of tables with every deployment which with our agile development is like every other week haha. I do have a requirement to replicate dynamically(the source side changes extremely frequently by adding and dropping tables) but even after talking to oracle support this is not possible. If you have a custom mapping for a table and use a wild card later which would resolve that table you will always get duplicate rows extracted.
    I talked to oracle support(with raised this as a bug/feature request) and raised this with our oracle engagement manager and asked them to add better functionality when it comes to filtering and wild card selection for TABLE and MAP statements to have an option setting to only map once. It's in their queue however not sure how high of a priority it will be for them(I'm guessing not very high =) )

  • BLOB Vs XMLTYPE, which is better!!?

    Hi all, we currently store our XMLs as BLOB in a relational table, now we need to do wildcard search on the XML and are thinking of changing the column to XMLTYPE for better performance. But still we know there are options for wildcard searching in BLOB itself, but we are worried about performance hit. Also it seems difficult to do XMLTYPE using hibernate. We would be transacting millions of XMLs per week.
    Can you please suggest which out be the best datatype for the column, remain with BLOB or to go with XMLTYPE and also specify which type of indexing is better Oracle Free Text or Unstructured XMLIndex for XMLType considering better performance.
    Thanks!!

    It's definitely possible to map an SQLXML object using Hibernate. Check out the section of the Hibernate UserType mappings in the manual here:
    http://docs.jboss.org/hibernate/orm/4.1/manual/en-US/html/ch06.html#types-custom
    I've done this myself; it basically involves implementing the UserType class and then adding a typedef to your hibernate-mapping file.
    I agree with Anon that you'll get far better performance for searching (as well as using far less space with XML BINARY storage) if you switch to XMLType. His blog has some good performance tips, by the way (thank you, Anon). However - a warning: you say that you are transacting millions of XML's a week; if you are replacing them, then you should be fine, but I've experienced problems updating pieces of the document that are large (where the XML being updated is too big for the O/R storage option, but too small for the CLOB storage option). YMMV.

  • Servlet - java.lang.OutOfMemoryError: Java heap space

    Hi,
    Does anyone know how to debug this error !!
    regards,
    Manmohan
    java.lang.OutOfMemoryError: Java heap space
    2006-05-12 11:38:16,485 INFO [STDOUT] SATS ERROR: DispatcherServlet system exception: Servlet execution threw an exception
    2006-05-12 11:38:16,485 ERROR [org.apache.catalina.core.ContainerBase.[jboss.web].[localhost].[drugtest]] System Exception!
    javax.servlet.ServletException: Servlet execution threw an exception
         at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:275)
         at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173)
         at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:672)
         at org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:463)
         at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:359)
         at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:301)
         at com.auriga.drugtest.web.servlets.DispatcherServlet.redirect(DispatcherServlet.java:51)
         at com.auriga.drugtest.web.servlets.DispatcherServlet.service(DispatcherServlet.java:31)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:810)
         at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:252)
         at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173)
         at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:81)
         at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:202)
         at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173)
         at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
         at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:178)
         at org.jboss.web.tomcat.security.CustomPrincipalValve.invoke(CustomPrincipalValve.java:39)
         at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:153)
         at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:59)
         at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:126)
         at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:105)
         at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:107)
         at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:148)
         at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:856)
         at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.processConnection(Http11Protocol.java:744)
         at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:527)
         at org.apache.tomcat.util.net.MasterSlaveWorkerThread.run(MasterSlaveWorkerThread.java:112)
         at java.lang.Thread.run(Thread.java:595)

    Where to change/set the memory size in config file. I am using tomcat 6.0, when i start to browse my application i am getting this error. But this error is not coming on all the systems which we work ,only on one system I am getting this type of error. What should I do. If there is a memory leak how to find it out?. our application user very huge memory and we cannot change/reduce the code. this is the error
    type Exception report
    message
    description The server encountered an internal error () that prevented it from fulfilling this request.
    exception
    org.apache.jasper.JasperException: javax.servlet.ServletException: java.lang.NoClassDefFoundError: org/apache/commons/collections/SequencedHashMap
    org.apache.jasper.servlet.JspServletWrapper.handleJspException(JspServletWrapper.java:522)
    org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:398)
    org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:337)
    org.apache.jasper.servlet.JspServlet.service(JspServlet.java:266)
    javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
    root cause
    javax.servlet.ServletException: java.lang.NoClassDefFoundError: org/apache/commons/collections/SequencedHashMap
    org.apache.jasper.runtime.PageContextImpl.doHandlePageException(PageContextImpl.java:850)
    org.apache.jasper.runtime.PageContextImpl.handlePageException(PageContextImpl.java:779)
    org.apache.jsp.include.do_005flogin_jsp._jspService(do_005flogin_jsp.java:436)
    org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
    javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
    org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:374)
    org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:337)
    org.apache.jasper.servlet.JspServlet.service(JspServlet.java:266)
    javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
    root cause
    java.lang.NoClassDefFoundError: org/apache/commons/collections/SequencedHashMap
    org.hibernate.mapping.Table.<init>(Table.java:33)
    org.hibernate.cfg.Mappings.addTable(Mappings.java:165)
    org.hibernate.cfg.HbmBinder.bindRootPersistentClassCommonValues(HbmBinder.java:290)
    org.hibernate.cfg.HbmBinder.bindRootClass(HbmBinder.java:273)
    org.hibernate.cfg.HbmBinder.bindRoot(HbmBinder.java:144)
    org.hibernate.cfg.Configuration.add(Configuration.java:669)
    org.hibernate.cfg.Configuration.addInputStream(Configuration.java:504)
    org.hibernate.cfg.Configuration.addResource(Configuration.java:566)
    org.hibernate.cfg.Configuration.parseMappingElement(Configuration.java:1587)
    org.hibernate.cfg.Configuration.parseSessionFactory(Configuration.java:1555)
    org.hibernate.cfg.Configuration.doConfigure(Configuration.java:1534)
    org.hibernate.cfg.Configuration.doConfigure(Configuration.java:1508)
    org.hibernate.cfg.Configuration.configure(Configuration.java:1428)
    org.hibernate.cfg.Configuration.configure(Configuration.java:1414)
    esq.connector.ConnectorFactory.prepareConnection(Unknown Source)
    esq.connector.ConnectorFactory.<init>(Unknown Source)
    esq.connector.ConnectorFactory.getConnector(Unknown Source)
    esq.connector.ConnectorFactory.getUserConnector(Unknown Source)
    esq.common.session.UserSession.authenticate(Unknown Source)
    esq.modules.user.LogonManager.doLogin(Unknown Source)
    org.apache.jsp.include.do_005flogin_jsp._jspService(do_005flogin_jsp.java:404)
    org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
    javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
    org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:374)
    org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:337)
    org.apache.jasper.servlet.JspServlet.service(JspServlet.java:266)
    javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
    note The full stack trace of the root cause is available in the Apache Tomcat/6.0.16 logs.

  • Refresh/Update data in a materialized view

    Hi,
    I have question about the data in a materialized view and how it is refreshed. My mat view has all my dimension-ids and my (for my specialize needs) aggregated measures from my fact table. I used the mat view wizard to create my view - which works perfectly. But now I wonder if I have to create some sort of mapping(?) or some sort of trigger to refresh the data in the mat view. Or is the data automatically refreshed when I start my fact table mappings. I use OWB 11gR2
    thx

    MVs have properties for refresh - you an refresh based on schedules or when dependent data is committed or manually.
    Cheers
    David

Maybe you are looking for

  • Data format change in bex

    hi friends, i have 0date infoobject as one of the fields in a cube. data is uploaded into cube correctly. when i checked data in cube it's in correct format 20.12.2006, but when i checked in bex report it shows 12/20/2006. how to convert it into dd/m

  • Error while creating Campaign in CRM

    Hi all, When I am trying to create Campaign in CRM, system is throwing an error saying that "Can not get RFC destination for SEM". We are not using SEM in our project. Can anybody suggest me how to avoid/rectify this error? points will be guaranteed.

  • Error message when I try to publish an iWeb photo album with audio to Mobileme:

    I get the following error message when trying to publish a photo album with audio from iWeb to Mobileme.  I've checked my account and I have plenty of available space on Mobileme and have logged in there. An error occurred while publishing file "/pri

  • Which product to purchase (cross platform necessary from Mac to PC) Adobe Photoshop 6.0.x to ???

    I have ancient Mac photoshop and pagemaker files. i need best cross platform product for windows that can access these files. adobe photoshop is my primary concern. I understand the indesign cs2 will import old school mac photoshop 6.0.x but is not c

  • Help for Delete command syntax

    1. When i use command: select * from families f, type_family t where f.family_name = t.family and f.type_id != t.type_id I get the output successfully. 2. But I want to run a command like: Delete from families f, type_family t where f.family_name = t