Import of DDL with data modeler

Hello,
I begin to use the data modeler of sql developer. I create un relational model with constraints about range and values list. Then I generate the corresponding DDL :
-- Generated by Oracle SQL Developer Data Modeler Version: 2.0.0 Build: 584
-- at: 2010-03-08 19:26:07
-- site: Oracle Database 10g
-- type: Oracle Database 10g
CREATE TABLE ACTLIST
TIME_ACT INTEGER NOT NULL ,
NON_CTL_AREA_NAME VARCHAR2 (12) NOT NULL ,
AIRSPACE_ENV_NAME VARCHAR2 (20) NOT NULL
) LOGGING
ALTER TABLE ACTLIST
ADD CONSTRAINT TIME_ACT_CHCK
CHECK ( TIME_ACT BETWEEN 1 AND 2000)
CREATE TABLE AIRCR_MO
AIRCRAFT_NAME VARCHAR2 (4) NOT NULL ,
WAKE_TURBULENCE_CAT VARCHAR2 (1) NOT NULL ,
TAS INTEGER NOT NULL
) LOGGING
ALTER TABLE AIRCR_MO
ADD CONSTRAINT WAKE_TURBULENCE_CAT_CHCK
CHECK ( DESCRIPTION IN ( 'L', 'M', 'H' ))
ALTER TABLE AIRCR_MO
ADD CONSTRAINT TAS_CHCK
CHECK ( TAS BETWEEN 0 AND 999)
Now, I import this DDL and constraints about range and values list are lost. Is it normal ? How Can I do to preserve these data ? Domains on each columns are effectively created but raneg and values list are empty !
Thank you

Hello,
these definitions are generated as check constraint:
ALTER TABLE AIRCR_MO
ADD CONSTRAINT WAKE_TURBULENCE_CAT_CHCK
CHECK ( DESCRIPTION IN ( 'L', 'M', 'H' )) and they are imported as check constraint in current implementation. We have ER logged for that - to import them as list of ranges or/and list of values.
Philip

Similar Messages

  • SQL Developer with Data Modeler, No Design Menu Option

    Using SQL Developer 3.0.04 Build MAIN-04.34 with Data Modeler. I have created a logical design and wish to forward engineer this to a relational model.
    I cannot seem to find the Design Menu option to use the >> Engineer to Relational Model option which is available under the Design Menu on the Stand Alone version of SQL Developer Data Modeler.
    Have checked the following locations:
    Tools -> Data Modeler
    View -> Data Modeler
    File -> Data Modeler
    Right Click on my Logical Design
    I have ensure that all my entities have the Engineer To property set to a valid Relational Model.
    Is this a bug or am I missing a menu option / configuration setting?
    Thanks in advance for any help
    John

    Hi John,
    you can find ">> Engineer to Relational Mode" button among other buttons for logical diagram.
    Philip

  • Import table - problems with date

    Hi,
    I want to import an table from text file with date column. The format in file is 12.01.2000 13:59:12.
    When importing the preview of column value is 2000-12-01 13:59:12. I've then added the format YYYY-MM-DD HH24:MI:SS. After import all rows with date of year 1999 or below are failed.
    Does anybody know why and how can I solve this problem?
    Thanks
    chrissy

    Hi Dimitri,
    yes all rows with date of year 2000 or higher are correct. It shows me the YYYY-DD-MM HH24:MI format if it recognizes the column as date. Somehow I get it worked that it is recognized as varchar, changed to date and set format to DD.MM.YYYY HH24:MI. So it was imported fine.
    Thanks for your help,
    chrissy

  • Can't create connection with Data Modeler 2.0.0 570

    I have installed SQL Developer Data Modeler on my Windows Vista Home Premium system and am unable to create a connection to import a Data Dictionary. When I select TNS from the connection type dropdown list , I get a message below that states: 'Warning ORACLE_HOME environment variable is not set'.
    This is odd, since I use SQL Developer daily on this same system with no issues. I have checked with regedit and the ORACLE_HOME environment variable is indeed set.
    Any ideas?

    I ran into a similar where Data Modeler was finding the TNSNAMES.ORA file for the older Oracle 8i client (which I have to keep installed because some older legacy software depends on it) rather than the 10g client. My solution was to make use of a suggestion for a simialr problem with SQL Developer and create a DOS "cmd" file in the "+<data modeler install dir>+\datamodeler" directory that sets the ORACLE_HOME environment variable to my 10g client directory and then runs "datamodeler.exe". My file is named "datamodeler.cmd" and contains the following two lines:
    SET ORACLE_HOME=C:\oracle\product\10.2.0\client_1
    start datamodeler.exe
    HTH.
    Ed. H.

  • Export import  in sql developer data modeling tool

    I tried sql developer data modeling tool
    but i have a proble
    i select file&gt; import &gt; data dictionary
    then created a connection to db and selected some tables and got the er digram successfully
    then i goto file &gt; export&gt;to data modeling design and save it to xml file
    but when i give this file to another developer and he imported the file File&gt;import&gt;data modeling design
    the digram is not displayed
    is it a bug or im doing something wrong

    ok it was my fault
    it's not the xml only
    there is a folder with the same name bside the xml file and should be included within the exchange

  • Help Needed with Data-modeling to build an application on

    Hi would anyone be able to help me in creating a data model  cause im really stuck with this one .Basically if been asked to create a survey application in oracle apex that use to excel based . So the info i was given was in a form of  excel sheet which looks like this
    NAME
    E-MAIL
    TSSA
    ORACLE
    HP
    IBM
    MS
    SAP
    INTERGRAPH
    CISCO
    Relationship
    Contracting
    Performance
    Architecture
    Supplier Feedback
    comments
    Jxxxxxx yyyyyyf
    [email protected]
    Yes
    Yes
    Yes
    Yes
    x
    requested to be added
    nnnitha iiiiiah
    [email protected]
    Yes
    Yes
    Yes
    x
    x
    Knnnn kkkikot
    [email protected]
    Yes
    x
    x
    is not payed
    Gggrt Louuuue
    [email protected]
    Yes
    Yes
    Yes
    Yes
    Yes
    Yes
    Yes
    Yes
    x
    x
    x
    x
    jeiiiha ad
    [email protected]
    Yes
    x
    to meet with
    John Rat
    [email protected]
    Yes
    x
    x
    So where it says yes thous are the vendors that people associated with them have to asses and where there's an X thous are the topics that the vendors have to be rated on . So if for example the first guy on the list Jxxxxxx yyyyyyf will asses TSSA , ORACLE, HP , IBM , MS , SAP  on the topic of Architecture and if you look at the second user nnnitha iiiiiah he would rate TSSA , ORACLE , INTERGRAPH on the topics of Relationship and performance  . Any idea how i could data model this to get my table structures right .so that features like completion status could be displayed to the user through APEX which can only be done by a correct data-model i have tried normalization but  i did go anywhere becauce there are so many variations any idea on how you would go about data modeling this would be greatly appreciated thank you    

    Not really an APEX specific question..  Maybe you should try posting this in the data modeler forum : SQL Developer Data Modeler
    Thank you,
    Tony Miller
    LuvMuffin Software

  • Problem with Data Model and Analysis View

    I create an analyze in BI Publisher and then i create a data model using this object.
    When i try to generate an XML with a number of rows the BI Publisher return an empty XML (Only with de DATA_DS tags but no data).   To bypass this problem i make and XML file by hand and this allow me to create reports and design it but when i try to view the reports i got the message that says "No Data Found".
    So i was check the analysis and all looks appears to be fine,  In the results tab it show me a complete table with the data i was looking to use.
    So i try to repeat the error and when  i try to create the XML for the Data Model i found this two error in the logs:
    [root@server ~]# [2013-07-17T16:37:22.844-04:00] [bi_server1] [WARNING] [] [oracle.xdo] [tid: 2361] [userId: <anonymous>] [ecid: ad7bb40a72b553c0:-3e5f91c5:13ecd037992:-8000-00000000000e2b34,0] [APP: bipublisher#11.1.1] Incomplete xslt._XDONFSEPARATORS: decimal separator: null, grouping separator: null
    [2013-07-17T16:37:26.828-04:00] [bi_server1] [WARNING] [] [oracle.xdo] [tid: 2361] [userId: <anonymous>] [ecid: ad7bb40a72b553c0:-3e5f91c5:13ecd037992:-8000-00000000000e2b3a,0] [APP: bipublisher#11.1.1] Incomplete xslt._XDONFSEPARATORS: decimal separator: null, grouping separator: null
    [2013-07-17T16:37:26.865-04:00] [bi_server1] [WARNING] [] [oracle.xdo] [tid: 2361] [userId: <anonymous>] [ecid: ad7bb40a72b553c0:-3e5f91c5:13ecd037992:-8000-00000000000e2b3a,0] [APP: bipublisher#11.1.1] oracle.xdo.servlet.CreateException: Path: /FOLDER/MODEL.xdm is not pointing to a report. Actual type: ReportItem, sub-type: DataModel[[
            at oracle.xdo.servlet.ReportException.fillInStackTrace(ReportException.java:124)
            at java.lang.Throwable.<init>(Throwable.java:196)
            at java.lang.Exception.<init>(Exception.java:41)
            at oracle.xdo.servlet.ReportException.<init>(ReportException.java:36)
            at oracle.xdo.servlet.CreateException.<init>(CreateException.java:18)
            at oracle.xdo.servlet.ReportRepository.getReport(ReportRepository.java:104)
            at oracle.xdo.servlet.ReportRepository.getReport(ReportRepository.java:128)
            at oracle.xdo.servlet.dataengine.DataProcessorFactory.getDataModelPath(DataProcessorFactory.java:207)
            at oracle.xdo.servlet.dataengine.DataProcessorFactory.isSemanticLayerDataModel(DataProcessorFactory.java:99)
            at oracle.xdo.servlet.dataengine.DataProcessorFactory.isSemanticLayerDataModel(DataProcessorFactory.java:78)
            at oracle.xdo.servlet.ReportModelContextImpl.getReportXMLData(ReportModelContextImpl.java:157)
            at oracle.xdo.servlet.CoreProcessor.process(CoreProcessor.java:346)
            at oracle.xdo.servlet.CoreProcessor.generateDocument(CoreProcessor.java:101)
            at oracle.xdo.servlet.ReportImpl.renderBodyHTTP(ReportImpl.java:1074)
            at oracle.xdo.servlet.ReportImpl.renderReportBodyHTTP(ReportImpl.java:639)
            at oracle.xdo.servlet.XDOServlet.writeReport(XDOServlet.java:492)
            at oracle.xdo.servlet.XDOServlet.writeReport(XDOServlet.java:462)
            at oracle.xdo.servlet.XDOServlet.doGet(XDOServlet.java:280)
            at oracle.xdo.servlet.XDOServlet.doPost(XDOServlet.java:313)
            at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
            at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
            at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
            at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
            at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
            at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
            at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
            at oracle.xdo.servlet.metadata.track.MostRecentFilter.doFilter(MostRecentFilter.java:64)
            at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
            at oracle.xdo.servlet.security.SecurityFilter.doFilter(SecurityFilter.java:125)
            at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
            at oracle.xdo.servlet.init.InitCheckingFilter.doFilter(InitCheckingFilter.java:63)
            at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
            at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:119)
            at java.security.AccessController.doPrivileged(Native Method)
            at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:315)
            at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:442)
            at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
            at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
            at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
            at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
            at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:139)
            at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
            at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:119)
            at java.security.AccessController.doPrivileged(Native Method)
            at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:315)
            at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:442)
            at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
            at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
            at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
            at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
            at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
            at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
            at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
            at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
            at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
            at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
            at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
            at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
            at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
            at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
            at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    And when i try to view the report that use the analysis i got this two warning in the logs:
    [2013-07-17T16:58:01.615-04:00] [bi_server1] [WARNING] [] [oracle.xdo] [tid: 57] [userId: <anonymous>] [ecid: ad7bb40a72b553c0:-3e5f91c5:13ecd037992:-8000-00000000000e2d7c,0] [APP: bipublisher#11.1.1] Incomplete xslt._XDONFSEPARATORS: decimal separator: null, grouping separator: null
    [2013-07-17T16:58:02.034-04:00] [bi_server1] [WARNING] [] [oracle.xdo] [tid: 57] [userId: <anonymous>] [ecid: ad7bb40a72b553c0:-3e5f91c5:13ecd037992:-8000-00000000000e2d84,0] [APP: bipublisher#11.1.1] Incomplete xslt._XDONFSEPARATORS: decimal separator: null, grouping separator: null
    As i understand there is has a reference to a null value but i cant find what column of the analysis has the problem.
    Any ideas about how to solve or debug this?
    Thanks

    I follow your instructions and it works fine. I can create the XML and the reports can show data.
    So, i already know that there is no problem with the data and with the BI Publisher installation but i still doesn know what is the problem with the analysis view that fails.
    Any idea how to debug it?
    Thanks.

  • File with pls extension opens with data modeler instead of sql developer

    Hello,
    I have both SQL Developer and Data Modeler installed as separate installations. If I  click in  the Windows File navigator on a file with the extension pls, the SQL Developer should open since the pls extension has been connected to SQL Developer.
    If I have neither SQLD nor DM open, SQLD opens -> ok
    If I have only SQLD open, the file is opened in SQLD -> ok
    If I have only DM open, the file is opened in DM -> not ok
    If I have both SQLD and DM open, the last started application opens the file -> sometimes ok
    Is it possible to manage the opening of a pls to open always in SQL Developer?
    Joop

    This is quite an old problem which I have seen with JDeveloper and SQLDeveloper so it may be framework related. I don't currently have both installed to check it out, but as it has been around a long time I doubt it has gone away.
    SQL files would open in either JDeveloper or SQL Developer following the pattern Joop described. Java files would only ever open in JDeveloper.    IIRC there was never a SQL file association for JDeveloper and SQLDeveloper was always the default.

  • Working With Data Models !!!!

    Hi,
    I am working with TableViews can some one tell me about some good material I can go through for Data Models and how i can work with them.

    Hi Emmanuel,
    many people have given you some points for your help, but you didn't vice versa except once.
    I will replace this text with the answer to your question (as far as I can) if you reward points (or at least reply why the offered help didn't work; be more responsive, please) on the following threads
    How to create a new System !!!!!!!
    How to use Visual Administrator !!!!
    How to use Logger !!!!
    Console Output !!!!  Need Help Badly !!!!
    How to use onRowSelection !!!!
    Best regards
    Detlev

  • Need help with Data Model for Private Messaging

    Sad to say, but it looks like I just really screwed up the design of my Private Messaging (PM) module...  *sigh*
    What looked good on paper doesn't seem to be practical in application.
    I am hoping some of you Oracle gurus can help me come up with a better design!!
    Here is my current design...
    member -||-----0<- private_msg_recipient ->0------||- private_msg
    MEMBER table
    - id
    - email
    - username
    - first_name
    PRIVATE_MSG_RECIPIENT table
    - id
    - member_id_to
    - message_id
    - flag
    - created_on
    - updated_on
    - read_on
    - deleted_on
    - purged_on
    PRIVATE_MSG table
    - id
    - member_id_from
    - subject
    - body
    - flag
    - sent_on
    - updated_on
    - sender_deleted_on
    - sender_purged_on
    ***Short explanation of how the application currently works...
    - Sender creates a PM and sends it to a Recipient.
    - The PM appears in the Sender's "Sent" folder in my website
    - The PM also appears in the Recipient's "Incoming" folder.
    - If the Recipient deletes the PM, I set "deleted_on" and my code moves the PM from Recipient's "Inbox" to the "Trash" folder.  (Record doesn't actually move!)
    - If the Recipient "permanently deletes" the PM from his/her "Trash", I set "purged_on" and my code removes the PM from the Recipient's Message Center.  (Record still in database!)
    - If the Sender deletes the PM, I set "sender_deleted_on" and my code moves the PM from the Sender's "Sent" folder to the "Trash" folder.  (Record doesn't actually move!)
    - If the Recipient "permanently deletes" the PM from his/her "Trash", I set "sender_purged_on" and my code removes the PM from the Sender's Message Center.  (Record still in database!)
    Here are my problems...
    1.) I can't store PM's forever.
    2.) Because of my design, the Sender really owns the PM, and if I add code to REMOVE the PM from the database once it has a "sender_purged_on" value, then that would in essence remove the PM from the Recipient's Inbox as well!!
    In order to remove a PM from the database, I would have to make sure that *both* the Recipient has "purged_on" value and the Sender has a "sender_purged_on" value.  (Lot's of Application Logic for something which should be simple?!)
    I am wondering if I need to change my Data Model to something that allows my autonomy when it comes to the Sender and/or the Recipient deleting the PM for good...
    One the other hand, I believe I did a good job or normalizing the data.  And my current Data Model is the most efficient when it comes to saving storage space and not having dups.
    Maybe I do indeed just need need to write application logic - or a cron job - which checks to make sure that *both* the Sender an Recipient have deleted the PM before it actually flushes it out of my database to free up space?!
    Of course, if one party sits on their PM's forever, then I can never clear things out of my database to free up space...
    What should I do??
    Some expert advice would be welcome!!
    Sincerely,
    Debbie

    rp0428,
    I think I am starting to see my evil ways and where I went wrong... 
    > Unfortunately his design is just as denormalized as yours
    I see that now.  My bad!!
    > the last two columns have NOTHING to do with the message itself so do NOT belong in a normalized table.
    > And his design:
    >
    > Same comment - those last two columns also have NOTHING to do with the message itself.
    Right.
    > The message table should just have columns directly related to the message. It is a list of unique messages: no more, no less.
    Right.
    > Mark gave you hints to the proper normalized design using an INTERSECT table.
    > that table might list: sender, recipient, sender_delete_flag, recipient_delete_flag.
    > As mark suggested you could also have one or two DATEs related to when the delete flags were set. I would just make the columns DATE fields.
    >
    > Once both date columns have a value you can delete the message (or delete all messages older than 30+ days).
    >
    > When both flags are set you can delete the message itself that references the sender and the message sent.
    Okay, how does this revised design look...
    MEMBER --||-----0<-- PM_DISTRIBUTION -->0-------||-- PRIVATE_MSG
    MEMBER table
    - id
    - email
    - username
    - first_name
    and so on...
    PM_DISTRIBUTION table (Maybe you can think of a better name??)
    - id
    - private_msg_id
    - sender_id
    - recipient_id
    - sender_flag
    - sender_deleted_on
    - sender_purged_on
    - recipient_flag
    - recipient_read_on
    - recipient_deleted_on
    - recipient_purged_on
    PRIVATE_MSG
    - id
    - subject
    - body
    - sent_on
    Is that what you were describing to me?
    Quickly reflecting on this new design...
    1.) It should now be in 3rd Normal Form, right?
    2.) It should allow the Sender and Recipient to freely and independently "delete" or "purge" a PM with no impact on the other party, right?
    Here are a few Potential Issues that I see, though...
    a.) What is to stop there from being TWO SENDERS of a PM?
    In retrospect, that is why I originally stuck "member_id_from" in the PRIVATE_MSG table!!  The logic being, that a PM only ever has *one* Sender.
    I guess I would have to add either Application Logic, or Database Logic, or both to ensure that a given PM never has more than one Sender, right?
    b.) If the design above is what you were hinting at, and if it is thus "correct", then is there any conflict with my Business Rule: "Any given User shall only be allowed 100 Messages between his/her Incoming, Sent and Trash folders."
    Because the Sender is no longer "tightly bound" to the PRIVATE_MSG, in my scenario above...
    Debbie could send 100 PM's, hit her quota, then turn around and delete and purge all 100 Sent PM's and that should in no way impact the 100 PM's sitting in other Users' Inboxes, right??
    I think this works like I want...
    Sincerely,
    Debbie

  • RuleAuthor : error importing XML Schemas into Data Model

    Hi,
    I have problems during import XML Schema in my Data Model.
    I'm following these steps:
    1) Click Definitions tab;
    2) Click XMLFact;
    3) Click Create
    4) I enter the path for the schema and the directory to store JAXB-generated classes. In this directory every user has all permission (777).
    In the next step when I click on "Add Shemas" I have this error:
    java.io.IOException: Not enough space at java.lang.UNIXProcess.forkAndExec(Native Method) at java.lang.UNIXProcess.<init>(UNIXProcess.java:53) at java.lang.ProcessImpl.start(ProcessImpl.java:65) at java.lang.ProcessBuilder.start(ProcessBuilder.java:451) at java.lang.Runtime.exec(Runtime.java:591) at java.lang.Runtime.exec(Runtime.java:429) at java.lang.Runtime.exec(Runtime.java:326) at oracle.rules.sdk.datamodel.impl.DataModelUtil.compileJavaFile(DataModelUtil.java:479) at oracle.rules.sdk.datamodel.DataModelManager.addXMLSchemaPath(DataModelManager.java:984) at oracle.rules.sdk.mapper.RuleObjectHelper.addSchemapath(RuleObjectHelper.java:2759) at oracle.rules.ra.uix.mvc.SchemaSelectorEH.addSchema(SchemaSelectorEH.java:138) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:585) at oracle.rules.ra.uix.mvc.BeanEH.genericHandleEvent(BeanEH.java:869) at oracle.rules.ra.uix.mvc.BeanEH.handleEvent(BeanEH.java:838) at oracle.cabo.servlet.event.TableEventHandler.handleEvent(Unknown Source) at oracle.cabo.servlet.event.TableEventHandler.handleEvent(Unknown Source) at oracle.cabo.servlet.event.BasePageFlowEngine.handleRequest(Unknown Source) at oracle.cabo.servlet.AbstractPageBroker.handleRequest(Unknown Source) at oracle.cabo.servlet.ui.BaseUIPageBroker.handleRequest(Unknown Source) at oracle.cabo.servlet.PageBrokerHandler.handleRequest(Unknown Source) at oracle.cabo.servlet.UIXServlet.doGet(Unknown Source) at javax.servlet.http.HttpServlet.service(HttpServlet.java:743) at javax.servlet.http.HttpServlet.service(HttpServlet.java:856) at com.evermind.server.http.ServletRequestDispatcher.invoke(ServletRequestDispatcher.java:711) at com.evermind.server.http.ServletRequestDispatcher.forwardInternal(ServletRequestDispatcher.java:368) at com.evermind.server.http.HttpRequestHandler.doProcessRequest(HttpRequestHandler.java:866) at com.evermind.server.http.HttpRequestHandler.processRequest(HttpRequestHandler.java:448) at com.evermind.server.http.AJPRequestHandler.run(AJPRequestHandler.java:302) at com.evermind.server.http.AJPRequestHandler.run(AJPRequestHandler.java:190) at oracle.oc4j.network.ServerSocketReadHandler$SafeRunnable.run(ServerSocketReadHandler.java:260) at com.evermind.util.ReleasableResourcePooledExecutor$MyWorker.run(ReleasableResourcePooledExecutor.java:303) at java.lang.Thread.run(Thread.java:595)
    I cannot find the solution!
    Can someone help me?
    Thanks.

    Do you still have enough disk space available on your file system to store the different xml-facts the RuleAuthor will create for you?

  • How to import whole database (with data) from remote server?

    I am using oracle sql developer 2.1. Our main database server is in USA, we use this from Bangladesh. We also a local copy here time to time updataed.
    Is there any way copy or migrate the whole database with data from usa server to our local serve using sql developer 2.1?
    we use windows server 2003 service pack 2 in local server.

    I replied to you in Import Data wizard not found and stand by my suggestions.
    Regards,
    K.

  • Importing CSV file with Data Merge Fails

    Specs
    See pasted text from CSV at http://pastebin.com/mymhugpN
    I am using InDesign CS6 (8.0.1)
    I created the CSV by downloading it from a Google Spreadsheet as a CSV. I confirm with the Terminal that the character encoding is utf-8 usnig the file command.
    Problem detailed
    I am trying to import a CSV file (utf-8) with Data Merge via the Select Data Source... command with Show Import Options checked. When viewing the Data Source Import Options dialog, I set the following options—Delimiter:Comma, Encoding:Unicode, Platform:Macintosh. I leave Preserve Spaces in Data Source unchecked. It fails to import any variables and produces no error message. I have tried other CSV files as well (created TextEdit, Espresso, etc.) and it seems that InDesign will not import any files if Unicode is specified as the encoding, no matter which other options are specified.
    Can anyone else confirm this?
    Importing as ACSII works, but obviously does not display my content correctly.

    Mike is having some trouble posting in this thread (and I am too), but he sent me a PM with what he wanted to say:
    OK. I think I might have a positive answer for you.
    I was getting lost in the upper ASCII characters you showed. In your test file I never could see any--a case of not seeing the trees for the forest.
    Your quote marks are getting dropped in your test file. Now, this may or may not affect other factors but it does in some further testing. I believe ID has an issue with dropping quote marks even in a plain ASCII file if the marks are at the beginning of a sentence and the file is tab delimited. Call it a bug.
    Because of all the commas and quote marks in your simple file, I think you should be exporting from Google Docs' spreadsheet as a tab-delimited file. This exported file has to be opened in a text editor capable of saving it out as a UTF-16 BE (Big Endian) type of file.
    Also, I think you are going to have to use proper quote marks throughout, or change them in the exported tab-delimited file. Best to have a correct source, though.
    Here is your sample ZIPped up. I think it works properly. But then again, I think I might be bleary-eyed by now.
    http://www.wenzloffandsons.com/temp/merge_psalms_utf-16.zip
    Take care, Mike

  • Problem with data modeler

    Sorry if this is wrong place to post question regarding oracle data modeler. Cannot find dedicated forum for it.
    I reverse engineer the db schema via modeler. But for all non-unique T-tree index, the result schema shows unique constraint. Anyone has the same problem? Any URL I can post a bug report? thx

    Oracle TimesTen in-memory DB:
    Command> indexes nasdaqreport;
    Indexes on table C3.NASDAQREPORT:
    NASDAQREPORT: unique hash index on columns:
    REPORTID
    NASDAQREPORT_IDX_CONTROLNO: non-unique T-tree index on columns:
    CONTROLNUMBER
    NASDAQREPORT_IDX_MATCHING: non-unique T-tree index on columns:
    CUSIP
    CPID
    RPID
    NASDAQREPORT_IDX_MATCHREPORTID: non-unique T-tree index on columns:
    MATCHINGREPORTID
    NASDAQREPORT_IDX_REFNO: non-unique T-tree index on columns:
    REFERENCENUMBER
    NASDAQREPORT_IDX_TRADEID: non-unique T-tree index on columns:
    TRADEID
    NASDAQREPORT_IDX_TRANID: non-unique T-tree index on columns:
    TRANID
    7 indexes found.
    All indexes are non-unique indexes.
    But after I reverse engineered the schema, I saw they are become unique indexes:
    http://picsorlinks.com/view/pic/id/32548

  • How to import flat file with date in filename on a regular basis

    Hi,
    Using OWB 11gR1
    I have a file that will be delivered to an FTP each night with the date in the filename having the form YYYYMMDD-FILE.txt (ex: 20100326-FILE.txt) that I want to import to an external table.
    Now I've set up the import to the external table but am only able to import files that I specify the name for exactly. I've tried pointing to filenames such as "*-FILE.txt" and "%-FILE.txt" but that only results in errors.
    It must be possible to automatically import files with different filenames but the same structure, or isn't it? Could anyone help me solve this it'd be greatly appreciated.
    Thank you in advance.

    Hi
    For dynamic files you can;
    1. either do the DDL on the external table to point to the file with the changing name
    2Copy the file to a fixed name before using the external table/maps
    3. Use the preprocessor to cat/pipe these files for the external table. See the post here http://blogs.oracle.com/warehousebuilder/2009/06/file_staging_using_external_table_preprocessor.html it shows using gunzip but could simply be doing 'cat' on a bunch of files to standard output
    Cheers
    David

Maybe you are looking for