SQL Dev 1.5.4+: Scripting DDL and data dumps?

In SQL Dev 1.5.4, can I script a DDL and data dump? If not, what about 2.0? If not 2.0, has anyone requested this functionality so I can vote for it? I find it frustrating that, while doing a Database Export, I can't even pre-declare (e.g. save) the set of objects I want to dump; sometimes, you want to selectively dump, and it's a pain to hunt and peck and select just those you want to dump. Easy if you want to dump everything but 2-3 objects in a large schema. Not so easy if you only need, say, 20 out of 100 objects to be dumped (e.g. for domain or configuration tables--some subset of the whole schema).
I'm really enjoying SQL Developer 1.5.4 by the way. Despite it's flaws, I'm pretty happy with it. Looking forward to 2.0 and beyond. Good work SQL Dev team.
Thanks very much.
Dana

They're all command line tools, so they can all be wrapped up in a batch or shell script. Bummer you can't access them... Hope you find a better solutionThanks K. I should be getting Oracle 10g Express Edition on my desktop soon--critical because we don't have full access to the Development instance. It's like putting changes through a straw over to the DBAs. I'm not sure why Development is locked-down to developers, but that's the way it is.
Any chance that Oracle 10g Express Edition comes with scriptable data pump binaries? Will still need authorization, but maybe that's one way to go. I hate trying to write my own Data Pump in Python or any other language. It's seems a bit absurd to me but I suppose there are reasons.
Dana

Similar Messages

  • BUG: Export DDL and Data fails for mixed case table/column names

    Hi there,
    I have found a bug in SQL Developer. See details below.
    Description:
    When "Export DDL and Data) function is used on a table/columns not named in UPPERCASE, sql generated by SQL Developer is invalid.
    Steps to reproduce:
    - open SQL Developer, connect to DB
    - make a table named "lowerCase" (in double quotes, so it won't be automatically changed to capital letters)
    - you may also add some columns, for example "lowerCol1", "UpCol2", ALLUPCOL3
    - add some data rows to the table
    - choose Tools -> Export DDL and Data
    - check exporting of tables and data, on "filter" tabs choose your "lowerCase" table
    - press "Apply"
    Error:
    Generated SQL contains invalid INSERTs: mixed-case table and columns are referenced without obligatory double quotes, which yields an error when generated script is executed (see below, relevant line is underlined)
    -- DDL for Table lowerCase
    CREATE TABLE "DBO_HT"."lowerCase"
    (     "lowerCol1" VARCHAR2(100),
         "UpCol2" VARCHAR2(100),
         "ALLUPCOL3" VARCHAR2(100)
    -- DATA FOR TABLE lowerCase
    -- FILTER = none used
    -- INSERTING into lowerCase
    Insert into lowerCase (lowerCol1,UpCol2,ALLUPCOL3) values ('lc','uc','auc');
    -- END DATA FOR TABLE lowerCase
    Remarks
    SQL Developer: version 1.2.1, build MAIN-32.13
    Oracle DBs: 9.2 & Express
    OS: Windows 2000 Professional
    If you need any more details/testing, let me know. I'd really appreciate a quick patch for this issue...
    Alternatively, do you know of any other simple way of copying a single database (it's called a schema in Oracle, right?) from one computer to another? Possibly something so simple like detaching->copying->reattaching mdf (data) files in SQL Server... I thought that this "Export DDL&Data" function will do, but as you can see I couldn't use it.
    I just need a simple solution that works - one operation on source to stuff, get the resulting files to other computer and one operation to have it running there... I think that such scenario is very basic, yet I just can't achieve it and I am simply not allowed to spend more time on it (read: our test project fails, my company rejects my "lobbying" and stays with MSSQL :/ )
    Thanks a lot & bye

    Thanks for your reply.
    ad. 1)
    You're right. I just wanted to give some very short feedback on my experiences with SQL Developer, so I didn't think starting new threads would be necessary, but as I was writing it became much bigger than I initially planned - sorry about that. I will make proper threads as soon as possible. Having "Edit post" button on this forum would also be useful.
    ad. 2)
    Generally, you're right - in most cases it's true that "switching DBMS is a major commitment" and "you will produce terrible code" if you don't learn the new one.
    However, I think that you miss one part of market here - the market that I think Express is also targeted on. I'd call it a "fire&forget databases" market; MySQL comes to mind as possibly most common solution here. It's the rather small systems, possibly web-accessed, whose data-throughput requirements are rather modest; the point is to store data at all, and not necesarily in fastest way, because given the amount of data that is used, even on low-end hardware it will work well enough. What's important here is its general ease of use - how easy is to set up such system, connect and access data, develop a software using it, how much maintenance is needed, how easy this maintenance is, how easy are the most common development tasks as creating a DB, moving a DB from test to production server etc. There, "how easy" directly translates to "how much time we need to set it up", which translates to "how much will the development will cost".
    Considering the current technology, switching the DBMS in such systems is not necesarily a major commitment and believe me that you will not produce terrible code. In many cases it's as simple as changing a switch in your ORM toolkit: hibernate.dialect = Hibernate.Dialect.OracleDialect vs MySQLDialect vs MsSql2005Dialect
    Therefore, in some part of market it's easy to switch DBMS, even on project-by-project basis. The reason to switch will appear when other DBMS makes life easier => development faster. From that point of view, I can understand my colleagues giving me an embarassing look and saying "come on, I won't read all these docs just to have db copied to test server". And it doesn't mean "they are not willing to learn anything new", it's just that they feel such basic task should have self-explaining solution that doesn't require mastering any special knowledge. And if they get such simple solutions somewhere else, it costs them nothing to change the hibernate dialect.
    I think Oracle did the great job with introducing the Express to this "fire&forget" market. The installation is a snap, it just works out of the box, nothing serious to configure, opposite to what I remember from installing and working on Oracle 9 a few years ago. In some places it's still "you need to start SQL*Plus and enter this script", but it's definitely less than before. I also find the SQL Developer a great tool, it can do most of what we need to do with the DB, it's also much better and pleasant to use over Oracle 9 tools. Still, a few basic things still require too much hassle, and I'd say taking your schema to another machine is one of them. So I think that, if you do it well, the "schema copy wizard" you mentioned might be very helpful. If I was to give any general advice for Express line of DB/tools, I'd say "make things simple" - make it "a DB you can't see".
    That's, IMHO, the way to attract more Express users.

  • SQL Dev 2.1:Ability to save and load visual query models?

    At SQL Dev 2.1, do we now have the ability to save and load visual query models as with TOAD? Or has that functionality been rolled into the for-pay Data Modeler extension? Seemed strange to hide the functionality deep within SQL Worksheet, then not allow query models to be saved or loaded. From what I recall, load/save was scheduled to be in 2.0.
    I've not yet downloaded and installed 2.1, but also as I recall, it was promised to be a significant (e.g. new-feature rich) release. Has it lived up to that promise? Hoping also that it's not so buggy as to not merit an upgrade. At my workplace, folks have different versions of SQL Dev 1.X installed, each for the purpose of retaining functionality that got broken in future releases, etc. Would love to be able to advise them to retire their 1.X installs in favor of 2.1.
    Edited by: Dana N on Jan 21, 2010 6:45 AM

    Dana, We are working on the query builder. We have not done anything with it in recent releases as we knew that it needed completely overhauled and a
    couple of bug fixes were not going to give us the query builder we needed. Sue will be able to tell you more as the we get closer to releasing it as a feature.Thanks Barry. I know it's probably impossible to give firm dates, but my colleagues would love to know, ballpark, when query builder might be ready. Any particular year/quarter that's targeted for?
    A lot of the bugs in the 2.1 are probably my fault as we decided to build a framework which allowed tasks to be scheduled like eclipse and push them to the
    background. Coupled with this, we tore apart the worksheet to allow us to fit tasks in, but to also give us the ability to do some of the enhancement requests
    that you guys, our users, have been asking for. While its not perfect, I believe its getting much better from both a functionality and performance standpoint. The
    worksheet will not change again and there are no enhancements logged or scheduled which will do that. The issues that have come as part of 2.1 are being
    addressed and we are fixing the highest priority ones as we speak. One of my priorities is to make these components as rock solid as possible.Thanks again. It really is a nice tool to work with, and I'm grateful to have it. I would choose stability for existing features over new ones, but I suppose there always has to be a balance in development. What was really scary in the past was getting bizarre / wrong answers in query results--that's a complete show stopper; e.g. clicking on a row/col intersection in a result set row would show strange and false values. New features can wait, but core functionality must be solid. If one can't presume integrity of query results, it's game over--using a version of SQL Developer that returns inconsistent result sets between runs could be Considered Harmful, or so it is to me.
    Anyway, thanks for responding and I hope Oracle Corp gets you whatever assistance you need to improve the quality of existing functionality while adding new features; with one not being at the expense of the other. :-) Glad also to see Sue has authored a book on the tool. Hopefully that will drive customer demand and force increased allocation of development resources toward the product. I'd like to think having out-of-the-box tools that are a joy to use might even increase market share.
    Data Modeler I would love to use, but am completely priced out of it. Enterprise Architect from Sparx Systems is the best value there so far as I can tell.
    Dana

  • Exporting Application w/ DDL and data

    I can export an Application.
    I can export the DDL for that application.
    I Cannot_ export the data for that application unless I am not following the proper procedure.
    Note that I did use "with supporting objects" option.
    Anyone know why ?
    Thanks in advance,
    Anon
    Edited by: Anon on Aug 2, 2010 11:20 AM
    Edited by: Anon on Aug 2, 2010 11:27 AM

    Supporting Objects allow you to add ddl and dml to your application so that you can export as one file and then import and install in one step. To get your data, I would suggest using SQL Developer and extracting your data as insert statements. That can then be loaded into your supporting objects (after your ddl).
    Here is a link to a 3.2 tutorial that explains more about Supporting Objects.
    -- Sharon

  • The export file from a calc script - naming and date/time stamp

    Here is a simple calc script to export data.
    2 questions:
    1. Is there an easy way to add a date/time stamp on the name of the data file so that it does not overwrite the older ones every time?
    2. I tried to specify the path so that it write to another server. "\\mfldappp011\E:\Exp_AW.txt". It's not working. How should I specify the path ?
    Fix (@Relative("Yeartotal",0),"Actual","Working",&ActualYear);
    Dataexport "file" "," "C:\Exp_AW.txt" "#MI";
    EndFix;
    Edited by: user9959627 on Sep 7, 2012 11:25 AM

    Probably easiest to call the maxl script from a command line script, then rename the exported file to include the tme stamp and copy/move it to a location.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Need info on DDL and Data propagation

    Greetings Guys,
         I have a requirement for moving data and DDL from lower environment to production, this would be done on some frequency say weekly for deploying the latest code for the application, I would like to know what are the possible techniques
    and tools available in SQL Server. I am not sure if I can set up replication for all the tables in the database because in case of restoring the database from backup I suspect fixing the replication problem itself will become a big thing. Currently we use
    merge statements for moving data between the environments and redgate sql compare API for moving DDLs. Let me know if there are any other ways to do this. 
    Environment:
    SQL SERVER 2008 R2
    WINDOWS 2008 R2
    With regards,
    Gopinath.
    With regards, Gopinath.

    You can also create a SSDT database project and publish the changes using it
    see
    http://www.techrepublic.com/blog/data-center/auto-deploy-and-version-your-sql-server-database-with-ssdt/
    http://blogs.msdn.com/b/ssdt/archive/2013/08/12/optimizing-scripts-for-faster-incremental-deployment.aspx
    http://schottsql.blogspot.in/2012/11/ssdt-publishing-your-project.html
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Tools - Export DDL (and data) mishandles timestamps

    I used SQLDeveloper to try to do a quick export of some test data. Here is a sample of part of an insert statement created by the export data option:
    to_timestamp('2005-12-31 00:00:00.0','DD-MON-RR HH.MI.SS.FF AM')
    This clearly will not work as the timestamp was retrieved in a format different than the format used in the to_timestamp() function. I tried to fix it, but I found than the utility switched from format to format. Some insert statements were consistent, some formatted the actual data differently, others used different format strings. It looked like I went to each developer and told them to write some code to create insert statements for their tables. Each one came back using a different format. Some were able to code valid SQL statements while others weren't able to create valid SQL. Unfortunately this was all done by one click. I cannot fathom how you all created such a mess because not only is it wrong (sometimes) but it is inconsistent. Actually I would guess someone created a hash table of available formats and randomly picked from the hash table.

    There is actually no error we are using dbms_metadata to see if there are contraints or ref_contraints on the table and this is the way it tells us there are not. We should be capturing this message better and not sending it to the user and I will fix that in the next release.
    However, this error should not be affecting your export if it is or you see missing information from the generated export file please reply.

  • Global Script Protect and data uploading

    I need to allow some users to upload data to our database
    that includes html tags. When global script protect is on all these
    tags are made safe and the content loses its formatting.
    When I disable global script protect it is possible to load
    an iframe externally to simulate a cross-site script attack.
    How can I prevent the cross site scripting but still allow
    users to upload html content to the database?
    I'm using Fusebox 3 if that matters.

    echowebs wrote:
    >
    > isn't there a more beneficial 'server compliant' way to
    > parse all these things than having to parse them on
    every page call on my site?
    > Sorry I am venting b/c I have spent hours on this crazy
    thing this morning and
    > it is driving me nuts :)
    >
    > thanks Ian
    >
    Well, not with ColdFusion. Since by the time ColdFusion gets
    the
    request it is too late for a server option. You, of course,
    could
    easily put such search logic in an Applicaton.cfm|.cfc
    template that is
    automatically run every request. But yes, ColdFusion based
    tools will
    run every request.
    If you want something 'server compliant' then you need to
    look at the
    setting and configuration options of your web server, i.e.
    IIS or
    Apache. This is the system that could do something more
    globally. I do
    not know what, since I have never had to deal with this
    level.
    But, the reality of HTTP based systems, is that every request
    is
    unqualifiedly untrusted and if you must build a secure system
    you just
    have to work with that situation. Every request could include
    malicious
    code in the Get, Post, Cookie, etc and if you just process
    this data
    without screening it, then trouble can insue.

  • Script Logic VS Data Transformation File

    Hi all,
       I'm new to SAP BPC. I have knowledge of SAP BW.
       I can see conversion file, which we are referring in data transformation file. which we can use for mapping and conversion of external data into internal data.
      How data transformation file different form script logic? Are we going to refer script logis in Data transformation file for each required dimension?
      Can any of you give me clarity on how to place script logic and data transformation file in BPC data management flow.
      I will really applicate all your help!!!
    Thanks
    Ben.

    Nilanjan,
       I have a another quick question...
      suppose my bpc application has 5 dimensions. Out of the 5 dimensions, 4 dimensions data i'm getting directly from SAP BW. assume 1 dim, i need to extract by doing look up at different table which also reside in BW.
       how to populate data for DIM 5.
       I got your point that data transformation file purely for field mapping. suppose DIM5 if i want to populate from script logic, wht do i need map in Transformation file. I hope you got my point.
       My question if how to populate a DIM in BPC using lookup approach.
    Thanks,
    Ben.

  • Training on CalcScripts, Reporting Scritps, MaxL and Data Loading

    Hi All
    I am new to this forum. I am looking for someone who can train me on topics like CalcSripts, Reporting Scripts, MaxL and Data Loading.
    I am willing to pay for your time. Please let me know.
    Thanks

    Hi Friend,
    As you seems to be new to essbase,you must learn What is Essbase, OLAP, Difference between Dense & Sparse, and then use "essbase tech ref" for more reference
    After that go through
    https://blogs.oracle.com/HyperionPlanning/and start exploring CalcScript, Maxl etc
    and all this for you free free free..........
    Thanks,
    Avneet

  • Oracle SQL template to create re-usable DDL/DML Scripts for Oracle database

    Hi,
    I have a requirement to put together a Oracle SQL template to create re-usable DDL/DML Scripts for Oracle databases.
    Only the Oracle DBA will be running the scripts so permissions is not an issue.
    The workflow for any DDL is as follows:-
    1) New Table
    a. Check if the table exists from the system/admin views.
    b. If table exists then give message "Table Exists"
    c. If table does not exist then execute DDL code
    2) Add Column
    a. Check if Column exists for a given table from system/admin views
    b. If column exists in the specified table,
    b1. backup table.
    b2. alter table to make changes to the column
    b3. verify data or execute dml script convert from backup to the new change.
    c. If Column does not exist
    c1. backup table
    c2. alter table to add column
    c3. execute dml to populate column with default value.
    The DML scripts are for populating base tables with data required for business operations.
    3) Add new row
    a. check if row exists by comparing old values of each column with new values to be added for the new record.
    b. If exists, give message row exists
    c. If not exists, add new record.
    4) Update existing record (We have createtime columns in these tables so changes can be tracked)
    a. check if row exists using primary key.
    b. If exists,
    b1. deactivate the record using the "active" column of the table
    b2. Add new record with the changes required.
    c. If does not exist, add new record with the changes required.
    Could you please help with some ideas which can get this done accurately?
    I have tried several ways, but I am not able to put together something that fulfills all requirements.
    Thank you,

    First let me address your question. (This is the easy part.)
    1. The existence of tables can be found in DBA_TABLES. Query it and and then use conditional logic and execute immediate to process the DDL.
    2. The existence of table columns is found in DBA_TAB_COLUMNS. Query it and then conditionally execute your DDL. You can copy the "before picture" of the table using that same dba view, or even better, use DBMS_METADATA.
    As for your DML scripts, they should be restartable, reversible, and re-run-able. They should "fail gracefully" on error, be written in such a way that they can run twice in a row without creating duplicate changes.
    3. Adding appropriate constraints can prevent invalid duplicate rows. Also, you can usually add to the where clause so that the DML does only what it needs to do without even relying on the constraint (but the constraint is there as a safeguard). Look up the MERGE statement to learn how to do an UPSERT (update/insert), which will let you conditionally "deactivate" (update) or insert a record. Anything that you cannot do in SQL can be done with simple procedural code.
    Now, to the heart of the matter...
    You think I did not understand your requirements?
    Please be respectful of people's comments. Many of us are professionals with decades of experience working with databases and Oracle technology. We volunteer our valuable time and knowledge here for free. It is extremely common for someone to post what they feel is an easy SQL or PL/SQL question without stating the real goal--the business objective. Experienced people will spot that the "wrong question" has been asked, and then cut to the chase.
    We have some good questions for you. Not questions we need answers from, but questions you need to ask yourself and your team. You need to reexamine this post and deduce what those questions are. But I'll give you some hints: Why do you need to do what you are asking? And will this construct you are asking for even solve the root cause of your problems?
    Then ponder the following quotations about asking the right question:
    Good questions outrank easy answers.
    — Paul Samuelson
    The only interesting answers are those which destroy the questions.
    — Susan Sontag
    The scientific mind does not so much provide the right answers as ask the right questions.
    — Claude Levi-Strauss
    You can tell whether a man is clever by his answers. You can tell whether a man is wise by his questions.
    — Mahfouz Naguib
    One hears only those questions for which one is able to find answers.
    — Friedrich Nietzsche
    Be patient towards all that is unresolved in your heart and try to love the questions themselves.
    — Rainer Maria Rilke
    What people think of as the moment of discovery is really the discovery of the question.
    — Jonas Salk
    Judge a man by his questions rather than his answers.
    — Voltaire
    The ability to ask the right question is more than half the battle of finding the answer.
    — Thomas J. Watson

  • Rerunning script (DML and DDL) without ora-00001 and ora-00955 errors

    What is the best way to write a DML and DDL script so that it can be run multiple times without these ORA errors:
    ORA-00955: name is already used by an existing object
    ORA-00001: unique constraint (JILL.SYS_C00160247) violated
    I have just joined a product development company using SQL Server as there primary database. They have just completed a port to Oracle.
    Their product release upgrades (given to clients) include sql scripts with database changes (structure and data). They require that the client be able to rerun the scripts more than once with no errors. In SQL Server, the accomplish it this way.
    For DDL:
    if not exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[MyTab]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)
    Begin
         CREATE TABLE [dbo].[MyTab] (
              [ID]int IDENTITY(1,1),
              [InvID] uniqueidentifier not null
    For DML:
    IF NOT EXISTS (SELECT 1 FROM [dbo].[mytab] WHERE [Name] = 'Smith' and [ID] = 3)
    BEGIN
    INSERT INTO [dbo].[mytab]
    ([ID] ,[Name]
    VALUES (3,'Smith')
    END
    I am tasked with duplicating this logic on the Oracle side. The only way I can think of so far is using plsql and checking for existence before every insert and create statement. The other options I thought of cannot be used in this case
    - "whenever sqlerror continue" - gives the same response for all errors. True errors should stop the code, so this is too risky.
    - "log errors into ... reject limit unlimited" on the insert - I thought this was my best solution until I found out that it doesn't support lobs.
    Do you know of any more elegant (and more efficient) solution other than plsql cursors to check for existence before running each insert/create?
    Any suggestions would be greatly appreciated.

    select table_name from user_tables will give you the table exist or not.
    all_tables/dba_tables
    http://download.oracle.com/docs/cd/B14117_01/server.101/b10755/statviews_1190.htm#i1592091

  • Can SQL Dev compare difference of 2 sql scripts?

    I need to compare structure of table that has changed by someone but I don't know which table has been modified. I found that SQL Dev can export meta data into sql file and have version control either. I'm not sure can we compare this 2 sql scripts with version control to detect some change in table structure or not?
    Or if has any others solution, Please help.
    Regards,
    Sutthisak S.

    If you open the script in sqldev, you get a History tab next to the Worksheet. Selecting a revision will compare against that.
    To compare against another file, select File - Compare With - Other File.
    If the internal compare doesn't convince you, there are free alternatives out there, like WinMerge, or my personal favourite (but commercial) Beyond Compare.
    Have fun,
    K.

  • Help needed with Express and SQL Dev

    Recently was asked to make a sql server application work with Oracle. I have never worked with Oracle products and no one in my small shop has either. I downloaded the Express 10g onto a virtual machine on my dev server and oracle SQL Developer locally. I have no idea how to connect to the Express db on the prod server using sql developer. I have tried Basic and TNS connection types and all the errors are very cryptic. Any help is appreciated. Do I need to install anything client side to connect to the server?
    I figured it out. I had to use the Basic connection type and I DL'd the J2EE .
    Edited by: jt_stand on Sep 25, 2008 10:51 AM

    Had to use the Basic connection type and get the right combination of options. I can now connect to my remote Express server with my local sql developer program.

  • DDL and INSERT scripts of EMPLOYEES, DEPARTMENTS, LOCATIONS, etc.

    Hi all,
    (Where) Can I have DDL and INSERT scripts of tables
    EMPLOYEES, DEPARTMENTS, LOCATIONS, JOB_HISTORY, JOBS, REGIONS,
    and
    EMP and DEPT tables?
    I could find only descriptions of the tables in http://download.oracle.com/docs/cd/B28359_01/server.111/b28328/scripts.htm#insertedID3, not the complete DDL and INSERT scripts. Please help.

    Hi,
    The scripts (and other files) are on the companion CD. You can download it from www.oracle.com .
    Bartek

Maybe you are looking for

  • Why can't you simply print a photo in iPhoto? It wants me to create a "theme" first? What's that all about?

    When I try to print in iPhoto it won't let me print without creating a "theme" first. Is there any way you can just print? I don't want a theme, I just want to print!

  • Panel Width Stays Fixed in 64-Bit but not 32-Bit

    Hello all, I've recently installed Configurator 1.0 and created my first panel as a collection of menu commands and tools I most often use. The panel is 800x130 (HxW) pixels. I'm running Vista 64 Ultimate with PS CS4 32-bit & 64-bit installed, and ha

  • Put the computer to sleep 'NEVER' with GPO

    I want to set all my workstations NEVER go into sleep mode. By default, I believe Windows 7 set to 30 minutes. I used domain gpo and able to set it at 'specify the system sleep timeout (plugged in)' and set timeout in seconds to 18748800 which turns

  • How to create dynamic Connection String in SSIS Package???

    Hi I created OLEDB Source Connnection String for that Package, i need Create OLEDB Destination Connection String Dynamically, Same server name,but i need to add DW at the end of the Database ex: Source database name is Demo, Destination database shou

  • IMac not COMPLETELY powering down...

    Morning all... I upgraded my 24" iMac (bought Jan 2007) to Leopard on Saturday, and apart from a few minor software funnies, it seemed to go OK. I just upgraded - not archive and install or anything complicated like that. The problem I now have is th