Metadata of tables in SQL file

I want to get SQL file which will have script to create table strucure in the database.
How i can achive with the EXPDP/exp? or any other option.
Thanks,

The best way to do so would be to dbms_metadata.get_ddl package which cna extract the ddl's of the objects for you. If you want to use datapump, that's also possible.
Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
SQL> create directory dir1 as '/u01/app/oracle/';
Directory created.
SQL> grant all on directory dir1 to public;
Grant succeeded.
SQL> exit
Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
[oracle@edhdr2p0-orcl ~]$ expdp aman/aman directory=dir1 dumpfile=expdp.dmp tables=scott.emp
Export: Release 11.2.0.1.0 - Production on Wed Aug 11 14:34:45 2010
Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
Starting "AMAN"."SYS_EXPORT_TABLE_01":  aman/******** directory=dir1 dumpfile=expdp.dmp tables=scott.emp
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "SCOTT"."EMP"                               8.570 KB      14 rows
Master table "AMAN"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
Dump file set for AMAN.SYS_EXPORT_TABLE_01 is:
  /u01/app/oracle/expdp.dmp
Job "AMAN"."SYS_EXPORT_TABLE_01" successfully completed at 14:35:12
[oracle@edhdr2p0-orcl ~]$ impdp aman/aman directory=dir1 dumpfile=expdp.dmp tables=scott.emp sqlfile=mysql.sql
Import: Release 11.2.0.1.0 - Production on Wed Aug 11 14:36:03 2010
Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
Master table "AMAN"."SYS_SQL_FILE_TABLE_01" successfully loaded/unloaded
Starting "AMAN"."SYS_SQL_FILE_TABLE_01":  aman/******** directory=dir1 dumpfile=expdp.dmp tables=scott.emp sqlfile=mysql.sql
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "AMAN"."SYS_SQL_FILE_TABLE_01" successfully completed at 14:36:07
[oracle@edhdr2p0-orcl ~]$ more /u01/app/oracle/mysql.sql
-- CONNECT AMAN
ALTER SESSION SET EVENTS '10150 TRACE NAME CONTEXT FOREVER, LEVEL 1';
ALTER SESSION SET EVENTS '10904 TRACE NAME CONTEXT FOREVER, LEVEL 1';
ALTER SESSION SET EVENTS '25475 TRACE NAME CONTEXT FOREVER, LEVEL 1';
ALTER SESSION SET EVENTS '10407 TRACE NAME CONTEXT FOREVER, LEVEL 1';
ALTER SESSION SET EVENTS '10851 TRACE NAME CONTEXT FOREVER, LEVEL 1';
ALTER SESSION SET EVENTS '22830 TRACE NAME CONTEXT FOREVER, LEVEL 192 ';
-- new object type path: TABLE_EXPORT/TABLE/TABLE
CREATE TABLE "SCOTT"."EMP"
   (    "EMPNO" NUMBER(4,0),
        "ENAME" VARCHAR2(10 BYTE),
        "JOB" VARCHAR2(9 BYTE),
        "MGR" NUMBER(4,0),
        "HIREDATE" DATE,
        "SAL" NUMBER(7,2),
        "COMM" NUMBER(7,2),
        "DEPTNO" NUMBER(2,0)
   ) SEGMENT CREATION IMMEDIATE
  PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
  STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
  PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DE
FAULT CELL_FLASH_CACHE DEFAULT)
  TABLESPACE "USERS" ;
HTH
Aman....

Similar Messages

  • Data Pump import to a sql file error :ORA-31655 no data or metadata objects

    Hello,
    I'm using Data Pump to export/import data, one requirement is to import data to a sql file. The OS is window.
    I made the follow export :
    expdp system/password directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY tables=user.tablename
    and it works, I can see the file TABLESDUMP.DMP in the directory path.
    then when I tried to import it to a sql file:
    impdp system/password directory=dpump_dir dumpfile=tablesdump.dmp sqlfile=tables_export.sql
    the log show :
    ORA-31655 no data or metadata objects selected for job
    and the sql file is created empty in the directory path.
    I'm not DBA, I'm a Java developer , Can you help me?
    Thks

    Hi, I added the command line :
    expdp system/system directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY schemas=ko1 tables=KO1QT01 logfile=capture.log
    the log in the console screen is (is in Spanish), no log file was cerated in the directory path.
    Export: Release 10.2.0.1.0 - Production on Martes, 26 Enero, 2010 12:59:14
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Conectado a: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    UDE-00010: se han solicitado varios modos de trabajo, schema y tables.
    (English error)
    UDE-00010: multiple job modes requested,schema y tables.
    This is why I used tables=user.tablename instead, is this right ?
    Thks

  • Load XML File into temporary tables using sql loader

    Hi All,
    I have an XML file as below. I need to insert the contents into a temporary staging table using sql loader. Please advice how I need to do that.
    For example Portfolios should go into a seperate table, and all the tags inside it should be populated in the columns of the table.
    Family should go into a seperate table and all the tags inside it should be populated in the columns of the table.
    Similarly offer, Products etc.
    - <ABSProductCatalog xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    - <ProductSalesHierachy>
    - <Portfolios>
    - <Portfolio productCode="P1">
      <Attribute name="CatalogProductName" value="Access" />
      <Attribute name="Status" value="Active" />
      </Portfolio>
    - <Portfolio productCode="P2">
      <Attribute name="CatalogProductName" value="Data" />
      <Attribute name="Status" value="Active" />
      </Portfolio>
    - <Portfolio productCode="P3">
      <Attribute name="CatalogProductName" value="Voice" />
      <Attribute name="Status" value="Active" />
      </Portfolio>
    - <Portfolio productCode="P4">
      <Attribute name="CatalogProductName" value="Wireless" />
      <Attribute name="Status" value="Active" />
      </Portfolio>
      </Portfolios>
    - <Families>
    - <Family productCode="F1">
      <Attribute name="CatalogProductName" value="Internet Access Services" />
      <Attribute name="Status" value="Active" />
    - <ParentHierarchy>
      <Item productCode="P1" modelType="Portfolio" />
      </ParentHierarchy>
      </Family>
    - <Family productCode="F2">
      <Attribute name="CatalogProductName" value="Local Access Services" />
      <Attribute name="Status" value="Active" />
    - <ParentHierarchy>
      <Item productCode="P2" modelType="Portfolio" />
      </ParentHierarchy>
      </Family>
      </Families>
    - <SubFamilies>
    - <SubFamily productCode="SF1">
      <Attribute name="CatalogProductName" value="Business Internet service" />
      <Attribute name="Status" value="Active" />
    - <ParentHierarchy>
      <Item productCode="F1" modelType="Family" />
      </ParentHierarchy>
      </SubFamily>
      </SubFamilies>
    - <ProductRefs>
    - <ProductRef productCode="WSP1" modelType="Wireline Sales Product">
      <ActiveFlag>Y</ActiveFlag>
    - <ProductHierarchy>
      <SalesHierarchy family="F1" subFamily="SF1" portfolio="P1" primary="Y" />
      <SalesHierarchy family="F2" portfolio="P2" primary="N" />
      <FinancialHierarchy quotaBucket="Voice" strategicProdCategory="Local Voice" />
      </ProductHierarchy>
      </ProductRef>
    - <ProductRef productCode="MSP2" modelType="Handset">
      <ActiveFlag>Y</ActiveFlag>
    - <ProductHierarchy>
      <SalesHierarchy portfolio="P4" primary="Y" />
      </ProductHierarchy>
      </ProductRef>
      </ProductRefs>
      </ProductSalesHierachy>
    - <Offers>
    - <Offer productCode="ABN">
      <OfferName>ABN</OfferName>
      <OfferDescription>ABN Description</OfferDescription>
    - <Segments>
      <Segment>SCG</Segment>
      <Segment>PCG</Segment>
      </Segments>
      <OfferUpdateDate>2009-11-20</OfferUpdateDate>
      <ActiveFlag>Y</ActiveFlag>
      </Offer>
    - <Offer productCode="OneNet">
      <OfferName>OneNet</OfferName>
      <OfferDescription>OneNet Description</OfferDescription>
    - <Segments>
      <Segment>SCG</Segment>
      <Segment>PCG</Segment>
      <Segment>PCG2</Segment>
      </Segments>
      <OfferUpdateDate>2009-11-20</OfferUpdateDate>
      <ActiveFlag>Y</ActiveFlag>
      </Offer>
      </Offers>
    - <Products>
    - <Product productCode="WSP1" modelType="Wireline Sales Product">
      <ProductName>AT&T High Speed Internet</ProductName>
      <ProductDescription>High Speed Internet</ProductDescription>
      <LegacyCoProdIndicator>SBC</LegacyCoProdIndicator>
      <RevenueCBLCode>1234B</RevenueCBLCode>
      <VolumeCBLCode>4567A</VolumeCBLCode>
      <SAARTServiceIDCode>S1234</SAARTServiceIDCode>
      <MarginPercentRequired>Y</MarginPercentRequired>
      <PercentIntl>%234</PercentIntl>
      <UOM>Each</UOM>
      <PriceType>OneTime</PriceType>
      <ProductStatus>Active</ProductStatus>
      <Compensable>Y</Compensable>
      <Jurisdiction>Everywhere</Jurisdiction>
      <ActiveFlag>Y</ActiveFlag>
    - <Availabilities>
      <Availability>SE</Availability>
      <Availability>E</Availability>
      </Availabilities>
    - <Segments>
      <Segment>SCG</Segment>
      <Segment>PCG</Segment>
      </Segments>
      <VDIndicator>Voice</VDIndicator>
      <PSOCCode>PSOC 1</PSOCCode>
      <USBilled>Y</USBilled>
      <MOWBilled>N</MOWBilled>
      <ProductStartDate>2009-11-20</ProductStartDate>
      <ProductUpdateDate>2009-11-20</ProductUpdateDate>
      <ProductEndDate>2010-11-20</ProductEndDate>
    - <AliasNames>
      <AliasName>AT&T HSI</AliasName>
      <AliasName>AT&T Fast Internet</AliasName>
      </AliasNames>
    - <OfferTypes>
      <OfferType productCode="ABN" endDate="2009-11-20" />
      <OfferType productCode="OneNet" />
      </OfferTypes>
    - <DynamicAttributes>
    - <DynamicAttribute dataType="String" defaultValue="2.5 Mbps" name="Speed">
      <AttrValue>1.5 Mbps</AttrValue>
      <AttrValue>2.5 Mbps</AttrValue>
      <AttrValue>3.5 Mbps</AttrValue>
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="TransportType">
      <AttrValue>T1</AttrValue>
      </DynamicAttribute>
      </DynamicAttributes>
      </Product>
    - <Product productCode="MSP2" modelType="Handset">
      <ProductName>Blackberry Bold</ProductName>
      <ProductDescription>Blackberry Bold Phone</ProductDescription>
      <LegacyCoProdIndicator />
      <RevenueCBLCode />
      <VolumeCBLCode />
      <SAARTServiceIDCode />
      <MarginPercentRequired />
      <PercentIntl />
      <UOM>Each</UOM>
      <PriceType />
      <ProductStatus>Active</ProductStatus>
      <Compensable />
      <Jurisdiction />
      <ActiveFlag>Y</ActiveFlag>
    - <Availabilities>
      <Availability />
      </Availabilities>
    - <Segments>
      <Segment>SCG</Segment>
      <Segment>PCG</Segment>
      </Segments>
      <VDIndicator>Voice</VDIndicator>
      <PSOCCode />
      <USBilled />
      <MOWBilled />
      <ProductStartDate>2009-11-20</ProductStartDate>
      <ProductUpdateDate>2009-11-20</ProductUpdateDate>
    - <AliasNames>
      <AliasName />
      </AliasNames>
    - <OfferTypes>
      <OfferType productCode="ABN" />
      </OfferTypes>
    - <DynamicAttributes>
    - <DynamicAttribute dataType="String" name="StlmntContractType">
      <AttrValue />
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="BMG 2 year price">
      <AttrValue>20</AttrValue>
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="MSRP">
      <AttrValue>40</AttrValue>
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="BMGAvailableType">
      <AttrValue />
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="ProductId">
      <AttrValue>123456</AttrValue>
      </DynamicAttribute>
    - <DynamicAttribute dataType="String" name="modelSource">
      <AttrValue>product</AttrValue>
      </DynamicAttribute>
      </DynamicAttributes>
      </Product>
      </Products>
      <CatalogChanged>Y</CatalogChanged>
      </ABSProductCatalog>

    Two options that come to mind. Others exist.
    #1 - {thread:id=474031}, which is basically storing the XML in an Object Relational structure for parsing
    #2 - Dump the XML into either an XMLType based table or column and use SQL (with XMLTable) to create a view that parses the data. This would be the same as the view shown in the above post.
    Don't use sql*loader to parse the XML. I was trying to find a post from mdrake about that but couldn't. In short, sql*loader was not build as an XML parser so don't try to use it that way.

  • How to extract data from multiple flat files to load into corresponding tables in SQL Server 2008 R2 ?

    Hi,
              I have to implement the following scenario in SSIS but don't know how to do since I never worked with SSIS before. Please help me.
              I have 20 different text files in a single folder and 20 different tables corresponding to each text file in SQL Server 2008 R2 Database. I need to extract the data from each text file and
    load the data into corresponding table in Sql Server Database. Please guide me in how many ways I can do this and which is the best way to implement this job.  Actually I have to automate this job. Few files are in same format(with same column names
    and datatypes) where others are not.
    1. Do I need to create 20 different projects ?
                   or
        Can I implement this in only one project by having 20 packages?
                 or
        Can I do this in one project with only one package?
    Thanks in advance.

    As I said I don't know how to use object data type, I just given a shot as below. I know the following code has errors can you please correct it for me.
    Public
    Sub Main()
    ' Add your code here 
    Dim f1
    As FileStream
    Dim s1
    As StreamReader
    Dim date1
    As
    Object
    Dim rline
    As
    String
    Dim Filelist(1)
    As
    String
    Dim FileName
    As
    String
    Dim i
    As
    Integer
    i = 1
    date1 =
    Filelist(0) =
    "XYZ"
    Filelist(1) =
    "123"
    For
    Each FileName
    In Filelist
    f1 = File.OpenRead(FileName)
    s1 = File.OpenText(FileName)
    rline = s1.ReadLine
    While
    Not rline
    Is
    Nothing
    If Left(rline, 4) =
    "DATE"
    Then
    date1 (i)= Mid(rline, 7, 8)
     i = i + 1
    Exit
    While
    End
    If
    rline = s1.ReadLine
    End
    While
    Next
    Dts.Variables(
    "date").Value = date1(1)
    Dts.Variables(
    "date1").Value = date1(2)
    Dts.TaskResult = ScriptResults.Success
    End
    Sub

  • How to transfer the tables from one file group to another file group in SQL 2008.?

    Hello all,
    I have few issues regarding the transfer of the tables from one file group to another file group  in SQL 2008 and also How can we  backup
    and restore the particular database based on file group level.
    Let’s say I have a tables stored within the different FG. such as
    Tables                                                    
      File group
    Dimension tables                                              
                                                                     Primary
    Fact tables                                               
                                                                              FG1
               FG2…
    zzz_tables                                               
                                                                              DEFAULT_FG    
    dim.table1                                                                                                                          DEFAULT_FG
    dim.table2                                                                                                                          DEFAULT_FG
    Here all I want to transfer the dim.table1 ,dim.table2  from  DEFAULT_FG to the Primary File
    group .So is there simple methods for transfer the dim.table1,2  from one FG to another .I have tried somewhat but I couldn’t get the exact way .So if someone have better idea please share your knowledge that would be really appreciated.
    Secondly after moving those dim.table1 ,dim.table2 from DEFAULT_FG to Primary ,All I want to backup and restore the database only containing  the Primary and FG1,FG2… not
    a DEFAULT_FG.Is it possible or not.?
    Hope to hear from the one who knows better approach for this kind of task .Your simple help will be much appreciated.
    Regards,
    Anil Maharjan

    Well after all my full day research on this topic had paid off, I finally got the solution and am so happy to research on these things. It makes
    us feel really happy after all our research and hard work doesn't goes as waste.
    Finally I got what I am looking for and want to make sure that I am able to transfer the tables from DEFAULT_FG to another FG without tables
    having clustered index on that tables .
    With the help of the link below I finally got my solution where Roberto’s coded store procedure simply works for this.
    Really thanks to him for his great post and thanks to all for your response and your valuable time.
    http://gallery.technet.microsoft.com/scriptcenter/c1da9334-2885-468c-a374-775da60f256f
    Regards,
    Anil Maharjan

  • In PL-SQL archive data from a table to a file

    I am currently developing a vb app where I need to archive data from a table to a file. I was hoping to do this with a stored procedure. I will also need to be able to retrieve the data from the file for future use if necessary. What file types are available? Thanks in advance for any suggestions.

    What about exporting in an oracle binary format? The export files cannot be modifiable. Is there a way to use the export and import utility in PL/SQL?
    null

  • Help with creating a sql file that will capture any database table changes.

    We are in the process of creating DROP/Create tables, and using exp/imp data into the tables (the data is in flat files).
    Our client is bit curious to work with. They do the alterations to their database (change the layout, change the datatype, drops tables) without our knowing. This has created a hell lot of issues with us.
    Is there a way that we can create a sql script which can capture any table changes on the database, so that when the client trys to execute imp batch file, the sql file should first check to see if any changes are made. If made, then it should stop execution and give an error message.
    Any help/suggestions would be highly appreciable.
    Thanks,

    Just to clarify...
    1. DDL commands are like CREATE, DROP, ALTER. (These are different than DML commands - INSERT, UPDATE, DELETE).
    2. The DDL trigger is created at the database level, not on each table. You only need one DDL trigger.
    3. You can choose the DDL commands for which you want the trigger to fire (probably, you'll want CREATE, DROP, ALTER, at a minimum).
    4. The DDL trigger only fires when one of these DDL commands is run.
    Whether you have 50 tables or 50,000 tables is not significant to performance in this context.
    What's signficant is how often you'll be executing the DDL commands on which the trigger is set to fire and whether the DDL commands execute in acceptable time with the trigger in place.

  • How do i create a new table from a *.sql file in JSP

    <html>
    <%@ page language="java" import="java.sql.*" %>
    <%@ page errorPage="ExceptionHandler.jsp" %>
    <%
    Class.forName("com.mysql.jdbc.Driver").newInstance();
    String dbUrl = "jdbc:mysql://localhost/test?user=root";
    Connection conn = DriverManager.getConnection(dbUrl);
    Statement stmt = conn.createStatement();
    stmt.executeUpdate("\\. addresses.sql");
    %>
    how would i create a table from the addresses.sql file?

    stmt.executeUpdate("\\. addresses.sql");This, of course, will not work.
    You might even execute it the same way you do from command line with some Runtime.exec() jugglery but I would suggest you to work with my first suggestion unless someone else came up with something better.

  • Insert data file name into table from sql loader

    Hi All,
    I have a requirement to insert the data file name dynamically into table using sql loader.
    Example:
    sqlldr userid=username/passwword@host_string control=test_ctl.ctl data=test_data.dat
    test_ctl.ctl
    LOAD DATA
    FILED TERMINATED BY ','
    INTO TABLE test
    (empid number,
    ename varchar2(20),
    file_name varchar2(20) ---------- This should be the data file name which can be dynamic (coming from parameter)
    test_data.dat
    1,test
    2,hello
    3,world
    4,end
    Please help..
    Thanks in advance.
    Regards
    Anuj

    you'll probably have to write your control file on the fly, using a .bat or .sh file
    rem ===== file : test.bat ========
    rem
    rem ============== in pseudo speak =============
    rem
    rem
    echo LOAD DATA > test.ctl
    echo FILED TERMINATED BY ',' >> test.ctl
    echo INTO TABLE test >> test.ctl
    echo (empid number, >> test.ctl
    echo ename varchar2(20), >> test.ctl
    echo file_name constant %1% >> test.ctl
    echo ) >> test.ctl
    rem
    rem
    rem
    sqlldr userid=username/passwword@host_string control=test.ctl data=test_data.dat
    rem =============== end of file test.bat =======================
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_field_list.htm#i1008664

  • Compare SQL file tables as a file content VS Documentum or Alfresco

    Hello - Documentum & Alfresco are 2 examples of Document Content Management (file repository, workflows, notifications, etc), I am interesting in the file content management where system provides file repository and some extra features like check in
    / check out and versioning. Does anyone have tried to replace a file repository with SQL file tables?

    You can use SQL Filetables for storing files as well as doing modifcations etc. for getting versioning you can enable change management or implement audit triggers in them.
    Here are some helpful links
    http://visakhm.blogspot.in/2012/07/working-with-filetables-in-sql-2012.html
    http://visakhm.blogspot.in/2012/07/triggers-on-filetables-in-sql-2012.html
    http://visakhm.blogspot.in/2013/09/implementing-transaction-over-file.html
    Check in checkout will not be available by default. You may need to implement it using some flags. My question is why cant you go for a versioncontrol software then like TFS? why reinvent the wheel?
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • How to convert from SQL Server table to Flat file (txt file)

    I need To ask question how convert from SQL Server table to Flat file txt file

    Hi
    1. Import/Export wizened
    2. Bcp utility
    3. SSIS 
    1.Import/Export Wizard
    First and very manual technique is the import wizard.  This is great for ad-hoc and just to slam it in tasks.
    In SSMS right click the database you want to import into.  Scroll to Tasks and select Import Data…
    For the data source we want out zips.txt file.  Browse for it and select it.  You should notice the wizard tries to fill in the blanks for you.  One key thing here with this file I picked is there are “ “ qualifiers.  So we need to make
    sure we add “ into the text qualifier field.   The wizard will not do this for you.
    Go through the remaining pages to view everything.  No further changes should be needed though
    Hit next after checking the pages out and select your destination.  This in our case will be DBA.dbo.zips.
    Following the destination step, go into the edit mappings section to ensure we look good on the types and counts.
    Hit next and then finish.  Once completed you will see the count of rows transferred and the success or failure rate
    Import wizard completed and you have the data!
    bcp utility
    Method two is bcp with a format file http://msdn.microsoft.com/en-us/library/ms162802.aspx
    This is probably going to win for speed on most occasions but is limited to the formatting of the file being imported.  For this file it actually works well with a small format file to show the contents and mappings to SQL Server.
    To create a format file all we really need is the type and the count of columns for the most basic files.  In our case the qualifier makes it a bit difficult but there is a trick to ignoring them.  The trick is to basically throw a field into the
    format file that will reference it but basically ignore it in the import process.
    Given that our format file in this case would appear like this
    9.0
    9
    1 SQLCHAR 0 0 """ 0 dummy1 ""
    2 SQLCHAR 0 50 "","" 1 Field1 ""
    3 SQLCHAR 0 50 "","" 2 Field2 ""
    4 SQLCHAR 0 50 "","" 3 Field3 ""
    5 SQLCHAR 0 50 ""," 4 Field4 ""
    6 SQLCHAR 0 50 "," 5 Field5 ""
    7 SQLCHAR 0 50 "," 6 Field6 ""
    8 SQLCHAR 0 50 "," 7 Field7 ""
    9 SQLCHAR 0 50 "n" 8 Field8 ""
    The bcp call would be as follows
    C:Program FilesMicrosoft SQL Server90ToolsBinn>bcp DBA..zips in “C:zips.txt” -f “c:zip_format_file.txt” -S LKFW0133 -T
    Given a successful run you should see this in command prompt after executing the statement
    Starting copy...
    1000 rows sent to SQL Server. Total sent: 1000
    1000 rows sent to SQL Server. Total sent: 2000
    1000 rows sent to SQL Server. Total sent: 3000
    1000 rows sent to SQL Server. Total sent: 4000
    1000 rows sent to SQL Server. Total sent: 5000
    1000 rows sent to SQL Server. Total sent: 6000
    1000 rows sent to SQL Server. Total sent: 7000
    1000 rows sent to SQL Server. Total sent: 8000
    1000 rows sent to SQL Server. Total sent: 9000
    1000 rows sent to SQL Server. Total sent: 10000
    1000 rows sent to SQL Server. Total sent: 11000
    1000 rows sent to SQL Server. Total sent: 12000
    1000 rows sent to SQL Server. Total sent: 13000
    1000 rows sent to SQL Server. Total sent: 14000
    1000 rows sent to SQL Server. Total sent: 15000
    1000 rows sent to SQL Server. Total sent: 16000
    1000 rows sent to SQL Server. Total sent: 17000
    1000 rows sent to SQL Server. Total sent: 18000
    1000 rows sent to SQL Server. Total sent: 19000
    1000 rows sent to SQL Server. Total sent: 20000
    1000 rows sent to SQL Server. Total sent: 21000
    1000 rows sent to SQL Server. Total sent: 22000
    1000 rows sent to SQL Server. Total sent: 23000
    1000 rows sent to SQL Server. Total sent: 24000
    1000 rows sent to SQL Server. Total sent: 25000
    1000 rows sent to SQL Server. Total sent: 26000
    1000 rows sent to SQL Server. Total sent: 27000
    1000 rows sent to SQL Server. Total sent: 28000
    1000 rows sent to SQL Server. Total sent: 29000
    bcp import completed!
    BULK INSERT
    Next, we have BULK INSERT given the same format file from bcp
    CREATE TABLE zips (
    Col1 nvarchar(50),
    Col2 nvarchar(50),
    Col3 nvarchar(50),
    Col4 nvarchar(50),
    Col5 nvarchar(50),
    Col6 nvarchar(50),
    Col7 nvarchar(50),
    Col8 nvarchar(50)
    GO
    INSERT INTO zips
    SELECT *
    FROM OPENROWSET(BULK 'C:Documents and SettingstkruegerMy Documentsblogcenzuszipcodeszips.txt',
    FORMATFILE='C:Documents and SettingstkruegerMy Documentsblogzip_format_file.txt'
    ) as t1 ;
    GO
    That was simple enough given the work on the format file that we already did.  Bulk insert isn’t as fast as bcp but gives you some freedom from within TSQL and SSMS to add functionality to the import.
    SSIS
    Next is my favorite playground in SSIS
    We can do many methods in SSIS to get data from point A, to point B.  I’ll show you data flow task and the SSIS version of BULK INSERT
    First create a new integrated services project.
    Create a new flat file connection by right clicking the connection managers area.  This will be used in both methods
    Bulk insert
    You can use format file here as well which is beneficial to moving methods around.  This essentially is calling the same processes with format file usage.  Drag over a bulk insert task and double click it to go into the editor.
    Fill in the information starting with connection.  This will populate much as the wizard did.
    Example of format file usage
    Or specify your own details
    Execute this and again, we have some data
    Data Flow method
    Bring over a data flow task and double click it to go into the data flow tab.
    Bring over a flat file source and SQL Server destination.  Edit the flat file source to use the connection manager “The file” we already created.  Connect the two once they are there
    Double click the SQL Server Destination task to open the editor.  Enter in the connection manager information and select the table to import into.
    Go into the mappings and connect the dots per say
    Typical issue of type conversions is Unicode to non-unicode.
    We fix this with a Data conversion or explicit conversion in the editor.  Data conversion tasks are usually the route I take.  Drag over a data conversation task and place it between the connection from the flat file source to the SQL Server destination.
    New look in the mappings
    And after execution…
    SqlBulkCopy Method
    Sense we’re in the SSIS package we can use that awesome “script task” to show SlqBulkCopy.  Not only fast but also handy for those really “unique” file formats we receive so often
    Bring over a script task into the control flow
    Double click the task and go to the script page.  Click the Design script to open up the code behind
    Ref.
    Ahsan Kabir Please remember to click Mark as Answer and Vote as Helpful on posts that help you. This can be beneficial to other community members reading the thread. http://www.aktechforum.blogspot.com/

  • SQL table to a file

    Ok, just need some idea's really. Trying to figure out a good way to get all the information from a SQL table into a file and have it organised properly, so that i can read it all back into arrays or back into another table.
    Right now i have no clue as to how to go about it. FileInputStream seems to be my only thought, but i wouldn't know how to distinguish between each row or even each column so i can read it back to an array.
    Any ideas would be helpful. Not looking for code or anything, although wouldn't mind.

    CSV format? Sorry, totally ignorant of many
    things!Google will help.
    but i get your idea of saving each row to a file and
    seperating each column with a comma. Then i'd
    suppose i'd have to read that file into 1 string and
    split it twice, so i get each column into its own
    array. Or read the file line by line and split each line. If you have a lot of data you can't read the entire file into memory.
    Can i save each row to the same file?Sure.
    Like keeping
    the file open and looping through the table
    (rs.next()), and then saving each row to the file on
    each iteration? I know i should look at the API, but
    that just confuses me more.Then work through Sun's tutorials. They're very good.

  • Invalid data format on EXPORTING table  to SQL-FLAT FILE (Insert type)

    Hi there!
    Writing from Slovenia/Europe.
    Working with ORACLE9i (standard version) on Windows XP with SQL-deloper 1.0.0.015.
    FIRST SQL.-DEVELOPER IS GOOD TOOL WITH SOME MINOR ERRORS.
    1.) Declare and Insert data EXAMPLE
    drop table tst_date;
    create table tst_date (fld_date date);
    insert into tst_date values (sysdate);
    insert into tst_date values (sysdate);
    2.) Retriving date with SQLPLUS
    SQL> select to_char(fld_date,'DD.MM.YYYY HH24:MI:SS') FROM TST_DATE;
    23.10.2006 11:25:23
    23.10.2006 11:25:25
    As you see TIME DATA IS CORRECT.
    When I EXPOPRT data TO SQL-insert type I got this result IN TST_DATE.SQL file:
    -- INSERTING into TST_DATE
    Insert into "TST_DATE" ("FLD_DATE") values (to_date('2006-10-23','DD.MM.RR'));
    Insert into "TST_DATE" ("FLD_DATE") values (to_date('2006-10-23','DD.MM.RR'));
    As you seel I lost TIME DATA.
    QUESTION!
    HOW CAN I SET PROPER DATE FORMAT IN SQL-DEVELOPER BEFORE I EXPORT DATA TO FLAT FILE.
    Best regards, Iztok from SLOVENIA
    Message was edited by:
    DEKUS

    A DATE-Field, is a DATE-Field and not a
    DATE-TIME-Field.
    The export-tool identifies a DATE-Field and exports
    the data into date-format.This is not true. Oracle DATE fields include a time element.
    To the original poster - I believe this is a bug in the current version.
    See this thread for possible workarounds Bad Export format --- BUG ???
    Message was edited by:
    smitjb

  • SQL Loader to Load Multiple Tables from Multiple Files

    Hi
    I wish to create a control file to load multiple tables from multiple files
    viz.Emp.dat into emp table and Dept.dat into Dept table and so on
    How could I do it?
    Can I create a control file like this:
    OPTIONS(DIRECT=TRUE,
    SKIP_UNUSABLE_INDEXES=TRUE,
    SKIP_INDEX_MAINTENANCE=TRUE)
    UNRECOVERABLE
    LOAD DATA
    INFILE 'EMP.dat'
    INFILE 'DEPT.dat'
    INTO TABLE emp TRUNCATE
    FIELDS TERMINATED BY "|" OPTIONALLY ENCLOSED BY '"'
    (empno,
    ename,
    deptno)
    INTO TABLE dept TRUNCATE
    FIELDS TERMINATED BY "|" OPTIONALLY ENCLOSED BY '"'
    (deptno,
    dname,
    dloc)
    Appreciate a Quick Reply
    mailto:[email protected]

    Which operating system? ("Command Prompt" sounds like Windows)
    UNIX/Linux: a shell script with multiple calls to sqlldr run in the background with "&" (and possibly nohup)
    Windows: A batch file using "start" to launch multiple copies of sqlldr.
    http://www.pctools.com/forum/showthread.php?42285-background-a-process-in-batch-%28W2K%29
    http://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/en-us/start.mspx?mfr=true
    Edited by: Brian Bontrager on May 31, 2013 4:04 PM

  • How to execute a script(.sql) file from a PL\SQL procedure

    I would like to know how to execute a .sql file from a stored procedure and the result of it should update a table. My .sql file contains select statements.

    Hi!
    just go through the following piece of code -
    SQL> ed
    Wrote file afiedt.buf
      1  declare
      2    str varchar2(200);
      3  begin
      4    str := '@C:\RND\Oracle\Misc\b.sql';
      5    execute immediate(str);
      6* end;
    SQL> /
    declare
    ERROR at line 1:
    ORA-00900: invalid SQL statement
    ORA-06512: at line 5ORA-00900: invalid SQL statement
    Cause: The statement is not recognized as a valid SQL statement. This error can occur if the Procedural Option is not installed and a SQL statement is issued that requires this option (for example, a CREATE PROCEDURE statement). You can determine if the Procedural Option is installed by starting SQL*Plus. If the PL/SQL banner is not displayed, then the option is not installed.
    Action: Correct the syntax or install the Procedural Option.
    Regards.
    Satyaki De.

Maybe you are looking for