Export as SQL INSERT generates dud SQL

When I attempted to export data from a table as SQL INSERT statements it generated the statements with single quotes around the table name:
SQL> Insert into 'JOHN_SMITH' values ('162142','89')
  2  /
Insert into 'JOHN_SMITH' values ('162142','89')
ERROR at line 1:
ORA-00903: invalid table name
SQL> Fortunately my text editor can do the necessary global replace :)
SQL> Insert into "JOHN_SMITH" values ('162142','89')
  2  /
1 row created.
SQL> I am using Raptor #919
Cheers, APC

Wanted to clarify the table in insert - in the EA4 release, we will not wrap the table name in quotes at all, if you want to preserve multi-byte characters or mixed case table names, you need to wrap the table name in double quotes. We just removed the single quotes we had incorrectly put in. This makes it consistent with creating and modifying a table - i.e. you provide the quotes if you want them.
-- Sharon

Similar Messages

  • Export sql insert generates to_date('1987-04-19','DD-MON-RR')

    export sql insert generates buggy dates :
    Scott --> Tables --> EMP --> export --> sql inserts
    Columns --> ALL
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7788,'SCOTT','ANALYST',7566,to_date('1987-04-19','DD-MON-RR'),3000,null,20);
    the to_date is invalid.
    this bug happens only when selecting ALL columns
    cheers
    Laurent

    Hi Sue,
    I have version 1.0.0.15.57 Linux x86 (downloaded last week).
    Is there a way to produce trace files or additional info so that we can debug this one?
    $ env
    PATH=/bin:/usr/bin
    LANG=en_US.UTF-8
    HOME=/home/lsc
    DISPLAY=:0.0
    _=/usr/bin/env
    $ cat /etc/SuSE-release
    SUSE Linux Enterprise Server 10 (i586)
    VERSION = 10
    $ /home/lsc/sqldeveloper/sqldeveloper
    Oracle SQL Developer 1.0
    Copyright (c) 2005 Oracle Corporation.  All Rights Reserved.
    Working directory is /home/lsc/sqldeveloper/jdev/bin
    Assert: Initializing.. [email protected]
    Assert: Unknown Node:8: USER
    Assert: Unknown Node:8: SHARED QUERIES
    Assert: Unknown Node:8: TABLE EDITORS
    Assert: Unknown Node:8: VIEWS
    Assert: Unknown Node:8: MVIEWS
    Assert: Unknown Node:8: SYNONYM
    Assert: Unknown Node:8: SEQ
    Assert: Unknown Node:8: Recycle Bin
    Assert: Unknown Node:8: DB Link
    Assert: Unknown Node:8: MVIEW LOG
    Assert: Unknown Node:8: PLSQL
    Assert: Unknown Node:8: TRigger
    Assert: Unknown Node:8: INDEX
    Assert: SQLView initedAssert: Folder:Unsupported node in report.xml:#text
    Assert: Folder:Unsupported node in report.xml:#text
    Assert: Folder:Unsupported node in report.xml:#text
    Assert: Folder:Unsupported node in report.xml:#text
    Assert: Folder:Unsupported node in report.xml:#text
    Assert: Folder:Unsupported node in report.xml:#text
    Assert: Folder:Unsupported node in report.xml:#text
    Assert: Folder:Unsupported node in report.xml:#textexpand "connections "
    expand "LSC02"
    expand "Tables"
    select "EMP"
    right-click EMP
    chose EXPORT--SQL INSERT
    Format=INSERT Output=File TABLE=EMP File=/home/lsc/EMP.sql Columns=ALL
    Click apply
    $ cat EMP.sql
    -- INSERTING into EMP
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7369,'SMITH','CLERK',7902,to_date('1980-12-17','DD-MON-RR'),800,null,20);
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7499,'ALLEN','SALESMAN',7698,to_date('1981-02-20','DD-MON-RR'),1600,300,30);
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7521,'WARD','SALESMAN',7698,to_date('1981-02-22','DD-MON-RR'),1250,500,30);
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7566,'JONES','MANAGER',7839,to_date('1981-04-02','DD-MON-RR'),2975,null,20);
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7654,'MARTIN','SALESMAN',7698,to_date('1981-09-28','DD-MON-RR'),1250,1400,30);
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7698,'BLAKE','MANAGER',7839,to_date('1981-05-01','DD-MON-RR'),2850,null,30);
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7782,'CLARK','MANAGER',7839,to_date('1981-06-09','DD-MON-RR'),2450,null,10);
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7788,'SCOTT','ANALYST',7566,to_date('1987-04-19','DD-MON-RR'),3800,null,20);
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7839,'KING','PRESIDENT',null,to_date('1981-11-17','DD-MON-RR'),5000,null,10);
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7844,'TURNER','SALESMAN',7698,to_date('1981-09-08','DD-MON-RR'),1500,0,30);
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7876,'ADAMS','CLERK',7788,to_date('1987-05-23','DD-MON-RR'),1100,null,20);
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7900,'JAMES','CLERK',7698,to_date('1981-12-03','DD-MON-RR'),950,null,30);
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7902,'FORD','ANALYST',7566,to_date('1981-12-03','DD-MON-RR'),3000,null,20);
    Insert into "EMP" ("EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO") values (7934,'MILLER','CLERK',7782,to_date('1982-01-23','DD-MON-RR'),1300,null,10);

  • Urgent Help !!!  Export data into insert format (Oracle Sql developer)

    Hi all,
    Please help , when i try to export ms access table which have 400,000 over rows to insert format using oracle sql developer 1.5.5. After the export have done the exported file xxx.sql is empty.
    Is it because of too many rows? or what tool or function should i use for exporting table with many rows.
    It used to have exported successfully with
    Insert into table( ) values ();
    Insert into table( ) values ();
    Insert into table( ) values ();
    Insert into table( ) values ();
    ----- when i try to export table with over 10,000 row.
    Regard,
    Tun

    Another option is to export your file as Formatted text (space delimited). This will create a fixed format file. You can either create an external table to access the file or use sqlloader to load it. In your control file or access parameters you will specify the positions of the fields you are interested in. Your control file will look something like this:
    OPTIONS (BINDSIZE=50000,ERRORS=50,ROWS=200,READSIZE=65536)
    LOAD DATA
    CHARACTERSET US7ASCII
    INFILE '/home/FIXED_FORMAT.dat' "FIX 58"
    CONCATENATE 1
    INTO TABLE "EMP2"
    APPEND
    REENABLE DISABLED_CONSTRAINTS
    "ID" POSITION(1:2) INTEGER(2) ,
    "REGION" POSITION(3:3) INTEGER EXTERNAL(1) ,
    "DEPT" POSITION(4:6) INTEGER EXTERNAL(3) ,
    "HIRE_DATE" POSITION(7:14) DATE(8) "mmddyyyy" ,
    "SALARY" POSITION(15:19) DECIMAL(9,2) ,
    "NAME" POSITION(20:34) CHAR(15)
    SQLDeveloper does not currently provide an option to import fixed format files.

  • Export the sql output to an excel file

    Hi,
    I would llke to know how can we export an sql query out put from oracle 9i to an excel file using a java code...
    Thanks in advance..
    Naveen

    Naveen,
    You can access Microsoft Excel files via JDBC using the "JdbcOdbc" driver that comes with the JDK.
    Hence a simple matter of using JDBC to both extract from Oracle and insert into Excel.
    You will find many resources on the Internet explaining how to do this.
    Please note that I am certain that this is not the only way to achieve this.
    Good Luck,
    Avi.

  • FIM multivalue attribute export to SQL - error 0x80230808

    Hi,
    Running FIM Synchronization Service v 4.0.3531.2 - Update1.
    This instance is a simple setup, import from AD, export to SQL. I'm trying to export two multivalue attributes, one is a normal multivalue string, second is a reference attribute (member). My anchor is the GUID from AD in string format, perfectly fitting
    into uniqueidentifier sql datatype.
    Since I wanted to have the member values in the multivalue table also stored in a uniqueidentifier column (for further linking) I had to create two columns, one as "uniqueidentifier" and second as "text". FIM configuration went smoothly,
    I defined a member multivalue reference attribute and selected the uniqueidentifier column as "String attribute column", and the other multivalue string attribute had to be linked as "Large string attribute column" and I pointed it to the
    text datatype column.
    Synchronization completed without errors, export step properly exported all multivalue string attributes to the text column, BUT when it came to exporting the GUID reference attribute to the uniqueidentifier column it exported only the first value showing
    a "dn-attribute-failure" error. From the FIM GUI it was only showing the error number = 0x80230808.
    I did a SQL trace to see what's going on. I re-run the export and saw FIM trying to delete all the values for this attribute multiple times with:
    DELETE from [tblAD_Multivalue]  WHERE [objectGUID] = N'{B011B424-5B2F-43A9-84C5-8605A570487B}' AND [attributeName] = N'member'
    followed by doing cursor magic with the first value that was already added:
    exec sp_cursor 180150007,4,0,N'tblAD_Multivalue',@objectGUID='B011B424-5B2F-43A9-84C5-8605A570487B',@attributeName='member',@guidValue='2E52A484-C7F6-49C0-AAC8-0A30C732A385'
    After repeating the above for over 10 times it added a export_error_detail:
    update [mms_connectorspace] set [export_error_detail] = N'<export-status>
    <cd-error>
    <error-code>0x80230808</error-code><error-literal>[Modify] Failed operation</error-literal>
    </cd-error>
    </export-status>
    ',[count_export_error_retries] = 0,[is_export_error] = 1,[initial_export_error_date] = '2011-02-12 21:36:08.995',[last_export_error_date] = '2011-02-12 21:36:08.995',[export_error_code] = -2145189885,[unapplied_export_batch_number] = 1,[unapplied_export_sequencer_number] = 2162572,[original_export_batch_number] = 1,[original_export_sequencer_number] = 2162572,[current_export_batch_number] = 4,[current_export_sequence_number] = 2216213 where ([object_id] = '6F5C98E3-38FF-4F32-95F6-B5A315B71D7A')
    I tried manually adding one of the following values directly to SQL and it worked, so I'm not really sure what's wrong here.
    Any ideas?
    Piotr

    Hi Markus,
    yes, I know all about the architecture to allow multi-value attributes to be exported to SQL.
    Everything works when I have just one string value column in my tblAD_Multivalue. The problem is that my anchor in the primary table is a GUID in a uniqueidentifier SQL column type (in FIM MV it's a string), so when I'm exporting multivalue reference attributes
    to SQL I would like the GUID values also to go to a uniqueidentifier type column. That's why I set up the secondary column in the multi-value table with text datatype, so FIM recognizes it as a "large string column". I think I did everything within
    the guidelines, so I was puzzled when I saw that error.
    Consider this simple example, you're exporting users and groups from AD, you're anchor data type is uniqueidentifier, and you need to export the member attribute and the proxyAddresses attribute. What would you do to allow future SQL joins to calculate membership?
    hope this explains a bit more this scenario, would appreciate any suggestions.
    Piotr

  • Export Query SQL multi-linee in XML format and Import XML in SAP

    Good morning
    Someone know if is possible after Export Query SQL on OITT and ITT1 with thousands rows in XML with values already change thru the query.
    Then import the XML without using loop in SAP with the above code vb.net
    oDistinta = DirectCast(m_oCompany.GetBusinessObjectFromXML(xmlFile, 0), SAPbobsCOM.ProductTrees)
                    Dim errCode As Integer = 0
                    Dim strErrore As String = ""
                    Dim iErrore As Integer = 0
                    oDistinta.Browser.ReadXml(xmlFile, 0)
                    errCode = oDistinta.Update
                    m_oCompany.GetLastError(iErrore, strErrore)
                    'on error
                    If iErrore <> 0 Then
    To make easy the program i post with 2 ProductTrees item. The 2 item already exist in OITT e ITT1.
    When i update the 1st item is modify but the 2nd not change.
    Here is the XML
    <?xml version="1.0" encoding="UTF-16"?>
    <BOM>
      <BO>
        <AdmInfo>
          <Object>66</Object>
          <Version>2</Version>
        </AdmInfo>
        <ProductTrees>
          <row>
            <TreeCode>V9998</TreeCode>
            <TreeType>iProductionTree</TreeType>
            <Quantity>1000.000000</Quantity>
            <U_SHI_MZDA>0</U_SHI_MZDA>
            <U_SHI_PUMA>0</U_SHI_PUMA>
            <U_SHI_PUUO>0</U_SHI_PUUO>
          </row>
        </ProductTrees>
        <ProductTrees_Lines>
          <row>
            <ItemCode>1047</ItemCode>
            <Quantity>5.000000</Quantity>
            <Warehouse>Mc</Warehouse>
            <Price>0.000000</Price>
            <Currency>
            </Currency>
            <IssueMethod>im_Backflush</IssueMethod>
            <Comment>
            </Comment>
            <ParentItem>V9998</ParentItem>
            <PriceList>2</PriceList>
            <DistributionRule>
            </DistributionRule>
            <U_IDE_POS>10</U_IDE_POS>
            <U_IDE_FASE>0</U_IDE_FASE>
            <U_IDE_OPER>10</U_IDE_OPER>
            <U_SHI_UndInv>N</U_SHI_UndInv>
            <U_SHI_Pri1>
            </U_SHI_Pri1>
            <U_SHI_Pri2>
            </U_SHI_Pri2>
            <U_SHI_Pri3>
            </U_SHI_Pri3>
            <U_SHI_Prgm>
            </U_SHI_Prgm>
            <U_SHI_Tim2>0.000000</U_SHI_Tim2>
            <U_SHI_Tim3>0.000000</U_SHI_Tim3>
            <U_SHI_DexNACQ>
            </U_SHI_DexNACQ>
            <U_SHI_BP>
            </U_SHI_BP>
            <U_SHI_PrcKit>0.000000</U_SHI_PrcKit>
            <U_SHI_ItmNACQ>
            </U_SHI_ItmNACQ>
            <U_SHI_LineOk>N</U_SHI_LineOk>
          </row>
        </ProductTrees_Lines>
        <ProductTrees>
          <row>
            <TreeCode>V9999</TreeCode>
            <TreeType>iProductionTree</TreeType>
            <Quantity>1000.000000</Quantity>
            <U_SHI_MZDA>0</U_SHI_MZDA>
            <U_SHI_PUMA>0</U_SHI_PUMA>
            <U_SHI_PUUO>0</U_SHI_PUUO>
          </row>
        </ProductTrees>
        <ProductTrees_Lines>
          <row>
            <ItemCode>1015</ItemCode>
            <Quantity>1000.000000</Quantity>
            <Warehouse>Mc</Warehouse>
            <Price>0.000000</Price>
            <Currency>
            </Currency>
            <IssueMethod>im_Backflush</IssueMethod>
            <Comment>
            </Comment>
            <ParentItem>V9999</ParentItem>
            <PriceList>5</PriceList>
            <DistributionRule>
            </DistributionRule>
            <U_IDE_POS>30</U_IDE_POS>
            <U_IDE_FASE>0</U_IDE_FASE>
            <U_IDE_OPER>30</U_IDE_OPER>
            <U_SHI_UndInv>N</U_SHI_UndInv>
            <U_SHI_Pri1>
            </U_SHI_Pri1>
            <U_SHI_Pri2>
            </U_SHI_Pri2>
            <U_SHI_Pri3>
            </U_SHI_Pri3>
            <U_SHI_Prgm>
            </U_SHI_Prgm>
            <U_SHI_Tim2>0.000000</U_SHI_Tim2>
            <U_SHI_Tim3>0.000000</U_SHI_Tim3>
            <U_SHI_DexNACQ>
            </U_SHI_DexNACQ>
            <U_SHI_BP>
            </U_SHI_BP>
            <U_SHI_PrcKit>0.000000</U_SHI_PrcKit>
            <U_SHI_ItmNACQ>
            </U_SHI_ItmNACQ>
            <U_SHI_LineOk>N</U_SHI_LineOk>
          </row>
        </ProductTrees_Lines>
      </BO>
    </BOM>
    Someone know the reason ?
    Thanks in advance.
    Regards

    Hi Gabriele,
    The issue is that the ReadXml method takes an index parameter which is the number of the object that you want to read from the XML data. In your code you are specifying index 0 which is the first product tree in your file so when you call the Update method this is the only record that will be updated. It is not possible to do a mass update using the DI API in this way, you must always load each object separately. You can use the GetXMLelementCount method of the Company object to find out how many product tree objects are in your file and then use the ReadXML method to load each one by incrementing the index parameter in a loop.
    Kind Regards,
    Owen

  • When I export my file to generate a .pdf, the text box literally has a box around it! How do I make that line disappear?

    When I export my file to generate a .pdf, the text box literally has a box around it! How do I make that line disappear?

    Sounds like you have a stroke on it. Select the frame and set the stroke to none.

  • SqlDeveloper -  Import data of backup sql insert generated of Tool Export

    Hi, sorry for my bad English,
    My database is deleted, and the only export support is one that was generated prior to erase in the SQL Developer 1.5.
    But this export has:
    Insert into ORA_ASPNET_APPLICATIONS (ApplicationName, LOWEREDAPPLICATIONNAME, application, DESCRIPTION) values ('Returns', 'return', '[B @ 600f44fe', null);
    else
    Insert into "ORA_ASPNET_APPLICATIONS (ApplicationName, LOWEREDAPPLICATIONNAME, application, DESCRIPTION) values ('Returns', 'return', 'C8AD09AAE28D4961B74BA2321054212B', null);
    How decode value into value normal?
    I've tried to RawtoHEX and is not the same value.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    I'm afraid that's a bugged export rather than a coded value...
    In the future, better make periodical backups with EXPDP.
    Sorry,
    K.

  • Some Questions on Import/Export using SQL*Plus

    Could somebody help me by answering the following queries? I couldn't find the answers in this forum or the User Guide.
    1. Can we create an application or page export without using the web front-end? E.g. Call some pl/sql, get a file on db server...
    2. Can we change the application id within an application/page export? It looks like it is only referenced a few times in the script (set package variable, wwv_flow_api.remove_flow, wwv_flow_audit.remove_audit_trail, wwv_flow_api.create_flow, thereafter uses package variable)
    3. Can we change the workspace within an application/page export? It looks the workspace is set with wwv_flow_api.set_security_group_id(p_security_group_id=>719307052588490). Is a 'security group' a synonym for 'workspace'?
    4. Supposing I wanted to transfer only the shared components of an application from one DB to another. Could I achieve this by first exporting every page of the target app individually, then import source app over target app, then re-import each individually exported page over target app?
    Thanks!

    1. Yes. Search this forum for command line export. The command line export utility is on the Studio at http://apex.oracle.com/studio
    2. I wouldn't do that. The application id is probably used (indirectly) to generate foreign
    keys (template ids, offsets and such).
    3. Same answer as (2).
    4. There is no way to export/import only the shared components.

  • Report Script Export to sql

    Hi there again, I test the report script to export data to SQL, for example, on Demo - Basic db, the text file is generated as this:
    New York Stereo Sales 1000.0 950.0
    New York Stereo Cost of Goods Sold 580.0 551.0
    New York Stereo Margin 420.0 399.0
    However when import to sql, I still get everything in one big column, any idea?
    <PAGE(Scenario)
    <COLUMN(Year)
    <ROW(Market, Product, Accounts)
    <CHILD East
    <DESC Product
    { DECIMAL 1
    WIDTH 9
    SUPBRACKET
    SUPCOMMA
    MISSINGTEXT " "
    UNDERSCORECHAR " "
    SUPHEADING
    NOINDENTGEN
    SUPFEED
    ROWREPEAT }
    Budget
    Jan Feb
    <DESC Accounts
    !

    Don, I haven't followed all of your other threads so I'm not sure what tool you're using to import the .txt file to your relational database.
    However, the problem is probably that your import tool isn't recognizing where one field ends and another begins. Pretty much every import tool has a method for specifying the column widths, or to use a delimiter - for example, specify tab delimiters and use {TABDELIMIT} in your report script.

  • Data and Cleansing export TO SQL table with Melissa Data appended fails

    I am using Data Quality services with Melissa Data Address Check as reference data.  Everything works fine until I take the option to export Data and Cleansing Info which will give me my cleansed data plus additional data points such as geocodes from
    Melissa.  When I do it fails with the error below.
    (Failed to create a new table geocode in database DQS_STAGING_DATA. Check whether the table already exists  and have the database administrator make sure the DQS Service has CREATE TABLE rights in the destination database and can INSERT to the destination
    table.)
    This error makes no sense as the table does not exist and I do have proper rights. I can export Data and Cleansing data if Melissa Data is not involved  ,  when I dig further it seems to be complaining about column header lengths.   
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_CBSADivisionCod' is too long. Maximum length is 128.;
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_DeliveryPointCo' is too long. Maximum length is 128.;
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_ResponseRecordI' is too long. Maximum length is 128.;
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_DeliveryPointCh' is too long. Maximum length is 128.;
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_CBSADivisionLev' is too long. Maximum length is 128.;
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_CongressionalDi' is too long. Maximum length is 128.;
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_CBSADivisionTit' is too long. Maximum length is 128.;
    I can see no option to control these column headers in DQS.  Has anyone else experienced this ?  Does anyone know of a workaround ? 
    I have already reported to Melissa data and they agreed the problem was the column header length but said they also had no control of that.

    Hello,
    You can create an SR with a based outbound filter. All object that match the filter will be provisioning to CS SQL (if you do not define filter, all objects will be provisioning).
    Or you can create an MVextension rules
    Regards,
    Sylvain

  • Data export through sql

    all experts
    i whant do backup through sql such as:
    SQL > host exp.exe userid=system full= y file=d:\backup\sysdate||'.bk'
    how i can do this
    thanks in advance

    Firstly, you can't run export like that. Secondly, export is a logical backup not a "real" backup.
    For Windows, fill in the blanks (at least work it out instead of getting it completely handed to you) -
    Problem with this method is having to store the password in the file so be careful.
    >>>>>>>>>>>>>>>>>>>>>>>
    Batch file -
    <<<<<<<<<<<<<<<<<<<<<<<
    SET ORACLE_SID=<ORA_SID>
    SET ORACLE_HOME=<ORA_HOME>
    REM backup controlfile to trace and generate the exp.parfile
    sqlplus /NOLOG @<some_path>\backupctlfiles.sql %ORACLE_SID%
    REM run an export using the generated filename in the previous step
    exp parfile=<some_path>\exp.parfile
    <<<<<<<<<<<<<<<<<<
    backupctlfiles.sql file -
    >>>>>>>>>>>>>>>>>>
    define dbpassw = '<export user password>’
    connect <username>/&dbpassw
    alter database backup controlfile to trace;
    set feedback off trimspool on trimout on pages 0 verify off
    spool <some_path>\exp.parfile
    select 'full=y ' || chr(10) ||
    'recordlength=65535' || chr(10) ||
    'direct=y '|| chr(10) ||
    'file=<export_file_path>\&1\export_full.&1..'||to_char(sysdate,'YYYYMMDDHH24MMSS')||'.dmp'||chr(10) ||
    'log=<export_file_path>\&1\export_full.&1..'||to_char(sysdate,'YYYYMMDDHH24MMSS')||'.log'||chr(10) ||
    'consistent=y'|| chr(10)||
    'userid=<username>/&dbpassw'
    from dual;
    spool off
    exit;

  • Export all sql reports in a region to csv or pdf

    Hello there,
    I have a region on a particular page in apex and that region has about 12 different sql query reports.
    I have enabled csv option for each of them so there are 12 links to export each report. However
    I would like to have only 1 link which will save the contents of all the reports to csv or pdf. The
    column headings are different for some of these reports but wanted to know if something like this
    is possible.
    Thanks in advance for reading this.

    You might try wrapping the regions in html regions that essentially give you the ability to specify a valign=top for the report regions.
    So, your regions look like this:
    30 Report Start - contains region source = (div)<table width="100%" cellspacing="0" cellpadding="0"><tr><td valign="top">
    40 Approved Tests Report (Column 2) Conditional
    45 Report 2 Start - contains </td><td valign="top">
    60 Unapproved Tests Report (Column 2) Conditional
    65 Report 3 Start - contains </td><td valign="top">
    80 Approved Count by Build Report (Column 2) Conditional
    85 Report End - contains "</td></tr></table>(/div)"
    Replace () with the angle brackets
    Maybe there is a more elegant solution with templates or page level CSS or something..
    But that should work.
    Edited by: Bob37 on Sep 17, 2010 3:33 PM
    Edited by: Bob37 on Sep 17, 2010 3:33 PM

  • Oracle 10g DB Export for SQL Server 2008

    I'm having a hard time exporting my database Primavera P6 Project Management for SQL Server 2008, in fact I need to do this so I can export to a future program in Visual Studio 2010. But contiue with the Oracle Database Enabled
    Thank you.
    Edited by: 898987 on 25/11/2011 01:59

    Hi,
    Thanks for the translation.
    You say you are having a hard time exporting my database Primavera P6 Project Management for SQL Server 2008. Could you give details of exactly what you are using for the export and what problems you are having ?
    Is this an export from an Oracle database or a SQL*Server 2008 database ?
    Regards,
    Mike

  • EXPORT catexp.sql 9.2.0.4.0

    Hi
    I would like to install catexp.sql with SYS user as SYSDBA.
    It was succesfully but when I tried to run "exp" I got the following errors:
    (Export option was OWNER=wps, GRANTS=y, ROWS=y, COMPRESS=y
    The database run on SUN SERVER.
    :>showrev
    Hostname: wps-sun
    Hostid: 8323de56
    Release: 5.8
    Kernel architecture: sun4u
    Application architecture: sparc
    Hardware provider: Sun_Microsystems
    Domain:
    Kernel version: SunOS 5.8 Generic 108528-13 December 2001
    Connected to: Oracle9i Enterprise Edition Release 9.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.4.0 - Production
    Export done in WE8ISO8859P1 character set and AL16UTF16 NCHAR character set
    About to export specified users ...
    . exporting pre-schema procedural objects and actions
    . exporting foreign function library names for user WPS
    . exporting PUBLIC type synonyms
    . exporting private type synonyms
    . exporting object type definitions for user WPS
    About to export WPS's objects ...
    . exporting database links
    . exporting sequence numbers
    . exporting cluster definitions
    EXP-00056: ORACLE error 31600 encountered
    ORA-31600: invalid input value EMIT_SCHEMA for parameter NAME in function SET_TRANSFORM_PARAM
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 105
    ORA-06512: at "SYS.DBMS_METADATA_INT", line 3926
    ORA-06512: at "SYS.DBMS_METADATA_INT", line 4050
    ORA-06512: at "SYS.DBMS_METADATA", line 836
    ORA-06512: at line 1
    EXP-00000: Export terminated unsuccessfully
    Could you help me in this problem what can I do ?
    Thank you in advance !
    Laszlo

    If you have a Metalink access, have a look at Note:232120.1 or Doc:498457.994
    If not, try running the catpatch.sql script in the $ORACLE_HOME/rdbms/admin directory, as sysdba and with no users connected.

Maybe you are looking for