Oracle export file as source

Hi,
I'm new to this group and to OWB as well. I am re-building in OWB a small sample application that I have previously built in another ETL tool.
One of the source files to be processed in this application is an Oracle export file. I have tried, and looked through the documentation, but it seems Oracle export format files are not supported as source files in OWB. This seems (to me) a strange omission in an Oracle ETL product, as imp/exp is an accepted an much-used way to exchange data.
So the question is: am I right (export files not supported) or must I look better? And if I'm right: how can I use the export file in OWB? My guess is creating an "external process".
Thanks,
Gerton

Gerton,
You cannot directly read an export file and the format has not been published. Even through OWB you cannot read it... part of it I guess is to prevent a potential security hole (you would be able to read any data from an export file if you had OWB?).
What you can do: external process to load the export file (command line), run a mapping against the schema, and use a transformation with dynamic SQL to drop the schema again. Not an elegant solution... but it could work.
Thanks,
Mark.

Similar Messages

  • How exported file

    how can read a oracle exported file?
    I export with exp commad, but how convert to ascii?
    thanks.

    What kind of analysis? An Excel spreadsheet?
    One very easy way to create a disk file of a table is to use a tool like SQL Developer or Toad and save it as a comma delimited file.
    You can also use UTL_FILE and write it to disk on the server.
    Or you can spool a select statement to a file using SQL Plus.
    exp is not the tool to use for this.

  • Oracle SQL Developer  error - "The file-based source procedure Name is not present in the database."

    I recently started working on the Oracle SQL developer. I have 'select' privileges on the QA schema and when tried to execute proc in QA. It is giving the error as 'The file-based source <procedure Name>  is not present in the database.  Was it compiled?'
    instead 'you do not have sufficient privileges to execute this procedure'. Did research on internet but with no luck.What configuration changes needed to be done to make it work.Guide me.

    Sounds like you do not have the correct privileges.  What should have been granted to you by the QA user, or some other user with appropriate privileges, is...
    grant EXECUTE on "QA"."<PROCEDURE_OF_INTEREST>" to "<YOUR_USERID>" ;
    Note that a grant of execute on a procedure has nothing to do with grants of select on some or all of QA's tables and views.
    So, as Vadim suggests, from your connection node in SQL Developer's Connections view, if you expand the Other Users node, then expand QA and look in the Procedures node, do you see the procedure of interest?  If not, you cannot expect to be able to execute it from your userid's connection. And even if you do see it, you may have some other privilege that permits viewing but not executing, like...
    The role SELECT_CATALOG_ROLE
    The system privilege SELECT ANY DICTIONARY
    And even if you do not see it there, then having certain other privileges granted to you could permit executing it in general, like...
    The role EXECUTE_CATALOG_ROLE
    The system privilege EXECUTE ANY PROCEDURE
    Also, note that the 3.0 release is a bit dated nowadays. Upgrading to 4.0.3 production or even the 4.1 EA2 (early adopter) release will, in general, give you a better experience.
    Best wishes,
    Gary

  • IMP-00008: unrecognized statement in the export file, oracle 11gr2 on redhat 5

    I am using Oracle 11g R2 on Radhat 5 linux to import(imp) a dmp file with a table with blob data type. I got the following errors with binary non-ascii on the screen and failed imp at that table:
    IMP-00008: unrecognized statement in the export file: (a lot of non-ascii characters followed)
    How do that happen and how do we handle it?
    -Henry

    Hello,
    IMP-00008 may due to several reasons.
    The Dump may be corrupted, you may also hit a Bug and so on, ...
    Is it the only error you got or do you have other error message (for instance IMP-00032) ?
    Else, I don't know why you use EXP/IMP in 11.2, the Original Export/Import is not recommended. You should use DATAPUMP which is much more powerful.
    Please, find enclosed a link about DATAPUMP:
    http://www.oracle-base.com/articles/10g/oracle-data-pump-10g.php
    Hope this help.
    Best Regards,
    Jean-Valentin Lubiez

  • ORA-31619: invalid dump file "D:\oracle-export\mitest-19-05-2009.dmp"

    HI
    I have took logical backup (export) on linux and import (windows 2003 server).
    export command:
    $expdp miqa/miqa schemas=miqa directory=backup_dir CONTENT=all dumpfile=miqa-`date +%d-%m-%Y`.dmp CONTENT=all logfile=miqa-`date +%d-%m-%Y`.log'
    when i import on windows 2003 server i am getting the following error
    grant create any directory to rnddb;
    create or replace directory backup_dir as 'd:\export';
    GRANT READ, WRITE ON DIRECTORY backup_dir TO rnddb;
    c:\>impdp rnddb/rnddb directory=backup_dir dumpfile=mitest-19-05-2009.dmp SCHEMAS=mitest REMAP_SCHEMA=mitest:rnddb CONTENT=all logfile=mitest-19-05-2009.log
    I am getting the following error.
    Import: Release 10.1.0.2.0 - Production on Wednesday, 17 June, 2009 14:45
    Copyright (c) 2003, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produ
    tion
    With the Partitioning, OLAP and Data Mining options
    ORA-39001: invalid argument value
    ORA-39000: bad dump file specification
    ORA-31619: invalid dump file "D:\oracle-export\mitest-19-05-2009.dmp"
    pls could you help me where i done mistake.
    Thanks
    Settu Gopal

    ORA-31619:     invalid dump file "string"
    Cause:      Either the file was not generated by Export or it was corrupted.
    Action:      If the file was indeed generated by Export, report this as an Import bug and submit the export file to Oracle Customer Support.
    How was the dump file transferred between Linux and Windows server?

  • PE 9 - Why is my exported file larger than the source?

    Hello
    I am new to Adobe Premier Elements 9. I have a Digital Video Recorder connected to my television. It produces MPEG-2 TS VIDEO FILEs. I want to import these into PE9 and just cut out the ads.
    I tried with a source file that is 1GB in size. I imported it and removed the ads but when I export it using Share > Computer and select settings that mimic the source I stll end up with a 3.7GB file. I thought it should be smaller that 1GB because the video is now about 5 minutes shorter.
    In my Export settings I chose:
    Multiplexer: TS
    Audio: Left it untouched
    Video:
    Basic Video Settings:
    Quality = 4
    Everything else set to Automatic (based on source)
    Render at Maximum Depth left unchecked
    Bitrate Settings:
    Encoding: VBR, 1 Pass
    Bitrate Level - Custom
    Min, Target and Max all set to 15
    GOP Settings:
    M Frames: 3
    N Frames: 12
    Is my DVR better at encoding/compressing video that PE9 or are my export setting wrong? Shouldn't the exported file be less that 1GB?

    As Steve points out, there are two parameters, that affect File Size:
    Duration
    Bit-Rate
    With a fixed Duration, the only way to adjust File Size is lower the Bit-Rate (lower Bit-Rate = smaller File Size, but at the expense of quality.
    Now, some CODEC's will allow one to use a lower Bit-Rate, but still retain perceived quality, better than some others, but it is still the Bit-Rate, that affects the File Size.
    Some Export/CODEC options will list the Bit-Rate directly, and MB/sec., where others will instead have a "quality" setting, and the Bit-Rate's MB/sec. will not be directly shown, but will relate to some quality setting.
    One major consideration is what is to be done with the resulting output file. If it is to be used for additional editing, then I feel that File Size is not a big issue, as I want the ultimate quality. If one is delivering the output file, then there are many considerations, from the platform of the recipient's computer, to the player being used, to the necessary quality of the file.
    Good luck,
    Hunt

  • Exporting file to oracle financial analyzer using Excel "Express Add In ?

    i want to export database oracle using "Excel Add-in Express",at present i have to convert the database to PRN/TXT.FILE first before loading it to oracle financial analyzer (via Loading Menu) .Can I use the Excel Express Add In facility to export file (Excel Format) directly to Oracle Database (financial Analyzer) without converting the database to TEXT FILE ?

    i want to export database oracle using "Excel Add-in Express",at present i have to convert the database to PRN/TXT.FILE first before loading it to oracle financial analyzer (via Loading Menu) .Can I use the Excel Express Add In facility to export file (Excel Format) directly to Oracle Database (financial Analyzer) without converting the database to TEXT FILE ?

  • Importing Oracle 10g export file to Oracle 9i

    We have an export file from a 10g database and wanting to import them to 9i database. Is this possible?

    Assuming that they have both 10g and 9i database, there is no reasom why they can't use the 9i export and import utilities to load the data.
    If the dump file came from a 3rd party's database to whcih they don't have access, then obviously they won't be able to load the file into a 9i database.

  • Imp-00008: unrecognized statement in the export file

    Hi All,
    I am trying to import an export from Oracle 8.1.7 source system to Oracle 11.2 using imp . I am getting the following errors may times during the import process.
    imp-00008: unrecognized statement in the export file
    The character set of Source data base is WE8DEC and the character set of target data base is WE8MSWIN1252 and i get the below statement during the start of the import
    import done in US7ASCII character set and AL16UTF16 NCHAR character set
    import server uses WE8MSWIN1252 character set (possible charset conversion)
    export client uses WE8DEC character set (possible charset conversion)
    export server uses WE8DEC NCHAR character set (possible ncharset conversion)
    The error imp-00008: unrecognized statement in the export file is it due to the Oracle version compatibility or due to Character set compatibility issue.
    I tried to create a new Database on same Oracle sever and i cant find WE8DEC in the list of character set to choose. Please help me on how to proceed.
    Regards,
    alen.

    934571 wrote:
    Hi Srini,
    Data is getting loaded correctly, but i get several of these error imp-00008: unrecognized statement in the export file messages during the import process, so i am not sure what is missing.
    Pl post the complete error message from the import log file.
    1) Is it possible to import the dump exported form oracle 8 into oracle 11?? Do we need to take any special care during the import ??
    Yes - no special requirements are needed.
    2) The Oracale Database character set it WE8DEC, but when i try to create a new database in 11 , i dont find that character set , is the character set obsolete now , if so what character set is super set of WE8DEC ??
    Pl post exact OS and database versions. Ideally you should be using AL32UTF8 for all new databases. WE8DEC is a deprecated characterset.
    See section 4.2.1 here - http://docs.oracle.com/cd/E11882_01/install.112/e24186/install.htm#BABFDDEA
    Thanks,
    Alen.HTH
    Srini

  • IMP-00010: not a valid export file, header failed verification

    Hi everyone,
    I am new in oracle's world, and I've been working with an export from a sun platform, and I try to import to a windows platform. I've been receiving this message error
    IMP-00010: not a valid export file, header failed verification
    Please, help me I have to finish with that today..
    Thanks and have a nice New Year

    thanks for your answer.
    sorry, but I don't have all exact information, because that export is comming from a client, and I don't know all information. I only know that the platform is sun.
    I received the file via an usb memory, directly from the source.
    I've been read that if you make an export in sun platform and it isn't as binary when you transfer it to a windows platform, it will be corrupt.
    Could you confirm me that?

  • Oracle export from 8i

    I have a newbie question - I searched the web for a long time but I cannot find any answers.
    I understand that all packages,procedures and stored program units are stored in system tablespace or
    sysaux tablespace. My requirement is to copy a few schemas from a source database into a target database. Is it enough to exp the schemas and bring them into the target database ? Will this bring the stored program units also across.Or will I need to export the sys/system schemas ?. I exported one schema and did the following in unix.
    strings expdat.dmp|grep -i "package" but this returns nothing. I also opened the exp file and searched for the packages but to no avail. So I created a parfile (tables=somepackagename) but exp complained that the object was not found.
    Any Ideas ?

    Hello,
    Oracle version in both source and target is 8.1.6.3.0I know it's not related to your question but, why do you use a 10 years old release ?
    You may think to migrate someday to a newer release.
    More over, from Oracle 10g, you may use DataPump (expdp / impdp) instead of the classical export/import and, with DataPump you can select accurately the Object TYPE you want to export or import by using EXCLUDE / INCLUDE parameters.
    For instance, to export only Procedure, you may use this with DataPump:
    INCLUDE=PROCEDUREElse about the content of expdat.dmp file you may try this:
    imp <user>/<password> file=expdat.dmp log=<logfile>.log full=y SHOW=YIt's better than opening the dump file.
    Hope this help.
    Best regards,
    Jean-Valentin
    Edited by: Lubiez Jean-Valentin on Jun 27, 2010 12:06 PM

  • Exporting files in Unicode

    Hi,
    I am using UTL_FILE.FOPEN_NCHAR and utl_file.PUT_line_NCHAR to write files in Unicode but instead it is exporting in UTF8 format, my database characterset is UTF8, it might be happening due to characterset.
    Can i export files in Unicode rather in UTF8.
    Thanks
    Zubair

    I think you must reverse the parameters it's destination first the source is optional
    Regards
    Etbin
    Looking at 11g documentation: http://docs.oracle.com/cd/E11882_01/appdev.112/e25788/u_file.htm#autoId0
    FOPEN_NCHAR Function
    This function opens a file in national character set mode for input or output, with the maximum line size specified. You can have a maximum of 50 files open simultaneously. With this function, you can read or write a text file in Unicode instead of in the database character set.
    Even though the contents of an NVARCHAR2 buffer may be AL16UTF16 or UTF8 (depending on the national character set of the database), the contents of the file are always read and written in UTF8. <font color="blue">UTL_FILE converts between UTF8 and AL16UTF16 as necessary.</font>
    I'm not sure what the last statement means (no Database at hand to try it, besides the usual procedure is not to do it row-by-row but to obtain a clob which gets written out in 32767 byte chunks)
    Edited by: Etbin on 26.2.2012 9:13

  • Schema Import failed :IMP-00008: unrecognized statement in the export file:

    Hi ,
    I was running a database schema import from a database DMP file.
    I encountered the following error in the log file that was getting generated :
    IMP-00008: unrecognized statement in the export file:
    All the tables got copied except for one. Now what I have thought of is to import that one table again after taking an export from the prod schema.
    But what about the functions,triggers,procedures ?? How do i bring all of them in the schema ?
    If reimporting the schema again the only option ?
    Thanks
    Kk

    Hello,
    In this order:
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/exp_imp.htm#i1023988
    Table Objects: Order of Import
    Table objects are imported as they are read from the export file. The export file contains objects in the following order:
    1. Type definitions
    2. Table definitions
    3.Table data
    4.Table indexes
    5. Integrity constraints, views, procedures, and triggers
    6. Bitmap, function-based, and domain indexes
    The order of import is as follows: new tables are created, data is imported and indexes are built, triggers are imported, integrity constraints are enabled on the new tables, and any bitmap, function-based, and/or domain indexes are built. This sequence prevents data from being rejected due to the order in which tables are imported. This sequence also prevents redundant triggers from firing twice on the same data (once when it is originally inserted and again during the import).
    For example, if the emp table has a referential integrity constraint on the dept table and the emp table is imported first, all emp rows that reference departments that have not yet been imported into dept would be rejected if the constraints were enabled.
    When data is imported into existing tables, however, the order of import can still produce referential integrity failures. In the situation just given, if the emp table already existed and referential integrity constraints were in force, many rows could be rejected.
    A similar situation occurs when a referential integrity constraint on a table references itself. For example, if scott's manager in the emp table is drake, and drake's row has not yet been loaded, scott's row will fail, even though it would be valid at the end of the import.

  • Oracle export error while using exp.

    Hi folks,
    Any idea why am i getting this error when I tried to get an export file?
    Error:
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
    With the Partitioning, Oracle Label Security, OLAP and Data Mining Scoring Engine optionsSegmentation Fault (core dumped)
    Steps:
    I was trying to get export file from db test/test_pass@env1 with below export command.
    Encountered the above error.
    exp test/test_pass@env1 FILE=another_test_dmp.dmp OWNER=another_test TRIGGERS=n GRANTS=y ROWS=y COMPRESS=y
    Log file created with 0 bytes.
    Some file name with core created. (huge in volume)
    another_test_dmp.dmp file created with 0 bytes.
    Appreciated your help in this.
    Thank you.
    Edited by: TechMahi.com on May 6, 2010 5:28 AM

    Hi,
    You should look your core dump to see more details of your error. If you want you could post it to see if we can help you.
    You can find it looking at the parameter CORE_DUMP_DEST
    Regards,
    Mario Alcaide
    http://marioalcaide.wordpress.com

  • Recovery from bad export file - IMP0009 error

    Hi all,
    I have a large export that is split into 2Gb chunks using FILESIZE option at export .. string is something like this:
    exp userid=xxx/xxx filesize=2048M file=bu${DATE}_01.dmp,bu${DATE}_02.dmp,b
    u${DATE}_03.dmp full=y log=bu${DATE}.log
    ..export log showed no errors but on import I'm getting IMP0009 : abnormal end of export file
    Typically there's a rather large and important table that's split between the first and 2nd files which I really need to get at. If I import at the moment it rolls back the first 1.1 million odd rows before it fails and it wont load the 2nd file at all, says something about the sequence of the file being incorrect.
    OS is AIX 4.3.3
    Oracle is 8.1.7.0.0
    Any ideas how I can recover the table that is split accross files? I've tried catting the files together to form one large one (and import on an OS without file system size limits) .. no dice although perhaps I can alter it to fool it into thinking it's 1 file??
    Thanks in desparation!
    Adam

    Username:
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Export file created by EXPORT:V08.01.07 via conventional path
    import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
    export client uses US7ASCII character set (possible charset conversion)
    export server uses US7ASCII NCHAR character set (possible ncharset conversion)
    IMP-00046: using FILESIZE value from export file of 2147483648
    . importing C45966's objects into C45966
    "CREATE TABLE "CUST_LEDGER_BALANCE" ("HOUSE" CHAR(6) NOT NULL ENABLE, "CUST""
    " CHAR(2) NOT NULL ENABLE, "SERVICE_GROUP" NUMBER(2, 0) NOT NULL ENABLE, "LE"
    "DGER_SEQ" NUMBER(10, 0) NOT NULL ENABLE, "LDGRDATE" DATE NOT NULL ENABLE, ""
    "STMT_DATE" DATE NOT NULL ENABLE, "LDTYPE" NUMBER(2, 0) NOT NULL ENABLE, "AM"
    "OUNT" NUMBER(12, 0) NOT NULL ENABLE, "DECIMAL_CNT" NUMBER(1, 0) NOT NULL EN"
    "ABLE, "SUMCODE" CHAR(3) NOT NULL ENABLE, "LEDGER_STATUS" NUMBER(1, 0) NOT N"
    "ULL ENABLE, "PROG" NUMBER(3, 0) NOT NULL ENABLE, "PRINT_FLAG" NUMBER(1, 0) "
    "NOT NULL ENABLE, "CYCLE_DATE" DATE NOT NULL ENABLE, "DUE_DATE" DATE NOT NUL"
    "L ENABLE, "MULTIMONTH_CNT" NUMBER(2, 0), "MULTIMONTH_AMT" NUMBER(12, 0)) P"
    "CTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 LOGGING STORAGE(INITIAL 239796"
    "224 NEXT 4194304 MINEXTENTS 1 MAXEXTENTS 240 PCTINCREASE 0 FREELISTS 1 FREE"
    "LIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "TBL_CUST_LED_C45966""
    . . skipping table "CUST_LEDGER_BALANCE"
    IMP-00009: abnormal end of export file
    Import terminated successfully with warnings.
    ..they were transferred in binary mode. Same result if I specify 2048M filesize in import string. Only seems to affect the table that is broken into 2 files, others seem OK.
    Thanks!
    Adam

Maybe you are looking for

  • Hyper Transport Sync Flood Error (msi 785gm-e51)

    hello all, my pc x4-630 his 5750 vgen 2gb 1333 corsair 450w i have problem with my pc. sometimes its reboot when im playing games.after reboot there a error messege hyper transport sync flood error. i already google about this problem and i got a inf

  • "detected a page fragment with multiple root components" warning

    I am getting a warning on the standalone WLS when I run my page that contains a taskflow as region. I am using a page fragment in my taskflow. <Warning> <oracle.adfinternal.view.faces.renderkit.rich.RegionRenderer> <ADF_FACES-60099> <The region compo

  • IPhone acting wierd after 1.1.3 update.

    After the new software update, my iPhone screen stops responding randomly. It only happens while I'm on safari and the iPod. It only happens rarely on the iPod, but when on safari, it happens enough to be a nuisance. Is this a problem other people ar

  • Timing of Loading AR AP GL Balances

    Hi .....please help and give me some guidance. When should we load the AR, AP & AA balances into SAP...Should we do it after the legacy system closes or we can do it at the go live date 12/31/2007 and load delta transactions from then till the final

  • Static file access.

    I figured out how to upload a file I need for an app into the shared components in the shared files area. I've looked through the docs, the forum and help screens and I can't see how I use this now in my app. I want to create a link to get the file b