PIPE 와 COMPRESS 를 사용한 EXPORT

제품 : ORACLE SERVER
작성날짜 : 2004-03-24
PIPE 와 COMPRESS 를 사용한 export
=================================
Export 중에 dump file size 가 2G 이상이어서 에러가 발생하는 경우가 있습니다.
이러한 문제를 해결하기 위하여, 압축(unix command인 compress 이용)하면서
disk 나 tape 으로 보낼 수 있는 방법은 다음과 같습니다.
$mknod /tmp/exp_pipe p          
=> /tmp/exp_pipe 라는 이름의 pipe를 생성합니다.
이 때 p는 pipe 임을 나타냅니다
(이후에 불필요하여 지우는 방법은 rm /tmp/exp_pipe,
rm하지 않고 계속해서 아래의 compress와 exp 작업을 중복하게 되는 경우
이후 import 시에 문제가 발생할 수 있으므로 주의한다.)
$compress < /tmp/exp_pipe > /dev/rmt/tx4 &     
=> pipe 를 통과하게 될 화일을 압축하여 tape device /dev/rmt/tx4 로
보내는 작업을 backgroud process 로 미리 기동하여 둡니다.
tape 대신 disk 로 보낼 때에는 export.dmp.Z 과 같은 화일명을 줍니다.
(이 때 <, >는 redirection 기호이므로 생략해서는 안 되고 위와 같이 그래도
적어야 한다.)
$exp system/manager file=/tmp/exp_pipe full=y and other options     
=> Dump file name 에는 위에서 만든 pipe 이름을 줍니다.
위와 같이 받은 dump file 을 import 하는 방법은 다음과 같습니다.
$mknod /tmp/imp_pipe p               
=> import 를 위한 /tmp/imp_pipe 생성하십시오
$uncompress < /dev/rmt/tx4 > /tmp/imp_pipe &     
=> pipe 를 통과 하게될 화일의 압축을 풀어서 tape device /dev/rmt/tx4 로
보내는 작업을 backgroud process 로 미리 기동하여 둡니다.
export를 disk로 받았다면, tape device name 대신 압축 화일명을 주십시오.
$imp system/manager file=/tmp/imp_pipe and other option          
=> file 에는 위에서 만든 pipe 이름을 줄 것
EXAMPLE)
2G를 초과하는 export dump file을 압축하여 tape으로 보내어 전체 데이타베이스
에 대하여 backup을 받고, 그 중에서 scott 유저의 데이타만 import 하는 예.
mknod /tmp/exp_pipe p
compress </tmp/exp_pipe> /dev/rmt/tx4 &
exp system/manager file=/tmp/exp_pipe owner=s buffer=100000 volsize=0
참고 : volsize=0 의 의미는 tape으로 화일을 보낼 때 제한을 두지 않겠다는 것
(no limit)을 의미합니다.
mknod /tmp/imp_pipe p
uncompress </dev/rmt/tx4> /tmp/imp_pipe &
imp system/manager file=/tmp/imp_pipe owner=scott ignore=y

Similar Messages

  • What is the best gzip ratio by using pipe to export

    Hello,
    I try to use data pump to export a core application schema. (10Gdb 10.2.0.4)
    I use 'estimate' parameter of expdp before real export, and get future dump file size
    will be 235GB.
    the only issue is that my largest file system size is 34GB, so seems like I have to use
    pipe to zip while exporting...
    the action plan is like below:
    1 make a unix pipe
    $ mknod /faj29/appl/oracle/exports/scott1 p
    2.driect pipe to .dmp
    nohup gzip -5c < scott1 > /faj29/appl/oracle/exports/ODS.dmp.gz & ===>quesiton is here?
    3. create a par
    userid=system/password@dbname
    SCHEMAS=xxx
    DUMPFILE=scott1
    LOGFILE=dpump_emrgods:exp_xxx.log
    COMPRESSION=NONE
    CONTENT=ALL
    ATTACH=xxx.EXPORT.job
    nohup expdp PARFILE=exp_xxx.par &
    My quesiton is based the information I provided, (34GB filesystem size vs 235GB dumpfile size)
    how much zip extent shoud I use?
    gzip -5c or gzip -9c
    seems like -9c is very time consuming, this why I don't like to use it. but I don't know
    zip ratio, so any friend knows?
    or do we have better way to do this task?
    (seems like I can not use parallel export by using pipe, if yes, how?)
    thanks a lot
    Jerry
    Edited by: jerrygreat on May 8, 2009 9:00 AM

    Hello,
    I am wrong, the pipe won't work with expdp at all.
    it is unlike the process with the older EXP tool, which you could tell to write to a named pipe and the data written to the named pipe could be compressed—all in one step.
    So can I use parameter 'compression'???
    --it seems like to only to compress meta data? not for dump file...?
    (but I want to compress my expected 235G dump file to fit one 34G filesystem)
    Any idea?
    regards,
    Jerry

  • Compression & Exporting

    Hi guys-
    Hope I'm not missing something really obvious here. Couldn't find much in the search. A while back I asked a question in the forum about how to fit my 2+ hr project on a DVD using iDVD to burn it. I was replied to and received a few tips on what to change the bit rate to and use dolby encoding. After knowing the details, I never could figure out how to change them and how to export it. QT compression? what settings? Use compressor? MPEG2 or 3? I'm lost. Thanks in advance!
    -Jack

    The problem with using iDVD to do this is you have less control in building your compression settings.
    Now -- they TELL me iDVD can hold 2 hours now... but I'm still not sure as I've never tried it.
    Your BEST bet would be to use DVDSP to burn the compression you made in COMPRESSOR... go back to your original post and take a look at the settings they suggested to you.
    HOWEVER -- if you REALLY need to do it in iDVD --- Inside FCP... File - Export - Quicktime Movie... use current settings and don't worry about checking SELF-Contained movie... leave it unchecked.
    Drag THAT file onto iDVD. Check the preferences in iDVD and see if you can't change the compression settings to make it a 2 hour movie on a single disc.
    Good luck, -- you might want to post this in the iDVD forum.
    CaptM

  • [b]Urgent: [/b] Error: Pipe 'Compress' was closed

    I am trying to extract data from SAP and put in Unix flat file. The amt of data selected will be huge and the unix file will have to be compressed. To do this, I am using Filter 'compress' when I open dataset for output. As part of the I am transferring in the select/end select statement. So each record is read from SAP and transfered to the flat file. After a 1000 records, it exits out of the select /end selcect to commit work. After commiting, goes back to finish selection.
    The code is breaking at commit work.
    <b>The error says: Pipes can only be open using Open dataset ... filter .. within a 'Unit of Work'.
    A 'Rollout' closes the pipe automatically
    My question is, do I need to commit work since I am selecting and transferring each record one at a time? If yes, how do I compress and commit after 1000 records with getting this error..</b>
    FORM F_WRITE_AUFK.  ' <- Form
      CLEAR: GV_ERROR, GV_RECCNT, GV_T_AMT, GV_T_QTY,
             GV_T_DOCS, GV_PREVIOUS_REC_ID, GV_PASSCNT.
    ----Open in compress mode -
    Data: GC_COMMIT dafault '1000'.
    OPEN DATASET LV_FILE FOR OUTPUT in text mode encoding default filter 'compress'.
      IF GV_ERROR = 'X'.
        EXIT.
      ENDIF.
      IF GV_ERROR = 'X'.
        EXIT.
      ELSE.
        DO.                                           " Loop to commit work
          SELECT    AUFNR AUART AUTYP ERNAM AENAM AEDAT KTEXT
                     FROM AUFK  INTO gs_AUFK
                     WHERE AUFNR IN S_AUFNR
                     AND AUFNR > GV_PREVIOUS_REC_ID
                     ORDER BY AUFNR.                 " Select into a work area
            GV_PREVIOUS_REC_ID = GS_AUFK-AUFNR.      
            ADD 1 TO GV_PASSCNT.
            PERFORM F_TRANSFER_DATASET USING P_F_AUFK GS_AUFK CHANGING GV_ERROR.
            IF GV_PASSCNT = GC_COMMIT.                " Exit when ready to commit work.
              EXIT.
            ENDIF.
          ENDSELECT.
          IF GV_PASSCNT = GC_COMMIT.
           COMMIT WORK.                              
            CLEAR GV_PASSCNT.
    *------Commit work and go back to the select/endselect to get more records
          ELSE.
            EXIT.
          ENDIF.
        ENDDO.
      ENDIF.
      IF GV_ERROR = 'X'.
        EXIT.
      ELSE.
        PERFORM F_CLOSEDATASET USING   P_F_AUFK CHANGING  GV_ERROR.
      ENDIF.
    ENDFORM.<b></b>

    COMMIT is an SQL command and I don't think it will have any effect on you dataset. I think this should work:
    FORM f_write_aufk.
      CLEAR: gv_error, gv_reccnt, gv_t_amt, gv_t_qty,
      gv_t_docs, gv_previous_rec_id, gv_passcnt.
    *--------Open in compress mode ----------------------------------------*
      DATA: gc_commit dafault '1000'.
      OPEN DATASET lv_file FOR OUTPUT IN TEXT MODE ENCODING DEFAULT FILTER
              'compress'.
      IF sy-subrc = 0.
        SELECT aufnr auart autyp ernam aenam aedat ktext
        FROM aufk INTO gs_aufk
        WHERE aufnr IN s_aufnr.
          PERFORM f_transfer_dataset
            USING p_f_aufk gs_aufk CHANGING gv_error.
        ENDSELECT.
        PERFORM f_closedataset USING p_f_aufk CHANGING gv_error.
      ENDIF.
    ENDFORM.                    "F_WRITE_AUFK
    Rob

  • High Quality HD footage Need to compress/export to fit onto a DVD!

    So, basically I have 58 min. worth of HQ, HD footage (with burnt in time code) that I need to fit onto a playable DVD (that will be burnt using DVD Studio Pro)
    Normally, I'd just export using QuickTime movie - however after doing so, my file size was 89.52 GB, which obviously is huge.
    What is the best way to go about compressing or exporting this so that it will fit onto a DVD or Dual Layer DVD.
    I need to send this to the client, and the quality really doesn't have to be amazing, as this is just for them to view and choose which footage they want to keep, etc. The quality need to be decent enough to view, essentially.
    I used FCP and DVD Studio pro -- but really just basic stuff for the most part. I am not super familiar with compressor and all that lingo, so the simplest way to go about this would be best, ha.
    The raw footage was shot using a SONY HDCAM 1080i 59.97 (16x9) - if this helps.
    Digitized the footage using pro-ress in FCP...so yeah I don't what other info might be needed, sorry.

    #42 - Quick and dirty way to author a DVD
    Shane's Stock Answer #42 - David Roth Weiss' Secret Quick and Dirty Way to Author a DVD:
    The absolute simplest way to make a DVD using FCP and DVDSP is as follows:
    1. Export a QT movie, either a reference file or self contained using current settings.
    2. Open DVDSP, select the "graphical" tab and you will see two little monitors, one blue, one green.
    3. Select the left blue one and hit delete.
    4. Now, select the green one, right click on it amd select the top option "first play".
    5. Now drag your QT from the broswer and drop it on top of the green monitor.
    6. Now, for a DVD from an HD source, look to the right side and select the "general tab" in the track editor, and see the Display Mode, and select "16:9 pan-scan."
    7. Hit the little black and yellow burn icon at the top of the page and put a a DVD in when prompted. DVDSP will encode and burn your new DVD.
    THATS ALL!!!
    NOW...if you want a GOOD LOOKING DVD, instead of taking your REF movie into DVD SP, instead take it into Compressor and choose the BEST QUALITY ENCODE (2 pass VBR) that matches your show timing. Then take THAT result into DVD SP and follow the rest of the steps. Except you can choose "16:9 LETTERBOX" instead of PAN & SCAN if you want to see the entire image.
    Shane

  • Scripting for GarageBand: import; compress; export

    Hi there -
    I'm not sure where the best place to post this is, so forgive me if this was not the best guess. I'm doing research on language and have a workflow that involves collecting thousands of recordings of individual words:
         I record someone for half an hour;
         splice that single audio file into hundreds or thousands of single-word chunks;
         export each splice as a separate audio file;
         (would like to) use automator / script to automatically rename the resultant files with an alphanumeric prefix and numerical postfix;
         import each of these files into GarageBand (or StudioOne) individually;
         compress the file to reduce dynamic range;
         export the finalized file into a designated folder.
    I have some very basic background in programming (primarily MatLab, but very basic dabbling in C), but I know nothing about scripting for Macs. In the long term, I'm interested in learning how to do this, but in the short term I'm looking at a lot of work that needs to get procssed and just need a quick fix for automating so that I don't have to (continue to) do this manually for each file!
    In case it is of interest to anyone who may respond: the ultimate goal fo the process is to attain as constant a volume across all the files as possible: I record many different people, in different settings over the course of months or years and was told that compression will help standardize the playback volume. The reason for compressing each file individually is because even within a single recording session, the individual's voice may fluctuate notably as they move closer and further from the mic, or even if there is a single loud noise during recording - it would act as the reference point for compression, even though it was an anamoaly.
    Finally, to clarify the questions involved here:
    How can I automate the last 4 or so (or more?!) points from the above workflow? or Any suggestions where to find someone who could do this?
    Alternatievely: suggestions on standardizing volume playback that does not involve the above process (NB: the standardization has to be as professional as possible, so I don't want to compromise the result for a simpler workflow).
    Where can I start to learn more about scripting on the Mac so I can tackle such issues on my own, in the future?
    Kind regards,
    Rax Adam

    Yes, the Keynote presentation slides/project is set for 1920x1080, the reason is I'm outputting via a high lumens projector that natively projects at 1920x1080p.
    Even though a Macbook Pro does not have a screen that size, the Macbook can send out even larger than 1920x1080, but I am limiting the output to the native screen res of the projector.

  • Export omport advice

    Hi,
    We are facing few space issues on the servers. now as a part of the schema refresh we have no option but to use the import and export using the NETWORK LINK. just wanna confirm whether it suuports the BLOB LONG with NETWORK LINK
    Thanks

    Vishnu wrote:
    Hi,
    We are facing few space issues on the servers. now as a part of the schema refresh we have no option but to use the import and export using the NETWORK LINK. just wanna confirm whether it suuports the BLOB LONG with NETWORK LINK
    ThanksI assume you want to use the traditional export/import tool, if you are in unix / linux environment what about using unix pipes compressed export?
    Example: http://www.tc.umn.edu/~hause011/code/exp-imp-db.ksh

  • IMPORT & EXPORT from/to compressed files directly...

    for newbies a tip to import(export) directly to(from) compressed files using pipes . Its for Unix based systems only, as I am not aware of any pipe type functionality in Windows. The biggest advantage is that you can save lots of space as uncompressing a file makes it almost 5 times or more. (Suppose you are uncompressing a file of 20 GB, it will make 100 GB) As a newbie I faced this problem, so thought about writing a post.
    Lets talk about export first. The method used is that create a pipe, write to a pipe(ie the file in exp command is the pipe we created), side by side read the contents of pipe, compress(in the background) and redirect to a file. Here is the script that achieves this:
    export ORACLE_SID=MYDB
    rm -f ?/myexport.pipe
    mkfifo ?/myexport.pipe
    cat ?/myexport.pipe |compress > ?/myexport.dmp.Z &
    sleep 5
    exp file=?/myexport.pipe full=Y log=myexport.logSame way for import, we create a pipe, zcat from the dmp.Z file, redirect it to the pipe and then read from pipe:
    export ORACLE_SID=MYDB
    rm -f ?/myimport.pipe
    mkfifo ?/myimport.pipe
    zcat ?/myexport.dmp.Z > ?/myimport.pipe &
    sleep 5
    imp file=myimport.pipe full=Y show=Y log=?/myimport.logIn case there is any issue with the script, do let me know :)
    Experts, please have a look...is it fine ? (Actually I dont have Oracle installed on my laptop(though have Fedora 6) so couldnt test the scripts)
    I posted the same on my blog too. just for bookmark ;)
    Sidhu
    http://amardeepsidhu.blogspot.com

    actually, only the compression thing runs in the background. rest proceeds like normal only. just instead of giving normal file we use pipe as a file.
    nice article about named pipes
    Sidhu

  • Expdb using Unix pipe command for max compression

    I think the answer to this is NO, but I thought I'd ask the questions anyway, in case some of the Oracle geniuses on this site have figured out a way around this.
    We have been using the existing exp/imp commands for years, due to the fact that we can pipe the output through a pre-setup file that will compress the data as it is written. Since we have moved onto the 10g and now the v11 software, we are starting to what to utilize the parallelism that is afforded in expdp, but don't want to deal with the space required for non-fully compressed exports that occurs with expdp.
    So the question is, has anyone figured out a way to use expdp and using 'on the fly' compression like you could do with the old exp utility on Unix boxes??
    Any help would be appreciated....

    Unfortunately, no--it has to do with the parallel nature of the file ops that prevents compression through a pipe. However, if you buy the Compression option in 11--you can compress on the fly.
    See this white paper.
    http://www.oracle.com/technology/products/database/utilities/pdf/datapump11g2007_quickstart.pdf
    Edited by: jdanton on Sep 2, 2008 3:12 PM

  • Export on Red Hat 7.1 Oracle 8.1.7.2

    I've been trying to export my oracle databases that I have on our linux machines. I have 2 oracle dbs on a linux machine with 2 cpus. 2 gig ram and 4 gig swap. I can run a full export at night when no one else is on the system but when I try to run an export during the day it just stops. And only 5 ppl or so are on the system at the time testing the programs. I've tried asking oracle but they haven't responded to me in over a week. I run the following command to export, which I would like to emphasize does work at night.
    cat /archive/exp_prod.dmp | compress > /archive/exp_full.dmp.Z &
    $ORACLE_HOME/bin/exp sys/'password' buffer=5000000 direct=y log=/archive/exp_full.out file=/archive/exp_prod.dmp full=y
    As you can see I pipe the export through a file to compress it. Once it stops it never starts again. I've let it go for over 24hrs and nothing. It doesn't always stop at the same table. The actaul compress process after it stops is still taking 98% of cpu time but no more information is being sent through the pipe file and the export never gets off of the table it froze on. Now since it has worked before with no one else on the system I must believe it is a memory issue or parameter not set correctly with linux. What can I look at to help solve this problem?

    I could not get back to this discussion group before. I apolagize
    for that, but as I mentioned in the previous message my problem
    is a linking one:
    I have follow the step metioned in several discussions in this
    group, namely:
    I have installed the compatibility libraries:
    compat-glibc
    compat-egcs
    compat-libs
    Set LD_ASSUME_KERNEL= 2.2.5
    run the environment script ./usr/i386-glibc21-linux/bin/i386
    ...-en.sh
    The runinstaller program run ok has metioned previously but it
    gives me errors during linking:
    After copying the necessary files and all that initial
    instalation work the linking phase runs smootly, it even link
    correctly several Oracle tools like Oracle intermedia, Oracle
    net8 client, and so on.
    When the Oracle 8i server linking the problems start.
    All errors are like this:
    -> Error invoking target install of makefile
    And the afected makefiles are numerous:
    /OraHome1/network/lib/ins_net_server.mk
    /OraHome1/network/lib/ins_oemagent.mk
    /OraHome1/ldap/lib/ins_ldap.mk
    /OraHome1/odg/lib/ins_odg.mk
    /OraHome1/ctx/lib/ins_ctx.mk
    Yes, the famous ctx is also giving me problems.
    Moreover:The mentiones target .c files refered to in the makefile
    script are all in their places!
    I do not know what is happenig in this instalation.
    Any clue?
    Best regards,
    Pedro
    Note:
    I also tried to cover for the ld and gcc linker and compiler to
    make sure they are the right version as is mentioned in another
    thread... same thing!

  • Regarding Export

    Hi,
    when i m doing Export of the tables from the database. but the size of the tables are in GB. and there is problem of space in my system.
    Is there any way to zip the export file before to create dump file.
    Kindly provide solution for this.
    Thanks in advance.

    As stated above, You can use Datapump for export/import. You can use the unix pipe for export/import as well.
    exp.ksh
    ========
    rm -f /tmp/exppipe
    /usr/sbin/mknod /tmp/exppipe p
    compress < /tmp/exppipe > /hca200/appl/oracle/test/dump/exp_database.dmp.Z &
    exp file=/tmp/exppipe parfile=exp.par
    imp.ksh
    ========
    rm -f /tmp/imppipe
    /usr/sbin/mknod /tmp/imppipe p
    uncompress < /hca200/appl/oracle/test/dump/exp_database.dmp.Z > /tmp/imppipe &
    imp parfile=imp.par file=/tmp/imppipe

  • EXPORT FILE을 SPLIT하여 받는 방법

    제품 : ORACLE SERVER
    작성날짜 : 2002-04-19
    EXPORT FILE을 SPLIT하여 받는 방법
    =================================
    PURPOSE
    이 자료는 export시 2GB이상의 export dump file을 만들지 못하는 제약사항에 대하여 여러 export file로 나누어 받는 방법을 설명한다.
    Explanation
    !!! IMPORTANT: 이 방법은 반드시 KORN SHELL (KSH) 에서 작업해야 한다. !!!
    만일 ksh을 쓰지 않을 경우는 unix prompt에서 ksh라고 치면 변경이 된다.
    unix pipe와 split command를 사용하여 disk에 1024m(약1GB)씩 export file이 생기면서 export된다.
    export나 import시 제공되는 모든 parameter의 추가적인 사용이 가능하다.
    Export command:
    echo|exp file=>(split -b 1024m - expdmp-) userid=scott/tiger tables=X
    사용예> echo|exp file=>(split -b 1024m - expdmp-) userid=scott/tiger log=scott.log
    위와 같이 split하여 export를 받을 경우 file name은 expdmp-aa, expdmp-ab...expdmp-az, expdmp-ba... 으로 생성된다.
    Import command:
    echo|imp file=<(cat expdmp-*) userid=scott/tiger tables=X
    사용예>echo|imp file=<(cat expdmp-*) userid=scott/tiger ignore=y commit=y
    split와 compress를 같이 사용해도 가능하다.
    Export command:
    echo|exp file=>(compress|split -b 1024m - expdmp-) userid=scott/tiger tables=X
    Import command:
    echo|imp file=<(cat expdmp-*|uncompress) userid=scott/tiger tables=X
    *** 위의 작업을 solaris에서 test가 되었다. os 종류에 따라서 shell command는
    달라질 수 있다.
    이 외에도 split command를 사용하지 않고 unix pipe와 compress를 사용하여
    3-step의 작업으로 export file size limit의 회피를 위한 방법도 가능하다.
    Export command:
    1) Make the pipe
    mknod /tmp/exp_pipe p
    2) Compress in background
    compress < /tmp/exp_pipe > export.dmp.Z &
    -or-
    cat p | compress > output.Z &
    -or-
    cat p > output.file & )
    3) Export to the pipe
    exp file=/tmp/exp_pipe userid=scott/tiger tables=X
    Import command:
    1) Make the pipe
    mknod /tmp/imp_pipe p
    2) uncompress in background
    uncompress < export.dmp.Z > /tmp/imp_pipe &
    -or-
    cat output_file > /tmp/imp_pipe &
    3) Import thru the pipe
    imp file=/tmp/imp_pipe userid=scott/tiger tables=X
    Reference Ducumment
    <Note:1057099.6>

  • Datapump Export Wipro Question

    Hi,
    we are having a 50GB size of Schema but the Dumpfile size will be 5GB only,
    now what the question is ,
    How can we Export this 50gb Schema using the 5GB Dumpfile?
    Is this is posible?
    Give me the solution for this with query?

    Create a compressed export on the fly. Depending on the type of data, you probably can export up to 10 gigabytes to a single file. This example uses gzip. It offers the best compression I know of, but you can also substitute it with zip, compress or whatever.
    # create a named pipe
    mknod exp.pipe p
    # read the pipe - output to zip file in the background
    gzip < exp.pipe > scott.exp.gz &
    # feed the pipe
    exp userid=scott/tiger file=exp.pipe ...
    refer the link:
    http://www.orafaq.com/wiki/Import_Export_FAQ#Can_one_export_to_multiple_files.3F.2F_Can_one_beat_the_Unix_2_Gig_limit.3F

  • Export larger dump file  in small drive

    Hello Everybody,
    I want to take the export of 100GB database (Dump file will come approx 50 to 60 GB) to the drive having 40GB space .Is it possible.
    Thanks in Advance
    Regards
    Hamid
    Message was edited by:
    Hamid

    No version, no platform... Why? Too difficult? Strain on your fingers?
    The answer is platform and version dependent!
    Anyway: on 9i and before : Winblows: Set the compression attribute of the directory or drive you plan to compress to, make sure this attribute is inherited.
    Unix: export to a pipe, compress the input of this pipe.
    Scripts are floating on this forum and on the Internet everywhere (also on Metalink) everyone should be able to find them with little effort.
    On 10g: expdp can generate compressed exports.
    Sybrand Bakker
    Senior Oracle DBA

  • How to export?

    Hello,
    I have a scenario,
    Server X: Filesystem size=4GB
    Server Y: Database on other system=10GB
    But i want to export my 10GB database from Y server to X server which seems to be not having adequate space.
    someone tole me that this can be achieved using 'mknod' command in linux?
    anyone have idea about that?
    i am searching on net for it, but though of asking it to u guys.
    Thanks
    aps

    To create the compressed export file you do something similiar to
    the following:
    % mknod /tmp/exp_pipe p # Make the pipe
    % compress < /tmp/exp_pipe > export.dmp.Z & # Background compress
    % exp file=exp_pipe <other options> # Export to the pipe
    To read the compressed export file you do something similiar to
    the following:
    % mknod /tmp/imp_pipe p # Make the pipe
    % imp file=imp_pipe <other options> & # Import from the pipe
    % uncompress < export.dmp.Z > /tmp/imp_pipe # Background uncompress
    But dont know how much it will compress 10GB db, you have to test it.

Maybe you are looking for

  • SRM 5.0: CLEAN_REQREQ_UP, BBP_REQEQ_TRANSFER and Webmonitor

    Hi SDN, we are on SRM 5.0 SP15. Classic Scenario with MM Backend. We have some issues with the webmoitoring. We have the clean job clean_reqreq_up running in 10 minutes intervals. Also we have bbp_get_status_2 every 20 minutes. Quite often we have re

  • ABAP-DICTIONARY PROBLEM

    I have created one Z-Table with following fields Fields                                          Data Elements MANDT                                       MANDT WERKS                                      WERKS_D MATNR                                 

  • JPA and table names

    Hello, are there any possibilities to use the same table names in different SAP JPA EJB? For example, the first project has entity with a name History and the table name is TMP_HISTORY. The second one has also the entity History with a table name TMP

  • Problem in DC Referencing

    Hi, i created 4 DCs : 1)EJB_DC containing a stateless session bean. 2)Java_DC containing a proxy class to make a call to the EJB method. 3)WD_DC containing a WebDynpro component calling  the Proxy Class within the Java DC.It has a build time and  run

  • Detecting multiple images

    I just bought a macbook pro for the first time but am having some trouble with image capture with my scanner.  When I go to the option of detecting multiple items, it does not work.  All it does is scan one big scan instead of each individual item.