Import issue - split and compressed dump

Hi,
I received 15GB export dump file from site as below, they are splited and compressed
1. xaa 4gb
2. xab 4gb
3. xac 4gb
4. xad 3gb
i have import these dump file here in Unix server. i found some document to import the split and compressed dump
i follow the below steps
1. copy all 4 files into a directory
2. then i used commands
rm -f import_pipe
mknod import_pipe p
chmod 666 import_pipe
(import_pipe file created in the current diriecty)
nohup cat xaa xab xac xad | uncompress - > import_pipe & ( Process number created like 23901. we need to wait till complete the backgroud process before giving IMPORT command?)
then i give
imp userid=<connection string> file=import_pipe full=yes ignore=yes log=dumplog.txt
then it shows the imp-0009 error...
pls help me resolve this issue.
Thanks in advacne

Pl post details of OS and database versions of the source and target. You will have to contact the source to determine how these files were created. It is quite possible that they were created using the FILESIZE parameter (http://docs.oracle.com/cd/E11882_01/server.112/e22490/original_export.htm#autoId25), in which case the import process can read from these multiple files without you having to further manipulate them.
HTH
Srini

Similar Messages

  • Split and Compress files

    Is it possible to compress and split files via terminal without a 3rd part software?
    I know I can compress but I cant find anywhere how to split it to certain size parts.
    Furthermore is there any app to help users with terminal commands?

    there are plenty of compression tools on the mac. gzip and bzip2.
    Lately, I like bzip2. This works a little easier on linux, especially if
    the original file is split into a lot of smaller ones.
    $ gzip TestFile
    $ ll
    total 120944
    -rw-r--r--@ 1 andya  501  61921000 May 31 22:01 TestFile.gz
    $ split -b 10m TestFile.gz
    $ openssl dgst -sha256 TestFile.gz xa*
    SHA256(TestFile.gz)= cd041d79b4af1a54b524602363a18e67201c4acb03675dfebcae9109d8707367
    SHA256(xaa)= a3d803049aee16cbbfd679668164707eb9053488fb2ec5720f282a711ee8c451
    SHA256(xab)= 0a79e26c77cb47ec09f5cf68cfa45ea8f52f5157cad07c0ac187eaf0ae59ff79
    SHA256(xac)= 0f556e8e93dcb41cb3ab20454ab46c016d6596316d75316d810f45e7c2b3682e
    SHA256(xad)= abc3db83737346a8af6ac7ba9552c4b71cf45865f7b9faded54f1683b2afd077
    SHA256(xae)= 3afbad7b68a1d1c703865422e40cbd68ca512a652f985a0714258b7d936ad0f6
    SHA256(xaf)= 11879853fcfbe6df6fb718e1166d4dcae7e0e6ebd92be6c32c104c0a28f0439a
    keep the hash of the smaller file, in case you get a error on the far end of the transfer.
    That way you only need to resend the small file that's corrupt.
    put the the TestFile back together.
    $ cat xa* > ScratchFile
    $ openssl dgst -sha256 TestFile.gz ScratchFile
    SHA256(TestFile.gz)= cd041d79b4af1a54b524602363a18e67201c4acb03675dfebcae9109d8707367
    SHA256(ScratchFile)= cd041d79b4af1a54b524602363a18e67201c4acb03675dfebcae9109d8707367

  • Import issues, Audio/video Unsynced

    Having major import issues, audio and video won't play in sync. Working with MP4 and MXF on most recent version of Premiere CC. Everything plays fine in quicktime/vlc but goest haywire in premiere. Have been through tech support twice, updated drivers, cleared cache and Premiere still has trouble playing back video/audio in sync. Any idea what's going one? Our company recently switched to adobe for the supposed "ease" it handles different codecs and so far have had nothing but issues? Sorry if this is already addressed in one of the forms, am at my witts end now having lost multiple days trying to troubleshoot premiere.

    Hi Adobe4,
    Having major import issues, audio and video won't play in sync. Working with MP4 and MXF on most recent version of Premiere CC.
    Sorry for this issue. Tell me, where did these files originate? From a video camera? Do they have a constant frame rate or a variable frame rate?
    Thanks,
    Kevin

  • Compressed dump file while export on Windows!

    Hi All
    Could someone suggest how do we take and compressed dump file while exporting on Windows environment with oracle 9.2.0.6.Please specify the exact syntax how to proceed on this.
    Thanks
    Bala.

    I don't think that exp tool can compress the export file it is creating (the compress parameter has nothing to do with the export file but with the way the database objects are going to be created in the target database).
    If you run export under Unix, there is a possibility to use Unix pipes to compress the export file during the export using Unix commands (compress or gzip for example). I don't know how to do something similar under Windows I have some doubts about this possibility.

  • Issue regarding Planning layout is not getting rendered and is dumping at : CL_RSDRC_TREX_QUERY_LAYER ~ _GET_PARTPROVS_WITH_TREX_PART in SAP TPM IP

    Gurus,
    I am facing an issue regarding SAP TPM IP ( HANA)
    I have 3 Infoproviders
    Planning infocube, Planning DSO1, Planning DSO2 and i created multiprovider and added these 3 infoproviders into it. I have created Aggregation level on multiprovider. Created Bex Input ready query on Aggregation level.
    Issue is  Planning layout is not getting rendered and is dumping at : CL_RSDRC_TREX_QUERY_LAYER ~ _GET_PARTPROVS_WITH_TREX_PART.
    I tried debugging it and found it is trying to read   i_r_pro -> n_ts_part. It is populated with only 3 values (i.e. 2DSOs and 1 Cube), whereas <l_partprov>-partprov is referring to Aggergation level, hence read statement isn't successful, it is dumping.
    The class CL_RSD_INFOPROV_CACHE->GET() method is trying to populate the N_TS_PART.N_TS_PART uses P_R_INFOPROV table which seems to be already populated.  So, I debugged all the below methods to find out how the P_R_INFOPROV but couldn't find any clue.
    Can any one help,it would be really help.
    Thanks
    Ashok

    Hello Gregor,
    On the launch of planning layout it throws an error message:
    Planning is not possible RSCRM_IMP_CORE008.
    When I debugged, i got to a point wherein
    particular Real Time Planning DSO is
    not getting retrieved under the MultiProivder in below class.
    Class: CL_RSCRM_IMP_ACTIONS_SERVICE, Method: GET_INFOPROV is not
    returning the real time Info-Provider Name(i.e. Planning DSO)
    underlyingthe Multiprovider.
    I've also tried to run the report mentioned by you for the Multiprovider but issue still exists.
    Let me know, if you have any pointers on this topic.
    Thanks,
    Jomy

  • IPhone 4 and iPhoto 9 import issue

    I just bought an iPhone 4 to replace my iPhone 3GS a few days ago. I finally had a chance to try its camera this weekend. Much to my surprise, all the photos I imported into iPhoto 9 on my MacBook Pro lost their XIF and geotag data once they landed in my iPhoto 9 library. I deleted the photos from my iPhone 4 so I couldn't go back to verify that the information was there. So tonight, I went out for a walk and I shot a few more photos with my iPhone 4. Instead of importing them into iPhoto 9, I used Image Capture to import them from my iPhone 4 into my MBP's Pictures folder. Then I checked and all the XIF and geotag data survived the transfer. I then imported the photos into iPhoto 9 by dragging them onto its icon in the Dock. I then was able to view the XIF and geotag data from within iPhoto 9.
    So my question is, why can't iPhoto 9 import the XIF and geotag data in photos directly from the iPhone 4? Is there any way I can fix this via some optional setting or am I do I need to wait for Apple to resolve this issue?

    Stanley Horwitz wrote:
    <...>
    So my question is, why can't iPhoto 9 import the XIF and geotag data in photos directly from the iPhone 4? Is there any way I can fix this via some optional setting or am I do I need to wait for Apple to resolve this issue?
    I don't have any problem importing photos from my iPhone 4 with iPhoto '09 (v 8.1.2)--it just works, so I really don't know what might be going wrong. I import by selecting my iPhone 4 in the iPhoto Devices and then select the ones I want to import and click on import selected. It has never failed to also import the EXIF data including the geolocation/camera info.

  • Is it possible to import and export dump between oracle 9 and oracle 11?

    Is it possible to import and export dump between oracle 9 and oracle 11?
    Source DB : Oracle 11g(unix0
    Oracle client : Oracle 9(Windows)
    Export import utility : Oracle 9's
    Destination DB : oracle 9

    I am getting the Following Error and export utility is not responding after this.
    Export: Release 9.2.0.1.0 - Production on Thu Jul 15 14:37:01 2010
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
    server uses AL32UTF8 character set (possible charset conversion)

  • Importing issues with Final Cut Pro X. Green flash and twitches while play back

    Hello, I'm having some importing issues with Final Cut Pro X and I can't find any solution whatsoever. Perhaps my importing settings are wrong?
    Anyways, I've done a lot of recordings on eye TV and HD PVR with some videos that are about 3 hours long. I exported them into H.264 files, which turns them into mpeg-4 files with AAC audio. Then I import them into final cut pro X and I notice it takes a long time for them to render the videos and I notice that there are some green screens and twitches while I play through the video.
    Here is a screenshot of the green screen that I'm talking about:
    [URL=http://imageshack.us/photo/my-images/688/screenshot20130409at338.png/][IMG]http://img688.imageshack.us/img688/4537/screenshot20130409at338.png[/IMG][/URL]
    Could it be my 2 video graphics card that is causing this kind of issues? Or is there any solution to fix this?
    In the past, I've used iMovie 09 and it worked out perfectly. However, the new iMovie 11 is terrible because it automatically optimzie my 3 hour long videos into making the importing process to (144 hours long to import). Is there any way to get iMovie 09 back or am I stuck on dealing with this Final Cut Pro X issues?
    More information:
    My Hauppauge HD PVR settings are set to custom. Video Constant Bit Rate, the average bit rate: 13.5 Mbps. The audio is AAC.
    My project settings for video properties is set to custom, 720P HD Format, 1280 x 720 Resolution and 30P Rate. The audio and render properties is set to custom, audio channels is stereo, audio sample rate is 48kHz and the Render Format is Apple ProRes 422.
    I export the video as a quicktime movie. The export setting is H.262.
    The summary of the export is:
    File type: Quicktime movie
    Estimated size: 397GB
    H.264, Width and height: 1280 x 720. Framte rate: 30 fps

    My Hauppauge HD PVR settings are set to custom. Video Constant Bit Rate, the average bit rate: 13.5 Mbps. The audio is AAC.
    My project settings for video properties is set to custom, 720P HD Format, 1280 x 720 Resolution and 30P Rate. The audio and render properties is set to custom, audio channels is stereo, audio sample rate is 48kHz and the Render Format is Apple ProRes 422.
    I export the video as a quicktime movie. The export setting is H.262.
    The summary of the export is:
    File type: Quicktime movie
    Estimated size: 397GB
    H.264, Width and height: 1280 x 720. Framte rate: 30 fps

  • External Hard Drive and Import Issue

    I'm sorry...I know the answer is here but I've read and read and just can't understand.
    I have an external hard and imported about 30 gigs into Iphoto with the settings to NOT create a copy into the Iphotos directory. So I understand why the don't show up in Iphoto when the external is not connected but I don't understand why 10 gigs of memory was consumed when I executed the import???
    Any help for a newbie would be gratefully appreciated....

    A. Make sure the drive is formatted Mac OS Extended (Journaled). iPhoto needs to have the Library sitting on disk formatted Mac OS Extended (Journaled). Users with the Library sitting on disks otherwise formatted regularly report issues including, but not limited to, importing, saving edits and sharing the photos.
    B. When you run a Referenced Library iPhoto makes aliases that point to your own photos. To convert to a Managed Library you need to replace those with the actual files. AliasHerder is an application that will replace these aliases with the actual files, and so preserve your Library and all your work in it. As always: back up first.
    C. To move the Library:
    1. Quit iPhoto
    2. Copy the iPhoto Library from your Pictures Folder to the External Disk.
    3. Hold down the option (or alt) key while launching iPhoto. From the resulting menu select 'Choose Library' and navigate to the new location. From that point on this will be the default location of your library.
    4. Test the library and when you're sure all is well, trash the one on your internal HD to free up space.

  • New iPhoto 9.5 (902.7) much slower and import issues

    I have a 2007 iMac and ever since I uploaded to Mavericks/iPhoto there's been issues.
    While it seems that a number of people have  noticed 'slowness' and I like other people had lots of crashing with items like Garageband (Fixed with 10.0.1 update), I conteinue to have problems with iPhoto and subsequently iTunes.
    When I attach my iPhone to the USB cable, it properly starts iPhoto and the iPhone shows up as a device, but then it just sits there with the Beach Ball trying to load the pictures.
    I've deleted everything on the iPhone and tried again but I continually have problems.
    I hooked up my iPad 2 to the same way and it only had one picture on it but it took forever to show up but it did successfully load however, I tried again with some screenshots that I took and it hung as well.
    This seems to be connected to iTunes sync as well as even when I have no photos on my iPhone so it doesn't trigger iPhoto loading, the photo sync either takes forever or quite often, just hangs iTunes and stops responding.
    iphoto once complained about library problems so I tried all the steps except to completely recreate the database.  Still same issues (slow and hanging imports)
    This also is happening on my Olympus OM-D EM-1 camera and SD Card reader that I attach so it seems more of a general issue than specific to my iPhone, etc.
    Any suggestions?  Recreate the entire iPhoto Database?

    As a test launch iPhoto with the Option key held down and create a new, test library.  Connect the iPad or iPhone and try to import some photos  to see if the same problem persists. Does it?
    OT

  • Unicode export:Table-splitting and package splitting

    Hi SAP experts,
    I know there are lot of forums related to this topic, but I have some new questions and hence posting a new thread.
    We are in the process of doing unicode conversion in our landscape(CRM 7.0 system based on NW 7.01) and we are running on AIX 6.1 and DB2 9.5. The database size is around 1.5 TB and hence, we want to go in for optimization for export and import in order to reduce the downtime.As a part of the process, we have tried to do table-splitting and parallel export-import to reduce the downtime.
    However, we are having some doubts whether this table-splitting has actually worked in our scenario,as the export has exeucted for nearly 28 hours.
    The steps followed by us :
    1.) Doing the export preparation using SAPINST
    2.) Doing table splitting preparation, by creating a table input file having entries in the format <tablename>%<No.of splits>.Also, we have used the latest R3ta file and the dbdb6slib.o(belonging to version 7.20 even though our system is on 7.01) using SAPINST.
    3.) Starting with the export using SAPINST.
    some observations and questions:
    1.) After completion of tablesplitting preparation, there were .WHR files that were generated for each of the tables in DATA directory of export location. However, how many .WHR files should be created and on what basis are they created?
    2.) I will take an example of a table PRCD_CLUST(cluster table) in our environment, which we had split. We had 29 *.WHR files that were created for this particular table. The number of splits given for this table was 36 and the table size is around 72 GB.Also, we noticed that the first 28 .WHR files for this table, had lots of records but the last 29th .WHR file, had only 1 record. But we also noticed that, the packages/splits for the 1st 28 splits were created quite fast but the last one,29th one took a long time(serveral hours) to get completed.Also,lots of packages were generated(around 56) of size 1 GB each for this 29th split. Also, there was only one R3load which was running for this 29th split, and was generating packages one by one.
    3.) Also,Our question here is that is there any thumb rule for deciding on the number of splits for a table.Also, during the export, are there any things that need to be specified, while giving the inputs when we use table splitting,in the screen?
    4.) Also, what exactly is the difference between table-splitting and package-splitting? Are they both effective together?
    If you have any questions and or need any clarifications and further inputs, please let me know.
    It would be great, if we could get any insights on this whole procedure, as we know a lot of things are taken care by SAPINST itself in the background, but we just want to be certain that we have done the right thing and this is the way it should work.
    Regards,
    Santosh Bhat

    Hi,
    First of all please ignore my very first response ... i have accidentally posted a response to some other thread...sorry for that 
    Now coming you your questions...
    > 1.) Can package splitting and table-splitting be used together? If yes or no, what exactly is the procedure to be followed. As I observed that, the packages also have entries of the tables that we decided to split. So, does package splitting or table-splitting override the other, and only one of the two can be effective at a time?
    Package splitting and table splitting works together, because both serve a different purpose
    My way of doing is ...
    When i do package split i choose packageLimit 1000 and also split out the tables (which i selected for table split)  into seperate package (one package per table). I do it because that helps me to track those table.
    Once the above is done i follow it up with the R3ta and wheresplitter for those tables.
    Followed by manual migration monitor to do export/import , as mentioned in the previous reply above you need to ensure you sequenced you package properly ... large tables are exported first , use sections in the package list file , etc
    > 2.) If you are well versed with table splitting procedure, could you describe maybe in brief the exact procedure?
    Well i would say run R3ta (it will create multiple select queries) followed by wheresplitter (which will just split each of the select into multiple WHR files)  ...  
    Best would go thought some document on table spliting and let me know if you have specific query. Dont miss the role of hints file.
    > 3.) Also, I have mentioned about the version of R3ta and library file in my original post. Is this likely to be an issue?Also, is there a thumb rule to decide on the no.of splits for a table.
    Rule is use executable of the kernel version supported by your system version. I am not well versed with 7.01 and 7.2 support ... to give you an example i should not use 700 R3ta on 640 system , although it works.
    >1.) After completion of tablesplitting preparation, there were .WHR files that were generated for each of the tables in DATA directory of export location. However, how many .WHR files should be created and on what basis are they created?
    If you ask for 10 split .... you will get 10 splits or in some case 11 also, the reason might be the field it is using to split the table (the where clause). But not 100% sure about it.
    > 2) I will take an example of a table PRCD_CLUST(cluster table) in our environment, which we had split. We had 29 *.WHR files that were created for this particular table. The number of splits given for this table was 36 and the table size is around 72 GB.Also, we noticed that the first 28 .WHR files for this table, had lots of records but the last 29th .WHR file, had only 1 record. But we also noticed that, the packages/splits for the 1st 28 splits were created quite fast but the last one,29th one took a long time(serveral hours) to get completed.Also,lots of packages were generated(around 56) of size 1 GB each for this 29th plit. Also, there was only one R3load which was running for this 29th split, and was generating packages one by one.
    Not sure why you got 29 split when you asked for 36, one reason might be the field (key) used for split didn't have more than 28 unique records. I dont know how is PRCD_CLUST  split , you need to check the hints file for "key". One example can be suppose my table is split using company code, i have 10 company codes so even if i ask for 20 splits i will get only 10 splits (WHR's).
    Yes the 29th file will always have less records, if you open the 29th WHR you will see that it has the "greater than clause". The 1st and the last WHR file has the "less than" and "greater than" clause , kind of a safety which allows you to prepare for the split even before you have downtime has started. This 2 WHR's ensures  that no record gets missed, though you might have prepared your WHR files week before the actual migration.
    > 3) Also,Our question here is that is there any thumb rule for deciding on the number of splits for a table.Also, during the export, are there any things that need to be specified, while giving the inputs when we use table splitting,in the screen?
    Not aware any thumb rule. First iteration you might choose something like 10 for 50 GB , 20 for 100 GB. If any of the tables overshoots the window. They you can give a try by  increase or decrease the number of splits for the table. For me couple of times the total export/import  time have improved by reducing the splits of some tables (i suppose i was oversplitting those tables).
    Regards,
    Neel
    Edited by: Neelabha Banerjee on Nov 30, 2011 11:12 PM

  • Import Issues In oracle 8i

    Hi All,
    When i am trying import dump file to non-prod box i notices some error in import.log some of index are trying create first without creation of object for example
    setp1:- Index N1 is trying to import first on table A, without importing table A and that index as been failed saying that table are view does not exist please help me in this issues
    Oracle Version : 8i
    Thanks Much,
    Napi.

    Please, provide the source database version, the export version utility, the target database, the import version utility.
    All the version should be for example like 8.1.7.4.
    And also, provide the extract of logfile with your exact errors.
    Nicolas.

  • Imported file split 20-30 times / need to multi-select de-flicker

    Hi all - first time poster and all-round noob.
    First my set up - I'm running Win7 Home Premium (64 bit), with PRE8 on a custom build i5/750 + ATI5850 + 4GBram. I'm using an old (3+yo) Sony DVD digicam DC-706 if I remember correctly. My problem is twofold - but I'll start with the most immediate one. When I get media off the dvd's, they appear in as 2/3 VOB files (nothing unusual with that). However, when I drag them from the media window onto the Sceneline, these VOB files are automatically split. In some cases the one file can suddenly be split 20-30times. Now this is annoying, and I was wondering if (a) I can stop the splitting, and (b) whether I can merge these clips. I can't locate anything online to help.
    Now the second part of the problem... when I do videos, I always have to select de-flicker on the video options. If I don't there's an awful jitter to the finished product. I thought that this was field reversal problem but switching field dominannce doesn't seem to stop it. When I use de-flicker, it goes away. This has been the same in Premiere Elements 5. Not sure if it's the cam or what... or is it my TV? Anyhow - it's annoying having to go into the clip>video>field options>flicker removal each time... especially when the software automatically splits into 20-30 clips.
    Is there any way to set the flicker removal on a global level rather than on a clip by clip basis. Also does anyone have a better solution as to why changing the field order do anything, but why flicker removal does. [note - the jitter isn't just on horizontal lines - but looks exactly like an interlacing problem which most probably just gets smoothed out with flicker removal]
    Thanks in advance.

    I've spent a little bit of time trying to understand this... and the following examples might be helpful to explain what's going on (apparently). This experimentation was done by (a) getting a vob file from dvd camcorder, (b) setting up a project with either lower (default, case A and B) or upper field first (case C and D). I then change the combination of parameters between reversing the field or not, and turning on interlacing of consecutive frames or not. The final parameter is whether or not to set a lower field first on the output (case A and C) or upper field first (B and D). Here I've written an X for where there was the jitter evident and OK where there was no jitter evident.
    TEST RESULTS
    (A) PROJECT PRESET - LOWER FIELD FIRST / OUTPUT AVI - LOWER FIELD FIRST
                                   REVERSE FIELD
                                   ON               OFF        
    Interlace   ON          OK                X
                  OFF           X                  OK
    (B) PROJECT PRESET - LOWER FIELD FIRST / OUTPUT AVI - UPPER FIELD FIRST
                                   REVERSE FIELD
                                   ON               OFF
    Interlace   ON          OK               X
                  OFF          X                  OK
    (C) PROJECT PRESET - UPPER FIELD FIRST / OUTPUT AVI - LOWER FIELD FIRST
                                   REVERSE FIELD
                                   ON               OFF
    Interlace   ON          OK               X
                   OFF          OK               X
    (D) PROJECT PRESET - UPPER FIELD FIRST / OUTPUT AVI - UPPER FIELD FIRST
                                  REVERSE FIELD
                                   ON               OFF
    Interlace   ON          OK               X
                   OFF          OK               X
    The end results is that the solution to getting a no-jitter output is not trivial. It appears to not only depend on the input field dominance, but also on the output field dominance and both the reverse field dominance+interlace consecutive frames.
    So from this very cursory examination - it would appear that if you had lower field first on both the import and export that there should be no need to do anything. Going for an upper field first however would still require you to reverse the field irrespective of whether you had an upper field first on output.
    I hope this helps a bit in understanding some of the issues... but please be aware that this is just for AVI files created on my hdd. I will need to test it out on the real thing (i.e. burnt to dvd)... stay tuned.

  • Hello All... Back after a brief absence, things look a little bit different. I'm trying to take a 16 minute mini dv video and compress it for use on the web. I'm interested in any suggestions you may have on settings for the video and audio tracks. I'v

    Hello All...
    Back after a brief absence, things look a little bit different.
    I'm trying to take a 16 minute mini dv video and compress it for use on the web. I'm interested in any suggestions you may have on settings for the video and audio tracks. I've tried using Sorenson 3 (15 frames, key frames set to automatic, 320 x 240) for video and IMA 4:1 (mono) for audio. The resulting video looked great but the file size came in at about 255 Mb.
    Thanks!
    PowerMac G5 1.8 Dual   Mac OS X (10.4.3)  
    Message was edited by: Dan Foley

    Thank you for the replies.  Everyone was correct about the jack, interface, and phasing problems.  I have been unplugging my motu audio interface and then using headphones at work.  I have not changed any detailed audio output settings in logic.  When I read that the jack might be a problem I tried switching headphones.  This actually helped.  I am using dre-beats headphones and they seem to be having issues with the mac/jack-(the phasing/panning problems.  I can use these headphones with other devices but not the mac.  I have to use ipod ear buds and the phasing seems fixed.  Hopefully this information is helpful to someone else. 
    If anyone knows how to correct this issue please let me know its difficult to know what my final mixes are going to sound like and I have had to keep bouncing everything into i-tunes- sync to ipod and then listen in my car radio. 

  • Error in phase: ADD_TO_BUFFER in SPAM during import of SAPKB70017 and 18

    Hello
    What is problem in this case? During import of SAPKB70017 and SAPKB70018 I get following error(there is no SLOG file in /usr/sap/trans/log):
    The import was stopped, since an error occurred during the phase
    ADD_TO_BUFFER, which the Support Package Manager is unable to resolve
    without your input.
    After you have corrected the cause of the error, continue with the
    import by choosing Support Package -> Import queue from the initial
    screen of the Support Package Manager.
    The following details help you to analyze the problem:
        -   Error in phase: ADD_TO_BUFFER
        -   Reason for error:
        -   Return code: 0211
        -   Error message:
    Notes on phase ADD_TO_BUFFER
    In this phase the queue is placed in the transport buffer of your
    system. This phases can terminate due to the following reasons:
    o   TP_INTERFACE_FAILURE: The tp interface could not be called. There is
         an RFC error.
    o   CANNOT_ADD_PATCH_TO_BUFFER: A Support Package could not be included
         in the transport buffer. The program tp is trying to open a file in
         the file system. For more information, refer to the SLOG log file in
         the directory /usr/sap/trans/log.
    A prerequisite of the Support Package Manager is that the Change and
    Transport System (CTS) is configured correctly. For more detailed
    information, read the online documentation available from Help -> SAP
    Library -> mySAP Technology Components -> SAP Web Application Server ->
    Change and Transport System .
    A list of the most important SAP Notes for Online Correction Support
    (OCS) is available in SAP Note 97620, which is updated regularly.

    Hi Jan,
    First, check whether your STMS is configured properly. Next check the transport directory and the transport tool from within STMS (system -> check in menu options).
    If these 2 are ok, then you might check from within the OS layer by manually adding to buffer (tp add to buffer <Request No> <SID>).
    Also, I am assuming you are doing this from client 000 with a non-standard user having proper authorizations.
    I believe this to be an issue of permissions at the OS layer.
    Regards
    Siva

Maybe you are looking for