Export/Import Photoshop preferences, plugins, etc.?

I need to wipe my computer clean and reinstall windows.
re-installing Photoshop is not a big deal, but I have several preferences, plugins, saved layer styles, actions, etc. That I do not want to loose and will not be retained.
Is there a way I can back all this up, and so when i reinstall photoshop its the same as it is now?
I also will want to do this with Dreamweaver, After Effects, and Illustrator.
Thanks!

Courtesy of d-solmedia.com...
1: Give Them A Home:
First, decide where you’ll be backing up your settings. Be it a folder on your hard drive, a CD/DVD or an offsite storage service. Wherever it is find a consistent place that will be separate up from your main computer should you have a total crash.
2: Archive your actions:
Photoshop is not a place to store actions. When you download a new action don’t just load it in the actions palette and expect it to stay there. PS will retain the actions so long as the preference to keep it loaded remains. Also deleting an action in PS does not actually delete the file, it just removes it from the action palette. As long as you have your action files safely stored you’re good. But, if you load an action, and then delete the file expecting PS to retain it forever, your doom is sealed.
I like to make a “favorite actions” set that I store with my other actions. This way all my commonly used actions are in one set that I can load fast. I keep it in PS all the time, but it’s backed up should I have a crash. My other actions are nearby as well, but I load them only occasionally since all my favorites are in one set. You can do this by making a new action set (folder) within PS, then drag your favorite actions into it from other sets, then save your favorites set in a safe place.
2: Saving Brushes & More With Preset Manger.
In PS go to Edit/Preset Manager. A window will pop up that contains a dropdown to select your brushes, swatches, gradients, styles, contours, patterns, custom shapes or tools. If you have any custom settings for any of these it’s as simple as going to the manger, selecting what you want to manage and saving the set to your backup location.
http://www.prophotoshow.net/blog/wp-content/uploads/2008/11/file-preset-manager.jpg
3: Backing Up Keyboard Shortcuts:
Keyboard shortcuts can save a ton of time, and once you set up custom ones you’ll want them to be there.  I for example set CMD+F to flatten since I use it so often. If I get on a machine that does not have the setting, I quickly miss it. All you need do is go to up to the menu and choose Edit/Keyboard Shortcuts. Slick the save icon, and save the current shortcuts in a file. Now you can easily load it on any machine by simply loading the saved file.
Power User Tricks:
There’s more you can backup than what’s on the surface of PS. Workspaces  for example control the location of your palette’s and menus. Most know that you can save workspace by simply going to Window/Worskpaces. Thing is, there’s not really a built in method of exporting those workspaces for backup. To do this we have to talk about the Power User method.
Outside of Photoshop, go to your application (programs) folder and find Photoshop. We’re using CS3 for this, but it should be similar with other versions. Inside the PS folder look for the Presets folder, and inside that you’ll find stuff like Workspaces, and all kinds of other settings. You can just copy and paste things from here, to your backup and be good to go.
Photoshop CS4 Notes: (UPDATE)
Most locations are the same in CS4. At the moment it looks like the only differnce if the Workspaces location. Rather than being inside the folder that contains Photoshop it’s found in these locations. Loaction information for other settings in CS4 can be found here.
Mac… Users/[Username]/Library/Preferences/Adobe Photoshop CS4 Settings/WorkSpaces
Vista… Users/[Username]/AppData/Roaming/Adobe/Adobe Photoshop CS4/Adobe Photoshop CS4 Settings/Workspaces
XP… Documents and Settings/[Username]/Application Data/Adobe/Adobe Photoshop CS4/Adobe Photoshop CS4 Settings/Workspaces

Similar Messages

  • I have 100 groups in planning for those 100 groups i want to build roles like interactive,view user,planner etc.for those how to change in export -import folder .xml file  in that edit  how  to change user roles in that xml it will generate automatic id.h

    I have 100 groups in planning for those 100 groups i want to build roles like interactive,view user,planner etc.for those how to change in export -import folder .xml file  in that edit  how  to change user roles in that xml it will generate automatic id.how to do that in xml file ?

    Thanks john for you are reply.
    I had tried what you sad.I open shared service in that foundation project i had export shared service.after that in import-export file.In that role.csv,user.csv,group.csv.Like this file have.When i open user file added some users after i trying save in excel it shown messgse
    I click yes and save the .csv file and import from share servie. i got error like this
    am i doing right way john.or explain clearly

  • Keeping video proportions during still image export/import

    What I'm doing is exporting a still image to the Mac desktop, then dropping it into Photo-to-Movie where I create pans and zooms on it, then exporting the pan/zoom sequence as a Quicktime movie that I drop back into FCE and splice into the Timeline.
    Trouble is, the image in the pan/zoom sequence always comes back into FCE distorted (elongated), I guess because the exported still image that it was made from reverts to the square pixels during the export process from FCE.
    So, my question is, in what form should I export the still image from FCE so that it retains the proportions it has in the video, so that when I operate on it in Photo-to-Movie and then bring it back into FCE, it still has the proportions of a video (NTSC) image?
    In other words, I need to retain the video (NTSC) image's proportions throughout the process of export, then creating pan/zooms on it in Photo-to-Movie, and then reimporting it into FCE.
    To accomplish this, I presume I have to either export it from FCE in some special NTSC-video-compatible form, or else convert it in Photoshop to an NTSC-compatible image before I drop it into PTM, create the pan/zoom sequence, and then bring that sequence back into FCE.
    I'd be grateful if anyone could suggest a solution to this problem.
    Tom

    Hello Tom Baker 1
    My friend let me tell you something - I TOTALLY share your frustrations, and disgust with the poor results of Keyframe (within FCE).
    I am not a software engineer, but my perseverance to try and try again, is far above average.
    At the risk of getting on a soapbox, (and I can attest to my excellent equipment capability), please believe me, (like yourself), I've paid my dues trying to get Keyframing to work suitably with stills in FCE.
    I think I can safely say, beyond all doubt, if you want to get satisfactory results in regard to pans and zooms, (without being a math major), some options for reliability and smoothness are: 'Photo to Movie' (just as you're doing), Fotomagico, http://boinx.com/fotomagico/overview/, (although I've only heard it's pretty good) - haven't tried it yet.
    OR
    A recent discovery of mine: Lyric Media Pan and Zoom http://www.lyric.com/fcp-plugins/panzoompro/pzp.htm - The really nice thing about this is that it's a FC, or FCE Plug In, and utilizes FC's keyframe software engine. Why is it different than Keyframe by itself in FCE? Because (so far), what I can see is that it sort of fixes the mickey-mousery, herky-jerky nightmare of Keyframe within FCE. To me it sure looks like it can take whatever pixel size you throw at it, (and without being a math major having to apply cautionary resizing to every darn still), it just does the job. Yes it seems to do the re-sizing for you, and consequently produces smooth, reliable motion to stills.
    Of course the advantage to this is that you're finally using an application (within FCE), to create your pans and stills right there. No exporting/importing of QT files as with Photo to Movie, or iMovie. And the parameters of control are more sophicated than Photo to Movie.
    Again, I'm still working with it, and I still need to master it, but it sure beats keyframe.
    One side note:
    I really love the crispness of an iMovie pan or zoom, but we all know by now that it's downfall are the dreaded 'JAGGIES'. As confirmed here on the forum, what seems to make Photo to Movie work so well, is the fact that by it's very nature of design, it automatically smoothes out, and probably reduces or resizes images so that they will NOT produce unwanted glistening (aliasing). And as once said here, your eye will accept this softening of an image, far better than the Jaggies, or the herky jerky.
    With all that said, I hope I didn't rant on too much - as I mentioned, there's nothing like 'experience'.
    Peace
    Mike
    (it would be nice to see someone comment on Lyric Media Pan Zoom Pro)
    PS- an excerpt from the documentary I'm working on:
    http://www.youtube.com/watch?v=jmB0_qiONQs

  • GUI Export/Import Dump feature?

    Will SQL Developer ever have a GUI Export/Import wizard. Right now we use the exp/imp DOS commands to acheive this. It would be nice to have a GUI guide you through the process in SQL Developer.
    Thanks
    Alex

    Sorry I could not respond earlier. This week has been absolutely CRAZY 4 me.
    What I'd like to see (in addition to what Tony just mentioned above) is a Wizard kind of interface that lets me export/import an entire database/schema/table etc. I should be able to select the from and to character set. I should have all the options available in the current command line exp/imp utility (but in a good GUI interface). I'm just pasting all the command line options currently available with exp/imp below. Some of these options can be permanent settings which can be specified somewhere in the preferences. The things that change with each export or import like userid etc.. can be options on the wizard.
    In addition to that if you have sql*loader control file also that would make it a great tool indeed!
    exp
    Keyword Description (Default)
    USERID username/password
    FULL export entire file (N)
    BUFFER size of data buffer
    OWNER list of owner usernames
    FILE output files (EXPDAT.DMP)
    TABLES list of table names
    COMPRESS import into one extent (Y)
    RECORDLENGTH length of IO record
    GRANTS export grants (Y)
    INCTYPE incremental export type
    INDEXES export indexes (Y)
    RECORD track incr. export (Y)
    DIRECT direct path (N)
    TRIGGERS export triggers (Y)
    LOG log file of screen output
    STATISTICS analyze objects (ESTIMATE)
    ROWS export data rows (Y)
    PARFILE parameter filename
    CONSISTENT cross-table consistency(N)
    CONSTRAINTS export constraints (Y)
    OBJECT_CONSISTENT transaction set to read only during object export (N)
    FEEDBACK display progress every x rows (0)
    FILESIZE maximum size of each dump file
    FLASHBACK_SCN SCN used to set session snapshot back to
    FLASHBACK_TIME time used to get the SCN closest to the specified time
    QUERY select clause used to export a subset of a table
    RESUMABLE suspend when a space related error is encountered(N)
    RESUMABLE_NAME text string used to identify resumable statement
    RESUMABLE_TIMEOUT wait time for RESUMABLE
    TTS_FULL_CHECK perform full or partial dependency check for TTS
    TABLESPACES list of tablespaces to export
    TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
    TEMPLATE template name which invokes iAS mode export
    imp
    Keyword Description (Default)
    USERID username/password
    BUFFER size of data buffer
    FILE input files (EXPDAT.DMP)
    SHOW just list file contents (N)
    IGNORE ignore create errors (N)
    GRANTS import grants (Y)
    INDEXES import indexes (Y)
    ROWS import data rows (Y)
    LOG log file of screen output
    FULL import entire file (N)
    FROMUSER list of owner usernames
    TOUSER list of usernames
    TABLES list of table names
    RECORDLENGTH length of IO record
    INCTYPE incremental import type
    COMMIT commit array insert (N)
    PARFILE parameter filename
    CONSTRAINTS import constraints (Y)
    DESTROY overwrite tablespace data file (N)
    INDEXFILE write table/index info to specified file
    SKIP_UNUSABLE_INDEXES skip maintenance of unusable indexes (N)
    FEEDBACK display progress every x rows(0)
    TOID_NOVALIDATE skip validation of specified type ids
    FILESIZE maximum size of each dump file
    STATISTICS import precomputed statistics (always)
    RESUMABLE suspend when a space related error is encountered(N)
    RESUMABLE_NAME text string used to identify resumable statement
    RESUMABLE_TIMEOUT wait time for RESUMABLE
    COMPILE compile procedures, packages, and functions (Y)
    STREAMS_CONFIGURATION import streams general metadata (Y)
    STREAMS_INSTANTIATION import streams instantiation metadata (N)
    The following keywords only apply to transportable tablespaces
    TRANSPORT_TABLESPACE import transportable tablespace metadata (N)
    TABLESPACES tablespaces to be transported into database
    DATAFILES datafiles to be transported into database
    TTS_OWNERS users that own data in the transportable tablespace set
    Thanks for the quick response,
    Regards
    Alex

  • Problem with EXPORT IMPORT PROCESS in ApEx 3.1

    Hi all:
    I'm having a problem with the EXPORT IMPORT PROCESS in ApEx 3.1
    When I export an application, and try to import it again. I get this error message
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful. ORA-06550: line 16, column 28: PLS-00103: Encountered the symbol "牃慥整㈰㈯⼴〲㐰〠㨷㐵㈺′䵐" when expecting one of the following: ( - + case mod new not null <an identifier> <a double-quoted delimited-identifier> <a bind variable> avg count current exists max min prior sql stddev sum variance execute forall merge time timestamp in
    As a workaround, I check the exported file and found this
    wwv_flow_api.create_flow
    p_documentation_banner=> '牃慥整⠤㈰㈯⼴〲㠰〠㨷㠵㈺′äµ
    And when I replace with this
    p_documentation_banner=> ' ',
    I can import the application without the error.
    somebody knows why I have to do this??
    Thank you all.
    Nicolas.

    Hi,
    This issue seems to have been around for a while:
    Re: Error importing file
    I've had similar issues and made manual changes to the file to get it to install correctly. In my case, I got:
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful.<br>ORA-02047: cannot join the distributed transaction in progress<br>begin execute immediate 'alter session set nls_numeric_characters='''||wwv_flow_api.g_nls_numeric_chars||'''';end;There are several suggestions, if you follow that thread, about character sets or reviewing some of the line breaks within pl/sql code within your processes etc. Not sure what would work for you.

  • Photoshop CS4 plugins exists multiple times

    Hello Everybody
    I'm running a iMac 27 inch (End 2009) with the newest operating system (OSX 10.10.3) and the adobe creative suite 4 (Version 1102).
    Since a few days, my photoshop (and only PS, all others like ID, BR and AI work fine) takes about 10 minutes to load. It always hangs at the "scanning for plugins" point on the splash screen.
    I have now spent around 6 hrs to get rid of this issue but nothing helped. I searched through hundreds of discussions and youtube for solutions. I found out that the plugins should be in the /application/Adobe PS version XX/Plugins  folder, where they are located. I tried to reinstall the whole suite several times, with and without the cleanout script from adobe. I tried applications like appcleaner, I tried to find the files on my own through the whole mac os (also searching with the parameter to show all hidden folders) etc. (By the way: I'm working as an IT guy, so I know a little what I'm doing). But I could not manage to get rid of those crazy plugins (I only use the standard plugins which come with the installation of PS).
    So I see that it takes about 10 minutes to go through that scanning plugins point on the splash screen. Also when I choose to save a file and I click on the dropdown for the file type, I have now every possibility listed multiple times. Same in the main menu "photoshop -> about plugins".
    Does anybody have any idea what I could do to fix this? I'm kinda stuck.
    Below you'll find the screenshots of the error messages and so on (of course, I'm using it in German).
    Every hint is appreciated. Thanks in advance.

    Crash logs.
    Working with your Operating System’s Tools | Mylenium's Error Code Database
    Mylenium

  • Frame7.1 wont import Photoshop CS5.1 .jpg file

    I am (reluctantly) in the process of migrating from Windows XP +
    Photoshop CS V8.0 to Windows 7/64bit + Photoshop CS5.1.
    Framemaker 7.1 works well in both XP and W7/64, including generating
    PDFs. I can import .jpg files written by Photoshop CS. Etc.
    ... except...
    If I open a known-good (already imported) .jpg file using Photoshop
    CS5.1 and, without making any changes, save it, Framemaker can no
    longer import it, neither directly or by reference:
       "The filter encountered an error and could not complete the
        translation".
    It appears the Photoshop CS5.1 writes a different .jpg file format
    than Photoshop CS (rhetorical: how can it be acceptable for CS5.1 to
    change the format of a basic .jpg file? but I digress...).
    Examining the original .jpg file vs. the CS5 .jpg file I find:
    - The new file is smaller (79KB vs. 82KB) - note both are small!
    - The contents (examined using a binary editor) are quite different.
    - Both versions display identically, in both CS and CS5.1 and in
       various viewers.
    I have tried various experiments trying to understand this and have
    had no luck. I can open the CS5 .jpg file in Photoshop CS and save it
    again but Framemaker will not open this version either, same error.
    This is not a RAM size issue (8GB), swap space issue, or anything else
    related to the computer hardware. All SW updates are current, etc.
    I have tried importing into a new, default template file, etc.
    So, I tried saving as .bmp and the first time I tried to import it
    Framemaker died immediately: no stack trace, no error messages, the
    Frame process was just gone. After restarting, the 2nd attempt I got a
    black rectangle of the correct size, but all black = not usable.
    Photoshop CS + Framemaker has worked well for many years, so I know
    this is a CS5 issue.
    Anyone have any ideas on how to make CS5.1 generate a "standard" .jpg
    file?
    Anyone have any ideas on how to import CS5.1 images using
    Framemaker 7.1?
    Please don't advocate upgrading Framemaker to V10... Adobe provides zero
    information on migration (from any version) and the last time we did
    this cost us many man-months of work to "fix" 14K+ pages in 50+
    manuals. We will stay with XP/CS and "eat" the new W7/CS5 system
    before we move to FM10.
    Thanks
    Previously (everything works)
      Windows XP/SP3
      Photoshop CS V8.0
      Framemaker 7.1P116, non-structured
    Today:
      Windows 7/64bit SP1
      Photoshop CS5.1
      Framemaker 7.1P116, non-structured

    AG: IIRC, the FM7.1 JPG filter only reads the older JFIF header format JPG files, while the CS5 Photoshop and many other applications write EXIF headers in the JPG (yes , the standard evolved).
    We also had a problem with JPEG 5 (legacy) vs. JPEG 2000 (newer) encoding, not in FM, but in the rendered PDFs at our print shop. The Xerox engine needed an update to fix that.
    Using JPEGs in FM has various considerations, some already mentioned in this thread. But the biggest concern, in my view, is cascading compression damage. JPEG compression is always "lossy", and the artifacts at higher compression, or after successive compressions, are often visually distracting (and may be aggravated if sharpening is also applied). Even at NO compression, there are losses due to the transforms
    Unless you are shooting .RAW, your camera is compressing from frame buffer to JPEG. When you re-save from your photo editor, regardless of edits, it gets re-compressed if saved to JPEG. When you render to PDF, it may get both down-samped and re-compressed again depending on PDF settings. If you post-process (optimize or reduce file size) in Acrobat, it can get down-sampled and re-compressed again.
    So I shoot JPEG with as little compression as possible (RAW has other considerations). I edit in PhotoShop and save as EPS, which is a very inefficient format space-wise, but is UNcompressed*, and is a very efficient format both performance- and stability-wise in FM. I save the EPS at the resolution desired in the PDF (usually 200 dpi). Yes, it does get (and needs to be) re-compressed during Distill, but we've skipped one needless re-comp, and avoided another down-sample. We don't further optimize in Acrobat post because the images are already optimal.
    EPS also supports both RGB and CMYK models, whereas JPEG is strictly RGB (or, beware: indexed color). And EPS is generally unmolested by Mr.Bill's GDI, which is crucial for color-managed CMYK.
    * TIFF can also be uncompressed, and CMYK, but can be a performance issue in FM.
    The Wiki page for JPEG, btw, is presently vandalized:
    "The name "JPEG" stands for Joint Photosaic Experts Group, the name of the committee that created the JPEG standard and also other standards. It was discovered by <likely name of vandal redacted> of the Philippines."

  • Export/Import Process in the UI for Variations Content Translation is Generating CMP Files with No XML

    We have a SharePoint 2010 Publishing Website that uses variations to deliver contain to multiple languages. We are using a third-party translation company to translate publishing pages. The pages are
    exported using the  export/import using the UI process described here: "http://blogs.technet.com/b/stefan_gossner/archive/2011/12/02/sharepoint-variations-the-complete-guide-part-16-translation-support.aspx".
    Certain sub-sites are extremely content-intensive. They may contain many items in the Pages library as well as lists and other sub-sites. 
    For some sub-sites (not all), the exported CMP file contains no XML files. There should be a Manifest.XML, Requirements.XML, ExportSettings.XML, etc., but there are none. After renaming the CMP file
    to CAB and extracting it, the only files it contains are DAT files.
    The only difference I can see between the sub-sites that generate CMP files with no XML files is size. For example, there is one site that is 114 MB that produces a CMP file with no XML files. Small
    sites do not have this problem. If size is the problem, then I would think the process would generate an error instead of creating a single CMP file that contains only DAT files. However, I do not know exactly what the Export/Import Process in the UI is doing.
    This leads to two questions:
    1.
    Does anyone know why some CMP files, when renamed to *.CAB and extracted, would not contain the necessary XML files?
    2. Second, if exporting using the UI will not work, can I use PowerShell? I have tried the Export-SPWeb, but the Manifest.XML does not contain translatable
    content. I have not found any parameters that I can use with Export-SPWeb to cause the exported CMP to be in the same format as the one produced by the Export/Import process in the UI.
    As a next step, we could try developing custom code using the Publishing Service, but before doing this, I would like to understand why the Export/Import process in the UI generates a CMP that
    contains no XML files.
    If no one can answer this question, I would appreciate just some general help on understanding exactly what is happening with the Export/Import Process -- that is, the one that runs when you select
    the export or import option in the Site Manager drop down. Understanding what it is actually doing will help us troubleshoot why there are no XML files in certain export CMPs and assist with determining an alternate approach.
    Thanks in advance
    Kim Ryan, SharePoint Consultant kim.ryan@[no spam]pa-tech.com

    I wanted to bump this post to see about getting some more responses to your problem. I'm running into the same problem as well. We're running a SharePoint 2010 site and are looking at adding variations now. The two subsites with the most content take a
    while to generate the .cmp file (one to two minutes of the browser loading bar spinning waiting on the file). Both files are generated with a lot of .dat files but no .xml files. I was thinking like you that it must be a size issue. Not sure though. Did you
    ever happen to find a solution to this problem?

  • Backing up using Export/Import & ITL, XML file formats

    Using Windows XP I have reset my default Itunes folder to c:\Itunes (too big for MY Documents\...). then copied everything from c:\itunes to an external hard drive. (I didn't realize the ITL file doesn't move and therefore I may not have a good backup of it.
    Starting with a new computer, installed Itunes, then did a 'File, Add folder to library (with copy option)' from the external hard drive. This restored all my songs but I lost playlists, play count & my ratings.
    I then did an import from library.xml, this restored the playlists, but still no play count or ratings.
    My question is
    1) IS there a better way to backup from Hard Drive to Hard drive? (Maybe all I really needed was the ITL file also?)
    2) what is the format of the ITL file (I am a systems programmer, would appreciate a URL that has detail), what (hopefully free) utility would edit it?
    3) a high level view of the XML file (I know XML)
    4) Where is there complete documentation on Itunes ( a detailed description of export/import would be appreciated)
    AMD Sempron, Duron, Athlon, Celeron   Windows XP Pro  
    AMD Sempron, Duron, Athlon, Celeron   Windows XP Pro  

    Here's my old standard answer for this topic - hope you can find something useful from it ...
    iTunes Music Library.itl & iTunes Music Library.xml must stay where you found them.
    To move your music files, use iTunes to do the work. Don't manually drag them around yourself.
    • change the library location in iTunes Advanced Preferences tab, the new location does not need to exist; iTunes will create it. The new location should be empty of music files. This just sets the location for any new files brought into iTunes; it does not affect any existing files in the iTunes library.
    • Select iTunes Advanced menu, Consolidate Library command. This copies all iTunes music files to the new library location. Playlist & Ratings data is preserved!
    • After it completes (assuming no errors), quit iTunes and get rid of the original files from your previous location. The xml & itl files must remain in the old location, the library location setting does not control the location of these two critical files.

  • Why the White Lines around imported Photoshop Images

    Working in CS6, importing native Photoshop images into an Illustrator gradient.  Certain imported Photoshop images, with transparent background, produce white lines bordering the images.   The white lines stand out against the AI gradient.  The whites lines are not visable on screen.  They came out in the commercially printed product.  The overall AI document, with PS images placed and linked, was save to a PDF.  It was then printed from the PDF. The PDF, as viewed on screen, does not show the white lines either.  Why and how do the white border lines (the box surrounding the image) appear?

    First I would recommend making sure there are indeed »no pixels« there in Photoshop – applying the Layer Style Stroke can help locate stray, semitransparent pixels.
    Other than that I am afraid your print service provider may not have provided optimal output.
    What print process was employed anyway?
    It looks like flattening effects, but those should simply not be visible in print.
    They happen when exporting X1 or X3 pdfs, which disallow transparency, so in creating the pdf the psd files that have transparency are flattened with whatever background and often broken up into smaller parts.
    That way a whole mess of fragmentary images can be created, as one can see when one opens such a pdf in Illustrator.
    One can avoid that by creating X4 pdfs, which do allow transparency, but if those can be processed by the printer is another question.
    But this issue might be more suitable to the Illustrator or pdf Fora.

  • Migrate Database- Export/Import

    Hi,
    I need to migrate an Oracle database 9i from Sun Solaris to Linux. The final target database version would be 11g.
    Since this is a 9i database, I see that we have only option of export and database. We have around 15 schemas.
    I have some queries related to it.
    1. If I perform a export with full=y rows=y, Does it export Sys, System Schema objects also?
    2. Can we perform export in Oracle 9i and use datapump import on targert 11g?
    3. What is the ebst approach - a) to perform schema by schema export or b) to perform a full database export with exp / file=xxx.dmp log=xxxx.log full=y?
    Since there is a database version different I dont want to touch sys, system schema objects.
    Appreciate your thoughts.
    Regards
    Cherrish Vaidiyan

    Hi,
    Let me try to answer some of these questions you queried for:
    1. If I perform a export with full=y rows=y, Does it export Sys, System Schema objects also?Export won't export sys objects. For example, there are tables in sys, like obj$ that contain information for other metadata objects, like scott.emp, etc. This is not exported becuase when scott.emp is exported, the data from obj$ is essentially exported that way. When the dumpfile is imported and scott.emp is recreated, the data in sys.obj$ will be restored through the create table statement. As far as the SYSTEM schema is concerned, some objects are exported and some are not. There are tables in system that contain information about queues, jobs, etc. These will probably not make any sense on the target system so those types of tables are excluded from the export job. Other objects make sense to export/import so those are done. This is all figured out in the internals of export/import. Thre are other schemas that are not exproted. Some that I can think of are DMSYS, ORDSYS, etc. This would be for the same reason as SYS.
    2. Can we perform export in Oracle 9i and use datapump import on targert 11g?No, the dumpfiles are formatted differently. If you use exp, then you must use imp. If you use expdp, then you must use impdp. You can do exp on 9i and imp on 11g with the dumfile that was created on 9i.
    3. What is the ebst approach - a) to perform schema by schema export or b) to perform a full database export with exp / file=xxx.dmp log=xxxx.log full=y?This is case by case decision. It depends on what you want. If you want the complete database moved, then I would personally think that a full=y is what you would want to do. If you just did schema exports, then you would never export the tablespaces. This would mean that you would have to create the tablespaces on the source system before you ran imp. There are other objects that are not exported when a schema level export is performed that are exproted when a full is performed. This information can be seen in the utilities guide. Look to see what is exported when in user/schema mode vs full/database mode.
    Since there is a database version different I dont want to touch sys, system schema objects.This is all done for you with the internal workings of exp/imp.
    Dean
    Edited by: Dean Gagne on Jul 29, 2009 8:38 AM

  • Export Import to Maintain Test and Production Environments

    We have developed an application using Locally built Database Providers in Oracle Portal 9.0.2.6 which is installed to 2 schemas, has 5 different providers, and over 100 Portal major components (Forms, Reports, and Calendars) and over 200 minor components (LOV's and Links). We have used export/import transport sets with some luck, but it is a struggle becuase the import procedures are not very robust. Many things (such as missing LOV's, corrupt components, preexisting versions, etc, etc.) can cause an import to fail. And the cleanup necessary to finally achieve a successful import can be very time-consuming.
    Having a robust import mechanism is very important to our strategy for keeping installed (our own and clients') portal instances up-to-date with our latest release. Some of the enhancements that would make it much easier to develop and maintain Portal applications include:
    Within the Portal:
    1. Ability to copy an entire provider within the same portal (rather than one component at a time).
    2. Ability to change the schema to which a Provider is associated.
    3. When copying a component from one provider to another, the dependent items (i.e. LOVs and Links) should be copied to new second provider as well. (i.e. rather rebuilding each LOV in each provider and then editing each form to point to the new LOVs)
    Transport Sets:
    4. Should allow for changing provider names and provider schema, and global component name changes, and resetting unqiue id's on import (to create copy rather than overwrite).
    5. Should allow the option to ignore errors and import all components which pass pre-check (rather than failing all components if all items do not pass pre-check).
    How are other Portal Developers dealing with installing and then rolling out new sets of Locally built Database Providers from Development environments to Production? Are there any whitepapers on the best practices for replicating/installing a portal application to a new portal instance and then keeping it updated?
    Oracle, are any of my wish-list items above on the future enhancement lists? Or have others figured out workarounds?
    Thanks,
    Trenton

    There are a couple of references which can be found on Portalstudio.oracle.com that are of some use:
    1. A FAQ for Portal 9.0.2.6 Export/Import http://portalstudio.oracle.com/pls/ops/docs/FOLDER/COMMUNITY/OTN_CONTENT/MAINPAGE/DEPLOY_PERFORM/9026_EXPORT_IMPORT_FAQ_0308.HTM
    2. Migration Instructions by Larry Boussard (BRUSARDL)
    3. Migrating Oracle Portal from Dev Systems to Production Systems bt Dheeraj Kataria.
    These are all useful documents for a successful first-time Export-Import. However, the limitations and lack of robustness I listed in my first post, make the process so time-consuming and error fraught as to not be a practical development strategy.

  • EXPORT / IMPORT  TO/FROM SHARED BUFFER

    Hello all,
    I am facing a problem with the EXPORT/IMPORT to SHARED BUFFER statements.
    In my report program , I export data to the shared memory.
    I then call a transaction to park an accouting document.
    The BTE 2218 gets triggered in the process. Here the IMPORT works fine.
    Later, there is a standard function module which is called IN UPDATE TASK.
    Within this, the IMPORT statement fails.
    It works on one server but not on another.
    Notes :
    The IMPORT works in debugging mode but fails if I simply run.
    Another point is that the ID used for identifying the shared memory uses sy-uname.
    Can the visiblity of sy-uname in UPDATE TASK be controlled by settings ?
    Any ideas on this ?
    Please don't copy paste the help on SHARED BUFFER etc.
    Thanks in advance.

    Hi Mariano,
    the issue is to due to multiple servers present where SHARED MEMORY is specific for each application server.
    So we export data into shared memory in program A, we have to be sure, that program B or FM which is called in background or update task by program A runs on the same application server
    Here, the problem is when program A calls the program B or FM in background or update it’s a dynamic scheduling to all application server with have batch work processes and not the same application server that of calling program A always, so program B runs on another application server which has different shared  memory.
    Solution will be:-
    To Force program B to run on same application server as of calling program A by
    passing sy-host of calling program A to Function module “JOB_CLOSE” parameter
    name “TARGETSERVER”. OR
    Instead of using SHARED MEMORY we will use DATABASE.
            EXPORT itab FROM itab  TO DATABASE indx(ar) CLIENT sy-mandt ID job_number in programA where job number is unique.
            Then IMPORT itab TO itab FROM database indx(ar) CLIENT sy-mandt ID job_number  in program B Where job number is passed from program A to B.
            Then DELETE FROM DATABASE indx(ar) CLIENT sy-mandt ID job_number.
    Regards,
    Vignesh Yeram

  • Export/import to/from database

    hi,
    i do not know what is this for. i read the help but still no idea.
    i know to use export/import to/from memory id and also set/get but not export/import to/from database.
    1) what help says it stores data cluster? what is data cluster
    2) can have example of export/import to/from database?
    3) what is the different for export/import to/from memory id and database?
    thanks

    Hi,
    1) A data cluster is a set of data objects grouped together for the purpose of storage in a storage medium(Like database, local memory or shared meomory etc), which can only be edited using ABAP statements like EXPORT, IMPORT and DELETE.
    2) Suppose you want to export the data in a internal table to database.
    Here TABLE_ID is the identifer of your data cluster. You are exporting your data to a SAP table INDX and giving it an ID (TABLE_ID in this case and this ID you are giving it an area by name XY where your data will be stored). EXPORT tab = itab
      TO DATABASE indx(XY)
      CLIENT '000'
      ID 'TABLE_ID'.   
    You can get this data as follows.
    IMPORT tab = itab
      FROM DATABASE indx(xy)
      CLIENT '000'
      ID 'TABLE_ID'.
    3) The difference is simple when you use MEMORY ID the data you export is stored in the application server MEMORY where as in the case of DATABASE it is stored in the database.
    Regards,
    Sesh

  • Steps to be followed in Export/Import

    Can anyone help me out defining the exact steps to be followed in exporting and importing the ODI objects?
    I have exported the master and work repository. I created a new master repository by importing the zip file. I can see the architecture neatly imported into the topology. The problem starts when I try to import the work repository. It keeps throwing the integrity constraint error. I export/import folder-wise as the current work repository is pretty huge!!
    It will be of great help if some one could explain the steps to be followed while export/import in ODI

    Hi there,
    It is typical of ODI to throw those errors, as the documentation is not quite comprehensive about how to do it.
    Here are a few guidelines:
    (i) Topology (master repository) should be exported n imported first. Then the models and finally the projects and the variables. Also, work repository export may not work if you try to export whole work repo at one go using File---->Export---->Work Repository !
    So the best thing is File--->Export---->Multiple Export----> then drag all ur objects into the box and check the "zip" checkbox
    Create separate zips for Models and separate for Projects and so on
    And import Model zip before Project zip in your new repository
    OR
    (ii) Use database schema export and import. This has worked the best for us.
    And I think it is the most flawless way of doing export import because you dont lose any objects and dont get any integrity constraint errors etc
    Just export the master repository & work repository schemas and then import them in you new database

Maybe you are looking for