Doubt in export/import

Hi All,
I have oracle 9i database.
i have a requirement from one of my user. He wants to restore 2 tables as of 31 dec 2010.
initially i thought of using flashback_scn but seems like i cant use this in 9i
can anyone please help how to find the last SCN for 31 dec 2010?
Regards,
Chris

With Rman, duplicate database until scn and then imp/exp the tables.
With imp/exp. Find the correct file and do a imp with it.
If you do not have backups and the flashback is off then you can not do ithow to find out if flashback is turned on? show parameter flash ? i did this but couldnt find any parameter..i even tried db_recevoery_file_dest but still couldnt find.

Similar Messages

  • Doubt in  export, import and table para when creating Function Module

    Dear fellow ABAPers,
    I have a doubt in defining export, import and table parameter while creating a function module.
    I am calling a function module inside a user exit. Now in the user exit the SAP fills an internal table called i_lfa1 with all the data user has eneterd.
    Now I want to pass this whole internal table to function module and the perform some checks on the values of internal table.
    After that function module fills an error structure with values depending on some check.
    1)
    How do I pass this internal table to function module ? 
    When I am creating function module in se37 where do I define this iternal table type ? Is it in Import or Table parameter during function module creation?
    2)
    Where do I define error structure type (which is returned by function module to main program)? Is it in Export or table parameter during function module creation?
    Please clear my doubt..
    Relevant points will be awarded.
    Regards,
    Tushar.

    Hi Tushar,
    1. How do I pass this internal table to function module ?
       I assume u are creating your own Y/Z FM.
       Pass it thru TABLES parameter.
    2. When I am creating function module in se37 where do I define this iternal table type
       Define this in TABLES interface.
       What Type ?
       THE SAME TYPE WHICH HAS BEEN DEFINED
        WHILE PASSING IN THE USER-EXIT FUNCTION MODULE.
       IF U SEE THE FM OF THE USER-EXIT,
       U WILL COME TO KNOW.
    3.
    Where do I define error structure type (which is returned by function module to main program)? Is it in Export or table parameter during function module creation?
    Define it in TABLES interace. (not in export, import)
      (Since what u are going to return is an internal table)
      U can take for eg. BDCMSGCOLL.
      OR u can create your own Y/Z structure
    for the same purpose.
      (or u can use the structure type T100)
    I hope it helps.
    Regards,
    Amit M.

  • Export/import to clone database -- please advise

    Hi All,
    Need help especially from exports that are strong in export/import.
    can someone advise me how to use export/import to clone a database?
    I know how to clone database using cold backup and hot backup.
    But wish to know the full steps in using export/import to do cloning.
    I'm going to do a full export on my "TEST" database, drop the database, create a database and import using the full export.
    After doing a full database export, full=y. What are the steps to follow. Have a few doubts about it.
    1) export the database (full=y)
    2) drop the database
    3) re-create the database, will be using the same name "TEST".
    4) create the users (IDs created by DBA, eg: schemas), tablespace and datafiles. --> import doesn't do it right?
    5) do a full import (full=y). In this step, will i encounter any problems? As in understand that all the oracle-owned objects (example SYSTEM, SYS, AUX) are already in the newly created database, will import full=y cause any problems?
    6) do i have to re-create the roles, SYNONYMS and grant role,sys,obj privileges to IDs not created by oracle. (example schema owner)
    in short i would like to know what EXPORT will export and not export
    what IMPORT will import and not import.
    Please advise.

    Hi zetabouy,
    thanks for ur input, definately helpful for me.
    just to confirm. am i right to say that only a full database level (full=y) export/import will export/import out roles as well as public synonyms? because i have tried also exporting a user from the database, and imported that user into a different database, but public synonyms and roles were not imported.
    one more question:
    For example we are going to do port data from production to uat database using export and import for only the schema.
    After exporting from production with such syntax (owner=OWNER01),
    is is better to drop all objects beloging to the schema owner in UAT before importing it (touser=OWNER01, ignore=y) using the production dumpfile?
    I'm asking this question as I'm afraid that if we did not drop all the pl/sql objects in UAT database, the updated procedures/fuctions/triggers exported from the production database will not get imported to the UAT database as UAT database have the same procedures/fuctions/triggers names (but not updated code) .
    will profiles be exported and imported too duing full database level exp/imp?
    please kindly advise.
    thanks
    Message was edited by:
    chew

  • Is time to export-import location dependent

    I had a doubt that does the export-import process depend on the location from where it is performed. I will try to explain the scenario. There is export-import to be done of a server in Africa as the migration to a new server is required. My question is does it make a difference in the time it takes to export-import, if the SecureCRT session is logged on from here, Africa itself or if the process is done from a far off location, say, India as some of our DBA team is in India.
    So, the doubt is does the export-import process depend on the location from where it is performed?
    I hope, my question is clear.
    Please, help in solving the doubt.
    regards

    You need to clarify from an Oracle perspective what the client machine is in this scenario (i.e. the machine that is connecting to the Oracle database). What machine is the export utility running on and what machine is the export utility writing to?
    If the export utility is running on the same machine and writing to a dump file on the same machine regardless of where the DBA is sitting, then where the DBA is sitting is irrelevant. If the export utility runs on a different machine or writes to a dump file on a different machine depending on where the DBA is sitting, then where the DBA is sitting is relevant.
    Justin

  • R/3 Export & Import during Unicod Conversion via SOCKET Method

    Hi
    We are in the process of upgrading our R/3 Enterprise to ECC 6.0. The size of the database is around 4 TeraBytes.
    Can somebody help us with necessary documentation/ othe rinputs to handle R/3 Export & Import using SOCKET method?  FYI we referred the SAP given doucments on the same. Any particular DOs & DO NOTs would be highly appreciated.
    Thank You
    Sai

    Hi Rahul
    regarding your 'Unicode doubt"' some ideas:
    1) The Upgrade Master Guide SAP ERP 6.0 and the Master Guide SAP ERP 6.0 include introductory information. Among other, these guides reference the SAP Service Marketplace-location http://service.sap.com/unicode@sap.
    2) In Unicode@SAP can you find several (content-mighty) FAQs
    Conclusion from the FAQ: First of all your strategy needs to follow your busienss model (which we can not see from here):
    Example. The "Upgrade to mySAP ERP 2005"-FAQ includes interesting remarks in section "DO CUSTOMERS NEED TO CONVERT TO A UNICODE-COMPLIANT ENVIRONMENT?"
    "...The Unicode conversion depends on the customer situation....
    ... - If your organization runs a single code page system prior to the upgrade to mySAP ERP 2005, then the use of Unicode is not mandatory. ..... However, using Unicode is recommended if the system is deployed globally to facilitate interfaces and connections.
    - If your organization uses Multiple Display Multiple Processing (MDMP) .... the use of Unicode is mandatory for the mySAP ERP 2005 upgrade....."
    In the Technical Unicode FAQ you read under "What are the advantages of Unicode ...", that "Proper usage of JAVA is only possible with Unicode systems (for example, ESS/MSS or interfaces to Enterprise Portal). ....
    => Depending on the fact if your systems support global processes, or depending on your use of Java Applications, your strategy might need to look different
    3) In particular in view of your 3rd option, I recommend you to take a look into these FAQs, if not already done.
    Remark: mySAP ERP 2005 is the former name of the application, which is named SAP ERP 6.0, now
    regards, and HTH, Andreas R

  • Export Import single table...

    Gurus...
    I am working on this single table which needs to be exported from prod and import into test.
    As I understand I need to follow below steps:
    1. Test - export table abc dump as backup
    2. Prod - Export single table abc
    3. Test - Drop table abc cascade constraints
    4. Test - Import abc into test
    Export par file:
    Directory= dbpump
    dumpfile= expdp_abc.dmp
    logfile= expdp_abc.log
    content= all
    tables=user.abc
    exclude=statistics,object_grant, tablespace_quota
    flashback_time= systimestamp
    Import par file:
    Directory= dbpump
    dumpfile= expdp_abc.dmp
    logfile= impdp_abc.log
    content= all
    tables=user.abc
    table_exist_action=replace
    transform=segment_attributes:N
    my doubts:
    1. Is my process flow correct?
    2. Export & Import file correct? or missing parameters?
    3. What happens to all objects connected to the table, will that also imported?
    4. Do I need to lock user during this process?
    5. Any script to check whether all objects connected to table does exist in test after import?

    Hi,
    Process for table export & import.
    +1. Create database directory in test as well as production:-+
    --> create or replace directory directory_name as 'physical_path';
    --> grant acsess on that directory to user.
    +2. Backp table in test environment (in case if you need old data in your test env):-+
    --> Create table BKP_table_name as select * from table_name; (table_name u want to import)
    +3. Take Export backup in Production database:-+
    --> expdp dumpfile=file_name.dmp logfile=file_name.log directory=directory_name tables=Owner.table_name
    +4. On test server do the following actvity:-+
    a. Just check for dependencies on that table using DBA_DEPENDENCIES.
    b. Truncate table which u want to import
    c. import table-
    impdp dumpfile=file_name.dmp tables=owner.table_name logfile=file_name_imp.log directory=directory_name table_exists_action=APPEND
    d. Just check for dependencies on that table using DBA_DEPENDENCIES.
    And also you can use table_exists_action=replace
    This will also import all depedent objects to the table..
    Regards,
    Edited by: XBOX on Dec 14, 2012 10:15 PM

  • Export-import if database is down

    I had a query whether we can run the export and the import process if the database is down. Basically export process is the command, 'exp'. So, would this
    command run even if the database is down. Even, import command is run with command, 'imp', so, would it be requiring database to be up.
    I hope, my question is clear.
    Please, help in solving the doubt.
    regards

    Hi,
    Export is a Logical Backup in sense DB Objects, Think how come with out DB instance is Up and Running, does exp application can extract the things.
    - Pavan Kumar N

  • Keeping video proportions during still image export/import

    What I'm doing is exporting a still image to the Mac desktop, then dropping it into Photo-to-Movie where I create pans and zooms on it, then exporting the pan/zoom sequence as a Quicktime movie that I drop back into FCE and splice into the Timeline.
    Trouble is, the image in the pan/zoom sequence always comes back into FCE distorted (elongated), I guess because the exported still image that it was made from reverts to the square pixels during the export process from FCE.
    So, my question is, in what form should I export the still image from FCE so that it retains the proportions it has in the video, so that when I operate on it in Photo-to-Movie and then bring it back into FCE, it still has the proportions of a video (NTSC) image?
    In other words, I need to retain the video (NTSC) image's proportions throughout the process of export, then creating pan/zooms on it in Photo-to-Movie, and then reimporting it into FCE.
    To accomplish this, I presume I have to either export it from FCE in some special NTSC-video-compatible form, or else convert it in Photoshop to an NTSC-compatible image before I drop it into PTM, create the pan/zoom sequence, and then bring that sequence back into FCE.
    I'd be grateful if anyone could suggest a solution to this problem.
    Tom

    Hello Tom Baker 1
    My friend let me tell you something - I TOTALLY share your frustrations, and disgust with the poor results of Keyframe (within FCE).
    I am not a software engineer, but my perseverance to try and try again, is far above average.
    At the risk of getting on a soapbox, (and I can attest to my excellent equipment capability), please believe me, (like yourself), I've paid my dues trying to get Keyframing to work suitably with stills in FCE.
    I think I can safely say, beyond all doubt, if you want to get satisfactory results in regard to pans and zooms, (without being a math major), some options for reliability and smoothness are: 'Photo to Movie' (just as you're doing), Fotomagico, http://boinx.com/fotomagico/overview/, (although I've only heard it's pretty good) - haven't tried it yet.
    OR
    A recent discovery of mine: Lyric Media Pan and Zoom http://www.lyric.com/fcp-plugins/panzoompro/pzp.htm - The really nice thing about this is that it's a FC, or FCE Plug In, and utilizes FC's keyframe software engine. Why is it different than Keyframe by itself in FCE? Because (so far), what I can see is that it sort of fixes the mickey-mousery, herky-jerky nightmare of Keyframe within FCE. To me it sure looks like it can take whatever pixel size you throw at it, (and without being a math major having to apply cautionary resizing to every darn still), it just does the job. Yes it seems to do the re-sizing for you, and consequently produces smooth, reliable motion to stills.
    Of course the advantage to this is that you're finally using an application (within FCE), to create your pans and stills right there. No exporting/importing of QT files as with Photo to Movie, or iMovie. And the parameters of control are more sophicated than Photo to Movie.
    Again, I'm still working with it, and I still need to master it, but it sure beats keyframe.
    One side note:
    I really love the crispness of an iMovie pan or zoom, but we all know by now that it's downfall are the dreaded 'JAGGIES'. As confirmed here on the forum, what seems to make Photo to Movie work so well, is the fact that by it's very nature of design, it automatically smoothes out, and probably reduces or resizes images so that they will NOT produce unwanted glistening (aliasing). And as once said here, your eye will accept this softening of an image, far better than the Jaggies, or the herky jerky.
    With all that said, I hope I didn't rant on too much - as I mentioned, there's nothing like 'experience'.
    Peace
    Mike
    (it would be nice to see someone comment on Lyric Media Pan Zoom Pro)
    PS- an excerpt from the documentary I'm working on:
    http://www.youtube.com/watch?v=jmB0_qiONQs

  • Memory Limitation on EXPORT & IMPORT Internal Tables?

    Hi All,
    I have a need to export and import the internal tables to memory. I do not want to export it to any data base tables. is there a limitation on the amount of memroy that is can be used for the EXPORT & IMPORT. I will free the memory once I import it. The maximum I expect would be 13,000,000 lines.
    Thanks,
    Alex (Arthur Samson)

    You don't have limitations, but try to keep your table as small as possible.
    Otherwise, if you are familiar with the ABAP OO context, try use Shared Objects instead of IMPORT/EXPORT.
    <a href="http://help.sap.com/saphelp_erp2004/helpdata/en/13/dc853f11ed0617e10000000a114084/frameset.htm">SAP Help On Shared Objects</a>
    Hope this helps,
    Roby.

  • Using set/get parameters or export/import in BSP.

    Hi All,
    Is it possible to use set/get or export/import in BSP?
    We need to set/export some variables from a BADI and get/ import them in the BSP application.
    Code snippet will be of great help..
    Thanks,
    Anubhav

    Hi Anubhav,
    You can use the Export / Import statements for your requirement,
    from the BADI use EXPORT to send the variable data to a unique memory location
    with IDs
    e.g.
    *data declaration required for background processing
          DATA: WA_INDX TYPE INDX.
    **here CNAME is the variable you want to export
    EXPORT PNAME = CNAME TO DATABASE INDX(XY) FROM WA_INDX CLIENT
                SY-MANDT ID 'ZVAR1'.
    and in the BSP application use the IMPORT statement to fetch back the values
    set with the IDs above.
    IMPORT PNAME = LV_CNAME
      FROM DATABASE INDX(XY) TO WA_INDX CLIENT
      SY-MANDT ID 'ZVAR1'.
    deletes the data to save wastage of memory
      DELETE FROM DATABASE INDX(XY)
        CLIENT SY-MANDT
        ID 'ZVAR1'.
    Regards,
    Samson Rodrigues

  • Issue with Memory ID export / import

    Hi Experts,
    We are facing strange issue during export/import from memory ID.
    We are exporting value to Memory ID from module pool program and trying to import same value in SAP workflow method.
    While importing the value from memory ID it is failing to import and gives sy-subrc =4.
    Any idea how can we do export/import in module pool program to Workflow method?
    Regards,
    Sanjana

    Hi sanjana,
    Please check the link. Here you can find some examples also.
    http://wiki.scn.sap.com/wiki/display/Snippets/Import+and+Export+to+Cluster+Databases
    P.S
    http://help.sap.com/saphelp_nw04/helpdata/en/fc/eb3bf8358411d1829f0000e829fbfe/content.htm
    You just need to create a key and an ID to save data into INDX table.Once its saved you can use same key and id to import .
    Again mentioning , dont forget to delete using same key and ID once you have imported else it may give error.
    Regards,
    Sandeep Katoch

  • Best choice for exporting / importing EUL

    Hi all
    I have been tasked with migrating an EUL from a R11 to a R12 environment. The Discoverer version on both environments is 10.1.2 and the OS is Solaris on oracle db's.
    I am unfortunately not experienced with Discoverer and there seems to be no one available to assist for various reasons. So I have been reading the manual and forum posts and viewing metalink articles.
    I tried exporting the entire eul via the wizard and then importing it to the new environment but i was not successfull and experienced the system hanging for many hours with a white screen and the log file just ended.
    I assumed this was a memory problem or slow network issues causing this delay. Someone suggested I export import the EUL in pieces and this seemed to be effective but I got missing item warnings when trying to open reports. This piece meal approach also worried me regarding consistency.
    So I decided to try do the full import on the server to try negate the first problem I experienced. Due to the clients security policies I am not able to open the source eul and send it to our dev. I was able to get it from their dev 11 system but I dismissed this as the dev reports were not working and the only reliable eul is the Prod one. I managed to get a prod eex file from a client resource but the upload to my server was extremely slow.
    I asked the dba to assit with the third option of exporting a db dump of the eul_us and importing this into my r12 dev environment. I managed this but had to export the db file using sys which alleviated a priviledge problem when logging in, I have reports that run and my user can see the reports but there are reports that were not shared to sysadmin in the source enviroment are now prefixed with the version 11 user_id in my desktop and the user cannot see her reports only the sysadmin ones.
    I refreshed the BA's using a shell script I made up which uses the java cmd with parameters.
    After some re reading I tried selecting all the options in the validate menu and refreshing in the discover admin tool.
    If I validate and refresh the BA using the admin tool I get the hanging screen and a lot of warnings that items are missing( so much for my java cmd refresh!) and now the report will not open and I see the substitute missing item dialogue boxes.
    My question to the forum is which would be the best approach to migrate the entire eul from a R11 instance to a R12 instance in these circumstances?
    Many thanks
    Regards
    Nick

    Hi Srini
    The os and db details are as follows:
    Source:
    eBus 11.5.2
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Target:
    ebus 12.1.2
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production DEV12
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Yes the DBA initially did an exp for me using EUL_US as the owner but a strange thing happened with privileges and also, some of the imported tables appeared in the target environment under the apps schema(21 tables) even though the eul_us exp had been 48 tables.
    I also had a problem on the db with "eul_us has insufficient privileges on table space discoverer" type errors.
    I checked the eul_us db privileges and was unable to resolve this initial privilege error even though the privileges were granted to eul_us.
    The dba managed to exp as system and then import it with the full=y flag in the import command which seems to bring in the privileges.
    Then I ran the eul5_id.sql and then made up a list of the business areas and made a sh script to refresh the business areas as follows:
    java -jar eulbuilder.jar -connect sysadmin/oracle1@dev -apps_user -apps_responsibility "System Administrator" -refresh_business_area "ABM Activities" -log refresh.log
    This runs successfully and I can log in select business area and grant access to the users. The reports return data.
    Then one of the users said she can't see all her reports. I noticed some if I opened desktop that were sitting there prefixed with a hash and her version 11 user id.
    So back to the manuals and in the disco admin help the instructions are to first go to view > validate > select all options then go to the business area and click file refresh. This gives me a lot of warnings about items that are missing. I assume this is because the item identifiers brought across in the db dump are the version 11 ones and thus not found in the new system.
    Any suggestions?
    Many thanks
    Nick

  • Project duplication:  Copy/Paste vs. Export/Import

    Anyone have any thoughts on project duplication?
    Presently I am copying a project and then pasting it into multiple new locations then renaming them. Recently I was in a training session where the instructor suggested exporting the project and then importing it claiming database size savings. Can anyone shed any insight?
    Thanks in advance.

    I can't imagine why an export/import would be a smaller footprint within the database than a project copy/paste. Not to mention that if you are talking about a relatively large project it can take a substantial amount of time to export and import. One of our schedules that is about 18k activities takes approximately 1.5 hours to export/import where as a project copy/paste only takes about 10 minutes. Note that we do not copy the baseline schedules when replicating projects.

  • Error when exporting/importing

    I am using MDM 7.1
    When I export the schema from my Dev system to import into QA I get the following Error:
    "This repository requires additional steps before transport. See the MDS log for details.
    In the log my issue is that I am trying to export an Assignment that includes an expression that uses "look-ups".
    In my Dev system I removed the expression to confirm if this is the issue, once I no longer have expressions with look-ups then it allows me to export the schema. I then tries to import it to QA (since the expressions are not changing I planned on excluding them from the import as a temporary workaround.
    however I get the same error message when trying to import. It seems that I can not export or import with a system that has an assignment with an expression that uses look-ups.
    Is there some config I am missing?

    Hi Brad,
    assignments/validations are a general problem when it comes to Schema exports/imports! What you can do in case there are not too many assignments -  is to delete the assignments and create them new (manually) after you have imported the schema.
    Hope this helps a little.
    Regards,
    Erdal

  • Export/Import Parameters dissapear when using user-exit

    I am using some import/export parameters in a dynamic action when I create a new record (infotype). I am also using a user exit to avoid modifying BEGDA and ENDDA when I modify the record (IPSYST = 'MOD'). Using this user-exit, the parameters dissapear from memory so the dynamic action does not execute well. What can I do to use the user-exit?? Anything to add?

    In the dinamic actions when I create, I delimit records on infotype with export/import parameters defined in infotype Module Pool. When I delete, I avoid deleting the record if it is not the last one. With the user-exit the modification of begda endda in infotype is not allowed. If I use the user-exit, the dinamic actions which use export/import parameters, don't work.
    I have tried to do in MP what I do in user-exit but it is not easy because I haven´t got in PSAVE what I want.

Maybe you are looking for

  • Error PR to PO

    Dear all, While creating PO wrt PR, for some line items of same PR the message is coming "qty ordered in full'.  For this we have checked PR but those line items are still open.  RFQ has been created for these, but while creating PO it is not showing

  • Scanning not working with Acrobat Pro X or Photoshop?

    My Epson Perfection V350 scanner is listed as a supported scanner on Apple's site. After installing the latest versions of Acrobat Pro X and Photoshop Extended, the scanner is not available to either app. After several useless exchanges with Adobe on

  • Error deploying in Studio when using 2 data sources

    I have built a cube using multiple datasources in Essbase Studio. The sources are Oracle database and EPMA dimension server. When I deploy the cube, I receive the following error. However, when I deploy using one source and specify DSN, it works. It

  • Question about load external SWF

    Hi people! I've been trying to load an external SWF into my app. I need to load a Fusion Chart's Gantt Chart, but I didn't achieve. Fusion Charts are embed using the attached JavaScript code. How can I load this SWF into my app? Should I use External

  • Static html pages in carousel

    Hi, I'm very new to web-center. I am trying to add static html pages into carousel. Is it possible to do it? If possible can anyone please tell how to do it?