Export and import a table data
Hi All,
There is table abc which is having appprox 40lak records without partitions(and have dmp of this data). Due to space constraints I want to compress the data and put it in archive schema.
My question is:
1) Can we import the data into other schema(say: archive schema) in compressed mode. If so could you please let me know the syntax for that.
2) Can we import the data into some temporary table(say: temp) in same schema?
3) And if any blob column exists in table then can we import the data without this blob column data.(Supposing the export dump has the blob data).
Is it possible, or whould we export without this blob column and import this to other table?
Please help me out.
Edited by: user11942774 on 4 Dec, 2011 10:33 PM
Thanks Mahir,
Could you please let me know the syntax or with any examples if possible.
is it possible to drop any column say blob column after the table is in compressed mode?
Edited by: user11942774 on 4 Dec, 2011 10:53 PM
Edited by: user11942774 on 4 Dec, 2011 10:54 PM
Similar Messages
-
Exporting and importing just table definitions
Hi,
I have this production database that has a huge amount of data in it. I was asked to set up a test database based on the exact same schema as the live database. When I tried to do an export (from live) and import (to test), with the parameters rows=N and compress=y, the target (test database) data file will still grow enormously, presumably because of the huge number of extents already allocated to the table in the live database. My test database of course, has a limited hard-disk space.
Is there a way to export and import the table definitions without having the target database experiencing a huge growth in the size of the tablespace?
Thanks,
Chris.If an export with compress=n is still creating initial extents that a too large, you can still build with the import file but it will take a little work.
run imp with indexfile=somefile.sql
when imp is finished, edit somefile.sql by:
1. remove all the REM statements.
2. remove all the storage clauses (tables and indexes)
Make sure your tablespaces have a small (say 1k) default initial extent.
run imp again with rows=n
All your tables and indexes will be created with the default tablespace initial extent. -
How to export and import dependent tables from 2 different schema
I have a setup where schema1 has table 1 and schema 2 has table2. And table 1 from schema1 depends upon table 2 from schema 2.
I would like to export and import these tables only and not any other tables from these 2 schemas with all information like grants,constraints.
Also will there be same method for Oracle 10g R1,R2 and Oracle 11G.
http://download.oracle.com/docs/cd/B12037_01/server.101/b10825/dp_export.htm#i1007514
Looking at this For table mode it says
Also, as in schema exports, cross-schema references are not exported
Not sure what this means.
As I am interested in only 2 tables I think I need to use table mode. But if I try to run export with both tables names, it says table mode support only one schema at a time. Not sure then How would the constraints would get exported in that case.
-Rohitworked for my 1st time I tried
exp file=table2.dmp tables="dbadmin.temp1,scott.emp"
Export: Release 10.2.0.1.0 - Production on Mon Mar 1 16:32:07 2010
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Username: / as sysdba
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Export done in US7ASCII character set and AL16UTF16 NCHAR character set
server uses WE8ISO8859P1 character set (possible charset conversion)
About to export specified tables via Conventional Path ...
Current user changed to DBADMIN
. . exporting table TEMP1 10 rows exported
EXP-00091: Exporting questionable statistics.
Current user changed to SCOTT
. . exporting table EMP 14 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
Export terminated successfully with warnings. -
Export and Import of Configuration Data
Hi All,
I have created a number of custom properties, structures, groups, renderers, layout sets etc and would like to move them from Dev to Test system.
I have found the following in the help doco:
"Export and Import of Configuration Data"
http://help.sap.com/saphelp_nw04/helpdata/en/e1/029c414c79b25fe10000000a1550b0/content.htm
However I am unable to see its functionality in our portal systems (namely the Actions->Export and Actions->Import functions).
Our version of Portal is: 6.0.9.6.0
KM: 6.0.9.3.0 (NW04 SPS09 Patch3)
Therefore we should be able to see it? no?
Are there any special settings to see these buttons?
Cheers,
VicThanks for your replies.
OK, so in otherwords, it is not available in SP09 and only in SP12+.
So, to "transport" this information I will have to manually reapply thes settings in the Test system!
Message was edited by: Victor Yeoh -
Dears,
I need to export two tables from PRD and import to QAS.
Size of table is 1 GB and 3 GB.
Please confirm:
1--Do i need any downtime for export and import of table
2- How much time will it take in this export process.
Please suggest.
ShivamI cannot tell you how much time it will take. are you still using Pentium 3 processors? are you on 4 quad-core CPU? are you on IDE disk? SATA disks? on a fast and big powerful SAN?
This being said, I would say roughly 10 minutes. but as you understand, this is just a wild guess.
for downtime, you did not tell us the table names. This being said, try to export on a low usage period. -
How row migration get eliminated by export and import the table.
Hi ,
Please let me know,how row migration get eliminated by export and import the table.
Another method,deleting migrated rows and inserting those rows from copied table,
i think the concept behind this method is,inserting these rows into new block.
pls correct me if i am wrong.
Thanks,
Kumar.Hi!
You can also use ALTER TABLE MOVE command, or if you are on 9i you can use DBMS_REDEFINITION package.
If you have a maintainance window the easiest way would be to use alter table move.
Regards,
PP -
Regarding Exporting and Importing internal table
Hello Experts,
I have two programs:
1) Main program: It create batch jobs through open_job,submit and close job.Giving sub program as SUBMIT.
I am using Export IT to memory id 'MID' to export internal table data to sap memory in the subprogram.
The data will be processed in the subprogram and exporting data to sap memory.I need this data in the main program(And using import to get the data,but it is not working).
Importing IT1 from memory id 'MID' to import the table data in the main program after completing the job(SUBMIT SUBPROGRAM AND RETURN).
Importing is not getting data to internal table.
Can you please suggest something to solve this issue.
Thank you.
Regards,
Anand.Hi,
This is the code i am using.
DO g_f_packets TIMES.
* Start Immediately
IF NOT p_imm IS INITIAL .
g_flg_start = 'X'.
ENDIF.
g_f_jobname = 'KZDO_INHERIT'.
g_f_jobno = g_f_jobno + '001'.
CONCATENATE g_f_jobname g_f_strtdate g_f_jobno INTO g_f_jobname
SEPARATED BY '_'.
CONDENSE g_f_jobname NO-GAPS.
p_psize1 = p_psize1 + p_psize.
p_psize2 = p_psize1 - p_psize + 1.
IF p_psize2 IS INITIAL.
p_psize2 = 1.
ENDIF.
g_f_spname = 'MID'.
g_f_spid = g_f_spid + '001'.
CONDENSE g_f_spid NO-GAPS.
CONCATENATE g_f_spname g_f_spid INTO g_f_spname.
CONDENSE g_f_spname NO-GAPS.
* ... (1) Job creating...
CALL FUNCTION 'JOB_OPEN'
EXPORTING
jobname = g_f_jobname
IMPORTING
jobcount = g_f_jobcount
EXCEPTIONS
cant_create_job = 1
invalid_job_data = 2
jobname_missing = 3
OTHERS = 4.
IF sy-subrc <> 0.
MESSAGE e469(9j) WITH g_f_jobname.
ENDIF.
* (2)Report start under job name
SUBMIT (g_c_prog_kzdo)
WITH p_lgreg EQ p_lgreg
WITH s_grvsy IN s_grvsy
WITH s_prvsy IN s_prvsy
WITH s_prdat IN s_prdat
WITH s_datab IN s_datab
WITH p1 EQ p1
WITH p3 EQ p3
WITH p4 EQ p4
WITH p_mailid EQ g_f_mailid
WITH p_psize EQ p_psize
WITH p_psize1 EQ p_psize1
WITH p_psize2 EQ p_psize2
WITH spid EQ g_f_spid
TO SAP-SPOOL WITHOUT SPOOL DYNPRO
VIA JOB g_f_jobname NUMBER g_f_jobcount AND RETURN.
*(3)Job closed when starts Immediately
IF NOT p_imm IS INITIAL.
IF sy-index LE g_f_nojob.
CALL FUNCTION 'JOB_CLOSE'
EXPORTING
jobcount = g_f_jobcount
jobname = g_f_jobname
strtimmed = g_flg_start
EXCEPTIONS
cant_start_immediate = 1
invalid_startdate = 2
jobname_missing = 3
job_close_failed = 4
job_nosteps = 5
job_notex = 6
lock_failed = 7
OTHERS = 8.
gs_jobsts-jobcount = g_f_jobcount.
gs_jobsts-jobname = g_f_jobname.
gs_jobsts-spname = g_f_spname.
APPEND gs_jobsts to gt_jobsts.
ELSEIF sy-index GT g_f_nojob.
CLEAR g_f_flg.
DO. " Wiating untill any job completion
LOOP AT gt_jobsts into gs_jobsts.
CLEAR g_f_status.
CALL FUNCTION 'BP_JOB_STATUS_GET'
EXPORTING
JOBCOUNT = gs_jobsts-jobcount
JOBNAME = gs_jobsts-jobname
IMPORTING
STATUS = g_f_status
* HAS_CHILD =
* EXCEPTIONS
* JOB_DOESNT_EXIST = 1
* UNKNOWN_ERROR = 2
* PARENT_CHILD_INCONSISTENCY = 3
* OTHERS = 4
g_f_mid = gs_jobsts-spname.
IF g_f_status = 'F'.
IMPORT gt_final FROM MEMORY ID g_f_mid .
FREE MEMORY ID gs_jobsts-spname.
APPEND LINES OF gt_final to gt_final1.
REFRESH gt_prlist.
CALL FUNCTION 'JOB_CLOSE'
EXPORTING
jobcount = g_f_jobcount
jobname = g_f_jobname
strtimmed = g_flg_start
EXCEPTIONS
cant_start_immediate = 1
invalid_startdate = 2
jobname_missing = 3
job_close_failed = 4
job_nosteps = 5
job_notex = 6
lock_failed = 7
OTHERS = 8.
IF sy-subrc = 0.
g_f_flg = 'X'.
gs_jobsts1-jobcount = g_f_jobcount.
gs_jobsts1-jobname = g_f_jobname.
gs_jobsts1-spname = g_f_spname.
APPEND gs_jobsts1 TO gt_jobsts.
DELETE TABLE gt_jobsts FROM gs_jobsts.
EXIT.
ENDIF.
ENDIF.
ENDLOOP.
IF g_f_flg = 'X'.
CLEAR g_f_flg.
EXIT.
ENDIF.
ENDDO.
ENDIF.
ENDIF.
IF sy-subrc <> 0.
MESSAGE e539(scpr) WITH g_f_jobname.
ENDIF.
COMMIT WORK .
ENDDO. -
Shell Script For Export And Import Of Table Records
Hello,
We have production and test instances and for constant testing we need to copy data from production to test or development environment.
At the moment what we do is manually doing export and import table records. At times this could be very tedious as we may need
to do this exercise a couple of times in a day.
Is it a good idea to do this exercise using shell script? If so how could I do this? If this is not a good idea what are the best alternatives?
Any input is highly appreciated.
ThanksAh I see, your company prefers stupidity over efficiency. It would be possible to do it in a controlled environment, wouldn't it? Also the test database would be allowed to select only.
So the non-allowance is just plain stupid.
To the second question: do you use hard-coded passwords in shell scripts?
Don't you think that poses a security risk?
Don't you think that is a bigger risk than a database link, properly set up?
In my book it is!
Sybrand Bakker
Senior Oracle DBA -
Export and import XMLType table
Hi ,
I want to export one table which contain xmltype column form oracle 11.2.0.1.0 and import into 11.2.0.2.0 version.
I got following errors when i export the table , when i tried with exp and imp utility
EXP-00107: Feature (BINARY XML) of column ZZZZ in table XXXX.YYYY is not supported. The table will not be exported.
then i tried export and import pump.Exporting pump is working ,following is the log
Export: Release 11.2.0.1.0 - Production on Wed Oct 17 17:53:41 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
;;; Legacy Mode Active due to the following parameters:
;;; Legacy Mode Parameter: "log=<xxxxx>Oct17.log" Location: Command Line, Replaced with: "logfile=T<xxxxx>_Oct17.log"
;;; Legacy Mode has set reuse_dumpfiles=true parameter.
Starting "<xxxxx>"."SYS_EXPORT_TABLE_01": <xxxxx>/******** DUMPFILE=<xxxxx>Oct172.dmp TABLES=<xxxxx>.<xxxxx> logfile=<xxxxx>Oct17.log reusedumpfiles=true
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 13.23 GB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "<xxxxx>"."<xxxxx>" 13.38 GB 223955 rows
Master table "<xxxxx>"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
Dump file set for <xxxxx>.SYS_EXPORT_TABLE_01 is:
E:\ORACLEDB\ADMIN\LOCALORA11G\DPDUMP\<xxxxx>OCT172.DMP
Job "<xxxxx>"."SYS_EXPORT_TABLE_01" successfully completed at 20:30:14
h4. I got error when i import the pump using following command
+impdp sys_dba/***** dumpfile=XYZ_OCT17_2.DMP logfile=import_vmdb_XYZ_Oct17_2.log FROMUSER=XXXX TOUSER=YYYY CONTENT=DATA_ONLY TRANSFORM=oid:n TABLE_EXISTS_ACTION=append;+
error is :
h3. KUP-11007: conversion error loading table "CC_DBA"."XXXX"*
h3. ORA-01403: no data found*
h3. ORA-31693: Table data object "XXX_DBA"."XXXX" failed to load/unload and is being skipped due to error:*
Please help me to get solution for this.CREATE UNIQUE INDEX "XXXXX"."XXXX_XBRL_XMLINDEX_IX" ON "CCCC"."XXXX_XBRL" (EXTRACTVALUE(SYS_MAKEXML(128,"SYS_NC00014$"),'/xbrl'))above index is created by us because we are storing file which like
<?xml version="1.0" encoding="UTF-8"?>
<xbrl xmlns="http://www.xbrl.org/2003/instance" xmlns:AAAAAA="http://www.AAAAAA.COM/AAAAAA" xmlns:ddd4217="http://www.xbrl.org/2003/ddd4217" xmlns:link="http://www.xbrl.org/2003/linkbase" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<link:schemaRef xlink:href="http://www.fsm.AAAAAA.COM/AAAAAA-xbrl/v1/AAAAAA-taxonomy-2009-v2.11.xsd" xlink:type="simple" />
<context id="Company_Current_ForPeriod"> ...I tried to export pump with and without using DATA_OPTIONS=XML_CLOBS too.Both time exporting was success but import get same KUP-11007: conversion error loading table "tab_owner"."bbbbb_XBRL" error.
I tried the import in different ways
1. Create table first then import data_only (CONTENT=DATA_ONLY)
2. Import all table
In both way table ddl is created successfully ,but it fail on import data.Following is the log when i importing
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "aaaaa"."SYS_IMPORT_TABLE_02" successfully loaded/unloaded
Starting "aaaaa"."SYS_IMPORT_TABLE_02": aaaaa/********@vm_dba TABLES=tab_owner.bbbbb_XBRL dumpfile=bbbbb_XBRL_OCT17_2.DMP logfile=import_vmdb
bbbbb_XBRL_Oct29.log DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
*KUP-11007: conversion error loading table "tab_owner"."bbbbb_XBRL"*
*ORA-01403: no data found*
ORA-31693: Table data object "tab_owner"."bbbbb_XBRL" failed to load/unload and is being skipped due to error:
ORA-29913: error in executing ODCIEXTTABLEFETCH callout
ORA-26062: Can not continue from previous errors.
Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "aaaaa"."SYS_IMPORT_TABLE_02" completed with 1 error(s) at 18:42:26 -
Export and import only table structure
Hi ,
I have two schema scott and scott2. scott schema is having table index and procedure and scott2 schema is fully empty.
Now i want the table structure, indexes and procedure from scott schema to scott2 schema. No DATA needed.
What is the query to export table structure, indexes and procedure from scott schema and import in scott2 schema.
Once this done, i want scott schema should have full access to scott2 schema.
Oracle Database 10g Release 10.2.0.1.0 - 64bit Production
Please help...Pravin wrote:
I used rows=n
it giving me below error while importing dump file:-
IMP-00003: ORACLE error 604 encountered
ORA-00604: error occurred at recursive SQL level 1
ORA-01013: user requested cancel of current operation^CYou are getting this error because you hit "Ctrl C" during the import, which essentially cancels the import.
IMP-00017: following statement failed with ORACLE error 604:
"CREATE TABLE "INVESTMENT_DETAILS_BK210509" ("EMP_NO" VARCHAR2(15), "INFOTYP"
"E" VARCHAR2(10), "SBSEC" NUMBER(*,0), "SBDIV" NUMBER(*,0), "AMOUNT" NUMBER("
"*,0), "CREATE_DATE" DATE, "MODIFY_DATE" DATE, "FROM_DATE" DATE, "TO_DATE" D"
"ATE) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 STORAGE(INITIAL 6684672"
" FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "DMSTG" LOGG"
"ING NOCOMPRESS"Srini -
Exporting and Importing a table in Lookout 6.0.2 using Office 2007
I am exporting a table to Excel 2007 from Lookout 6.0.2 so I can make changes and then import the table with changes back into Lookout. I have done this in the past with Excel 2003 and have had no problems. When I export the table the file that is created is an Excel 97-2003 workbook file, so after I make changes to the file in Excel 2007, I save the file as an Excel 97-2003 workbook because that is what is was initially and that is the only type of file Lookout sees when trying to do an import. Problem is when go to import this file I get an error saying 'cannot read file format'.
Any help would be greatly appreciated.
Thanks!
Jason
Jason PhillipsLookout Datatable object can only read worksheet file, but unfortunately Excel 2007 can not save as worksheet any more.
I have not found a way to use datatable with excel 2007 yet.
One workaround is to use ODBC. Refer to this article.
http://digital.ni.com/public.nsf/allkb/C8137C6BDA276982862566BA005C5D1F?OpenDocument
Ryan Shi
National Instruments -
Export and Import Of Tables having BLOB and Raw datatype
Hi Gurus,
I had to export one schema in one database and import to another schema in another database.
However my current database contains raw and blob datatype.I have exported the whole database by the following commnad
exp SYSTEM/manager FULL=y FILE=jbrms_full_19APR2013.dmp log=jbrms_full_19APR2013.log GRANTS=y ROWS=y
My question is if all the tables with raw and blob have been exported properly or not.I have done one more thing after taking the export , I have imported to local db and checked the no of rows in the both the envs are same.As I have not tested with the application to confirm.
I am using this version of Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Production
not able to attach the complete log file but for the schema jbrms which has the blob and raw datatype.
Please let me know if you find some potential concerns with the export for BLOB and raw
. about to export JBRMS's tables via Conventional Path ...
. . exporting table FS_FSENTRY 8 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table FS_WS_DEFAULT_FSENTRY 2 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table PM_WS_DEFAULT_BINVAL 60 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table PM_WS_DEFAULT_BUNDLE 751 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table PM_WS_DEFAULT_NAMES 2 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table PM_WS_DEFAULT_REFS 4 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table VERSIONING_FS_FSENTRY 1 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table VERSIONING_PM_BINVAL 300 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table VERSIONING_PM_BUNDLE 11654 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table VERSIONING_PM_NAMES 2 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
. . exporting table VERSIONING_PM_REFS 1370 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.You could see the 'QUESTIONABLE STATISTICS' warning for a couple of reasons. I don't remember them all but.
1. If the target and source character set is different.
2. system generated names (I think?)
the best solution if you don't need the exact statistics that are on your source database would be to add
statistics=none
to your imp command and then regather statistics when the imp command is done.
Dean -
ECC Client export and import with BW data model
Dear BW Expert,
We are now have BW QAS system which is connecting to ECC600QAS.
Now we are going to:
First >> Export data ECC600QAS.
Second Drop this client
Third Create client with the same logical name as ECC600QAS and then import again with the same logical system name is ECC600QAS. Ofcourse we will re-establish again connection between ECC600QAS and BW QAS.
In this case, do we need to create all data model or mapping data in BW again. Please help on this.
Thanks and Regards,
Kim Cheo.Hi Reddy,
All your notes do not mention anything about in case Client refresh with the same logical name.
Our case is that we do:
Step 1: Export client logical name is "ECC600QAS".
Step 2: Drop client "ECC600QAS"
Step 3: Import again and create the same client logical name "ECC600QAS".
In this case do we need to create again all data model, mappying rule or transformation.
Really, I can not found any idea there in all your note.
Thanks and regards,
Kim CheolYeon Lee. -
Dear
i have a table DEV_SSN of size 2GB. I want to take a dump of these table. can you give the exp options that i need t use to speed up the process. this is on oracle 8.1.7 and latter i need to import the same to a different schema, It has constraints like primary key, foreign key, check, not null,null and two index on it
please let me know the import options as well
thanks and regards.
VIJAYHi,<br>
<br>
The Oracle doc can help you.<br>
<br>
Nicolas. -
Dear Guru's,
Does anybody know how to export EWA's from one SolMan to another?
I need historical data for SoDocA from another Solution Manager System.......
Please advise!
Regards,
DavidHello,
I know it is possilble to archive EWA reports to a 3rd party Archive system.
Then it would simply be a mater of conencting he 3rd party archive system to the new Solution Manager to have access to historical EWA data.
SAP Noe 546685 has more information on this setup.
Hope this helps.
Regards,
Paul
Maybe you are looking for
-
HT4902 I don't want my contacts to move to iCloud. Is that possible?
I'm changing my organizations mobil me account to iCloud. Mostly we want the photos we've posted to mobile me. But I'm using my computer to make the move and I'm concerned iCloud will upload my contactacts from my computer. How do I prevent that? fra
-
BootCamp can't install partition
Hi All,anyone could help me to resolve ....The disk is not journaled. You must enable journaling using Disk Utility before using the Boot Camp Assistant.
-
How to use MyFaces component with Java Creator 2
Please advise for me the way to use Myfaces component with Java Creator 2. Thanks
-
Hi!! I'm using an OVS in an application which is in Spanish, but the OVS shows the fields labels in English. Which configuration does this depend on? How can I control the language used by the OVS? Thanks a lot!
-
Need Help how to address problem with undoing partition from BootCamp Assistant- receiving error message "Boot Camp Assistant cannot be used: You must update your computer's boot ROM firmware before using this setup assistant". attempted solutions: h