Data Manager - export/import
My questions is as follows:
Export from: Oracle 8.17 (on Unix)
Import to: Oracle 8.05 (Windows2000)
How should I set the parameters in the Data Manager to make all tables/procedures/triggers etc.. come across complete and undamaged.?
I have tried many things - but usually the procedures become invalid and no triggers come across.
I have used IMP80 as well for importing with same result.
Regards
Olav Torvund
[email protected]
This is exactly why everyone out there should not get too comfortable using OEM to manage their databases. If you know how to do a task using the command-line, then by all means do it that way. Use OEM or DBA Studio for things you're not so sure of or if you need a visual to help you do it right. Command-line is quicker and more percise because you have total control of what info you're sending to the db.
Others may disagree with the advice I'm giving - which is to NOT use OEM if you know how to do the task via command-line - and further input is welcome. What do others think about this?????????
Roy
Similar Messages
-
Brief about Console, data manager and Import manager
Hi,
Iam new to MDM we just started installing our MDM components and iam a MM functional guy Can you any one tell me about console, data manager and import manger i need to know the following,
1. what is it
2. How the 3 is interlinked
3. can the data manager be loaded directly without a import
4. is there a std way of creating repository
5. I need to do samples on vendor, customer harmonisation and the spend analysis and matl master harmonisation..
Iam asking too much in this forum but i knw you guys are there to help me out.
Thanks in advance,
SureshMDM Console
The MDM Console allows the system manager to administer and monitor the MDM Server, and to create, maintain the structure of, and control access to the MDM repositories themselves. Records are not entered or managed with the MDM Console, but rather with client apps such as the MDM Client.
MDM Client(data manager)
The MDM Client is the primary client of the MDM Server, and is the program most users see on their PCs. It allows users to store, manage and update master data consisting of text, images, and other rich content, and to create taxonomies, families, and relationships. Other clients include the Import Manager, the Image Manager, and the Syndicator.
Import Manager
Allows you to import master data from most types of flat or relational electronic
source files (e.g. Excel, delimited text, SQL, XML, and any ODBC-compliant source),
and to restructure, cleanse, and normalize master data as part of the import process.
<i>2. How the 3 is interlinked</i>
--->
Suppose you are sending vendor information to MDM repository be sending IDOC from R/3. Then the whole process can go as... IDOC will be sent to XI, so XI will generate XML file that contains data. Then import manager can import this file and will so place the data in repository.
Now if you want to send data in MDM repository, the process goes reverse way. Syndicator will generate flat/xml file. This file will be given to XI so that XI will generate IDOC out of it and will send it to XI. This whole process ensures data consistancy
For details refer this link
http://help.sap.com/saphelp_mdm550/helpdata/en/43/D7AED5058201B4E10000000A11466F/frameset.htm -
BPC 7.5 NW -- Data Manager Export: Fixed-Length Fields?
Hi Experts,
Client has a requirement that we export a batch of transaction data from BPC, which is generally no problem, but in this case they want to do it with Fixed-Length fields in the Export file.
I've done a lot of BPC Data Manager Imports with Fixed-Length fields, and that function really works well. But, has anyone tried to use the Export package with this option? It doesn't seem to be working in my case yet.
Any tips?
Thanks so much, as always,
Garrett
=======================
Update -- After going back to review documentation, it looks like the the *PAD() function is actually intended for export not really importing, which makes sense. ...The SAP online help library says that it's meant for import, but I now believe that is a typo.
Also, I've added the line "OUTPUTFORMAT = NORMAL" in my *OPTIONS section. Anyone else manage to get export working on BPC 7.5 NW
Edited by: Garrett Tedeman on Mar 3, 2011 1:37 PMUpdate -- This problem may now be resolved.
I have been able to conduct test IMPORTs of 48,000, then 96,000 and then 1.7 million records. All were fine.
It turns out that that difference is that the text files were sorted by amount in the ones that failed. They were sorted by GLAccount in column A for the ones that succeeded.
Edit: Yep, all files loaded normally when re-sorted by GLACCOUNT, etc. on the left-hand side. Apparently, when you're doing a lot of records that might confuse the system or something
Edited by: Garrett Tedeman on Nov 18, 2010 11:41 AM -
SYSTEM Copy with Data ( SAP Export/Import way)
Hello ,
There is requirement at my client site to build SAP system copy from Production System without copying Data but
should have Programs/ Structures / Repository/ Tables & views etc.
Though We have thought of building SAP system with Export/ Import way and then deleting the Client from copied system
after that running Remote client copy from Source to Target system with SAP_CUST profile
But I have heard with SAP Export/Import way , We can have SAP data copy skipped and only structure to copy. If there is any way
of such kind then Please help me in letting know the same
Thanks
Deepak GosainHi Deepak
Kindly refer the SCN link difference between the Client copy Export / import & Remote copy method
Difference between remote client copy and client Import/Export- Remote client copy steps
BR
SS -
Pre Checks before running Data Pump Export/Import
Hi,
Oracle :-11.2
OS:- Windows
Kindly share the pre-checks required for data pump export and import which should be followed by a DBA.
ThanksWhen you do a tablespace mode export, Data Pump is essentially doing a table mode export of all of the tables in the tablespaces mentioned. So if you have this:
tablespace a contains table 1
table 2
index 3a(on table 3)
tablespace b contains index 1a on table 1
index 2a on table 2
table 3
and if you expdp tablespaces=a ...
you will get table 1, table 2, index 1a, and index 2a.
My belief is that you will not get table 3 or index 3a. The way I understand the code to work is that you get the tables in the tablespaces you mention and their dependent objects, but not the other way around. You could easily verify this to make sure.
Dean -
Data Manager : Export records to excel
Hi MDM pals,
Data manager offers an user to select records in a repository and export them to an excel file.Options exist to choose fields and Qualifiers to be exported.Output file will be a single excel file with all records in distinct columns.
Having said that, I am in need of exporting records in main table and qualified lookup tables with respective qualifiers into different excel files. Please advise how to split the same.
Need stated as below :
1) Excel file 1 : Main table (Customer)
2) Excel file 2 : Qualified Table 1 (Sales Data) - Should contain Sales records(non qualifiers and qualifiers) for each customer.
3) Excel file 3 : Qualified Table 1 (Pricing Data) - Should contain Price records(non qualifiers and Qualifiers) for each customer.
Regards,
Vinay M.SHi Vinay,
Yes this can be achieved but you need to perform the export three times.
1) Main table (Customer)
Go to File -> Export To -> Excel, select all the fields, this will simply export all the main table records. If you want to export selected records then either select them manually or perform the search and then go for export.
2) Excel file 2 : Qualified Table 1 (Sales Data) - Should contain Sales records(non qualifiers and qualifiers) for each customer.
Go to File -> Export To -> Excel, remove all the main table fields, select Customer Number field and the Lookup Qualified field Sales Data from the list of available fields and move them to Fields to Export, once you move Sales Data field, Qualifiers Check box will get enabled, Check that and you will get the list of Qualifiers in the available fields, either you can select all the qualifiers or the selected ones depending upon what you want in the excel file. Just click on OK and check the output file.
3) Excel file 3 : Qualified Table 1 (Pricing Data) - Should contain Price records(non qualifiers and Qualifiers) for each customer.
Similar to 2
Regards,
Jitesh Talreja -
Data management - Export Data on a text file and work on Excel/Acces
Hello,
When i use the Data Export tool in order to retreive all Accounts data, the CRM creates a txt file with more than 255 columns (number of record type by account). Then I can't import them in a new Access database because Access import tool do not accepts too many columns. So, i try to open it in Excel by import text file. However, some Accounts have a comment in a large text field and, when Excel converted the text file, datas are shifted compared with the column headers because of this specific "comments" record type.
Is someone knows whether it's possible to reduce the number of record type exported by selecting specific ones or knows a process to import on Excel/Access which resolved this problem of position ?
Regards,
GuillaumeGuillaume, you can create a historical account report and specify the fields you want and then download to Excel. Also, if you use the export utility which provides a CSV file format with more than 255 columns, delete the columns you do not want to get it under 255 and then convert to Excel.
-
Oracle 10g - Data Pump: Export / Import of Sequences ?
Hello,
I'm new to this forum and also to Oracle (Version 10g). Since I could not find an answer to my question, I open this post in hoping to get some help from the experienced users.
My question concerns the Data Pump Utility and what happens to sequences which were defined in the source database:
I have exported a schema with the following command:
"expdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp LOGFILE=logfile.log"
This worked fine and also the import seemed to work fine with the command:
"impdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp"
It loaded the exported objects directly into the schema of the target database.
BUT:
Something has happened to my sequences. :-(
When I want to use them, all sequences start again with value "1". Since I have already included data with higher values in my tables, I get into trouble with the PK of these tables because I used sequences sometimes as primary key.
My question go in direction to:
1. Did I something wrong with Data Pump Utility?
2. How is the correct way to export and import sequences that they keep their actual values?
3. When the behaviour described here is correct, how can I correct the values that start again from the last value that was used in the source database?
Thanks a lot in advance for any help concerning this topic!
Best regards
FireFighter
P.S.
It might be that my english sounds not perfect since it is not my native language. Sorry for that! ;-)
But I hope that someone can understand nevertheless. ;-)My question go in direction to:
1. Did I something wrong with Data Pump Utility?I do not think so. But may be with the existing schema :-(
2. How is the correct way to export and import
sequences that they keep their actual values?If the Sequences exist in the target before the import, oracle does not drop and recreate it. So you need to ensure that the sequences do not already exist in the target or the existing ones are dropped before the import.
3. When the behaviour described here is correct, how
can I correct the values that start again from the
last value that was used in the source database?You can either refresh with the import after the above correction or drop and manually recreate the sequences to START WITH the NEXT VALUE of the source sequences.
The easier way is to generate a script from the source if you know how to do it -
Solution Manager -- Export Import Project
Hi,
I want to import a project from SolMan 7.00 to SolMan 7.00 EHP1, but I can´t find an option to do it.
As I know I have to create an Implementation Project in tx SOLAR_PROJECT_ADMIN and then import the implementation projects that already exists.
Any clue?Hi Jorge,
In reply to your previous post you need to take the 3 files saved locally and copy them:
The K-file have to be copied into the /usr/sap/trans/cofiles directory and the R- and I-file into /usr/sap/trans/data.
The documents are in a transport request dump that you can access in the normal manner.
With regards to the structure in SOLAR01, first make sure you are in the correct project by switching the project using the icon at the top.
At the export and import stage, make sure you have ticked documents and structure so this is transported across.
Finally check your transport request has completed successfully without errors.
Regards,
-Rohan -
ROWID - data type export import
Hi all,
I have a table which has a column of datatype ROWID. Size of the table is 5GB. When I try to export it takes 40 mins for this table. While importing it takes 7hrs to complete.
I feel it is taking more time than it is supoposed to. Is it because of the ROWID data type. Is it doing something extra, can someone throw some ilight on this.
Thanks in Advance,
JaggyamNo it will not be because of the ROWID column data type.
Probably because of:
- creation of indexes on the table.
- creation of constraints (PK/UK/FK/CHECK) on the table.
These actions take time, and do not happen when you export a table.
Btw:
Are you aware that the rowid values in that particular column will (very likely, depends a bit on your case) have been rendered un-usable after the import? -
Data transfer (Export & Import) of Delta Queue between two system.
We are planning to implement the following senario for a certain R/3 and BW system.
1. Implement storage snap shot copy solution for R/3 database and create replicated system of production R/3 every day in order to avoid load of production R/3 to BW directly. That is to say, we will extract business data from the replicated R/3 system and forward to BW.
2. We need to utilize Delta Queue function in order to feed the only gap data between production R/3 and BW. but as stated in 1. the source data will be replicated from the production R/3 every day and so we could not keep the Delta Queue data in R/3.
3. So, we are examining the following method in order to keep the Delta Data in the replicated R/3 system. i,e after feeding business data from the replicated R/3 to BW, we will extract the following Delta Queue table and keep it.
- TRFCQOUT (Client Dependent pointer table per queue name and destination.
- ARFCSSTATE: Likn between TRFCQOUT and ARFCSDARA
- ARFCSDATA: Compressed data for tRFC/qRFC
And then, next day, after replicating R/3 system, we will import those Delta data to the replicated R/3 system before stating extract business data and feed to BW.
Is the such method realistic? if not, is there any alternative?
Please advice!
Best regards,I found the following document and this is one that I want to inplement.
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/5d51aa90-0201-0010-749e-d6b993c7a0d6
Additionaly, I want to execute this senario repeatedly. Does someone have experience such implementation?
Eguchi. -
10G data pumps export / import
window XP pro sp-2,
I run expdp from the command line I got this
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
tion
With the Partitioning, OLAP and Data Mining options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-39087: directory name DATA_PUMP_DIR is invalid
What am I missing ?
EricMaybe you've missed some parameters? Expdmp alsways needs to have some sort of info on what you want to export, i.e.:
expdp hr/hr TABLES=employees,jobs DUMPFILE=dpump_dir1:table.dmp NOLOGFILE=y
or
expdp hr/hr PARFILE=exp.par
with parfile contents being:
DIRECTORY=dpump_dir1
DUMPFILE=dataonly.dmp
CONTENT=DATA_ONLY
EXCLUDE=TABLE:"IN ('COUNTRIES', 'LOCATIONS', 'REGIONS')"
QUERY=employees:"WHERE department_id !=50 ORDER BY employee_id"
These examples are from the documentation on Utilities, so you can look up all the possible parameters there. -
Using export/import to migrate data from 8i to 9i
We are trying to migrate all data from 8i database to 9i database. We plan to migrate the data using export/import utility so that we can have the current 8i database intact. And also the 8i and 9i database will reside on the same machine. Our 8i database size is around 300GB.
We plan to follow below steps :
Export data from 8i
Install 9i
Create tablespaces
Create schema and tables
create user (user used for exporting data)
Import data in 9i
Please let me know if below par file is correct for the export :
BUFFER=560000
COMPRESS=y
CONSISTENT=y
CONSTRAINTS=y
DIRECT=y
FEEDBACK=1000
FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
FILESIZE=2048GB
FULL=y
GRANTS=y
INDEXES=y
LOG=export.log
OBJECT_CONSISTENT=y
PARFILE=exp.par
ROWS=y
STATISTICS=ESTIMATE
TRIGGERS=y
TTS_FULL_CHECK=TRUE
Thanks,
Vinod BhansaliI recommend you to change some parameters and remove
others:
BUFFER=560000
COMPRESS=y -- This will increase better storage
structure ( It is good )
CONSISTENT=y
CONSTRAINTS=y
DIRECT=n -- if you set that parameter in yes you
can have problems with some objects
FEEDBACK=1000
FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
FILESIZE=2048GB
FULL=y
GRANTS=y -- this value is the default ( It is
not necesary )
INDEXES=y
LOG=export.log
OBJECT_CONSISTENT=y -- ( start the database in restrict
mode and do not set this param )
PARFILE=exp.par
ROWS=y
STATISTICS=ESTIMATE -- this value is the default ( It is
not necesary )
TRIGGERS=y -- this value is the default ( It is
not necesary )
TTS_FULL_CHECK=TRUE
you can see what parameters are not needed if you apply
this command:
[oracle@ozawa oracle]$ exp help=y
Export: Release 9.2.0.1.0 - Production on Sun Dec 28 16:37:37 2003
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
You can let Export prompt you for parameters by entering the EXP
command followed by your username/password:
Example: EXP SCOTT/TIGER
Or, you can control how Export runs by entering the EXP command followed
by various arguments. To specify parameters, you use keywords:
Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
USERID must be the first parameter on the command line.
Keyword Description (Default) Keyword Description (Default)
USERID username/password FULL export entire file (N)
BUFFER size of data buffer OWNER list of owner usernames
FILE output files (EXPDAT.DMP) TABLES list of table names
COMPRESS import into one extent (Y) RECORDLENGTH length of IO record
GRANTS export grants (Y) INCTYPE incremental export type
INDEXES export indexes (Y) RECORD track incr. export (Y)
DIRECT direct path (N) TRIGGERS export triggers (Y)
LOG log file of screen output STATISTICS analyze objects (ESTIMATE)
ROWS export data rows (Y) PARFILE parameter filename
CONSISTENT cross-table consistency(N) CONSTRAINTS export constraints (Y)
OBJECT_CONSISTENT transaction set to read only during object export (N)
FEEDBACK display progress every x rows (0)
FILESIZE maximum size of each dump file
FLASHBACK_SCN SCN used to set session snapshot back to
FLASHBACK_TIME time used to get the SCN closest to the specified time
QUERY select clause used to export a subset of a table
RESUMABLE suspend when a space related error is encountered(N)
RESUMABLE_NAME text string used to identify resumable statement
RESUMABLE_TIMEOUT wait time for RESUMABLE
TTS_FULL_CHECK perform full or partial dependency check for TTS
VOLSIZE number of bytes to write to each tape volume
TABLESPACES list of tablespaces to export
TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
TEMPLATE template name which invokes iAS mode export
Export terminated successfully without warnings.
[oracle@ozawa oracle]$
Joel P�rez -
Hi dears.
I am using oracle 8.0.5. I always using Data Manager for import and export.
Now i have installed oralce 9i and 10g. in both of these version i could not found Data Manager.
Can some body help me how i import and export by 9i or 10g
thnsCan some body help me how i import and export by 9i or 10gFor 9i/10g -- [http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/exp_imp.htm#i1023560]
For 10g and above --> [http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/part_dp.htm]
Anand -
How to set a variable value on the BPC Data Manager
Hello BPC Experts,
I'm creating a BPC10 NW version demo environment for our prospect customer.
I made a data manager to import BW data into BPC model from BW cube.
I need to import just one month data into BPC from the BW cube that has more than one month data.
When I run the data package manually, I can select a member of the time dimension prompt and I can import specific month data I want.
But, in a case where the program (data manager) runs by JOB monthly,
I can't select a member of the time dimension prompt manually.
So I want to know how to set a variable value to the time dimension prompt from a system date etc. automatically.
Are there any way to set a variable value to the time dimension prompt on the data manager automatically from a system date?
Or, do you have any other solution to import just one month data into BPC from the BW cube that has more than one month data by the data manager running on JOB ?
(without selecting a member of the time dimension prompt of the data package manually)
Thanks in advance,
KeisukeHi Gersh
Sorry for my late reply and thanks for your helpful information.
I tried the second way of your information and I could configure it.
And I 'll try first way of your information.
Regards,
Keisuke
Maybe you are looking for
-
Best practice question for implementing a custom component
I'm implementing a custom component which renders multiple <input type="text" .../> controls as part of it. The examples I've seen that do something similar use the ResponseWriter to generate the markup "by hand" like: writer.startElement("input
-
How do i get my iPod vid to play videos continuously and not one by one plz
Hi, can someone please advise how do i get the iPod to play the videos one after the other like when playing songs. I have tried going into video settings but it does not help. I did manage to get it on cont. play once last week sometime but i think
-
Receiving java.io.FileNotFoundException: (No such file or directory)
Hi, I'm trying to use XML Bursting program in 11.5.10. I have uploaded the control file, data template. But the XML Publisher Bursting program completes with error and below is the error message in the log file: XML/BI Publisher Version : 5.6.3 Reque
-
How do set up Favorites to back up automatically (eg weekly)?
I would like to set up my Firefox browser so that it automatically backs up my Favorites file once a week. I have been doing the manual backup, but I sometimes forget.
-
When I go into my app world and try to download anything I get the message 'There is an issue with the current session. Please login to continue. error 30702' however I cannot logout to then log back in. I get the same message when I try to enter My