Data-Export/Import
Hi,
I have exported my data via SQL Developer. The file has been saved as a LDR-File. In the file I can find:
LOAD DATA
INFILE *
TRUNCATE
INTO TABLE "CONTRL"
FIELDS TERMINATED BY '|'
OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS (
DATEINAME ,
DATEITYP ,
PFAD ,
TIMESTMP timestamp "DD.MM.RR" )
begindata
20091001_043629_0000786715|CONTRL|c:\TestCON715.txt|01.10.09 04:34:00,000000000|
Now I am trying to import this file. I am using the SQLLoader. I cannot do it, even on the same computer.
In the log-file I can find ORA-01830 and even I try to do some conversion like:
TIMESTMP timestamp "MM-DD-YY hh24:mi:ss"
TIMESTMP to_date(:TIMESTMP "MM-DD-YY hh24:mi:ss")
Or….I am getting all of the time the same ORA-01830!
How to solve the problem? Why the export function save a file that is not importable at the some computer?
Thanks
Kef
Look at this test:
SQL> create table test (t timestamp);
Table created.
SQL> insert into test values (
2 to_timestamp('01.10.09 04:34:00,000000000','mm.dd.rr'));
to_timestamp('01.10.09 04:34:00,000000000','mm.dd.rr'))
ERROR at line 2:
ORA-01830: date format picture ends before converting entire input string
SQL> insert into test values (
2 to_timestamp('01.10.09 04:34:00,000000000','mm.dd.rr hh24:mi:ss'));
to_timestamp('01.10.09 04:34:00,000000000','mm.dd.rr hh24:mi:ss'))
ERROR at line 2:
ORA-01830: date format picture ends before converting entire input string
SQL> insert into test values (
2 to_timestamp('01.10.09 04:34:00,000000000','mm.dd.rr hh24:mi:ss,ff9'));
1 row created.So I think you must change
TIMESTMP timestamp "DD.MM.RR" )to
TIMESTMP timestamp "mm.dd.rr hh24:mi:ss,ff9" )Max
[My Italian Oracle blog|http://oracleitalia.wordpress.com/2010/01/31/le-direttive-di-compilazione-pragma/]
Similar Messages
-
Data export/import , different OS's
HI,
i have to export data from 10.2.0.1 database which is on Windows OS and import that data into a 9.2.0.6 database which is on IBM AIX Platform
How can i perform the data export/import between different OS's?
Kindly suggest
Thanks
SKYou can use exp from your AIX machine DB, as it is 9i, it works as 9i client
exp username/password@your_10gdb_ip_address/10gdb_instance_name file='your AIX path to store the dump' .......
e.g: exp hr/[email protected]/orcl file='xxx' .........
or if you have a tnsname entry configured for 10db in AIX 9i then use
exp user/pass@tnsname file='xxxx'.....
e.g: hr/hr@10gdb_on_win file='xxxxx' ........
then import it from AIX by
imp file='path of the dump reside in AIX' ..................
Edited by: adnanKaysar on Mar 18, 2010 3:10 PM -
XML Multiple Data Export/Import into New Form
Is it possible to import an XML file with multiple form data (exported via "Merge data files into spreadsheet" option, then saved as XML) into a "template" form and create individual forms from the multiple data sheet? In other words, I've merged 65 forms' data into an XML file. Now I'd like to import it all back into an updated form.
What I've been doing now is exporting the XML data individually for each form and importing each form individually into the new form.
One option is to extend rights to the user to import nd export themselves, but I'm still looking into the Formrouter service, which, if implemented, won't be for a while.
Any solutions to this painful process?
Thanks - Derek
I just realized this may be a question for a different forum...Acrobat... My apologies.Hi Derek,
Without the LC Enterprise server product(s) I don't think you will be able to achieve this. Acrobat.com give a mechanism of distributing the form, I am fairly sure it will allow you to view the responses in a new form.
Also applying reader extensions to the form with Acrobat will not help, as this removes the ability of importing/exporting XML. See https://acrobat.com/#d=3lGJZAZuOmk8h86HCWyJKg. If you are extending rights with LC Reader Extensions ES then this restriction should not apply.
If you have the 65 XML responses, I would be inclined to bite the bullet and manually import the XML into the new form.
Good luck,
Niall -
About Oracle 10g Data export/import
I have two DBs, let's call them AProd, BAcpt.
all tables/procedures/sequences etc. are defined same, only schema names are different, and the role names are different (e.g. In AProd, role 1 named as AProd_ROLE1, but corresponding role in BAcpt is named as BAcpt_ROLE1).
In order to use Aprod data to refresh BAcpt, in OEM, I established a DB link from BAcpt to Aprod and planned to use "Import from Database" option.
Option 1: By selecting "Schema" -> "Data Only", Import content type =" Import all objects", I am able to refresh BAcpt's tables with AProd's data, however, I can't refresh Sequences, especially the last_number of the sequences.
Option 2: By selecting " Schema" -> "All", Import content type =" Import all objects", I am afraid this will overrite all the roles/privs of BAcpt, and I have to run scripts to recreate BAcpt's roles, and grant those roles to users again...
Is there any better solution for this request?
Thanks,Thank you first.
I did drop and recreate on target DB using source DB's sequences. Extending the original question a bit, if the sequence doesn't exist on target, I guess Option 1 won't bring over the new sequence/procedures from source DB either, correct?
Also for Option 1, It seems the Index also be updated........... Any document I can refer for details? I thought after import data, I need to "rebuild Index online" ... will it still be true ?
Thank you for any answers:-) -
Usage of export , import tools in SAP 4.6C
Hi Gurus,
Can you please guide me how to do sap data export & import in 4.6C version?
Kindly guide me how to documents and also any related blogs on this.
SivaHi Vankan,
Your question is so extensive that should be explained a bit.
Could you please be more specific? Which platform are we talking about? DB, OS, etc?
Which kind of export/import do you want to carry out?: e.g.: client transport, client copy, homogeneous system copy, heterogeneous system copy...
The technique you will use for that depends on what you want to do and on the platform you are working on. -
Help on export sybase iq tables with data and import in another database ?
Help on export Sybase iq 16 tables with data and import into another database ?
Hi Nilesh,
If you have table/index create commands (DDLs), you can create them in Developper and import data using one of methods below
Extract/ Load table
Insert location method : require IQ servers to be entered in interfaces file
Backup/Restore : copy entire database content
If you have not the DDLs, you can generate them using IQ cockpit or SCC.
http://infocenter.sybase.com/help/topic/com.sybase.infocenter.dc01773.1604/doc/html/san1288042631955.html
http://infocenter.sybase.com/help/topic/com.sybase.infocenter.dc01840.1604/doc/html/san1281564927196.html
Regards,
Tayeb. -
Error message when importing data using Import and export wizard
Getting below error message when importing data using IMPORT and EXPORT WIZARD
Error 0xc0202009: Data Flow Task 1: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
<dir>
<dir>
Messages
Error 0xc0202009: Data Flow Task 1: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80004005 Description: "Could not allocate a new page for database REPORTING' because of insufficient disk space in filegroup 'PRIMARY'.
Create the necessary space by dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.".
(SQL Server Import and Export Wizard)
Error 0xc0209029: Data Flow Task 1: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Destination - Buyer_.Inputs[Destination Input]" failed because error code 0xC020907B occurred, and the error row disposition on "Destination
- Buyer_First_Qtr.Inputs[Destination Input]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
Error 0xc0047022: Data Flow Task 1: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Destination - Buyer" (28) failed with error code 0xC0209029 while processing input "Destination Input" (41). The
identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information
about the failure.
(SQL Server Import and Export Wizard)
Error 0xc02020c4: Data Flow Task 1: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
(SQL Server Import and Export Wizard)
</dir>
</dir>
Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on Source - Buyer_First_Qtr returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput().
The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
Smash126Hi Smash126,
Based on the error message” Could not allocate a new page for database REPORTING' because of insufficient disk space in filegroup 'PRIMARY'. Create the necessary space by dropping objects in the filegroup, adding additional files to the filegroup, or setting
autogrowth on for existing files in the filegroup”, we can know that the issue is caused by the there is no sufficient disk space in filegroup 'PRIMARY' for the ‘REPORTING’ database.
To fix this issue, we can add additional files to the filegroup by add a new file to the PRIMARY filegroup on Files page, or setting Autogrowth on for existing files in the filegroup to increase the necessary space.
The following document about Add Data or Log Files to a Database is for your reference:
http://msdn.microsoft.com/en-us/library/ms189253.aspx
If there are any other questions, please feel free to ask.
Thanks,
Katherine Xiong
Katherine Xiong
TechNet Community Support -
Using export/import to migrate data from 8i to 9i
We are trying to migrate all data from 8i database to 9i database. We plan to migrate the data using export/import utility so that we can have the current 8i database intact. And also the 8i and 9i database will reside on the same machine. Our 8i database size is around 300GB.
We plan to follow below steps :
Export data from 8i
Install 9i
Create tablespaces
Create schema and tables
create user (user used for exporting data)
Import data in 9i
Please let me know if below par file is correct for the export :
BUFFER=560000
COMPRESS=y
CONSISTENT=y
CONSTRAINTS=y
DIRECT=y
FEEDBACK=1000
FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
FILESIZE=2048GB
FULL=y
GRANTS=y
INDEXES=y
LOG=export.log
OBJECT_CONSISTENT=y
PARFILE=exp.par
ROWS=y
STATISTICS=ESTIMATE
TRIGGERS=y
TTS_FULL_CHECK=TRUE
Thanks,
Vinod BhansaliI recommend you to change some parameters and remove
others:
BUFFER=560000
COMPRESS=y -- This will increase better storage
structure ( It is good )
CONSISTENT=y
CONSTRAINTS=y
DIRECT=n -- if you set that parameter in yes you
can have problems with some objects
FEEDBACK=1000
FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
FILESIZE=2048GB
FULL=y
GRANTS=y -- this value is the default ( It is
not necesary )
INDEXES=y
LOG=export.log
OBJECT_CONSISTENT=y -- ( start the database in restrict
mode and do not set this param )
PARFILE=exp.par
ROWS=y
STATISTICS=ESTIMATE -- this value is the default ( It is
not necesary )
TRIGGERS=y -- this value is the default ( It is
not necesary )
TTS_FULL_CHECK=TRUE
you can see what parameters are not needed if you apply
this command:
[oracle@ozawa oracle]$ exp help=y
Export: Release 9.2.0.1.0 - Production on Sun Dec 28 16:37:37 2003
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
You can let Export prompt you for parameters by entering the EXP
command followed by your username/password:
Example: EXP SCOTT/TIGER
Or, you can control how Export runs by entering the EXP command followed
by various arguments. To specify parameters, you use keywords:
Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
USERID must be the first parameter on the command line.
Keyword Description (Default) Keyword Description (Default)
USERID username/password FULL export entire file (N)
BUFFER size of data buffer OWNER list of owner usernames
FILE output files (EXPDAT.DMP) TABLES list of table names
COMPRESS import into one extent (Y) RECORDLENGTH length of IO record
GRANTS export grants (Y) INCTYPE incremental export type
INDEXES export indexes (Y) RECORD track incr. export (Y)
DIRECT direct path (N) TRIGGERS export triggers (Y)
LOG log file of screen output STATISTICS analyze objects (ESTIMATE)
ROWS export data rows (Y) PARFILE parameter filename
CONSISTENT cross-table consistency(N) CONSTRAINTS export constraints (Y)
OBJECT_CONSISTENT transaction set to read only during object export (N)
FEEDBACK display progress every x rows (0)
FILESIZE maximum size of each dump file
FLASHBACK_SCN SCN used to set session snapshot back to
FLASHBACK_TIME time used to get the SCN closest to the specified time
QUERY select clause used to export a subset of a table
RESUMABLE suspend when a space related error is encountered(N)
RESUMABLE_NAME text string used to identify resumable statement
RESUMABLE_TIMEOUT wait time for RESUMABLE
TTS_FULL_CHECK perform full or partial dependency check for TTS
VOLSIZE number of bytes to write to each tape volume
TABLESPACES list of tablespaces to export
TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
TEMPLATE template name which invokes iAS mode export
Export terminated successfully without warnings.
[oracle@ozawa oracle]$
Joel P�rez -
Export & Import data in Oracle (Urgent)
I just wonder whether Oracle 8i has the 'Export & Import data' feature in their DBA Administration tool.
Inside DBA Studio, I found a option to export/import data to text file, but we must connect to Oracle Management Server (OMS) first before we can use that feature. I found the same feature available in Oracle 7.3.3 in Oracle Data Manager.
How to make sure that I have a OMS installed on my server? (I purchase a Oracle 8i Standard Edition, does it include OMS?)
Can we export from a table (database A) to a table in database B? Or We can only do this thru. a dump file?With every installation of an Oracle DB you get the exp(ort) and imp(ort) utilities. You can use them to move data from one user to another.
Run them from the dos-prompt like:
exp parfile=db_out.par
imp parfile=db_in.par
with db_out.par=
file=db.dmp
log= db_out.log
userid=system/?????
owner=???
constraints=y
direct=n
buffer=0
feedback=100
and db_in.par=
file=db.dmp
log= db_in.log
userid=system/???
touser=??
fromuser=???
constraints=y
commit=y
feedback=100
null -
Java exception: Planning Data Form Import/Export Utility: FormDefUtil.sh
Hi,
We have the following in our environment
Oracle 10gAS (10.1.3.1.0)
Shared Services (9.3.1.0.11)
Essbase Server (9.3.1.3.01)
Essbase Admin Services (9.3.1.0.11)
Provider Services (9.3.1.3.00)
Planning (9.3.1.1.10)
Financial Reporting + Analysis UI Services (9.3.1.2)
I got the following error while using the Planning Data Form Import/Export Utility. Does anyone have any idea?
hypuser@server01>$PLANNING_HOME/bin/FormDefUtil.sh import TEST.xml localhost admin password SamApp
[May 6, 2009 6:25:11 PM]: Intializing System Caches...
[May 6, 2009 6:25:11 PM]: Loading Application Properties...
[May 6, 2009 6:25:11 PM]: Looking for applications for INSTANCE: []
[May 6, 2009 6:25:13 PM]: The polling interval is set =10000
Arbor path retrieved: /home/hypuser/Hyperion/common/EssbaseRTC/9.3.1
[May 6, 2009 6:25:14 PM]: Setting ARBORPATH=/home/hypuser/Hyperion/common/EssbaseRTC/9.3.1
Old PATH: /home/hypuser/Hyperion/common/JRE/IBM/1.5.0/bin:/home/hypuser/Hyperion/common/JRE/IBM/1.5.0/bin:/home/hypuser/Hyperion/common/JRE/IBM/1.5.0/bin:/home/hypuser/Hyperion/common/JRE/IBM/1.5.0/bin:/home/hypuser/Hyperion/AnalyticServices/bin:/home/hypuser/Hyperion/common/JRE-64/IBM/1.5.0/bin:/usr/bin:/etc:/usr/sbin:/usr/ucb:/home/hypuser/bin:/usr/bin/X11:/sbin:.
[May 6, 2009 6:25:14 PM]: Old PATH: /home/hypuser/Hyperion/common/JRE/IBM/1.5.0/bin:/home/hypuser/Hyperion/common/JRE/IBM/1.5.0/bin:/home/hypuser/Hyperion/common/JRE/IBM/1.5.0/bin:/home/hypuser/Hyperion/common/JRE/IBM/1.5.0/bin:/home/hypuser/Hyperion/AnalyticServices/bin:/home/hypuser/Hyperion/common/JRE-64/IBM/1.5.0/bin:/usr/bin:/etc:/usr/sbin:/usr/ucb:/home/hypuser/bin:/usr/bin/X11:/sbin:.
java.lang.UnsupportedOperationException
at com.hyperion.planning.olap.HspEssbaseEnv.addEssRTCtoPath(Native Method)
at com.hyperion.planning.olap.HspEssbaseEnv.init(Unknown Source)
at com.hyperion.planning.olap.HspEssbaseJniOlap.<clinit>(Unknown Source)
at java.lang.J9VMInternals.initializeImpl(Native Method)
at java.lang.J9VMInternals.initialize(J9VMInternals.java:187)
at com.hyperion.planning.HspJSImpl.createOLAP(Unknown Source)
at com.hyperion.planning.HspJSImpl.<init>(Unknown Source)
at com.hyperion.planning.HspJSHomeImpl.createHspJS(Unknown Source)
at com.hyperion.planning.HspJSHomeImpl.getHspJSByApp(Unknown Source)
at com.hyperion.planning.HyperionPlanningBean.Login(Unknown Source)
at com.hyperion.planning.HyperionPlanningBean.Login(Unknown Source)
at com.hyperion.planning.utils.HspFormDefUtil.main(Unknown Source)
Setting Arbor path to: /home/hypuser/Hyperion/common/EssbaseRTC/9.3.1
[May 6, 2009 6:25:15 PM]: MAX_DETAIL_CACHE_SIZE = 20 MB.
[May 6, 2009 6:25:15 PM]: bytesPerSubCache = 5654 bytes
[May 6, 2009 6:25:15 PM]: MAX_NUM_DETAIL_CACHES = 3537
Setting HBR Mode to: 2
Unable to find 'HBRServer.properties' in the classpath
HBR Configuration has not been initialized. Make sure you have logged in sucessfully and there are no exceptions in the HBR log file.
java.lang.Exception: HBR Configuration has not been initialized. Make sure you have logged in sucessfully and there are no exceptions in the HBR log file.
HBRServer.properties:HBR.embedded_timeout=10
HBR Configuration has not been initialized. Make sure you have logged in sucessfully and there are no exceptions in the HBR log file.
java.lang.ExceptionInInitializerError
at java.lang.J9VMInternals.initialize(J9VMInternals.java:205)
at com.hyperion.hbr.api.thin.HBR.<init>(Unknown Source)
at com.hyperion.hbr.api.thin.HBR.<init>(Unknown Source)
at com.hyperion.planning.db.HspFMDBImpl.initHBR(Unknown Source)
at com.hyperion.planning.db.HspFMDBImpl.initializeDB(Unknown Source)
at com.hyperion.planning.HspJSImpl.createDBs(Unknown Source)
at com.hyperion.planning.HspJSImpl.<init>(Unknown Source)
at com.hyperion.planning.HspJSHomeImpl.createHspJS(Unknown Source)
at com.hyperion.planning.HspJSHomeImpl.getHspJSByApp(Unknown Source)
at com.hyperion.planning.HyperionPlanningBean.Login(Unknown Source)
at com.hyperion.planning.HyperionPlanningBean.Login(Unknown Source)
at com.hyperion.planning.utils.HspFormDefUtil.main(Unknown Source)
Caused by: Exception HBR Configuration has not been initialized. Make sure you have logged in sucessfully and there are no exceptions in the HBR log file.
ClassName: java.lang.Exception
at com.hyperion.hbr.common.ConfigurationManager.getServerConfigProps(Unknown Source)
at com.hyperion.hbr.cache.CacheManager.<clinit>(Unknown Source)
at java.lang.J9VMInternals.initializeImpl(Native Method)
at java.lang.J9VMInternals.initialize(J9VMInternals.java:187)
... 11 more
[May 6, 2009 6:25:15 PM]: Regeneration of Member Fields Complete
[May 6, 2009 6:25:16 PM]: Thread main acquired connection com.hyperion.planning.olap.HspEssConnection@5f0e5f0e
[May 6, 2009 6:25:16 PM]: Thread main releasing connection com.hyperion.planning.olap.HspEssConnection@5f0e5f0e
[May 6, 2009 6:25:16 PM]: Thread main released connection com.hyperion.planning.olap.HspEssConnection@5f0e5f0e
[May 6, 2009 6:25:16 PM]: Need to create an Object. pool size = 0 creatredObjs = 1
java.lang.RuntimeException: Unable to aquire activity lease on activity 1 as the activity is currently leased by another server.
at com.hyperion.planning.sql.actions.HspAquireActivityLeaseCustomAction.custom(Unknown Source)
at com.hyperion.planning.sql.actions.HspAction.custom(Unknown Source)
at com.hyperion.planning.sql.actions.HspActionSet.doActions(Unknown Source)
at com.hyperion.planning.sql.actions.HspActionSet.doActions(Unknown Source)
at com.hyperion.planning.HspJSImpl.aquireActivityLease(Unknown Source)
at com.hyperion.planning.HspJSImpl.reaquireActivityLease(Unknown Source)
at com.hyperion.planning.utils.HspTaskListAlertNotifier.reaquireTaskListActivityLease(Unknown Source)
at com.hyperion.planning.utils.HspTaskListAlertNotifier.processTaskListAlerts(Unknown Source)
at com.hyperion.planning.utils.HspTaskListAlertNotifier.run(Unknown Source)
[May 6, 2009 6:25:16 PM]: Fetching roles list for user took time: Total: 42
[May 6, 2009 6:25:16 PM]: Entering method saveUserIntoPlanning
[May 6, 2009 6:25:16 PM]: User role is:0
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HP:0005,ou=HP,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:1,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:9,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:3,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:7,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:14,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:15,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:10,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:12,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:13,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:1,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:9,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:3,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:7,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:14,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:15,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:10,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:12,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Skipping unused HUB role: native://DN=cn=HUB:13,ou=HUB,ou=Roles,dc=css,dc=hyperion,dc=com?ROLE
[May 6, 2009 6:25:16 PM]: Hub Roles for user is:991
[May 6, 2009 6:25:16 PM]: Exiting method saveUserIntoPlanning
[May 6, 2009 6:25:16 PM]: Saved the user admin to Planning
[May 6, 2009 6:25:16 PM]: Entering method persistUserChanges()
[May 6, 2009 6:25:16 PM]: Exiting method persistUserChanges()
[May 6, 2009 6:25:16 PM]: Before calling getGroupsList for user from CSS
[May 6, 2009 6:25:16 PM]: After getGroupsList call returned from CAS with groupsList [Ljava.lang.String;@705c705c
[May 6, 2009 6:25:16 PM]: Fetching groups list for user took time: Total: 4
[May 6, 2009 6:25:16 PM]: Entering method persistGroupChanges()
[May 6, 2009 6:25:16 PM]: Exiting method persistGroupChanges()
[May 6, 2009 6:25:16 PM]: User synchronization of 1 user elapsed time: 81, Users: 72, Groups: 9.
[May 6, 2009 6:25:16 PM]: Didnt add child Forms, couldnt find parent 1
Add/Update form under form folder - Corporate
[May 6, 2009 6:25:21 PM]: Propegating external event[ FROM_ID: 7a7ebc1d Class: class com.hyperion.planning.sql.HspObject Object Type: -1 Primary Key: 1454699 ]
[May 6, 2009 6:25:21 PM]: Propegating external event[ FROM_ID: 7a7ebc1d Class: class com.hyperion.planning.sql.HspPDFPrintOptions Object Type: -1 Primary Key: 1454699 ]
[May 6, 2009 6:25:21 PM]: Propegating external event[ FROM_ID: 7a7ebc1d Class: class com.hyperion.planning.sql.HspForm Object Type: 7 Primary Key: 1454699 ]
[May 6, 2009 6:25:21 PM]: Propegating external event[ FROM_ID: 7a7ebc1d Class: class com.hyperion.planning.sql.HspPDFPrintOptions Object Type: -1 Primary Key: 1454699 ]
[May 6, 2009 6:25:21 PM]: Propegating external event[ FROM_ID: 7a7ebc1d Class: class com.hyperion.planning.sql.HspAnnotation Object Type: 14 Primary Key: 1454699 ]
[May 6, 2009 6:25:21 PM]: Propegating external event[ FROM_ID: 7a7ebc1d Class: class com.hyperion.planning.sql.HspAccessControl Object Type: 15 Primary Key: 50001,1454699 ]
[May 6, 2009 6:25:21 PM]: Propegating external event[ FROM_ID: 7a7ebc1d Class: class com.hyperion.planning.sql.HspFormDef Object Type: -1 Primary Key: 1454699 ]
Form imported complete.
[May 6, 2009 6:25:21 PM]: Could not get HBR connection.Hi,
When I run the Formdefutil command, forms were imported successfully. But I got the following message.
Could not get HBR connection.
What does it mean?
Thanks & Regards,
Sravan Kumar. -
Export / import tablespace with all objects (datas, users, roles)
Hi, i have a problem or question to the topic export / import tablespace.
On the one hand, i have a database 10g (A) and on the other hand, a database 11g (B).
On A there is a tablespace called PRO.
Furthermore 3 Users:
PRO_Main - contains the datas - Tablespace PRO
PRO_Users1 with a role PRO_UROLE - Tablespace PRO
PRO_Users2 with a role PRO_UROLE - Tablespace PRO
Now, i want to transfer the whole tablespace PRO (included users PRO_MAIN, PRO_USER1, PRO_User2 and the role PRO_UROLE) from A to B.
On B, I 've created the user PRO_Main and the tablespace PRO.
On A , i execute following statement:
expdp PRO_Main/XXX TABLESPACES=PRO DIRECTORY=backup_datapump DUMPFILE=TSpro.dmp LOGFILE=TSpro.log
On B:
impdp PRO_Main/XXX TABLESPACES=PRO DIRECTORY=backup_datapump DUMPFILE=TSpro.dmp LOGFILE=TSpro.log
Result:
The User PRO_Main was imported with all the datas.
But i 'm missing PRO_USER1, PRO_User2 and the role PRO_UROLE...
I assume, i 've used wrong parameters in my expd and / or impdp.
It would be nice, if anybody can give me a hint.
Thanks in advance.
Best Regards,
FrankWhen you do a TABLESPACE mode export by specifying just the tablespaces, then all that gets exported are the tables and their dependent objects. The users, roles, and the tablespace definitions themselves don't get exported.
When you do a SCHEMA mode export by specifying the schemas, you will get the schema definitions (if the schema running the export is privied) and all of the objects that the schema owns. The schema does not own roles or tablespace definitions.
In your case, you want to move
1. schemas - which you already created 1 on your target database
2. roles
3. everything in the tablespaces owned by multiple schemas.
There is no 1 export/import command that will do this. This is how i would do this:
1 - move the schema definitions
a. you can either create these manually or
b1. expdp schemas=<your list of schemas> include=user
b2 impdp the results from b1.
2. move the roles
expdp full=y include=role ...
remember, this will include all roles. If you want to limit what gets exported, then use:
include=role:"in ('ROLE1', 'ROLE2', ETC.)
impdo the roles just exported
3. move the user information
a. If you want to move all of the schema's objects like functions, packages, etc, then you need to use a schema mode
export
expdp user/password schemas=a,b,c ...
b. If you want to move only the objects in those tablespaces, then use the tablespace export
expdp user/password tablespaces=tbs1, tbs2, ...
c. import the dumpfile generated in step 3
impdp user/password ...
Hope this helps.
Dean -
Procedure Data Export and Import from One server to another server
Dear all,
I want to transfer my all Production Server data into Quality Server or Development Server. So how can i transfer my all data in this and please tell me procedure for transfer data into different server. and also tell me which server u preferred for store production data. Please help me out as soon as possible.
Thanks in Advance
Keyur chauhan.That is possible throgh Remote Client copy or Client Export Import depends on the size of data which you want to transfer.
http://www.sap-basis-abap.com/bc/client-copy-from-production-to-quality-server-to.htm.
http://help.sap.com/saphelp_erp2004/helpdata/en/69/c24c4e4ba111d189750000e8322d00/content.htm
Please go through these links.
I would prefer to transfer the data in QA system so that all the user testing can be done easily.
Regards,
Subhash -
Materialized View with "error in exporting/importing data"
My system is a 10g R2 on AIX (dev). When I impdp a dmp from other box, also 10g R2, in the dump log file, there is an error about the materialized view:ORA-31693: Table data object "BAANDB"."MV_EMPHORA" failed to load/unload and is being skipped due to error:
ORA-02354: error in exporting/importing data
Desc mv_emphora
Name Null? Type
C_RID ROWID
P_RID ROWID
T$CWOC NOT NULL CHAR(6)
T$EMNO NOT NULL CHAR(6)
T$NAMA NOT NULL CHAR(35)
T$EDTE NOT NULL DATE
T$PERI NUMBER
T$QUAN NUMBER
T$YEAR NUMBER
T$RGDT DATEAs i ckecked here and Metalink, I found the info is less to do with the MV? what was the cause?The total lines are 25074. So I used the GREP from the OS to get the lines involved with the MV. Here are:
grep -n -i "TTPPPC235201" impBaanFull.log
5220:ORA-39153: Table "BAANDB"."TTPPPC235201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
5845:ORA-39153: Table "BAANDB"."MLOG$_TTPPPC235201" exists and has been truncated. Data will be loaded but all dependent meta data will be skipped due to table_exists_action of truncate
8503:. . imported "BAANDB"."TTPPPC235201" 36.22 MB 107912 rows
8910:. . imported "BAANDB"."MLOG$_TTPPPC235201" 413.0 KB 6848 rows
grep -n -i "TTCCOM001201" impBaanFull.log
4018:ORA-39153: Table "BAANDB"."TTCCOM001201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
5844:ORA-39153: Table "BAANDB"."MLOG$_TTCCOM001201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
9129:. . imported "BAANDB"."MLOG$_TTCCOM001201" 9.718 KB 38 rows
9136:. . imported "BAANDB"."TTCCOM001201" 85.91 KB 239 rows
grep -n -i "MV_EMPHORA" impBaanFull.log
8469:ORA-39153: Table "BAANDB"."MV_EMPHORA" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
8558:ORA-31693: Table data object "BAANDB"."MV_EMPHORA" failed to load/unload and is being skipped due to error:
8560:ORA-12081: update operation not allowed on table "BAANDB"."MV_EMPHORA"
25066:ORA-31684: Object type MATERIALIZED_VIEW:"BAANDB"."MV_EMPHORA" already exists
25072: BEGIN dbms_refresh.make('"BAANDB"."MV_EMPHORA"',list=>null,next_date=>null,interval=>null,implicit_destroy=>TRUE,lax=>
FALSE,job=>44,rollback_seg=>NULL,push_deferred_rpc=>TRUE,refresh_after_errors=>FALSE,purge_option => 1,parallelism => 0,heap_size => 0);
25073:dbms_refresh.add(name=>'"BAANDB"."MV_EMPHORA"',list=>'"BAANDB"."MV_EMPHORA"',siteid=>0,export_db=>'BAAN'); END;the number in front of each line is the line number of the import log.
Here is my syntax of import pmup:impdp user/pw SCHEMAS=baandb DIRECTORY=baanbk_data_pump DUMPFILE=impBaanAll.dmp LOGFILE=impBaanAll.log TABLE_EXISTS_ACTION=TRUNCATEYes I can create the MV manually and I have no problem to refresh manually it after the inmport. -
BPC10 - Data manager package for dimension data export and import
Dear BPC Expers,
Need your help.
I am trying to set up a data manager package for first time to export dimension - master data from one application and import in another application ( both have same properties) .
I created a test data manager package from Organize > add package > with process chain /CPMB/EXPORT_MD_TO_FILE and Add
In the advance tab of each task there are some script logic already populated. please find attached the details of the script logic written under each of the tasks like MD_Source, concvert and target .
I have not done any chnages in the script inside the task .
But when i run the package , I have selected a dimension 'Entity' but in second prompt ,it ask for a transformation file , and syatem autometically add the file ... \ROOT\WEBFOLDERS\COLPAL\FINANCE\DATAMANAGER\TRANSFORMATIONFILES\Import.xls
I have not changed anything there
in the next prmpt , it ask for a output file ..and it won't allow me enter the file name .....i
Not sure how to proceed further.
I shall be greatfull if someone guide me from your experiance how to set up a simple the data manager package for master data export from dimension . Should I update the transformation file in the script for import file and output file in the advance tab. how and what transformation file to be created and link to the data manager package for export / import .
What are the steps to be executed to run the package for exporting master data from dimension and import it another application .
Thanks in advance for your guidance.
Thanks and Regards,
Ramanuj
=====================================================================================================
Detals of the task
Task : APPL_MD-SOURCE
(DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
(OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
(RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
(%TEMPNO1%,%INCREASENO%)
(%TEMPNO2%,%INCREASENO%)
(/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
(/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
(/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
(/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
(/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
(/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
(/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
Task : EXPORT_MD_CONVERT
(DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
(OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
(RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
(%TEMPNO1%,%INCREASENO%)
(%TEMPNO2%,%INCREASENO%)
(/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
(/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
(/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
(/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
(/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
(/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
(/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
Task : FILE_TARGET
(DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
(OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
(RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
(%TEMPNO1%,%INCREASENO%)
(%TEMPNO2%,%INCREASENO%)
(/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
(/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
(/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
(/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
(/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
(/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
(/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
================================================================================1. Perhaps you want to consider a system copy to a "virtual system" for UAT?
2. Changes in QAS (as with PROD as well) will give you the delta. They should ideally be clean... You need to check the source system.
Another option is to generate the profiles in the target system. But for that your config has to be sqeaky clean and in sync, including very well maintained and sync'ed Su24 data.
Cheers,
Julius -
Loss of people picker column data during import/export of a list in SharePoint 2010
Hi
Even we are facing the same issue. i a have meeting minutes(ootb) list, there are totally 100 items with multiple participant in Participant column, when i try to import export that list to some other site.
participant column loosing the data, for all the items the data is empty.
any idea?
guruIf the other site is in another site collection then that is to be expected.
User information is cached on each site collection as the user identity is referenced. That means that a user might have id #7 on one site and #8438 on another. The user lookup column references that ID value rather than holding full information on the user
in each column they might be needed. To avoid using the wrong IDs when you move a site i suspect it just throws them out.
If that's the case then you'll need to create your own export/import process, most probably with PowerShell.
Maybe you are looking for
-
TS4020 How can I manage and view storage on a second iCloud account?
I have 2 iCloud accounts, my second iCloud account is just used for email, how can I see how much storage I have left on it please? I have logged into iCloud.com and cannot see it there. I can only check the storage on my primary account on my mac's
-
Poor Performance with my new System
Hi All, Great Forum. Hopefully someone here can provide me some direction. I just built my first AMD based system (spec's below). Everything is running smooth and stable, but I dont believe I am getting the performance out of this rig as I should.
-
My phone doesn't give me the option to turn mirroring on when trying to connect to apple tv. As far as I can see everything is updated and the AirPlay is turned on. Be grateful if anyone knows what's wrong.
-
We are trying to get ADDSync to work with Office 365. servers: Windows 2012 r2 Environment: ADDSYNC Cloud Service: Office 365 Federated Domain Services are setup and working Our ADDsync (used to be dirsync) isn't running properly anymore. We deleted
-
Any Ideas Why LASTDATE Isn't Working as Expected?
I have a measure ... Inventory Quantity:=CALCULATE ( SUM ( [PosNetProjdQty] ), LASTDATE ( Calendar[CLNDR_WK_BEG_DT] ) ... which when added to a Pivot Table I'd expect a YEAR subtotal to equal the December Ending MONTH value. Instead, it's summing a