Export/import of mappings using OMBPLUS and Collections?
Hi,
We've got some good TCL code to export (and import) mappings for our projects along the lines of:
OMBEXPORT MDL_FILE '$projname.mdl' \
PROJECT '$projname' \
CONTROL_FILE '$controlfile.txt' \
OUTPUT LOG '$projname.log'
and now we're moving to organizing mappings by OWB COLLECTION and need to export/import by collection instead. Thought it might be as simple as this:
OMBEXPORT MDL_FILE '$collname.mdl' \
FROM COMPONENTS ( COLLECTION '$collname')
CONTROL_FILE '$controlfile.txt' \
but that's not working (saying "Project $collname does not exist.", so it seems to still be treating what I thought was a collection name as a project name . Can someone point out the right syntax to use COLLECTIONs in OMBEXPORT/OMBIMPORT?
All thoughts/tips appreciated...
Thanks,
Jim C.
Hi,
If we use the below tcl script for the entire project export , does it also export the locaitons,control center,configurations along???
If yes then can you gimme the script for altering the locations , control center/configuration connection details??
OMBEXPORT MDL_FILE '$projname.mdl' \
PROJECT '$projname' \
CONTROL_FILE '$controlfile.txt' \
OUTPUT LOG '$projname.log'
Can you also provide me the tcl for project import??
Thanks
Similar Messages
-
Avoiding creation of DBlink during deployment of mappings using OMBPlus
Hi
we are facing an issue in our OWB 11g R2 (upgraded to patchset 10185523)
We are deploying the mappings using OMBPlus like so
OMBCREATE TRANSIENT DEPLOYMENT_ACTION_PLAN 'DEPLOY_PLAN' ADD ACTION '$object_type_DEPLOY' SET PROPERTIES (OPERATION)VALUES ('CREATE') SET REFERENCE $object_type '$objname'
OMBDEPLOY DEPLOYMENT_ACTION_PLAN 'DEPLOY_PLAN'
set OMBCONTINUE_ON_ERROR true
OMBDROP DEPLOYMENT_ACTION_PLAN 'DEPLOY_PLAN'
OMBCOMMIT
The $objname is any object(in this case a mapping) that we are deploying
OMBPlus generates a script on OWB level to create the associated DBlink each time it deploys a mapping and we get an error
INFORMATIONAL
multiple rows found for select into statement
DBlink_a Create Error ORA-02011: duplicate database link name
But the mapping deploys fine.
Any tips how we can avoid creation of dblinks from OMBPlus whilst deploying our mappings only?
Any help will be appreciated
Birdy
Edited by: birdy on 22-Nov-2011 05:22I dont think that you can:
"Deploying a mapping or a process flow includes these steps:
•Generate the PL/SQL, SQL*Loader, or ABAP script, if necessary.
•Register the required locations and deploy any required connectors. This ensures that the details of the physical locations and their connectors are available at runtime."
http://www.comp.dit.ie/btierney/Oracle11gDoc/owb.111/b31278/concept_deploy.htm
But error that you get is only informational so you dont have to worry about it. -
Problem deploying certain mappings using OMBPlus
We've got a problem deploying certain mapings using OMBPLus. This applies to all mappings that use a function to lookup the effective date of a SCD (type 2) from a lookup table. We use that function in the dimension operator properties as the "Default Effective Time of Open Record". Though the validation of those mappings fails they can nevertheless be deployed using the Control Center Manager. And, best of all, they work fine.
Now we have to deploy those mappings using OMBPlus because we're not allowed to work with tools on the preproduction and production systems. Deploying those mappings with OMBPlus doesn't seem to work. It works fine for all other mappings but not for those mentioned above. Has anybody a solution?
Regards,
JörgHi Dave,
this is the error message I get whe I run my OMBPLus script:
OMB05602: An unknown Deployment error has occured for Object Type PUB03205 : Validation Error: MAP_DIM_FAHRZEUGKLASSE : VLD-5010: The Default Effective Date of open records for dimension DIM_FAHRZEUGKLASSE should be an expression of type DATE. Value "FCT_GET_AKT_LL_vmon()" is not a valid date: PLS-00201: identifier 'FCT_GET_AKT_LL_VMON' must be declared
I'm not quite sure about what you mean with creating a stub function. As you can see from the error message above I use a function called FCT_GET_AKT_LL_vmon that returns the actual date od the load from a lookup table. This function is of course present in the target schema.
Regards,
Jörg -
hi all
It would be appreciated if any one let me know how to edit
bitmap which is imported in flash using xml and save the edited
bitmap back to xml in flash.
Is it posible to save the bitmap data in flash?
thanks in advanceYes you can... but like I said before you need to upload the
data from the changes you make to a server.
In terms of the solution... its unlikely that you'll find one
specifically for your needs. You will have to learn whatever you
don't know how already and maybe adapt some existing examples to
your needs.
To change the visual state of a movie clip... you just do all
the regular things that you want to do to it using flash... scale,
rotation, drawing API , textfields etc in actionscript. If you
don't know how to how to do that stuff, then you need to learn that
first. That's basic actionscript.
You can capture the visual state of a movieclip using the
BitmapData class. That includes a loaded jpeg. You can also
manipulate bimatp data using the same class. You should read up on
that if you don't know how to use it or check out the examples
below for uploading info.
For uploading to the server:
Here's an as2 solution that took 15 secs to find using
google:
http://www.quasimondo.com/archives/000645.php
If you're using as3, google search for "jpeg encoder as3" and
look through that info. There are also historical answers in the
forums here related to this type of thing that might help as
well. -
Portal export/import between 9.0.2 and 9.o.3
I am aware that you can not export/import between portal installations on different versions e.g. 3.0.9 and 9.0.2 but is it possible to do export/import between 9.0.2 and 9.0.3?
The FAQ only states:
<<<<Snip>>>>
Portal Export/Import utilities will only work between portal installations that are at the same revision level. For example, Portal Import/Export will copy content successfully between two 9.0.2.x instances of Portal, but not between a 9.0.2 portal instance and a 3.0.9 portal instance.
That said, Portal Export/Import WILL work between two portal instances that are of different "minor" revision levels. For example, Portal Export/Import will work between a 9.0.2.2.14 portal instance to a 9.0.2.2.22 portal instance, and vice versa.
<<<</Snip>>>> -
Can't Import Sony AVCHD Using Log and Transfer
HI
I have been importing AVCHD footage from all kinds of cameras, mainly Panasonics for over 4 years now using Final Cut Pro 7. I just got a new Sony NEX-EA50 but when I try to import the footage using Log and Transfer, it does not show up. When I try to drop the card folder into the window I get the ""PRIVATE" contains unsupported media or has an invalid directory structure. Please choose a folder whose directory structure matches supported media." message.
The only way I can import the footage is using Toast, but that is a pain in the *** because it is a hassle to rename and scan the clips before I import.
Does anybody know whats going on? Did Sony purposely change their file directory system so it couldn't be read by FCP?
Thanks in advance.Hi,
What framerate did you shoot ?
Log and Transfer will not recognise non standard frame rates such as 50 fps or 60fps. Toast, Clipwrap etc is an option but there is another. If you have the Panasonic AVCCAM importer Quicktime plug in, it will allow you to convert to Pro Res through Mpeg Streamclip or Compressor.
DM
Hang on, are you pointing log and transfer at the ROOT folder and not the folders inside. -
Importing mappings using OMBPLUS
we just started using OWB, and in our environment we do not want to migrate mappings from DEV to TEST to PRODUCTION using GUI. i heard that importing and deploying can be done using OMB scripting ..
can any please give some example scripts for importing mapping and deploying mappings on to a different box.
ThanksDo you need to move the mapping definitions over to each box? Or just deploy them to different boxes?
If you want a single design repository where you deploy from to multiple runtime areas, then this is acheived through creating multiple configurations and changing your project configuration prior to each deployment.
If, on the other hand, you want to MOVE the current mapping definitions to each box where you will, in effect, be setting up a new design AND runtime repository, then there is added complexity.
For example, scripting dropping old mappings prior to importing the new version of the project to avoid legacy objects hanging around becomes a potential issue. for that, you would need to script deployemnt plans of type DELETE for objects no longer wanted. This also helps script the new deployment as you don't have to code which mappings are being CREATEd, and which are being REPLACEd.
I have an OMB+ script I am just testing right now to load a project MDL file and deploy it to a new repository. I have a partial script for unloading and old version of the project, but it is not ready yet. This one also assumes a clean install where I even have to go so far as to register my runtime user, and define and register all schemas to be used by the control center. We also do not deploy normal database objects via OWB. This script assumes that the proper database setup for tables, sequences, views, etc. has been done prior to deploying the OWB mappings and process flows. We use Designer as the repository of record for all non-OWB database objects.
I don't claim to be an experienced TCL or OMB+ developer, but it might help you out as a starting point. Note that the config file and file of library functions used by this script (such as the exec_omb you will see that I use) are also appended to this post.
File: owb_import.tcl
#get db connection info
source c:\\omb\\owb_config.tcl
#get standard library
source c:\\omb\\omb_library.tcl
# Connect to repos
# Commit anything from previous work in this session, otherwise OMBDISCONNECT will fail out.
exec_omb OMBCOMMIT
# If already connected, disconnect first.
set print [exec_omb OMBDISCONNECT]
# Test if message is "OMB01001: Not connected to repository." or "Disconnected."
# any other message is a showstopper!
if [string match Disconn* $print ] {
log_msg LOG "Success Disconnecting from previous repository...."
} else {
# We expect an OMB01001 error for trying to disconnect when not connected
if [string match OMB01001* $print ] {
log_msg LOG "Disconnect unneccessary. Not currently connected...."
} else {
log_msg ERROR "Error Disconnecting from previous repository....Exiting process."
log_msg ERROR "$print"
#exit
set print [exec_omb OMBCONNECT $OWB_DEG_USER/$OWB_DEG_PASS@$OWB_DEG_HOST:$OWB_DEG_PORT:$OWB_DEG_SRVC USE REPOSITORY '$OWB_DEG_REPOS']
if [omb_error $print] {
log_msg ERROR "Unable to connect to repository."
log_msg ERROR "$print" "1"
log_msg ERROR "Exiting Script.............."
return
} else {
log_msg LOG "Connected to Repository"
# Connect to project
set print [exec_omb OMBCC '$PROJECT_NAME']
if [omb_error $print] {
log_msg LOG "Project $PROJECT_NAME does not exist. Creating...."
set print [exec_omb OMBCREATE PROJECT '$PROJECT_NAME']
if [omb_error $print] {
log_msg ERROR "Unable to create project '$PROJECT_NAME'"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
} else {
log_msg LOG "Created Project '$PROJECT_NAME'"
exec_omb OMBCOMMIT
exec_omb OMBSAVE
exec_omb OMBCC '$PROJECT_NAME'
} else {
log_msg LOG "Switched context to project $PROJECT_NAME"
# Check Existance of Oracle Module
set print [exec_omb OMBCC '$ORA_MODULE_NAME']
if [omb_error $print] {
log_msg LOG "Oracle Module $ORA_MODULE_NAME does not exist. Creating...."
set print [exec_omb OMBCREATE ORACLE_MODULE '$ORA_MODULE_NAME']
if [omb_error $print] {
log_msg ERROR "Unable to create module '$ORA_MODULE_NAME'"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
} else {
log_msg LOG "Created Oracle Module '$ORA_MODULE_NAME'"
exec_omb OMBCOMMIT
exec_omb OMBSAVE
exec_omb OMBCC '$ORA_MODULE_NAME'
} else {
log_msg LOG "Switched context to module $ORA_MODULE_NAME"
#switch back up to project level. You cannot attach locations to a module if you are in it.
exec_omb OMBCC '..'
log_msg LOG "Switched context back up to project $PROJECT_NAME"
# Check Existance of OWB Registered USer
set lst [OMBLIST USERS]
if {[string match *$DATA_LOCATION_USER* $lst]} {
log_msg LOG "Verify USer $DATA_LOCATION_USER Exists."
} else {
log_msg LOG "registering User $DATA_LOCATION_USER."
# set print [ exec_omb OMBREGISTER USER '$DATA_LOCATION_USER' SET PROPERTIES (DESCRIPTION, ISTARGETSCHEMA, TARGETSCHEMAPWD) VALUES ('$DATA_LOCATION_USER', 'true', '$DATA_LOCATION_PASS')]
if [omb_error $print] {
log_msg ERROR "Unable to register user '$DATA_LOCATION_USER'"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
exec_omb OMBCOMMIT
# Import MDL File
#switch back up to root level to ensure import succeeds.
exec_omb OMBSAVE
exec_omb OMBCC '..'
log_msg LOG "Switched context back up to root level."
set print [exec_omb OMBIMPORT MDL_FILE '$IMPORT_FILE_PATH/$IMPORT_FILE_NAME' USE UPDATE_MODE MATCH_BY NAMES OUTPUT LOG TO '$IMPORT_LOG_PATH/$IMPORT_LOG_NAME' ]
if [omb_error $print] {
#We expect to get warnings due to differences in Control center names etc,
if {[string match OMB05105* $print]} {
log_msg LOG "MDL File $IMPORT_FILE_NAME imported with warnings"
} else {
log_msg ERROR "Unable to import $IMPORT_FILE_PATH/$IMPORT_FILE_NAME. "
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
} else {
log_msg LOG "MDL File $IMPORT_FILE_NAME imported with no warnings.............."
exec_omb OMBCOMMIT
exec_omb OMBCC '$PROJECT_NAME'
log_msg LOG "Switched context back to project level."
# Validate to Control Center
set lst [OMBLIST CONTROL_CENTERS]
if [string match *$CONTROL_CENTER_NAME* $lst] {
log_msg LOG "Verify Control Center $CONTROL_CENTER_NAME Exists."
log_msg LOG "Setting Passwords to enable import and deploy"
set print [exec_omb OMBALTER LOCATION '$DATA_LOCATION_NAME' SET PROPERTIES (PASSWORD) VALUES ('$DATA_LOCATION_PASS')]
if [omb_error $print] {
log_msg ERROR "Unable to log onto location '$DATA_LOCATION_NAME' with password $DATA_LOCATION_PASS"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
exec_omb OMBCOMMIT
set print [exec_omb OMBALTER LOCATION '$WFLOW_LOCATION_NAME' SET PROPERTIES (PASSWORD) VALUES ('$WFLOW_LOCATION_PASS')]
if [omb_error $print] {
log_msg ERROR "Unable to log onto workflow location '$WFLOW_LOCATION_NAME' with password $WFLOW_LOCATION_PASS"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
exec_omb OMBCOMMIT
log_msg LOG "Connecting to Control Center $CONTROL_CENTER_NAME"
set print [exec_omb OMBCONNECT CONTROL_CENTER USE '$DATA_LOCATION_PASS' ]
if [omb_error $print] {
log_msg ERROR "Unable to add Location $DATA_LOCATION_NAME to Control Center $CONTROL_CENTER_NAME"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
exec_omb OMBCOMMIT
log_msg LOG "Connected.............."
} else {
log_msg LOG "Need to create Control Center $CONTROL_CENTER_NAME."
log_msg LOG "Creating Code Location."
# For Global_names = FALSE
# set print [exec_omb OMBCREATE LOCATION '$DATA_LOCATION_NAME' SET PROPERTIES (TYPE, VERSION, DESCRIPTION, BUSINESS_NAME, HOST, PORT, SERVICE, CONNECT_AS_USER, PASSWORD, DATABASE_NAME) VALUES ('ORACLE_DATABASE','$DATA_LOCATION_VERS','ERS Datamart Code User','$DATA_LOCATION_NAME', '$DATA_LOCATION_HOST','$DATA_LOCATION_PORT','$DATA_LOCATION_SRVC', '$DATA_LOCATION_USER','$DATA_LOCATION_PASS','$DATA_LOCATION_SRVC' ) ]
set print [exec_omb OMBCREATE LOCATION '$DATA_LOCATION_NAME' SET PROPERTIES (TYPE, VERSION, DESCRIPTION, BUSINESS_NAME, HOST, PORT, SERVICE, CONNECT_AS_USER, PASSWORD) VALUES ('ORACLE_DATABASE','$DATA_LOCATION_VERS','ERS Datamart Code User','$DATA_LOCATION_NAME', '$DATA_LOCATION_HOST','$DATA_LOCATION_PORT','$DATA_LOCATION_SRVC', '$DATA_LOCATION_USER','$DATA_LOCATION_PASS') ]
if [omb_error $print] {
log_msg ERROR "Unable to create location '$DATA_LOCATION_NAME'"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
exec_omb OMBCOMMIT
log_msg LOG "Creating Workflow Location."
set print [exec_omb OMBCREATE LOCATION '$WFLOW_LOCATION_NAME' SET PROPERTIES (TYPE, VERSION, DESCRIPTION, BUSINESS_NAME, HOST, PORT, SERVICE_NAME, PASSWORD, SCHEMA) VALUES ('ORACLE_WORKFLOW','$WFLOW_LOCATION_VERS','ERS Datamart Workflow User','$WFLOW_LOCATION_NAME', '$WFLOW_LOCATION_HOST','$WFLOW_LOCATION_PORT','$WFLOW_LOCATION_SRVC', '$WFLOW_LOCATION_PASS','$WFLOW_LOCATION_USER') ]
if [omb_error $print] {
log_msg ERROR "Unable to create Workflow location '$WFLOW_LOCATION_NAME'"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
#return
exec_omb OMBCOMMIT
log_msg LOG "Creating Control Center"
set print [exec_omb OMBCREATE CONTROL_CENTER '$CONTROL_CENTER_NAME' SET PROPERTIES (DESCRIPTION, BUSINESS_NAME, HOST, PORT, SERVICE_NAME, USER, SCHEMA, PASSWORD) VALUES ('ERS Datamart Control Center','$CONTROL_CENTER_NAME', '$DATA_LOCATION_HOST','$DATA_LOCATION_PORT','$DATA_LOCATION_SRVC', '$DATA_LOCATION_USER', '$CONTROL_CENTER_SCHEMA','$DATA_LOCATION_PASS') ]
if [omb_error $print] {
log_msg ERROR "Unable to create control center '$CONTROL_CENTER_NAME'"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
exec_omb OMBCOMMIT
log_msg LOG "Adding Location $DATA_LOCATION_NAME to Control Center $CONTROL_CENTER_NAME"
set print [exec_omb OMBALTER CONTROL_CENTER '$CONTROL_CENTER_NAME' ADD REF LOCATION '$DATA_LOCATION_NAME' SET PROPERTIES (IS_TARGET, IS_SOURCE) VALUES ('true', 'true') ]
if [omb_error $print] {
log_msg ERROR "Unable to add Location $DATA_LOCATION_NAME to Control Center $CONTROL_CENTER_NAME"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
exec_omb OMBCOMMIT
log_msg LOG "Adding Workflow Location $WFLOW_LOCATION_NAME to Control Center $CONTROL_CENTER_NAME"
set print [exec_omb OMBALTER CONTROL_CENTER '$CONTROL_CENTER_NAME' ADD REF LOCATION '$WFLOW_LOCATION_NAME' SET PROPERTIES (IS_TARGET, IS_SOURCE) VALUES ('true', 'true') ]
if [omb_error $print] {
log_msg ERROR "Unable to add Location $DATA_LOCATION_NAME to Control Center $CONTROL_CENTER_NAME"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
#return
exec_omb OMBCOMMIT
log_msg LOG "Setting Control Center as default control center for project"
exec_omb OMBCC 'DEFAULT_CONFIGURATION'
set print [exec_omb OMBALTER DEPLOYMENT 'DEFAULT_DEPLOYMENT' SET REF CONTROL_CENTER '$CONTROL_CENTER_NAME' ]
if [omb_error $print] {
log_msg ERROR "Unable to set Control Center $CONTROL_CENTER_NAME in default configuration"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
exec_omb OMBCOMMIT
exec_omb OMBCC '..'
log_msg LOG "Connecting to Control Center $CONTROL_CENTER_NAME"
set print [exec_omb OMBCONNECT CONTROL_CENTER USE '$DATA_LOCATION_PASS' ]
if [omb_error $print] {
log_msg ERROR "Unable to add Location $DATA_LOCATION_NAME to Control Center $CONTROL_CENTER_NAME"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
exec_omb OMBCOMMIT
log_msg LOG "Registering Code Location."
set print [exec_omb OMBALTER LOCATION '$DATA_LOCATION_NAME' SET PROPERTIES (PASSWORD) VALUES ('$DATA_LOCATION_PASS')]
exec_omb OMBCOMMIT
set print [exec_omb OMBREGISTER LOCATION '$DATA_LOCATION_NAME']
if [omb_error $print] {
log_msg ERROR "Unable to register Location $DATA_LOCATION_NAME"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
exec_omb OMBCOMMIT
log_msg LOG "Registering Workflow Location."
set print [exec_omb OMBALTER LOCATION '$WFLOW_LOCATION_NAME' SET PROPERTIES (PASSWORD) VALUES ('$WFLOW_LOCATION_PASS')]
exec_omb OMBCOMMIT
set print [exec_omb OMBREGISTER LOCATION '$WFLOW_LOCATION_NAME']
if [omb_error $print] {
log_msg ERROR "Unable to register Workflow Location $WFLOW_LOCATION_NAME"
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
exec_omb OMBCOMMIT
# Assign location to Oracle Module
log_msg LOG "Assigning Code Location to Oracle Module."
set print [ exec_omb OMBALTER ORACLE_MODULE '$ORA_MODULE_NAME' ADD REFERENCE LOCATION '$DATA_LOCATION_NAME' SET AS DEFAULT ]
if [omb_error $print] {
log_msg ERROR "Unable to add reference location '$DATA_LOCATION_NAME' to module '$ORA_MODULE_NAME' "
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
set print [ exec_omb OMBALTER ORACLE_MODULE '$ORA_MODULE_NAME' SET REFERENCE METADATA_LOCATION '$DATA_LOCATION_NAME' ]
if [omb_error $print] {
log_msg ERROR "Unable to set metadata location '$DATA_LOCATION_NAME' on module '$ORA_MODULE_NAME' "
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
set print [ exec_omb OMBALTER ORACLE_MODULE '$ORA_MODULE_NAME' SET PROPERTIES (DB_LOCATION) VALUES ('$DATA_LOCATION_NAME') ]
if [omb_error $print] {
log_msg ERROR "Unable to add db_location '$DATA_LOCATION_NAME' to module '$ORA_MODULE_NAME' "
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
exec_omb OMBCOMMIT
# Assign location to Workflow Module
log_msg LOG "Assigning Location to Workflow Module."
set print [ exec_omb OMBALTER ORACLE_MODULE '$ORA_MODULE_NAME' ADD REFERENCE LOCATION '$DATA_LOCATION_NAME' SET AS DEFAULT ]
if [omb_error $print] {
log_msg ERROR "Unable to add reference location '$DATA_LOCATION_NAME' to module '$ORA_MODULE_NAME' "
log_msg ERROR "$print"
log_msg ERROR "Exiting Script.............."
exec_omb OMBROLLBACK
return
# Begin Object Deployment
log_msg LOG "*********** Deploying: Mappings ****************"
exec_omb OMBCC '$ORA_MODULE_NAME'
set mapList [ OMBLIST MAPPINGS ]
foreach mapName $mapList {
log_msg LOG "Deploying: $mapName"
set print [ exec_omb OMBCREATE TRANSIENT DEPLOYMENT_ACTION_PLAN 'DEPLOY_PLAN' ADD ACTION 'MAPPING_DEPLOY' SET PROPERTIES (OPERATION) VALUES ('CREATE') SET REFERENCE MAPPING '$mapName' ]
if [omb_error $print] {
log_msg ERROR "Unable to create Deployment plan for '$mapName'"
log_msg ERROR "$print"
return
set print [ exec_omb OMBDEPLOY DEPLOYMENT_ACTION_PLAN 'DEPLOY_PLAN' ]
if [omb_error $print] {
log_msg ERROR "Error on execute of Deployment plan for '$mapName'"
log_msg ERROR "$print"
exec_omb OMBDROP DEPLOYMENT_ACTION_PLAN 'DEPLOY_PLAN'
exec_omb OMBCOMMIT
return
exec_omb OMBDROP DEPLOYMENT_ACTION_PLAN 'DEPLOY_PLAN'
exec_omb OMBCOMMIT
log_msg LOG "*********** Deploying: Process Flows **************"
exec_omb OMBCC '..'
exec_omb OMBCC '..'
set all_Owfs [OMBLIST PROCESS_FLOW_PACKAGES]
if {$all_Owfs!="" } {
foreach one_Owf $all_Owfs {
set print [ exec_omb OMBCREATE TRANSIENT DEPLOYMENT_ACTION_PLAN 'DEPLOY_PLAN' ]
if [omb_error $print] {
log_msg ERROR "Unable to create Deployment plan for '$one_Owf'"
log_msg ERROR "$print"
return
set print [ exec_omb OMBALTER DEPLOYMENT_ACTION_PLAN 'DEPLOY_PLAN' ADD ACTION 'OWFDEP$one_Owf' SET PROPERTIES (OPERATION) VALUES ('CREATE') SET REFERENCE PROCESS_FLOW_PACKAGE '$one_Owf' ]
if [omb_error $print] {
log_msg ERROR "Unable to alter Deployment plan for '$one_Owf'"
log_msg ERROR "$print"
exec_omb OMBDROP DEPLOYMENT_ACTION_PLAN 'DEPLOY_PLAN'
return
log_msg LOG "OWF LOCATION: $Process_Flow_Module - OWF: $one_Owf Deployment...";
set print [ exec_omb OMBDEPLOY DEPLOYMENT_ACTION_PLAN 'DEPLOY_PLAN' ]
if [omb_error $print] {
log_msg ERROR "Unable to execute Deployment plan for '$one_Owf'"
log_msg ERROR "$print"
exec_omb OMBDROP DEPLOYMENT_ACTION_PLAN 'DEPLOY_PLAN'
return
set print [ exec_omb OMBDROP DEPLOYMENT_ACTION_PLAN 'DEPLOY_PLAN' ]
log_msg LOG "OWF LOCATION: $Process_Flow_Module - OWF: $one_Owf Deployed";
log_msg LOG "OWF LOCATION: $Process_Flow_Module - All Workflows Deployed...";
#OMBDISCONNECTWhich depends on the following two files:
OMB_LIBRARY.TCL
# Default logging function.
# Accepts inputs: LOGMSG - a text string to output
# FORCELOG - if "1" then output regardless of VERBOSE_LOG setting
proc log_msg {LOGTYPE LOGMSG {FORCELOG "0"}} {
global VERBOSE_LOG
if { $VERBOSE_LOG == "1"} {
puts "$LOGTYPE:-> $LOGMSG"
} else {
if { $FORCELOG == "1"} {
puts "$LOGTYPE:-> $LOGMSG"
proc exec_omb { args } {
log_msg OMBCMD "$args"
# the point of this is simply to return errorMsg or return string, whichever is applicable,
# to simplify error checking using omb_error{}
if [catch { set retstr [eval $args] } errmsg] {
return $errmsg
} else {
return $retstr
proc omb_error { retstr } {
# OMB and Oracle errors may have caused a failure.
if [string match OMB0* $retstr] {
return 1
} elseif [string match ORA-* $retstr] {
return 1
} else {
return 0
}and a config file where all config/passowrd info is put.
# GLOBAL VARIABLE DECLARATION SECTION
#DESIGN REPOSITORY CONNECTION INFORMATION
# Login info for the main design repository owner
set OWB_DEG_USER owb_user
set OWB_DEG_PASS password
set OWB_DEG_HOST host01
set OWB_DEG_PORT 5555
set OWB_DEG_SRVC orcl_srvc
set OWB_DEG_REPOS owb_user
# CONTROL CENTER AND LOCATION DECLARATION SECTION
set CONTROL_CENTER_NAME ERS_DM_CTLCNTR
set CONTROL_CENTER_SCHEMA owb_user
#Connection info to ers_etl_app schema for deployment
set DATA_LOCATION_NAME ERS_DM_DATA
set DATA_LOCATION_VERS 9.2
set DATA_LOCATION_USER ERS_ETL_APP
set DATA_LOCATION_PASS ers_etl_app
set DATA_LOCATION_HOST newserver
set DATA_LOCATION_PORT 1555
set DATA_LOCATION_SRVC orcl_new
#Connection info to workflow schema for deployment
set WFLOW_LOCATION_NAME ERS_DM_WFLOW
set WFLOW_LOCATION_VERS 2.6.2
set WFLOW_LOCATION_USER OWF_MGR
set WFLOW_LOCATION_PASS owf_mgr
set WFLOW_LOCATION_HOST newserver
set WFLOW_LOCATION_PORT 1555
set WFLOW_LOCATION_SRVC orcl_new
# PROJECT,MUDULE AND DIRECTORY DECLARATION SECTION
set PROJECT_NAME ERS_DM
set ORA_MODULE_NAME ERS_ETL_APP
set PFLOW_MODULE_NAME LD_DMRT
set PFLOW_PACKAGE_NAME LD_DMRT
# Directory DECLARATION SECTION
set IMPORT_FILE_NAME "ers_dply_tst_rc01.mdl"
set IMPORT_FILE_PATH "C:/OMB/import/db_imports"
set IMPORT_LOG_NAME "ers_dply_tst_rc01_imp.log"
set IMPORT_LOG_PATH "C:/OMB/import/logs"
# Logging Option
global VERBOSE_LOG
set VERBOSE_LOG "1"Hope that this helps you out!
Message was edited by: zeppo
Edited to: Try and correct formatting tags. -
Database Migration 10g, difference in export/import from 8.1.6 and DBUA
i would like to know if there is any difference in migrating using export/import methods from 8.1.6 and migration using dbua from 8.1.6 -> 8.1.7 -> 10g. In terms of tablespaces, datafiles performance .
Will the DBUA convert the tablespaces automatically to take advantage of 10g ?
Thanks .Hi,
If we refer to the Oracle doc, it's depend of your 10g release.<br>
Into Upgrade Paths 10.2, you need to upgrade your 8i firstly in 8.1.7.4.<br>
Into Upgrade Paths 10.1, you you can directly migrate.<br>
Anyway, you need to use 8iexp utility, and 10g imp utility : Using Different Releases of Export and Import 10.1 or Using Different Releases of Export and Import 10.2.<br>
<br>
Nicolas. -
Full Export/Import Errors with Queue tables and ApEx
I'm trying to take a full export of an existing database in order to build an identical copy in another database instance, but the import is failing each time and causing problems with queue tables and Apex tables.
I have used both the export utility and Data Pump (both with partial and full exports) and the same problems are occurring.
After import, queue tables in my schema are unstable. They cannot be dropped using the queue admin packages as they throw ORA-24002: QUEUE_TABLE <table> does not exist exceptions. Trying to drop the tables causes the ORA-24005: must use DBMS_AQADM.DROP_QUEUE_TABLE to drop queue tables error
As a result, the schema cannot be dropped at all unless manual data dictionary clean up steps (as per metalink) are done.
The Apex import fails when creating foreign keys to WWV_FLOW_FILE_OBJECTS$PART. It creates the table ok, but for some reason the characters after the $ are missing so the referencing tables try to refer to WWV_FLOW_FILE_OBJECTS$ only.
I am exporting from Enterprise Edition 10.2.0.1 and importing into Standard edition 10.2.0.1, but we are not using any of the features not available in standard, and I doubt this would cause the issues I'm getting.
Can anyone offer any advice on how I can resolve these problems so a full import will work reliably?Thanks for the lead!
After digging around MetaLink some more, it sounds like I'm running into Bug 5875568 (MetaLink Note:5875568.8) which is in fact related to the multibyte character set. The bug is fixed in the server patch set 10.2.0.4 or release 11.1.0.6. -
In FCP-X is there a way to import AVCHD clips from an AVCHD file folder I have copied from my Canon hard disk camcorder to my iMac hard disk? With FCE I used log and transfer with no problem; but, FCP-X doesn't import them. It will import AIC files that have been created in FCE from the original AVCHD files. If FCP-X will not import these files directly from the copied AVCHD folder, is there a workaround using other software to convert the files? For example, Compressor? (Canon does not provide any AVCHD conversion software for OSX).
Hi jphil and Tom
I have exactly the same problem but the answer you (Tom) provided did not help in my case. Trying to "Import ->From Camera" and pointing to the AVCHD folder yields the error message: "(Folder) contains unsupported files or has invalid directory structure."
Folder structure is
AVCHD
BDMV
CLIPINF
.CPI files
Index.bdm
Movieobj.bdm
PLAYLIST
.MPL file
STREAM
.MTS files
AVF_INFO
3 files (.bnp, .inp, .int)
DCIM
101MSDCF
empty
MODELCFG.IND (file)
My question is: Is that not a "camera archive" of the type you talked about in the thread? Is it on principle not possible to import the files in that format/directory structure? If not, since the shots were not taken by myself, what should I ask the guy who shot the movies to give me to be able to work with them in FC?
I work with a brand new Final Cut Pro X 10.0.4 (still very clumsy with it) on an iMac 11,3 with MacOS 10.6.8.
I downloaded the latest version of codec updates from the Apple support site but no difference.
Thank you in advance
Giorgio
Freiburg, Germany -
Help: Using Record and Collection Methods
I have created a record and I have to loop for as many records in the "RECORD". However if I attempt using any of the Collection Methods, I get the following error:
ERROR at line 1:
ORA-06550: line 41, column 14:
PLS-00302: component 'EXISTS' must be declared
ORA-06550: line 41, column 4:
PL/SQL: Statement ignored
ORA-06550: line 47, column 26:
PLS-00302: component 'COUNT' must be declared
ORA-06550: line 47, column 7:
PL/SQL: Statement ignored
Here is the SQL I am trying to execute:
DECLARE
TYPE Emp_Rec is RECORD (
Emp_Id Emp.Emp_no%TYPE
, Name Emp.Ename%TYPE
ERec Emp_Rec;
Cursor C_Emp IS
SELECT
Emp_No, Ename
From Emp;
begin
OPEN C_Emp;
LOOP
FETCH C_Emp INTO Erec;
EXIT WHEN C_Emp%NOTFOUND;
END LOOP;
IF Erec.Exists(1) THEN
dbms_output.Put_line('exists' );
else
dbms_output.Put_line('does not exists');
end if;
CLOSE C_Emp;
FOR I IN 1..ERec.COUNT
LOOP
dbms_Output.Put_Line( 'Row: ' || To_Char(I) || '; Emp: ' || ERec(i).Name) ;
END LOOP;
end;
Can anyone help, please?
Thanking you in advance,You only defined a Record and not a collection, therefore you cannot use .EXISTS
This is how you would use record, collection and exists together
DECLARE
TYPE Emp_Rec is RECORD
(Emp_Id Emp.Empno%TYPE,
EName Emp.Ename%TYPE );
TYPE Emp_tab is table of emp_rec index by binary_integer;
ERec Emp_tab;
Idx INTEGER;
Cursor C_Emp IS
SELECT EmpNo, Ename From Emp;
begin
IDX := 1;
OPEN C_Emp;
LOOP
FETCH C_Emp INTO Erec(IDX);
EXIT WHEN C_Emp%NOTFOUND;
IDX := IDX + 1;
END LOOP;
IF Erec.Exists(1) THEN
dbms_output.Put_line('exists' );
else
dbms_output.Put_line('does not exists');
end if;
CLOSE C_Emp;
FOR I IN 1..ERec.COUNT LOOP
dbms_Output.Put_Line( 'Row: ' || To_Char(I) || '; Emp: ' || ERec(i).eName) ;
END LOOP;
end;I hope you realize that this is only an example of how to use RECORD, COLLECTIONS and Collection Methods (.EXISTS, .COUNT)
and not the best way to display the contents of a table. -
Importing/Parsing XML using SQL and/or PL/SQL
What is the recomended way of importing/parsing XML data using SQL and/or PL/SQL?
I have an XSD that defines the structure of the file, and an XML file that has the content in the appropriate structure. I have parsed (checked) the structure of the XML file using JDOM in my java application, and then passed it to a function in a package on the database as a CLOB.
What I need to do is parse the contents of the XML file that is passed into the function and extract the values of each XML element so that I can then do some appropriate validation before inserting and committing the data to the database (hence completing the import function).
A DBA colleague of mine has been experimenting with various ways of acheiving this, but has encountered a number of problems along the way, one of which being that he thinks that it is not possible to parse XML elements that are nested more than four levels deep - is this the case?
The structure of the XSD/XML that the database function needs to manipulate and import data from is very complex (and recursive by it's nature).
I would appreciate any suggestions as to how I can achieve the above in the most efficient manner.
Thanks in advance for your help
DavidThis is the forum for the SQLDeveloper tool. You will get better answers in the SQL and PL/SQL forum, and especially the XML DB forum.
Oracle has comprehensive and varied support for XML, including a PL/SQL parser. -
Export/import compatibility between 2.0 and 2.2
If I upgrade my development environment to Apex 2.2 whenever it is released, can I export apps/pages from there and import into a Apex 2.0 environment?
Both export files have the following in them.
-- This date identifies the minimum HTML DB version required to import this file.
wwv_flow_api.set_version(p_version_yyyy_mm_dd=>'2005.05.01');So I guess the answer is yes, but I don't want to make any assumptions that would prevent me from deploying new/changed functionality into my existing 2.0 Production environment.
Can someone from Oracle comment on this?
ThanksVikas,
Yes, in most cases that will work just fine. There are two options on the application export page that would make it so you can't import your application in APEX 2.0, "Export Deployment Attributes" and "Export Comments". Selecting either of those will include metadata in the export which wasn't around in 2.0, so they affect the import compatibility.
Otherwise, apps and pages should be compatible with 2.0.
- Marco -
Export - Import In ABAP ( for variables and internal table)
how can we pass value for the variable and internal table using Export and Import?
data: var type sy-uzeit.
var = sy-uzeit.
EXPORT var TO MEMORY ID 'TIME'.
data: var type sy-uzeit.
IMPORT var FROM MEMORY ID 'TIME'.
write:/ var,sy-subrc,sy-uzeit.
i found var value 0 while importing.
what is the right syntax for passing value of variable and internaltable.
regards,
dushyant.Hi,
There are two possible solutions.
Solution1:
Program1.Should be run before atleast once so that TIME should be filled.
data: var type sy-uzeit.
var = sy-uzeit.
EXPORT var TO MEMORY ID 'TIME'.
Program2.IF the TIME is filled,then only it will produce the result.
data: var type sy-uzeit.
clear var.
IMPORT var FROM MEMORY ID 'TIME'.
write:/ var, sy-subrc, sy-uzeit.
Solution2:
Single program:
data: var type sy-uzeit.
var = sy-uzeit.
EXPORT var TO MEMORY ID 'TIME'.
clear var.
IMPORT var FROM MEMORY ID 'TIME'.
write:/ var, sy-subrc, sy-uzeit.
Kindly reward points by clikcing the star on the left of reply,if it helps. -
Export/import table with XMLTYPE data_type and fine_grained policy
Hi friends!
I'm trying to export a table with XMLTYPE and faine-grained policy.
Source: HP-UX - Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
Target: Linux 2.6.18-238.el5 - Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
I do it thru exp/imp old utilities as mention in metalink ID 1318012.1.
One of the things that surprised me after exp/imp is that the number of objects increased:
Objects at Source:
exp \"/ as sysdba\" owner=xml log=xml file=xml.dmp
Type
Number of Objects
TYPE
431
TRIGGER
6
TABLE
17
PROCEDURE
2
LOB
120
INDEX
17
FUNCTION
1
On Target:
imp \"/ as sysdba\" file=xml.dmp fromuser=xml touser=xml log=xml.log
Type
Number of Objects
TYPE
431
TABLE
32
PROCEDURE
2
LOB
429
INDEX
478
FUNCTION
1
Why is this happening? Is it normal?
Other problem that I found is triggers are not imported, why!?!?
Thank you very much for your help!
JoséHi!
The new squema was created empty. If I run the following query:
select '10g', TABLE_NAME,COUNT(*)
from DBA_LOBS@DSN_HP
where OWNER='XML'
group by TABLE_NAME
union
select '11g', table_name,count(*)
from DBA_LOBS
where OWNER='XML'
group by TABLE_NAME
order by 2,1
As a result:
Version
Table_name
Total
10g
ACTION_TABLE
1
11g
ACTION_TABLE
1
10g
Document1767_TAB
14
11g
Document1767_TAB
13
10g
Document1852_TAB
14
11g
Document1852_TAB
13
10g
Document1941_TAB
16
11g
Document1941_TAB
15
10g
Document2016_TAB
14
11g
Document2016_TAB
13
10g
Document2087_TAB
13
11g
Document2087_TAB
12
10g
IBT_XML_RECIBIDOS
1
11g
IBT_XML_RECIBIDOS
1
10g
LINEITEM_TABLE
2
11g
LINEITEM_TABLE
2
10g
PURCHASEORDER
7
11g
PURCHASEORDER
7
10g
PurchaseOrder1145_TAB
9
11g
PurchaseOrder1145_TAB
7
10g
RICARDO
13
10g
RICARDO2
1
11g
RICARDO2
1
10g
RITNTFER
1
11g
RITNTFER
1
10g
RITNTFRE_08
13
11g
SYS_NT3+LEU6vbfGLgQ18DLgrURw==
69
11g
SYS_NT3+LEU6vffGLgQ18DLgrURw==
76
11g
SYS_NT3+LEU6vjfGLgQ18DLgrURw==
63
11g
SYS_NT3+LEU6vpfGLgQ18DLgrURw==
1
11g
SYS_NT3+LEU6vqfGLgQ18DLgrURw==
2
11g
SYS_NT3+LEU6vTfGLgQ18DLgrURw==
65
11g
SYS_NT3+LEU6vXfGLgQ18DLgrURw==
66
10g
TESTCLOB
1
11g
TESTCLOB
1
There are many new tables created...I suppose because the differences between versions...
Any ideas?
Thanks a lot!
José
Maybe you are looking for
-
Hi Experts, I m trying to post A/R credit memo but error comes that G/L account is missing, i've checked the control accounts under accounting tab of business partner, it is ok, plz suggest me that what is missing or anything else need to edit? Regar
-
Boot up Screen help -- This has to be an EASY fix
Does anyone know how to get the MacBookPro to boot up to the screen which allows the user to chose whether you are the Administrator, Another Account or a Guest? Now when it boots up it defaults to the Administrator and you need to Log Out then you
-
Hi Experts, Can you please help me in telling the new line character in Data Services, so that I can use that in script. I tried using \n for new line, but of no use. Regards, vivek
-
ORACLE 10g : Datapump-Import : Error: unknown parameter name 'REMAP_TABLE'
Hi, I am working on Oracle 10g. When executed import datapump script in the UNIX box with option "RMAP_TABLE" I received the error "unknown parameter name 'REMAP_TABLE' "Can you please give me solution for it? Scripts :- impdp eimsexp/xyz TABLES=EIMD
-
Scanner appears but can't scan into A3
Is there a way to have Aperture not go to import option when I turn my scanner on? I do use the Hot Folder, but A3 still opens the import window when I start the scanner. Thanks.