Scripts generated post deployment in OWB
Hi,
I have a query regarding the sccripts generated after deployment of any object in OWB.
When we create a map in OWB and deploy it, two scripts are generated. one is the package which physically create the mappning process in target schema. The other script is a ddl script e.g. map_name_drop.ddl.
Can you provide me an understanding why this ddl script is generated.
Many Thanks
Nothing specific at the moment but would like to know if its possible and to what level its possible to interact via scripts or other methods with the applications in CS6 after deployment
Can projects be opened in Captivate via script?
Can publishing settings be set for labs of captivate installs so all have a common set of custom settings?
Can projects be published via scripting?
Similar Messages
-
How to view the DDL script prior to object deployment in OWB 10g R2?
How to view the DDL script prior to object deployment in OWB 10g R2?
Here is what I' looking for: in 10gR2, let's say I've built dimension X, but it's not deployed yet. I've selected one of the deployment options, let's say: "Deploy to Catalog only". Now, I'd like to see a DDL script that will be executed at the deployment time. Where can I find this script? What screen? What menu?
Thanks,
vrViewing the Scripts
After you have generated scripts for your target objects, you can open the scripts and
view the code. Warehouse Builder generates the following types of scripts:
■ DDL scripts: Creates or drops database objects.
■ SQL*Loader control files: Extracts and transports data from file sources.
■ ABAP scripts: Extracts and loads data from SAP systems.
To view the generated scripts:
1. From the Generation Results window, select an object in the navigation tree on the
left of the Generation Results dialog.
2. Select the Scripts tab on the right of this dialog.
The Scripts tab contains a list of the generated scripts for the object you selected.
3. Select a specific script and click the View Code button.
Regards,
Marcos -
COMMAND TO EXECUTE SCRIPT GENERATED BY OWB TO EXPORT METADATA
Sorry but I'm trying to execute the script to export metadata to OLAP like is descripbed in the Oracle Warehouse Builder User's Guide, but it doesn't function it give the usege:
TARGET is the schema where is data:
sqlplus OLAPDBA/[email protected] TARGET
Usage: SQLPLUS [ [<option>] [<logon>] [<start>] ]
where <option> ::= -H | -V | [ [-C <v>] [-L] [-M <o>] [-R <n>] [-S] ]
<logon> ::= <username>[<password>][@<connect_identifier>] | / | /NOLOG
<start> ::= @<URL>|<filename>[.<ext>] [<parameter> ...]
"-H" displays the SQL*Plus version banner and usage syntax
"-V" displays the SQL*Plus version banner
"-C" sets SQL*Plus compatibility version <v>
"-L" attempts log on just once
"-M <o>" uses HTML markup options <o>
"-R <n>" uses restricted mode <n>
"-S" uses silent mode
Thanks in advance.Simth,
Since you use the OWB framework, you would deploy the generated script using the deployment manager and then you can execute using the deployment manager also.
Please refer to the quick guide:
http://otn.oracle.com/products/warehouse/pdf/92QuickStartGuide.pdf
You also may want to check out the deployment and execution section of the user's guide:
http://otn.oracle.com/documentation/warehouse.html
Thanks,
Mark. -
How to test the script generated by OWB?
Hi!! I am now using Oracle 9i. What kind of Oracle software that I should use to run the script that was generated by OWB? Thanks.
For example"
OPTIONS ( DIRECT=TRUE,PARALLEL=FALSE, ERRORS=50, BINDSIZE=50000, ROWS=200, READSIZE=65536)
LOAD DATA
CHARACTERSET ZHS16GBK
INFILE 'C:\out.csv'
READBUFFERS 4
INTO TABLE "DIFF_OUT"
APPEND
REENABLE DISABLED_CONSTRAINTS
FIELDS
TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
"CUST_NUM" CHAR ,
"ACCT_NUM" CHAR ,
"CUSTNAME" CHAR ,
"ORDER_NO" CHARSimth,
Since you use the OWB framework, you would deploy the generated script using the deployment manager and then you can execute using the deployment manager also.
Please refer to the quick guide:
http://otn.oracle.com/products/warehouse/pdf/92QuickStartGuide.pdf
You also may want to check out the deployment and execution section of the user's guide:
http://otn.oracle.com/documentation/warehouse.html
Thanks,
Mark. -
VS 2013, DB project how to run post deployment script?
Hi,
In VS 2010 in the DB project I can run pre and post deployment scripts. Both are included in the project by default.
In 2013 there's no such thing. What's the suggested alternative?
Thanks
SSIS questionHi Andreyni,
You could get them if you add new Items.
Best Regards,
Jack
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
"Same Database" references and post-deploy scripts
Say that I have two database projects - project A and project B. Project A references project B as a same database reference.
When deploying project A (with Include Composite Objects enabled), all of project B's objects are included, as expected. However, project B's post-deployment script is not run.
Is this by design, or a bug?Hi Jamie,
In our scenario, we are using post-deployment scripts with MERGE statements as a way to manage population of initial data in certain tables. For example, if you imagine a fictitious Gender table:
CREATE TABLE Gender (
Id INT IDENTITY PRIMARY KEY NOT NULL,
Code VARCHAR(2) NOT NULL,
Description NVARCHAR(16) NOT NULL,
CONSTRAINT [UK_Gender_Code] UNIQUE (Code)
we would have a corresponding entry in the post-deployment script such as:
MERGE INTO Gender AS Target
USING (VALUES
('M', 'Male'),
('F', 'Female')
) AS Source (Code, Description)
ON Target.Code = Source.Code
WHEN MATCHED THEN
UPDATE SET
Target.Description = Source.Description
WHEN NOT MATCHED BY TARGET THEN
INSERT (Code, Description)
VALUES (Code, Description)
WHEN NOT MATCHED BY SOURCE THEN
DELETE;
For tables that (for better or for worse) are expected to always contain certain values, this works very well to ensure that those tables are populated with the correct entries both when databases are initially deployed and when they are updated, and also
works well to handle updating this data for new or modified entries.
But, for same-database references, the post-deploy script never runs, and you end up with empty tables. It's possible to work around this (deploy database B to target, then deploy database A, excluding composite objects), but it doesn't really fit
nicely into the press-F5-to-deploy that SSDT's Visual Studio experience has. -
Upgrade-deployment in OWB 10.2.
Upgrade-deployment in OWB
As I need to create new partitions for the fact table (with data!), I want to test upgrade-deployments in Control Center Manager.
But I get the error dialog box.
“RTC-5270: Invalid upgrade plan.
The deployment has been aborted due to a problem with generating a valid upgrade plan.”
I have already executed this Sql: grant_upgrade_privileges.sql
and the Location access is: Host:Port:Service
I am using OWB 10.2.0.3.0/33 on Windows
Is it advisable to do upgrade-deployments in a production environments?I had also the same problem with the upgrade function in OWB. I also executed the SQL Script "grant_upgrade_privileges.sql". But also in my case the error was still present.
On Oracle Metalink I found the Note:387919.1
with this "quiet" interesting side note:
"... Please note that grant_upgrade_privileges.sql script needs to be executed before that table is created (deployed) the first time. If this script has not been executed before the tables are deployed the first time it will not any more be possible to upgrade the table definition. Even if this script is executed it will not any more be possible to upgrade a table deployed before the execution of the script. "
Now, after replacing all tables, I´m able to upgrade them. Luckily I was still in the Development phase when I discoverd this circumstance of OWB behaviour.
But I still have problems to upgrade tables with TIMESTAMPS in it. It seems that Oracle does not support the upgrade function for some Datatypes.
Does anyone of you has some experience in that problem or even better some way to solve (workaround) it? -
Confusion over DBCA script generated for manual RAC DB creation
Version:11.2.0.4/RHEL 6.3
We would like to create our 3-node RAC DB manually. DBCA cannot meet our requirement because our redo log files, datafiles, tempfiles and control files are placed in a complicated manner . If we use DBCA , we will have to spend a lot of time configuring to our requirements after the DB creation.
I generated the DB creation scripts from DBCA (DB Name = BRCFPRD )
DBCA placed the db creation scripts in the specified directory in all the 3 nodes !!
They all have almost the same contents . The only difference being the instance name (BRCFPRD12.sql for Node2, ... etc).
Scripts in each node have the createDB.sql statement which has CREATE DATABASE "BRCFPRD" statement. Why is this ? The database need to be created only from one node. Then why did DBCA place createDB.sql in all nodes ?
I just want to run the script from just one node , say Node1 and it should create the 3-Node RAC DB. How can I do this manually?
-- The scripts genereated by DBCA in Node1
apex.sql
BRCFPRD1.sh
BRCFPRD1.sql
context.sql
CreateClustDBViews.sql
CreateDB.sql
CreateDBCatalog.sql
CreateDBFiles.sql
cwmlite.sql
emRepository.sql
init.ora
interMedia.sql
JServer.sql
lockAccount.sql
ordinst.sql
owb.sql
postDBCreation.sql
spatial.sql
xdb_protocol.sql
-- The contents of the main shell script BRCFPRD1.sh
$ cat BRCFPRD1.sh
#!/bin/sh
OLD_UMASK=`umask`
umask 0027
mkdir -p /optware/product/admin/BRCFPRD/adump
mkdir -p /optware/product/admin/BRCFPRD/dpdump
mkdir -p /optware/product/admin/BRCFPRD/hdump
mkdir -p /optware/product/admin/BRCFPRD/pfile
mkdir -p /optware/product/cfgtoollogs/dbca/BRCFPRD
umask ${OLD_UMASK}
ORACLE_SID=BRCFPRD1; export ORACLE_SID
PATH=$ORACLE_HOME/bin:$PATH; export PATH
echo You should Add this entry in the /etc/oratab: BRCFPRD:/optware/product/oracle/11.2.0:Y
/optware/product/oracle/11.2.0/bin/sqlplus /nolog @/optware/product/BRCFPRD1.sql
-- Contents of BRCFSPRD1.sql
$ cat BRCFPRD1.sql
set verify off
ACCEPT sysPassword CHAR PROMPT 'Enter new password for SYS: ' HIDE
ACCEPT systemPassword CHAR PROMPT 'Enter new password for SYSTEM: ' HIDE
ACCEPT sysmanPassword CHAR PROMPT 'Enter new password for SYSMAN: ' HIDE
ACCEPT dbsnmpPassword CHAR PROMPT 'Enter new password for DBSNMP: ' HIDE
host /optware/product/oracle/11.2.0/bin/orapwd file=/optware/product/oracle/11.2.0/dbs/orapwBRCFPRD1 force=y
host /grid/product/11.2.0/bin/setasmgidwrap o=/optware/product/oracle/11.2.0/bin/oracle
host /optware/product/oracle/11.2.0/bin/srvctl add database -d BRCFPRD -o /optware/product/oracle/11.2.0 -p +DATA/BRCFPRD/spfileBRCFPRD.ora -n BRCFPRD -a "DATA,ARCH_DG"
host /optware/product/oracle/11.2.0/bin/srvctl add instance -d BRCFPRD -i BRCFPRD1 -n cimprd175
host /optware/product/oracle/11.2.0/bin/srvctl add instance -d BRCFPRD -i BRCFPRD3 -n cimprd177
host /optware/product/oracle/11.2.0/bin/srvctl add instance -d BRCFPRD -i BRCFPRD2 -n cimprd176
host /optware/product/oracle/11.2.0/bin/srvctl disable database -d BRCFPRD
@/optware/product/CreateDB.sql
@/optware/product/CreateDBFiles.sql
@/optware/product/CreateDBCatalog.sql
@/optware/product/JServer.sql
@/optware/product/context.sql
@/optware/product/xdb_protocol.sql
@/optware/product/ordinst.sql
@/optware/product/interMedia.sql
@/optware/product/cwmlite.sql
@/optware/product/spatial.sql
@/optware/product/emRepository.sql
@/optware/product/apex.sql
@/optware/product/owb.sql
@/optware/product/CreateClustDBViews.sql
host echo "SPFILE='+DATA/BRCFPRD/spfileBRCFPRD.ora'" > /optware/product/oracle/11.2.0/dbs/initBRCFPRD1.ora
@/optware/product/lockAccount.sql
@/optware/product/postDBCreation.sql
-- Contents of CreateDB.sql in Node1
$ cat /optware/product/CreateDB.sql
SET VERIFY OFF
connect "SYS"/"&&sysPassword" as SYSDBA
set echo on
spool /optware/product/CreateDB.log append
startup nomount pfile="/optware/product/init.ora";
CREATE DATABASE "BRCFPRD"
MAXINSTANCES 32
MAXLOGHISTORY 1
MAXLOGFILES 192
MAXLOGMEMBERS 3
MAXDATAFILES 3000
DATAFILE SIZE 700M AUTOEXTEND ON NEXT 10240K MAXSIZE UNLIMITED
EXTENT MANAGEMENT LOCAL
SYSAUX DATAFILE SIZE 600M AUTOEXTEND ON NEXT 10240K MAXSIZE UNLIMITED
SMALLFILE DEFAULT TEMPORARY TABLESPACE TEMP TEMPFILE SIZE 20M AUTOEXTEND ON NEXT 640K MAXSIZE UNLIMITED
SMALLFILE UNDO TABLESPACE "UNDOTBS1" DATAFILE SIZE 200M AUTOEXTEND ON NEXT 5120K MAXSIZE UNLIMITED
CHARACTER SET AL32UTF8
NATIONAL CHARACTER SET AL16UTF16
LOGFILE GROUP 1 SIZE 28672M,
GROUP 2 SIZE 28672M
USER SYS IDENTIFIED BY "&&sysPassword" USER SYSTEM IDENTIFIED BY "&&systemPassword";
set linesize 2048;
column ctl_files NEW_VALUE ctl_files;
select concat('control_files=''', concat(replace(value, ', ', ''','''), '''')) ctl_files from v$parameter where name ='control_files';
host echo &ctl_files >>/optware/product/init.ora;
spool offIf you look at scripts generated in Node2 and Node3 , you can see all scripts except the instance specific ones are commented using REM .
REM host /u01/product/oracle/11.2.0.3/dbhome_1/bin/srvctl add instance -d STOMPER -i STOMPER1 -n ugxtlprd186
REM host /u01/product/oracle/11.2.0.3/dbhome_1/bin/srvctl add instance -d STOMPER -i STOMPER2 -n ugxtlprd187
REM host /u01/product/oracle/11.2.0.3/dbhome_1/bin/srvctl disable database -d STOMPER
REM @/u01/product/CreateDB.sql
REM @/u01/product/CreateDBFiles.sql
REM @/u01/product/CreateDBCatalog.sql
REM @/u01/product/JServer.sql
REM @/u01/product/context.sql
<snipped > -
BI deployment from OWB 10.2 (Paris)
Hi all,
I'm curious what success or lack thereof you have had in deploying dimensions and cubes relationally to Discoverer with OWB 10.2.
Do you deploy item folders for dimensions and cubes seperately?
Do you create item folders for every level of your dimensions?
Do you create "complex" item folders instead having the dimensions and cubes prejoined?
I'm working on our BI deployment and would love to know some of your experiences with either method. Also, are you able to fully manage your EULs with OWB or do you have to use Discoverer Administrator post-deployment?
Thanks in Advance,
MikeHi Mike.
Do you create item folders for every level of your dimensions?You must create item folders for all information you want to show to end-euser or you need to add to a calculation, hierarchy, join, etc. It depends on how you have imported the folder in discoverer. Make sure the item you want to add to a folder is not already there.
Do you create "complex" item folders instead having the dimensions and cubes prejoined?Read the best practices document/thread. You'll see that using db views might be an alternative.
I never tried this using OWB 10g R2. In previous releases you'd have to create collections, export/import it to discoverer and that was the most you could do using OWB.
As far as I've read the documentation, the "Defining Business Intelligence Objects" chapter in OWB users guide, it seems like there's a lot stuff you can do in OWB.
However you'll see the need to edit the Business Areas using Discoverer Adm Edition. Read this thread, you'll see there's plenty of stuff you won't be able to do using OWB (like editing security/privileges, for instance):
Discoverer Adm. Best Practices
Also, read the following document:
http://www.oracle.com/technology/products/discoverer/pdf/Discoverer_10_1_2_BestPractices.pdf
You'll find answers for most of your questions.
Regards,
Marcos -
Cannot save script generated web page
I found that safari cannot save a script generated page by (File -> Save as). I've written a simple test page as follow:
<html>
<body>
<script>
function OpenPopUpWin()
var generator=window.open('','save_win', 'width=645,height=600,resizable=yes,menubar=yes,toolbar=no,directories=no,locat ion=no,scrollbars=yes,status=yes');
generator.document.write('testing');
generator.document.close();
</script>
test (<-this link call the OpenPopUpWin() function)<br>
</body>
</html>
Does anyone have similar problem before? Is it a bug in safari? Are there any workarounds if I want to save the page?I guess I'm still not understanding your problem. I took your original script, made the one change as mentioned in my earlier post, and it worked.
It's quite likely/possible that Apple filter javascript embedded in posts - all sorts of nasty things could happen if they didn't, but you're not using Safaro to create your pages, are you?
Just as an example, this seems to do what you describe - copy and paste it into a new text document and see what happens:
<pre class=command><html>
<body>
<script>
function OpenPopUpWin()
var generator=window.open('','save_win', 'width=645,height=600,resizable=yes,menubar=yes,toolbar=no,directories=no,locat ion=no,scrollbars=yes,status=yes');
generator.document.write('testing');
generator.document.close();
</script>
<a href="#" OnClick="OpenPopUpWin()'>test</a> (<-this link call the OpenPopUpWin() function)
</body>
</html>
</pre> -
Java Script Error while deploying a Model with Value Help
Hi,
I am using EP 7.0 SP 10.
I am trying to deploy a model which includes the Value Help for an Input field, and i am trying to deploy this model.
The model compiles successfully, but gives a Java Script Error while deploying the model,
! Error on Page
When Click on this java script error, it shows that ,
Line:14985
Char 1: Error
object does n't support this property or method.
code
URL: <serverhost>/VCRes/WebContent/VisualComposer6.0/bin/223334.htm?24102006.1712.
The Same model works in dev server, and it fails in the production server.
Thanks and Regards,
SekarHi jakob,
Thankyou for your quick response.
I did a basic model with the help of a documentation which i got from this forums.I created a iView and from there i used Bapi "BAPI_SALESORDER ".
I created a Input Form and a outpot form (table view).I tested model and am able to get the output.but when i try to deploy it is giving me the error.
And i think am not paring any formulas here.
Please guide me.
thanks and regrads
Pradeep.B -
Remote Desktop User - Post Deployment/ Post Build
Hi,
Would like to know if there is a way to enable RDP post Deployment. Post Build of a Cloud Service.
Also can i have multiple user created and enable RDP for each of them?
Thanks,
Pradebban RajaI know we can do it from Visual Studio by creating the certificate and exporting it into a PFX.
But is the certificate re-usable! Can i use the same certificate for a different user which i create at a later point of time?
Thanks,
Pradebban Raja -
Haven't got submenu [generate toplink-deployment-descriptor.xml]
followed steps in the tutorial : Building a Simple JSF and TopLink App in JDeveloper 10.1.3 Preview
http://www.oracle.com/technology/products/jdev/101/howtos/jsftoplink/index.html
but at the step 8 ( Now that you've generated your TopLink Class and initial mappings, you'll want to generate the TopLink Deployment Descriptor. Locate the TopLink Mapping node and right click "Generate toplink-deployment-descriptor.xml")
i can't see the submenu [generate toplink-deployment-descriptor.xml]
please help me
thanksand at TopLink node have 2 file : sessions.xml and "TopLink Mapping"
when right-click on "TopLink Mapping" see 3 sub-menu
+Add or remove Descriptor
Generate Mapping Status Report
Create Data Control
Help me get the sub-menu :Generate toplink-deployment-descriptor.xml ???
thanks -
EPMA Shared Members not reflecting in Planning application post deployment
Hi,
I have added a member in 'Entity' dimnesion. The same member is also added in an alternate hiererchy as shared member (Using Insert member option in dimension library). When I refresh the application view, I do see both the original member and shared member in application vieiw. But post deployment, only the base member is reflected in planning application and not the shared member.
I have redeployed the application multiple times now, but i am still not able to see the shared members reflected in Planning application.
Kindly let me know if you have any idea on the same or faced a similar issue?
Regards
MeenalHi,
Have you patched up EPMA with patches from Metalink3
9.3.1.1 Patch includes -
6897835 SHARED MEMBERS IN ENTITY DIMENSION CAN NOT BE DEPLOYED TO ASO/BSO APPS
It is advisable to keep a look out for patch fixes with EPMA
Hope this helps
John
http://john-goodwin.blogspot.com/ -
Any body has any idea where i can download the WLST Script Generator?
It used to be under codesamples of BEA where it is no longer available after the oracle aquisition of bea.
the name of the jar file is: wlstScriptGenerator.jar for WLS 9.x and 10.xThe number '1129323703654' is a stringified time stamp. If you convert that number to a date you will get,
from java.util import Date
d = Date(1129323703654L)
print d
Fri Oct 14 17:01:43 EDT 2005
Right now there is no way to control the names, the script generator uses the stringified timestamp for each script that it creates (to make the script names unique).
Thanks,
-satya
BEA Blog: http://dev2dev.bea.com/blog/sghattu/
Maybe you are looking for
-
Unable to get my BG in HR:Position
i've config my BG in Sys Ad:profile site level, but im unable to get this BG in organization column in positions form. and it also gives message "APP-PER-52803: YOUR BUSINESS GROUP DOES NOT MATCH YOUR SECURITY PROFILE" how can i appreciate you guys p
-
Message filter to mark email as read does not work.
I have set up a message rule to first mark the message read and then move to a folder. Thunderbird moves the message to the correct folder, but it is still marked as unread. I'm using Thunderbird for Windows and I'm using Windows 7. Also, Thunderbird
-
Hai How many servers are there in Netweaver
HI, Please can any one tell me how many servers are present in Netweaver platform and can u please tell me the differnce between netweaver and previous one which is in usage. Wht are the advantages over it and disadvatages if any. Wht is the role of
-
Installation of Safari 1.3.2
I have been using Safari 1.3.2 on OSX 10.3.9 for years now, very happily. My surge surpressor tripped recently and ever since my flashplayer plug-in hasn't worked. I've tried reinstalling it, updating it, re-updating safari and even reinstalling safa
-
Can PPro view custom xmp schema?
As the title, can PPro view custom xmp schema in its metadata view?