Change T-SQL Script
I know that OMWB can convert the store procedure to oracle but how can i convert a T-SQL script to oracle?
i know OMWB can eaily do it.
i can make the t-sql script to store procedure but i don't want this..
So is there any better way to change it?
Anil
Im afraid the OMWB does not have support for scripts yet.
As noted you can cut and paste statements into the body of a stored procedure that is already captured by the OMWB and parse them. Unfortunately there is no other way to access the OMWB parser.
Regards,
Dermot.
Similar Messages
-
Hi everyone,
I want to accomplish a task in TFS that I need to auto generate the database changes as SQL script file in drop folder for every build.
For ex.: If I add a table in a database and then check in the changes, I need to get that create table script in the drop folder as .sql file extension
I want to automate the build too for every check in. Help me out and guide me a step by step procedure because since I am new to TFS build in visual studio.
ThanksCheck out SSDT:
https://msdn.microsoft.com/en-us/data/tools.aspx
It can generate a DACPAC which can be used to update a SQL Database through the commandline. To ensure that the .sql file executed is compatible with the target database schema it contains a compiled version of teh schema and will generate the change script
on-the-fly.
If you want, you call SqlPackage,exe to generate a SQL script if you want to inspect it before executing.
https://msdn.microsoft.com/en-us/library/hh550080%28v=vs.103%29.aspx
My blog: blog.jessehouwing.nl -
Run multiple sql scripts using osql
We have 2 databases which should be installed on each and every sql server.
STEPS DONE
1. scripted out these two DB'S as ex: db1.sql db2.sql
2.Scripted lookup tables in these DB's into two scripts ex: lookup1.sql lookup2.sql
3.scripted permissions of service accounts in two databases as ex: perm1.sql perm2.sql
In order to run all these scripts in sequence I am planning to use an batch file which executes all these scripts in sequence and also we know the database creation script db.sql looks for the same path for DATA and LOG file locations as it is in the script. Is
that possible to use parameters to allow dba to set location paths while running batch file?
The sequence should be as
db1.sql
lookup1.sql
perm1.sql
db2.sql
lookup2.sq
perm2.sqlSetup a ControlParms table. Let the DBA configure the values in the table.
Change the .sql scripts to read the ControlParms table for path or other configuration value.
Dynamic SQL: http://www.sqlusa.com/bestpractices/dynamicsql/
Kalman Toth Database & OLAP Architect
SQL Server 2014 Design & Programming
New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012 -
How to reference dynamic parameters in the PL/SQL script
The meaning of dynamic parameter is the position and name of parameters will be changed based on the data structure of a referenced text file reading by the concerned PL/SQL script. Anybody can post a sample code will be very appreciated.
The SQL and PL/SQL discussion forum is a good source for this kind of information.
The URL is:
PL/SQL -
I am getting an error in "Step 2 - Setup or Remove Lync Server Components" of "Install or Update Lync Server System" step.
"An error occured while applying SQL script for the feature BackendStore. For details, see the log file...."
Additionally, all previous steps such as: Prepare Active Directory, Prepare first Standard Edition server, Install Administrative Tools, Create and publish topology are done without any errors. The user that I used to setup the Lync server is member of:
Administrators
CSAdministrator
Domain Admins
Domain Users
Enterprise Admins
Group Policy Creator Owners
RTCComponentUniversalServices
RTCHSUniversalServices
RTCUniversalConfigReplicator
RTCUniversalServerAdmins
Schema Admins
I have tried to re-install all the things and started to setup a new one many times but the same error still occurred. Please see the log below and give me any ideas/solutions to tackle this problem.
****Creating DbSetupInstance for 'Microsoft.Rtc.Common.Data.BlobStore'****
Initializing DbSetupBase
Parsing parameters...
Found Parameter: SqlServer Value lync.lctbu.com\rtc.
Found Parameter: SqlFilePath Value C:\Program Files\Common Files\Microsoft Lync Server 2013\DbSetup.
Found Parameter: Publisheracct Value LCTBU\RTCHSUniversalServices;RTC Server Local Group;RTC Local Administrators;LCTBU\RTCUniversalServerAdmins.
Found Parameter: Replicatoracct Value LCTBU\RTCHSUniversalServices;RTC Server Local Group.
Found Parameter: Consumeracct Value LCTBU\RTCHSUniversalServices;RTC Server Local Group;RTC Local Read-only Administrators;LCTBU\RTCUniversalReadOnlyAdmins.
Found Parameter: DbPath Value D:\CsData\BackendStore\rtc\DbPath.
Found Parameter: LogPath Value D:\CsData\BackendStore\rtc\LogPath.
Found Parameter: Role Value master.
Trying to connect to Sql Server lync.lctbu.com\rtc. using windows authentication...
Sql version: Major: 11, Minor: 0, Build 2100.
Sql version is acceptable.
Validating parameters...
DbName rtcxds validated.
SqlFilePath C:\Program Files\Common Files\Microsoft Lync Server 2013\DbSetup validated.
DbFileBase rtcxds validated.
DbPath D:\CsData\BackendStore\rtc\DbPath validated.
Effective database Path: \\lync.lctbu.com\D$\CsData\BackendStore\rtc\DbPath.
LogPath D:\CsData\BackendStore\rtc\LogPath validated.
Effective Log Path: \\lync.lctbu.com\D$\CsData\BackendStore\rtc\LogPath.
Checking state for database rtcxds.
Checking state for database rtcxds.
State of database rtcxds is detached.
Attaching database rtcxds from Data Path \\lync.lctbu.com\D$\CsData\BackendStore\rtc\DbPath, Log Path \\lync.lctbu.com\D$\CsData\BackendStore\rtc\LogPath.
The operation failed because of missing file '\\lync.lctbu.com\D$\CsData\BackendStore\rtc\DbPath\rtcxds.mdf'
Attaching database failed because one of the files not found. The database will be created.
State of database rtcxds is DbState_DoesNotExist.
Creating database rtcxds from scratch. Data File Path = D:\CsData\BackendStore\rtc\DbPath, Log File Path= D:\CsData\BackendStore\rtc\LogPath.
Clean installing database rtcxds.
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
****Creating DbSetupInstance for 'Microsoft.Rtc.Common.Data.RtcSharedDatabase'****
Initializing DbSetupBase
Parsing parameters...
Found Parameter: SqlServer Value lync.lctbu.com\rtc.
Found Parameter: SqlFilePath Value C:\Program Files\Common Files\Microsoft Lync Server 2013\DbSetup.
Found Parameter: Serveracct Value LCTBU\RTCHSUniversalServices;RTC Server Local Group.
Found Parameter: DbPath Value D:\CsData\BackendStore\rtc\DbPath.
Found Parameter: LogPath Value D:\CsData\BackendStore\rtc\LogPath.
Trying to connect to Sql Server lync.lctbu.com\rtc. using windows authentication...
Sql version: Major: 11, Minor: 0, Build 2100.
Sql version is acceptable.
Validating parameters...
DbName rtcshared validated.
SqlFilePath C:\Program Files\Common Files\Microsoft Lync Server 2013\DbSetup validated.
DbFileBase rtcshared validated.
DbPath D:\CsData\BackendStore\rtc\DbPath validated.
Effective database Path: \\lync.lctbu.com\D$\CsData\BackendStore\rtc\DbPath.
LogPath D:\CsData\BackendStore\rtc\LogPath validated.
Effective Log Path: \\lync.lctbu.com\D$\CsData\BackendStore\rtc\LogPath.
Checking state for database rtcshared.
Reading database version for database rtcshared.
Database version for database rtcshared - Schema Version5, Sproc Version 0, Update Version 1.
Thanks and Regards,
Thanh LeThanks Lạc
Phạm 2
I Had similar issue i end up uninstalling and reinstallting but same issue, then i change the drive but same issue. It was I/O issue. After adjusting my I/O it fix our issue and installation went on without any issue.
If any one using KVM here is detail article
We just give this option cache=‘writeback
using this article http://www.ducea.com/2011/07/06/howto-improve-io-performance-for-kvm-guests/ and http://itscblog.tamu.edu/improve-disk-io-performance-in-kvm/ this fix my issue thanks -
Using &variables in a SQL Script scheduled job within OEM
Hi...I've been searching through the forum looking for any examples of setting up a job within OEM, using the SQL Script job type, where I can basically use a WHERE clause that says 'where column_name = &variable_name' and somehow provide that at run time, as if I were in a SQL*Plus session and using a PROMPT and ACCEPT command. I thought there might be a way to emulate that situation by placing the value I'd like to qualify on within some placeholder in the Parameter section of the job. We have a few users who have limited access to OEM and need to run queries on GRANTS and ROLES for various users etc. I realize there are other ways to do this, however I'm wondering if OEM has a capability like this. Any info is appreciated! Tks!
Looks like you're missing the schema name and you'll want to use QUOTENAME to add delimiters to the objects
e.g.
DECLARE @DATABASE AS VARCHAR(50)
DECLARE @SchemaName as SYSNAME;
DECLARE @TABLE AS VARCHAR(50)
DECLARE @QUERY AS VARCHAR(MAX)
SELECT @DATABASE = '602'
SELECT @SchemaName = 'dbo' --change as appropriate
SELECT @TABLE = 'Items'
SET @QUERY = 'SELECT TOP 10 * FROM ' + QUOTENAME(@DATABASE)+'.' + QUOTENAME(@SchemaName) + '.'+QUOTENAME(@TABLE)
print @query
EXEC( @QUERY) -
SQL Script to MS Access compatable
Hello Guys,
I have a SQL script that I want to change to be able to function in Access. The script uses some outer join (+) clause in the where clause. I want to do the same outer join in Access 2002. The script is below. Some of the code have already been change to Access such as Choose(), Date() etc. I just want the outer join to work... Where I have the oracle outer join (+) operator those are the tables I want to join.
SELECT OPB_SUBJECT.SUBJ_NAME,
MAX(OPB_WFLOW_RUN.WORKFLOW_NAME) AS MAXOFWORKFLOW_NAME,
OPB_WFLOW_RUN.START_TIME,
OPB_WFLOW_RUN.END_TIME,
CHOOSE([SESS_TASK_INST_RUN].[RUN_STATUS_CODE],"Succeeded",
"Disabled","Failed","Stopped","Aborted","Running",
"Suspending","Suspened","Stopping","Aborting",
"Waiting","Scheduled","Unscheduled","Unknown",
"Terminated") AS EXPR1,
OPB_WFLOW_RUN.RUN_STATUS_CODE,
(IIF(ISNULL([OPB_WFLOW_RUN].[END_TIME]),DATE()) - [OPB_WFLOW_RUN].[START_TIME]) * 1440 AS EXPR2,
SESS_TASK_INST_RUN.TASK_NAME,
SESS_TASK_INST_RUN.START_TIME,
SESS_TASK_INST_RUN.END_TIME,
SESS_TASK_INST_RUN.RUN_STATUS_CODE,
SESS_TASK_INST_RUN_LOG.LOG_FILE,
CHOOSE([SESS_TASK_INST_RUN].[RUN_STATUS_CODE],"Succeeded",
"Disabled","Failed","Stopped","Aborted","Running",
"Suspending","Suspened","Stopping","Aborting",
"Waiting","Scheduled","Unscheduled","Unknown",
"Terminated") AS EXPR3,
SESS_TASK_INST_RUN_LOG.SRC_SUCCESS_ROWS,
SESS_TASK_INST_RUN_LOG.TARG_SUCCESS_ROWS,
SESS_TASK_INST_RUN_LOG.SRC_FAILED_ROWS,
SESS_TASK_INST_RUN_LOG.TARG_FAILED_ROWS,
SESS_TASK_INST_RUN_LOG.TOTAL_TRANS_ERRS,
(IIF(ISNULL([SESS_TASK_INST_RUN].[END_TIME]),DATE()) - [SESS_TASK_INST_RUN].[START_TIME]) * 1440 AS EXPR4,
OPB_USERS.USER_DESC,
ROUND([OPB_WFLOW_RUN].[START_TIME]) AS EXPR5,
OPB_USERS.USER_NAME
FROM OPB_SUBJECT,
OPB_WFLOW_RUN,
OPB_TASK_INST_RUN AS SESS_TASK_INST_RUN,
OPB_SESS_TASK_LOG AS SESS_TASK_INST_RUN_LOG,
OPB_USERS
WHERE (((SESS_TASK_INST_RUN.TASK_TYPE) = 68)
AND ((OPB_USERS.USER_NAME) = [OPB_WFLOW_RUN].[USER_NAME])
AND ((OPB_SUBJECT.SUBJ_ID) = [OPB_WFLOW_RUN].[SUBJECT_ID])
AND ((SESS_TASK_INST_RUN.INSTANCE_ID) = [SESS_TASK_INST_RUN_LOG].[INSTANCE_ID] (+)) )
AND ((SESS_TASK_INST_RUN.WORKFLOW_ID) = [SESS_TASK_INST_RUN_LOG].[WORKFLOW_ID] (+)) )
AND ((SESS_TASK_INST_RUN.WORKFLOW_RUN_ID) = [SESS_TASK_INST_RUN_LOG].[WORKFLOW_RUN_ID] (+)) )
AND ((SESS_TASK_INST_RUN.WORKLET_RUN_ID) = [SESS_TASK_INST_RUN_LOG].[WORKLET_RUN_ID] (+) )
AND ((OPB_WFLOW_RUN.WORKFLOW_ID) = [SESS_TASK_INST_RUN].[WORKFLOW_ID] )
AND ((OPB_WFLOW_RUN.WORKFLOW_RUN_ID) = [SESS_TASK_INST_RUN].[WORKFLOW_RUN_ID]))
GROUP BY OPB_SUBJECT.SUBJ_NAME,OPB_WFLOW_RUN.START_TIME,
OPB_WFLOW_RUN.END_TIME,OPB_WFLOW_RUN.RUN_STATUS_CODE,
SESS_TASK_INST_RUN.TASK_NAME,SESS_TASK_INST_RUN.START_TIME,
SESS_TASK_INST_RUN.END_TIME,SESS_TASK_INST_RUN.RUN_STATUS_CODE,
SESS_TASK_INST_RUN_LOG.LOG_FILE,SESS_TASK_INST_RUN_LOG.SRC_SUCCESS_ROWS,
SESS_TASK_INST_RUN_LOG.TARG_SUCCESS_ROWS,
SESS_TASK_INST_RUN_LOG.SRC_FAILED_ROWS,SESS_TASK_INST_RUN_LOG.TARG_FAILED_ROWS,
SESS_TASK_INST_RUN_LOG.TOTAL_TRANS_ERRS,OPB_USERS.USER_DESC,
OPB_USERS.USER_NAME,OPB_WFLOW_RUN.WORKFLOW_NAME,
SESS_TASK_INST_RUN.RUN_STATUS_CODE;The standard SQL syntax for outer join is:
SELECT * FROM T1 [LEFT] [OUTER] JOIN T2 ON (T2.C1 = T1.C1 .. AND T2.CN = T1.CN)
It is in some situations "equivalent" to Oracle's propritary syntax:
SELECT * FROM T1, T2 WHERE T2.C1(+) = T1.C1 ...
As far as I remember MS Access supports standard SQL syntax regarding outer joins, so in your case:
SELECT ...
FROM
OPB_SUBJECT,
OPB_WFLOW_RUN,
OPB_USERS,
OPB_TASK_INST_RUN AS SESS_TASK_INST_RUN
LEFT JOIN OPB_SESS_TASK_LOG AS SESS_TASK_INST_RUN_LOG
ON
(SESS_TASK_INST_RUN.INSTANCE_ID) = [SESS_TASK_INST_RUN_LOG].[INSTANCE_ID]
AND (SESS_TASK_INST_RUN.WORKFLOW_ID) = [SESS_TASK_INST_RUN_LOG].[WORKFLOW_ID]
AND (SESS_TASK_INST_RUN.WORKFLOW_RUN_ID) = [SESS_TASK_INST_RUN_LOG].[WORKFLOW_RUN_ID]
AND (SESS_TASK_INST_RUN.WORKLET_RUN_ID) = [SESS_TASK_INST_RUN_LOG].[WORKLET_RUN_ID]
WHERE (((SESS_TASK_INST_RUN.TASK_TYPE) = 68) .........(Not tested, naturally) -
Unable to upload sql scripts in oracle database express edition 10g
Hi!
To start, I inform you that I am beginner with Oracle. I installed Oracle 10g Express Edition on my Ubuntu 9.10 like this:
:~$ dpkg -i oracle-xe-universal_10.2.0.1-1.0_i386.deb
:~$ /etc/init.d/oracle-xe configure
and I keep default value. Then I go on http://127.0.01:8080/apex, I login with system account, I create another dba account etc ... all looks very well ! Then I follow this tutorial: http://st-curriculum.oracle.com/tutorial/DBXETutorial/index.htm
and I am unable to load sql script which is given in this tutorial: load_sample.sql (page:What to do first> Loading Data). I try with a little sql script and same result. I use gedit with UTF-8 and load in UTF-8, my browser is Firefox (I try with Epiphany also).
The result is: when I upload a script, The browser returns "script uploaded." Then I can click on my script but the file is empty ! Nothing is done when i click on "run". Also I can' create a script; With Epiphany browser, the frame to write is all red. To import a script I have to copy one by one request in sql commands. I try also to change the owner of load_sample.sql:
-rw-r--r-- 1 oracle dba 49969 2010-01-10 12:05 load_sample.sql
But it doesn't work. If someone has an idea ? Thanks for your help.
regards
Alex.
load_sample.sql:
CREATE TABLE regions
( region_id NUMBER
CONSTRAINT region_id_nn NOT NULL
, region_name VARCHAR2(25)
CREATE UNIQUE INDEX reg_id_pk
ON regions (region_id);
ALTER TABLE regions
ADD ( CONSTRAINT reg_id_pk
PRIMARY KEY (region_id)
) ;Hi,
So, go to shell ,position yourself in the path where load_sample.sql is, login to the XE as the user with whom you need to run load_sample.sql , and execute
+@load_sample.sql.+
Sorry to hijack this thread but I am having exactly the same problem in Ubuntu, it seems as if IE is the only browser that allows you to do this, but installing that in Ubuntu looks like a whole other world of pain I do not really want to get involved in at the moment.
I like the sound of just shortcutting the sql upload through the shell but I'm a beginner and I do not understand your comment well enough to implement it myself.
Could you please explain these in more granular layman's terms for someone who is unfamiliar with Linux syntax?
By 'Shell' do you mean the Terminal or the SQL Command Line? What exactly should I be typing in there to execute the sql?
I have saved the SQL file on my desktop (jon@jon-pc, it's a virtual Ubuntu machine on a Windows Vista host).
Thanks in advance
Jon -
Problem while executing script in Toad - How to use '&' in the sql script ?
I have to execute sql script in toad. Sql script has one insert query in which one insert-value is 'USA & CAN'. When I executed the script in toad by pressing F5, I got a prompt window asking for the value if 'CAN' as it is after the &.
I tried using[b] {escape '\' } .... but could not resolve the problem.
Is there any solution or workaround to overcome this problem. I have thousands of records with such values and I have to use sql script only.There is an option in TOAD to change this behaviour.
Look in VIEW/OPTIONS/SQL Editor/
Uncheck the box for "Scan statements for bound variables before execution".
In SQL*PLUS it would be SET SCAN OFF
(desupported version is SET DEFINE OFF)
Message was edited by:
Sven Weller -
SQL Script working differently with 8i and 9i
Hi
I am facing strange problem with my simple SQL script called from a shell script. It bahaves differently with ORACLE 8.1.7.4 and 9.2.0.1. The machine is same.
sqlplus -s / @Tech.sql WKC625 11 11 '11 22' ""
This is working with 9i but it does't work with ORACLE 8.
Actually what happens is that the argument in single quotes (') is taken as 2 separate argument.
The Tech.sql is :
spool add
insert into TECH values ('&1','&2','&3','&4','&5','');
commit;
quit
I tried putting double quotes also. Is something got changes between ORACLE 8 and 9i.
Please help
SurendraAre you sure it works in 9?
If you are using UNIX, then the O/S will strip the quotes while processing the arguments. You need to use 2 single quotes around 11 22, just as you have around the empty string at the end.
I believe that Windows does the same, but I do not use windows.
John -
How to use parameters in oracle SQL script????
Right now I am writing a SQL script to create a schema and build the objects of this schema....
I use a .net winform program to run sqlplus to parse this sql script ...
The problem is that the schema name and the tablespace's location and the sys password must be input by the user, so my SQL script should use these runtime input parameters instead of const parameters....
So, how to use parameter in SQL script ...........
Are there some example scripts in oracle home directory for me to refer to????Hi,
UNISTD wrote:
thanks .....
what's the difference between variable , define, accept in sqlplus ???VARIABLE declares (but does not assign a value to) a bind variable. Unlike substitution variables, bind variables are passed to the back end to be compiled, and they can only be values in certain data types. You can not use a bind vaiable in place of an identifier, so to do something like
CREATE USER &1 ...a bind variable won't work.
"DEFINE x = y" sets the substitution variable &x to have the value y. There is no user interaction (unless x or y happen to contain undefined substtiution variables).
"DEFINE x" shiows the value of the substitution variable &x, or, if it is undefined, raises a SQL*Plus error. I use this feature below.
ACCEPT sets a substitution variable with user interaction.
And if the user miss some parameters in “sqlplus /nolog ssss.sql par1 par2 par5 par6”, how to use default value of the miss parameters??Don't you need a @ befiore the script name, e.g.
sqlplus /nolog @ssss.sql par1 par2 par5 par6Sorry, I don't know of any good way to use default values.
The foloowing works, but, as you can see, it's ugly.
"DEFINE 1" display a message like
DEFINE 1 = "par1" (CHAR)if &1 is defined; otherwise,it will display a SQL*Plus error message like
SP2-035: symbol 1 is UNDEFINEDNotice that the former contains an '=' sign, but the latter does not.
The best way I know to use default values is to run the DEFINE command, save the output to a filee, read the file, and see if it's an error message or not.
So you can use a script like this:
-- This is DEFINE_DEFAULT.SQL
SPOOL got_define_txt.sql
DEFINE &dd_old
SPOOL OFF
COLUMN dd_new_col NEW_VALUE &dd_new
WITH got_define_txt AS
SELECT q'[
@got_define_txt
]' AS define_txt
FROM dual
SELECT CASE
WHEN define_txt LIKE '%=%'
THEN REGEXP_REPLACE ( define_txt
, '.+"
([^"]*)
, '\1'
ELSE '&dd_default'
END AS dd_new_col
FROM got_define_txt
{code}
and start your real script, ssss.sql, something like this:
{code}
DEFINE dd_new = sv1
DEFINE dd_old = 1
DEFINE dd_default = FOO
@DEFINE_DEFAULT
DEFINE dd_new = sv2
DEFINE dd_old = 2
DEFINE dd_default = "Testing spaces in value"
@DEFINE_DEFAULT
{code}
when this finishes running, the substitution variable &sv1 will either have the value you passed in &1 or, if you didn't pass anything, the default value you specified, that is FOO.
Likewise, &sw2 will have the value you passed, or, if you didn't pass anything, the 23-character string 'Testing spaces in value'.
Here's how it works:
Define_default.sql puts the output of the "DEFINE x" command into a column, define_txt, in a query. That query displays either the existing value of the substitution variable indicated by &dd_old or, if it is undefined, the default value you want to use, which is stored in the substitution variable &dd_default. The substitution variable named in &dd_new is always set to something, but that something may be its existing value.
Notice that the paramerters to define_default.sql must be passed as global varibales.
Why didn't I just use arguments, so that we could simply say:
{code}
@DEFINE_DEFAULT sv1 1 FOO
{code}
? Because that would set the substitution variables &1, &2 and &3, which are miost likely the very ones in which you're interested.
I repeat: there must be a better way, but I'm sorry, I don't know what it is.
I usually don't do the method above. Instead, I always pass the required number of parameters, but I pass dummy or plce-holder values.
For example, if I wanted to call ssss.sql, but use defulat vlaues for &1 and &3, then I would say something like:
{code}
@ssss ? par2 ?
{code}
and, inside ssss.sql, test to see if the values are the place holder '?', and, if so, replace them with some real default value. The use has to remember what the special place holder-value is, but does not need to know anything more, and only ssss.sql itself needs to change if the default values change. -
Passing variables to a SQL script within GC
Does anyone know if it is possible to pass in a variable to a SQL script job within Grid Control? I don't see that I can but I wanted to ask a larger group before cloing the loop on this.
For example, I have a few different databases which contain schemas which contain date-range daily partitioned tables. I have a single code block which normally accepts the table_owner as a variable then loops through to analyze the current day partition. I'd love for OEM to be able to have the job submitted to different database targets whilst passing in a different schema name.
Thank you in advance!This can be done in two steps, but first create a table containing 1 field in all the target databases (something like table_owner) you want to analyze. Before running the analyze script change in all the to be analyzed databases the table_owner (as a job?). Then run the analyze script that contains a statement that first reads the table_owner and use this table_owner as the variable you want.
Even more simple would be to use a database link from the to be analyzed db's to a central db so you only have to change the table_owner in one table.
Eric -
Hi,
I would like to search and replace text in files when I run .sql files.
E.g. When user provide scripts xyz.sql, they would put table name like
INSERT INTO xyz abc VALUES ('text');I would like to replace xyz abc with mytablename.
INSERT INTO mytablename VALUES ('text');Besides, from the user provided sql scripts, they would be some junk create table scripts, I would like to comment off these junk table scripts.
How could I do achieve the above?
I would be calling user provided sql scripts from my sql scripts using @ ./data/temp.sql
I am using oracle 8i for these. Any help is highly appreciable.
RegardsSorry for not providing enough information.
Oracle Version as follows:
Oracle8i Enterprise Edition Release 8.1.7.4.0 - Production
PL/SQL Release 8.1.7.4.0 - Production OS:
Windows XP SP2
damorgan wrote:No clear description of what you are actually trying to do ... for example where does "mytablename" come from, where does "xyz" come from, how is the decision to be made? Based on what logic?
These scripts are provided by customers where they use other database like sysbase to generate sql scripts and customers are not in a position to change or alter the scripts as these scripts are part of the solution provided by vendors(legacy systems)
I have mapping sql script where table name is e.g. xyz abc then use script TS2 and process the customer given sql script.
Let me know if I am missing out any other information.
Regards -
Used Columns in SQL script/query
Hi,
we really have many SQL scripts.
Since we expect changes within the schema, we need to know the scripts which have to be reworked.
Is there any tool available which identifies all columns refereced by a sql script (columns used in select-list, where-clause, order-by clause, ...) ?
Axel.There is a function in TOAD (http://www.toadsoft.com/) Schema Browser you select the table that need to be changed then you select the tab "used by" and it will show the stored procedures that are using this TABLE (not columns). Also this option only display stored code in the database and not unnamed scripts.
This is a script you can use:
Select owner, object_type, object_name, object_id, status
from sys.DBA_OBJECTS where object_id in (
Select object_id
from public_dependency
connect by prior object_id = referenced_object_id
start with referenced_object_id = (
Select object_id from sys.DBA_OBJECTS
where owner = 'OWNER'
and object_name = 'TABLE_NAME'
and object_type = 'TABLE' ))
I hope this is useful to you.
Message was edited by:
Delfino N. -
Save a script or copy a script with SQL Script editor
Hello,
I try to rename a script in the script repository I've loaded using the upload command.
I enter the new script name in the text field and save with the new name as the pdf doc explain (page 18-8).
It does not create any new script in the repository.
Also when i make a change and save it does not save the modification and the last modification date remains the upload date.
Can you help me ?
Thx.
Jean-PaulScott,
Thx for your help. I use Application Express 3.1.0.00.32 and the document I refer is Application Express User’s Guide Release 3.1 E10499-01 chapter 18 page 18-8 "Copying a script".
What I do is the following :
1) upload a file name pks_test with encoding scheme 'Western European ISO-8869-1' (to preserve accentuation characters used in the french language)
2) Edit it then modify it in the editor then save it.
3) The re-open it : the modification made at previous step has disapeared !
4) Change the name from pks_test to test_pks and the name is not changed.
Also the 'Last Updated' info is updated as if there were no change since the
script creation time.
If i do the same steps with the Hosted version of APEX (thx to Oracle) all work
fine.
There is a difference between the 2 environment :
my local (in France) display text in english (Home/Application Builder/SQL workshop ...)
the hosted (somewhere in the world) display text in french (Page d'accueil/Application Builder/SQL workshop ...)
Thx for the help
Maybe you are looking for
-
Message no. GLT2001, New general ledger, Down payment request Fixed asset
Hi, I have posted down payment request with target special gl indicator for fixed asset. When I run payment program in proposal there is a mistake: Message no. GLT2001 Diagnosis The online document splitting is active in your system. Here, each docum
-
I keep getting someone else's photos in my photo stream and can't delete them directly
Hi ! I noticed lately that I keep getting pics of someone else in my photo stream. I thought maybe that someone hacked into my cloud so I disabled it and the share feature for photos. This isn't the first time this happens, I used to get pics from th
-
Experiencing frequent kernel panic
I have a G4 AGP with 1.4 Ghz Sonnet processor upgrade (original 400 MHZ) and a flashed ATI Radeon 9200 card. I am also using a 4 port USB2 PCI card and a couple USB Hubs. Over the last couple of the weeks I am getting a daily kernel panic (usually wh
-
ITunes Extras for Lord of the Rings Extended Edition Bundle
Hi. I am wondering if anyone knows about how to access the Extras that are supposed to be included with the Lord of the Rings Extended Edition Bundles. On the Store page for the bundle there is a section listing out Extras included but the movies don
-
OLEDB Syntax error INSERT INTO in DAL
hi guys im having problems with my DAL in my school project before im starting new one maybe you can help me using System; using System.Collections.Generic; using System.Linq; using System.Web; using System.Data; using System.Data.OleDb; public class