SQL job blocking CDR retrieval
Hi, i have an SQL job problem in my customer sites. The problem began with 3rd party application can't retrieve the CDR data. And then the 3rd party vendor ask me to shut down the job "optmizations for CIPT" in SQL Server Service Management.
(SQL Enterprise Server Manager -> Console root -> Microsoft SQL Servers -> SQL Server Group -> (Local)(Windows NT) -> Management -> Jobs)
After that, CDR retrieval is running well till now, it have been almost a month.
The question is, what is the job "optimizations for CIPT" for? Is it important. The CCM version is 3.1.4 and i am going to patch it in SR8a. Is it necessary to turn the job on before patching the CCM?
Or is it a bug untill i patch the CCM?
Thanks ^^
The Optimizations for CIPT job tries to purge the db at the end of each day. However, if the size of the database is too large, the job needs to take a lot of time to find the records to purge. If the total number of records is low, then there are fewer records to search through to find those old records. So, if the value is low, it should reduce the time it takes, and improve performance.
Job "Optimizations for CIPT" has two steps:
1. Optimize CDR database
2. Optimize Config database
Similar Messages
-
Can I use Reports Server Queue PL/SQL Table API to retrieve past jobs ?
Hi all,
Can I use Reports Server Queue PL/SQL Table API to retrieve past jobs using WEB.SHOW_DOCUMENT from Forms ?
I have reviewed note 72531.1 about using this feature and wonder if i can use this metadata to retrieve past jobs submitted by a user.
The idea would be to have a form module that can filter data from the rw_server_queue table, say, base on user running the form, and be able to retrieve past jobs from Report Server Queue. For this, one would query this table and use WEB.SHOW_DOCUMENT.
Is this possible ...?
Regards, Luis ...!Based on that metalink note and the code in the script rw_server.sql, I am pretty sure that by querying the table you would be able accomplish what you want... I have not tested it myself... but it looks that it will work... you have the jobid available from the queue, so you can use web.show_document to retrieve the output previously generated...
ref:
-- Constants for p_status_code and status_code in rw_server_queue table (same as zrcct_jstype)
UNKNOWN CONSTANT NUMBER(2) := 0; -- no such job
ENQUEUED CONSTANT NUMBER(2) := 1; -- job is waiting in queue
OPENING CONSTANT NUMBER(2) := 2; -- opening report
RUNNING CONSTANT NUMBER(2) := 3; -- running report
FINISHED CONSTANT NUMBER(2) := 4; -- job has finished
TERMINATED_W_ERR CONSTANT NUMBER(2) := 5; -- job has terminated with -
CIPT Optimation SQL job crashing ccm.exe
Any one seen this before and have idea on how to fix it or if this is a known bug? Have done searches on bugtoolkit and have not found any yet. The CIPT job completes fine but ccm crashes while it is running.
I have moved the start time of the CIPT SQL job to later in the morning when other jobs are done running and it also crashes.
SQL SP3a
OS 2000.2.6sr4
3.3(4)sr2
BARS 4.04I got this issue fixed by purging old CDR records, and shrinking the CDR database.
The way I found out it was something to do with CDR database was I excecuted the steps in the CIPT SQL job one by one and the step1 involving CDR took 8-9 minutes to run. After purging records it is down to 1 minute or so and nothing has unregistered/crashed like previous times I had ran the CIPT SQL job. -
Reference value of an SQLPLUS variable in a PL/SQL anonymous block
All,
Is there a way of referencing an SQLPLUS variable within a PL/SQL anonymous block. See my example below........
sqlplus -s /@${L_DB_SID} <<-ENDOFSQL >> ${L_LOGFILE}
SET FEEDBACK OFF
SET PAGES 0
SET SERVEROUTPUT ON
WHENEVER SQLERROR EXIT SQL.SQLCODE
WHENEVER OSERROR EXIT 2
VARIABLE l_ret_sts NUMBER;
VARIABLE l_ret_msg VARCHAR2(300);
exec sh_plsql_owner.sh\$secure_batch.p\$set_role(p_ret_sts => :l_ret_sts);
begin
if :l_ret_sts > 0 then
dbms_output.put_line('l_ret_sts:'||:l_ret_sts||':SECURITY');
else
${L_PLSQL_PROG}(p_ret_type => 0, p_ret_sts => :l_ret_sts, p_ret_msg => :l_ret_msg);
dbms_output.put_line('l_ret_sts:'||NVL(:l_ret_sts,0));
dbms_output.put_line('l_ret_msg:'||:l_ret_msg);
end if;
end;
exit
ENDOFSQL
I need to be able to reference :l_ret_sts in the begin block using the if statement "if :l_ret_sts > 0 then"
:l_ret_sts is populated in a procedure call beforehand.
However it seems as though the begin block cannot reference the value returned to :l_ret_sts.
Any ideas.
Ian.Managed to solve this. I put my call to the package that the role enables via dynamic sql....
sqlplus -s /@${L_DB_SID} <<-ENDOFSQL >> ${L_LOGFILE}
SET FEEDBACK OFF
SET PAGES 0
SET SERVEROUTPUT ON
WHENEVER SQLERROR EXIT SQL.SQLCODE
WHENEVER OSERROR EXIT 2
VARIABLE l_ret_sts NUMBER;
VARIABLE l_ret_msg VARCHAR2(300);
exec dbms_application_info.set_client_info('CONTROL-M');
exec sh_plsql_owner.sh\$secure_batch.p\$set_role(p_ret_sts => :l_ret_sts);
declare
v_text varchar2(500);
begin
if :l_ret_sts > 0 then
dbms_output.put_line('l_ret_sts:'||:l_ret_sts||':SECURITY');
else
v_text := 'begin ${L_PLSQL_PROG}(p_ret_type => 0, p_ret_sts => :1, p_ret_msg => :2);end;';
execute immediate v_text using in out :l_ret_sts, in out :l_ret_msg;
dbms_output.put_line('l_ret_sts:'||NVL(:l_ret_sts,0));
dbms_output.put_line('l_ret_msg:'||:l_ret_msg);
end if;
end;
exit
ENDOFSQL
Cheers
Ian. -
HTML Form in a "PL/SQL (anonymour Block)"
Hello
I need a little ugent guidance
I have create a "form" within a "PL/SQL (anonymour Block)". The requirement is to show what a HTML form looks like as you build the code
The problem is I am "Up Setting" the APEX processing i.e. wwv_flow.accept ... I have added an example below .....
All help very welcome
Thanks
Pete
htp.prn('<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">');
htp.prn( '<style type="text/css">');
htp.prn('#form{font-family: "Trebuchet MS", Verdana, sans-serif; width:25em;}');
htp.prn('h2{margin: 0 0 0 0; padding: 0;}');
htp.prn('p{margin: 0 0 1em 0; padding: 0; font-size:90%}');
htp.prn('fieldset{background:#C361D2; border:none; margin-bottom:1em; width:24em; padding-top:1.5em}');
htp.prn('p.legend{background:#DED983;
color:black;
padding: .2em .3em;
width:750px;
font-size:18px;
border:6px outset #DED980;
position:relative;
margin: -2em 0 1em 1em;
width: 20em;}');
htp.prn('fieldset{margin-bottom:1em; width:66em; padding-top:1.5em;}');
htp.prn('#company {background:#F3B4F5; border:outset #F3B4F5; width="700";}');
htp.prn('#company label{position:absolute;
font-family:arial;
font-size:16px;
padding:.2em;}');
htp.prn('input{margin-left:9em;margin-bottom:.2em; line-height:1.4em;}');
htp.prn('#message1 {background:#a3B4F5; border:outset #a3B4F5; width="700";}');
htp.prn('#message2 {background:#c3B4F5; border:outset #c3B4F5; width="700";}');
htp.prn('button1 {font:48px "Trebuchet MS", "Verdana", sans-serif;
background:#F0888A;
border:outset #6EC6F1}');
htp.prn('#buttons1 input {background:#DED983;
font:1.2em "Trebuchet MS", Verdana, sans-serif}');
htp.prn('p#buttons1 {white-space:nowrap}');
htp.prn('</style>');
htp.prn('<table width="760"><tr bgcolor="#D5EAE9">');
htp.prn('<BR><BR>');
htp.prn ('<form method="" action="">');
htp.prn ('<fieldset id="company"><p class="legend" >Company</p>');
htp.prn ('<label>Comapany Name: </label> <input name="company" type="Text" size="30"/>');
htp.prn ('<br><br>');
htp.prn ('</fieldset>');
htp.prn ('<br><br><br>');
htp.prn ('<fieldset id="message1"><p class="legend">Message One</p>');
htp.prn ('</fieldset>');
htp.prn ('<br><br><br>');
htp.prn ('<fieldset id="message2"><p class="legend">Message Two</p>');
htp.prn ('</fieldset>');
htp.prn ('<br><br>');
htp.prn ('</form>');
htp.prn('</tr></table>');
End;
______________________________________________________________________________________________________Pete:
Remove the name attributes from all input elements defined by the pl/sql process. For example
<input name="company" type="Text" size="30"/> should be replaced by <input type="Text" size="30"/> or <input name="f01" type="Text" size="30"/>
The APEX accept process recognises a predefined set of HTML form input names. Any input with a name not from this set will cause the accept process to fail. f01 through f50 are valid names for the accept procedure.
varad -
I've created an SSIS package to import a comma delimited file (csv) with double quotes for a text qualifier ("). Some of the fields contain the delimiter inside the qualified text. An example row is:
15,"Doe, John",IS2,Alabama
In SSIS I've specified the text qualifier as ". The sample output in the connection manager looks great. The package runs perfectly from VS and when manually executed on the SSIS server itself. The problem comes when I schedule the package to run via SQL
job. At this point the package ignores the text qualifier, and in doing so pushes half of a field into the next available column. But instead of having too many columns, it concatenates the last 2 columns ignoring the delimiter. For example (pipes are fields):
15|"Doe| John"|IS2,Alabama
So the failure happens when the last half of a field is too large to fit into the next available field. In the case above _John" is 6 characters where the IS2 field is char(3). This would cause a truncation failure, which is the error I receive from the
job history.
To further test this I created a failure flow in my data flow to capture the records failing to be pulled from the source csv. Out of ~5000 records, ~1200 are failing, and every one of the failures has a comma delimiter inside the quoted text with a 'split'
length greater than the next ordinal field. The ones without the comma were inserted as normal and records where the split fit into the next ordinal column where also imported, with the last field being concatenated w/delimiter. Example records when selected
from table:
25|"Allan Percone Trucking"|GI6|California --Imported as intended
36|"Renolds| J."|UI6,Colorado --Field position offset by 1 to right - Last field overloaded
To further ensure this is the problem, I changed the csv file and flat file connection manager to pipe delimited, and sure enough every record makes it in perfectly from the SQL job execution.
I've tried comma delimited on the following set ups. Each set up failed.
SSIS Server 2008 R2 RTM
DB Server 2008 RTM
DB Compat 80
SSIS Server 2008 R2 RTM
DB Server 2008 R2 RTM
DB Compat 80
SSIS Server 2008 R2 RTM
DB Server 2008 RTM
DB Compat 100
SSIS Server 2008 R2 RTM
DB Server 2008 R2 RTM
DB Compat 100
Since a lot of our data comes in via comma delimited flat files, I really need a fix for this. If not I'll have to rebuild all my files when I import them to use pipe delimiters instaed of commas. I'd like to avoid the extra processing overhead if possible.
Also, is this a known bug? If so can someone point me to the posting of said bug?
Edit: I can't ask the vendor to change the format to pipe delimited because it comes from a federal government site.Just wanted to share my experience of this for anyone else since I wasted a morning on it today.
I encountered the same problem where I could run the package fine on my machine but when I deployed to a server and ran the package via dtexec, the " delimeter was being replaced with _x0022_ and columns all slurped up together and overflowing columns/truncating
data etc.
Since I didn't want to manually hack the DTS XML and can't install updates, the solution I used was to set an expression on the textdelimiter field of the flat file connection with the value "\"" (a double quote). That way, even if the one stored in XML
gets bodged somewhere along the way, it is overridden with the correct value at run time. The package works fine everywhere now. -
Execute Process Task showing all the SSIS Config variables in SQL Job History
Hi,
Am using an Execute Process task to execute my child package. And executing the Parent package from a SQL JOB.
I am using the same config file for both Parent and Child packages.
After the Job execution was Successful / Failure, in Job history it is showing all the variables from Config file which are not using in Child package. With the below message.
"The package path referenced an object that cannot be found"
I don't want to catch all the Variables information in Job History. Instead I need only the Success / Failure message.
PFA Screen.
Thanks,
Sailajasee
http://www.mssqltips.com/sqlservertip/1417/custom-logging-in-sql-server-integration-services-ssis/
https://www.simple-talk.com/sql/ssis/ssis-event-handlers-basics/
Please Mark This As Answer if it solved your issue
Please Vote This As Helpful if it helps to solve your issue
Visakh
My Wiki User Page
My MSDN Page
My Personal Blog
My Facebook Page -
Run Multiple SSIS Packages in parallel using SQL job
Hi ,
We have a File Watcher process to determine the load of some files in a particular location. If files are arrived then another package has to be kick-started.
There are around 10 such File Watcher Processes to look for 10 different categories of files. All 10 File Watcher have to be started at the same time.
Now these can be automated by creating 10 different SQL jobs which is a safer option. But if a different category file arrives then another job has to be created. Somehow I feel this is not the right approach.
Another option is to create one more package and execute all the 10 file watcher packages as 10 different execute packages in parallel.
But incase if they don’t execute in parallel, i.e., if any of the package waits for some resources, then it does not satisfy our functional requirement . I have to be 100% sure that all 10 are getting executed in parallel.
NOTE: There are 8 logical processors in this server.
(SELECT cpu_count FROM sys.dm_os_sys_info
i.e., 10 tasks can run in parallel, but somehow I got a doubt that only 2 are running exactly in parallel and other tasks are waiting. So I just don’t want to try this option.
Can someone please help me in giving the better way to automate these 10 file watcher process in a single job.
Thanks in advance,
Raksha
RakshaHi Jim,
For Each File Type there are separate packages which needs to be run.
For example package-A, processes FileType-A and package-B processes FileType-B. All these are independent processes which run in parrallel as of now.
The current requirement is to have File Watcher process for each of these packages. So now FileWatcher-A polls for FileType-A and if any of the filetype-A is found it will kick start package-A. In the same way there is FileWatcher-B, which looks for FileType-B
and starts Package-B when any of FileType-B is found. There are 10 such File Watcher processes.
These File Watcher Processes are independent and needs to start daily at 7 AM and run for 3 hrs.
Please let me know if is possible to run multiple packages in parallel using SQL job.
NOTE: Some how I find it as a risk, to run these packages in parallel using execute package task and call that master package in job. I feel only 2 packages are running in parallel and other packages are waiting for resources.
Thanks,
Raksha
Raksha -
Hi Everyone,
I having a problem to transfer data from MS SQL 2005 to IBMAS400. Previously my SSIS was running perfectly but there is some changes I need to be done in order for the system to work well. Considers my changes are minimal & just for upgrades (but I did
include DELETE statements to truncate AS400 table before I insert fresh data from MS SQL table to the same AS400 table), so I compile my SSIS package & it run successfully & I passed it into MS SQL Integrated Service as 1 of the packages & manually
executed the package & the result is the same, that mean it was successful again but when I try to run it in a MS SQL Job Scheduler, the job failed with these message shown below as extracted from the job View history.
Date today
Log Job History (MSSQLToAS400)
Step ID 1
Server MSSQLServer
Job Name MSSQLToAS400
Step Name pumptoAS400
Duration 00:00:36
Sql Severity 0
Sql Message ID 0
Operator Emailed
Operator Net sent
Operator Paged
Retries Attempted 0
Message
Executed as user: MSSQLServer\SYSTEM. ... 9.00.4035.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved.
Started: today time
Error: on today time
Code: 0xC0202009 Source: SSISMSSQLToAS400 Connection manager "SourceToDestinationOLEDB"
Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred.
Error code: 0x80004005. An OLE DB record is available.
Source: "IBMDA400 Session"
Hresult: 0x80004005
Description: "CWBSY0002 - Password for user AS400ADMIN on system AS400SYSTEM is not correct ". End Error
Error: today
Code: 0xC020801C
Source: Data Flow Task OLE DB Destination [5160]
Description: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "DestinationClearData" failed with error code 0xC0202009. There may be error messages posted before
this with more information on why the AcquireConnection method ca... The package execution fa... The step failed.
So I hope somebody can shed some hints or tips for me to overcome time problem of mine. Thanks for your help in advance. As I had scoured thoroughout MSDN forums & found none solution for my problem yet.
PS: In the SQL Integrated Services when I deployed the package I set the security of the packages to Rely on server...
Hope this will help.Hi Ironmaidenroxz,
From the message “Executed as user: MSSQLServer\SYSTEM”, we can see that the SQL Server Agent job ran under the Local System account. However, a Local System account doesn’t have the network rights natively, therefore, the job failed to communicate with
the remote IBMAS400 server.
To address this issue, you need to create a proxy account for SQL Server Agent to run the job. When creating the credentials for the proxy account, you can use the Windows domain account under which you executed the package manually.
References:
How to: Create a Credential
How to: Create a Proxy
Regards,
Mike Yin
TechNet Community Support -
Send error messages in DatabaseMail when SQL Job fails/succeeds
Hi ,
I have setup a profile and account for sending mail using DatabaseMail. I have a SQL JOB. If the job fails, I receive an Email.
Till this point, everything works fine. This is what the email looks like :
JOB RUN: 'GenerateJVForLabourAndOverheads_R11Testing' was run on 20-12-2012 at 10:10:50
DURATION: 0 hours, 0 minutes, 0 seconds
STATUS: Failed
MESSAGES: The job failed. The Job was invoked by Schedule 58 (Execute SP). The last step to run was step 1 (Execute SP).
Now what I want is, if the job fails for any reason, that error message should appear in the email message body.
for example, FK_Constraint error , Conversion errors or any error that caused my sql job to fail.
Is this possible ?
Any help appreciated,
Thanks,
Lok..This is one way to do, assuming your job has 2 steps and step 1 does the main part.
So,in your step 2 of the job, you can add the below code and
declare @body1 varchar(8000)
select top 1 @body1 = [Message] from msdb.dbo.sysjobhistory A INNER JOIN msdb.dbo.sysjobs B ON B.Job_id = A.Job_id
where B.Name = <<JobNaMe>> and A.step_id =1 order by instance_id desc
EXECUTE msdb.dbo.sp_send_dbmail @profile_name = <<YourMail profile>>
,@recipients = <<RecipeintEmail>>,@subject =<<Your subject>>,@body = @body1
I guess, you can also log the job step output to an outputfile and have it sent as an attachment to your mail... this might help if you have many steps in one job....
HTH!!
Please mark as 'Answer', if the solution solves your problem. -
Running a SSIS package with SQL Job and Linked Server
I have a SSIS 2008 package. In one of the Script task I am calling a stored procedure which is using Openquery using linked server. I deployed this package with protection level as "EncryptWithPassword" and gave a password to the package.
Created a SQL job and edited its command line to include the password. If I login to SQL Server Mgmt Studio with Windows Authentication and run the job manually it runs fine. But when I schedule it then I get an error that "The Communication link to Linked
server failed".
Please helpHi Vivek.B,
The issue should occur because the SQL Server Agent Service Account or SQL Agent Proxy account under which the job step runs doesn’t have sufficient permissions on the linked server.
If the job owner is the sysadmin fixed server role, the job can be run under the SQL Server Agent Service Account or a proxy account, then please make sure the SQL Agent Service Account or the proxy account has corresponding login on the linked server. If
the job owner is not a sysadmin fixed server role, the job must run under a proxy account. In this case, make sure the proxy account has a corresponding login on the linked server.
Reference:
http://blogs.msdn.com/b/dataaccesstechnologies/archive/2009/10/13/who-owns-my-job-and-who-runs-it.aspx
Regards,
Mike Yin
TechNet Community Support -
Hi folks,
I have a scheduled maintenance plan and associated jobs in SQL Server 2008 SP2 that have been working but stopped about two months ago. When I run a maintenance task or sql job using any account (sa, sysadmin-priveleged domain account, etc) I get the following
entries in the SQL Server Agent log (see below). I've tried changing the SQL Server Agent account, applied the latest hotfixes (CU2 for 2008 SP2) and set permissions manually so that the service accounts have dbo access to the msdb database. Anyone got other
ideas? I've avoided recreating the msdb database, but that may be my only option.
Date 4/02/2011 3:42:40 PM
Log SQL Server Agent (Current - 4/02/2011 3:23:00 PM)
Message
[298] SQLServer Error: 229, The EXECUTE permission was denied on the object 'sp_sqlagent_log_jobhistory', database 'msdb', schema 'dbo'. [SQLSTATE 42000] (ConnExecuteCachableOp)
Date 4/02/2011 3:42:40 PM
Log SQL Server Agent (Current - 4/02/2011 3:23:00 PM)
Message
[298] SQLServer Error: 229, The EXECUTE permission was denied on the object 'agent_datetime', database 'msdb', schema 'dbo'. [SQLSTATE 42000] (ConnExecuteCachableOp)
Date 4/02/2011 3:42:40 PM
Log SQL Server Agent (Current - 4/02/2011 3:23:00 PM)
Message
[298] SQLServer Error: 229, The EXECUTE permission was denied on the object 'sp_sqlagent_log_jobhistory', database 'msdb', schema 'dbo'. [SQLSTATE 42000] (ConnExecuteCachableOp)And just to add to the confusion - I created a SQL login called sql-maintjobs and have it sysadmin priveleges. I then logged in to SQL using this account on the instance that is failing and ran the following:
declare
@backupjobid uniqueidentifier
select
@backupjobid=CONVERT(uniqueidentifier,
'2C4974D4-53BE-4E38-8EC0-8F5398CADE88')
exec
sp_sqlagent_log_jobhistory
@job_id = @backupjobid,
@step_id
=1,
@sql_message_id
= 0,
@sql_severity
= 0,
@message
=
NULL,
@run_status
=1,
-- SQLAGENT_EXEC_X code
@run_date
=19900101,
@run_time
=1,
@run_duration
=1,
@operator_id_emailed
= 0,
@operator_id_netsent
= 0,
@operator_id_paged
= 0,
@retries_attempted
=0,
@server
=
NULL,
@session_id
= 0
This sql worked fine - no permission errors at all. That would suggest the account does have permission to EXECUTE that stored procedure... very confusing! -
Calling PL/SQL anonymous block from href in tabular report
apex 2.2.
I've got a tabular report where I've added some img columns with a href to call diff processes
for example
"<a href="f?p=&APP_ID.:22:&APP_SESSION.:BRANCH_TO_PAGE_ACCEPT|NEW_PROXY:NO:::22,ABCDEF ><img src="/i/asyl.gif" border="0" alt="Runprocess"></a>"
When clicking on an image column in row x then I would like to run the process - no page submit.
The pl/sql anonymous block process source calls package.storedproc(p1,p2) - two in parameters
I'm struggling with the syntax and is wondering if there's a smarter way to achieve the same function
Any ideas most welcome
Thanks
Peter<a href="f?p=&APP_ID.:22:&APP_SESSION.:BRANCH_TO_PAGE_ACCEPT|NEW_PROXY:NO:::22,ABCDEFG" ><img src="/i/asylogin.gif" border="0" alt="Process"></a>Question is how can you pass values from a row in a tabular report to the application process ?
-
Autonomous Transactions usage in PL/SQL anonymous block coding
Hi,
I am trying to incorporate Autonomous Transaction for our work. I am using the tables provided below,
CREATE TABLE T1
F1 INTEGER,
F2 INTEGER
CREATE TABLE T2
F1 INTEGER,
F2 INTEGER
insert into t1(f1, f2)
values(20, 0)
insert into t2(f1, f2)
values(10, 0)
Now, when I use the code snippet given below, it is working as expected.
create or replace procedure p1 as
PRAGMA AUTONOMOUS_TRANSACTION;
begin
update t2
set f2 = 25
where f1 = 10;
commit;
end;
declare
PRAGMA AUTONOMOUS_TRANSACTION;
a integer;
begin
update t1
set f2 = 15
where f1 = 20;
p1();
rollback;
end;
Here, updation in t2 table is commited and t1 is rolled back, it is working as
expected. I would like to achieve the same functionality through PL/SQL
anonymous block coding, to do this, I use the following code snippet,
declare
PRAGMA AUTONOMOUS_TRANSACTION;
a integer;
begin
update t1
set f2 = 15
where f1 = 20;
begin
update t2
set f2 = 35
where f1 = 10;
commit;
end;
rollback;
end;
Here, data in both the tables are commited, how do I change it to work as I
mentioned above like committing t2 alone, please help, thank you.
Regards,
DevaCan you explain what you're trying to accomplish from a business perspective? This doesn't look like a particularly appropriate way to use autonomous transactions, so you may be causing yourself problems down the line.
That said, padders's solution does appear to work for me
SCOTT @ nx102 Local> CREATE TABLE T1
2 (
3 F1 INTEGER,
4 F2 INTEGER
5 )
6 /
Table created.
Elapsed: 00:00:01.03
SCOTT @ nx102 Local>
SCOTT @ nx102 Local>
SCOTT @ nx102 Local> CREATE TABLE T2
2 (
3 F1 INTEGER,
4 F2 INTEGER
5 )
6 /
Table created.
Elapsed: 00:00:00.00
SCOTT @ nx102 Local>
SCOTT @ nx102 Local> insert into t1(f1, f2)
2 values(20, 0)
3 /
1 row created.
Elapsed: 00:00:00.01
SCOTT @ nx102 Local>
SCOTT @ nx102 Local> insert into t2(f1, f2)
2 values(10, 0)
3 /
1 row created.
Elapsed: 00:00:00.01
SCOTT @ nx102 Local> commit;
Commit complete.
Elapsed: 00:00:00.01
SCOTT @ nx102 Local> DECLARE
2 a INTEGER;
3
4 PROCEDURE update_t2
5 IS
6 PRAGMA AUTONOMOUS_TRANSACTION;
7 BEGIN
8 UPDATE t2
9 SET f2 = 35
10 WHERE f1 = 10;
11
12 COMMIT;
13 END update_t2;
14 BEGIN
15 UPDATE t1
16 SET f2 = 15
17 WHERE f1 = 20;
18
19 update_t2;
20
21 ROLLBACK;
22 END;
23 /
PL/SQL procedure successfully completed.
Elapsed: 00:00:00.04Have you done something else that would cause a deadlock?
Justin -
Using Create table command in a Pl/Sql Anonymous Block
Hi,
I need to create a table dynamically based on the table_name and column_names that a user wants. When I use a Pl/sql Anonymous block to do this, it complains. Any suggestions ?
Thanks,
MarisaPersonally this sounds like a bad design to me. I would say under most "normal" circumstances, you should not be creating tables on the fly. Especially one where a user has control over what columns,datatypes to use. Let a developer or dba take care of that.
Maybe you are looking for
-
Studio 8 on pc, and mac?
I have studio 8, and I have a pc. If I got a mac, is it possible to install studio 8 on a mac? The CDs say Win/Mac, or was it Mac/Win :), anyways. Thanks Nate
-
Can I have itunes in english and in french?
What I really want is to watch my favourite movies in french, is there a way to do this without downloading itunes french. If I do have to, how do I do it and will my english itunes account stay the same?
-
I was saving files from an old win95 machine to bring them to a new laptop which is windows 7. I plugged in the drive which is not full and I can only see the hp launcher. In drive properties this drives shows 100 pct full. I can and did pull up an
-
Merging a clip in audition3 (please help)
I recently upgraded to audition3 and I'm having a bit of problem. Having done a succesful punch in on track 2 in multi track view, I can't find a way to merge my clip into the track. With audition 1.5 you simply right clicked the clip and merged the
-
Problems installing CS5.5 Master Collection
I completed disc 1 and it said to insert disc 2 to continue. I ejected disc 1 and inserted disc 2 then clicked continue. After I click continue, the screen flashes then goes back to the message to insert disc 2 and continue. I have restarted installa