Stored procedure fails when run from SSIS DataFlow task
I'hv a stored procedure (with dynamis SQL), taking colums names as parameters to run and returns dynamic columns in output. E.g. if I pass @ipselect = '*' it will returns all columns in output, when I pass @ipselect = 'column1, column2, column3'
,it will returns 3 columns in output.
Now I have SSIS package. I've Data Flow task in foreach loop container and it is taking parameter from client table. e.g
clientid column name
1 *
2 columns1,columns2,column3
3 columns1,columns2,column3,columns4,columns5,column6
4 *
Foreach container loop through each client id and set ipselect parameter with
column name and that value is being passed to @ipselect stored procedure in Dataflow task. When there is @ipselect = '*' it runs fine ,returns all columns and write to destination. But when there is ipselect = 'columns1,columns2,column3',
it fails with below error:
[OLE DB Source [1]] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E55.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80040E55 Description: "Column does not exist.".
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC0202009. The component returned a failure code when the pipeline engine called PrimeOutput().
The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
Any suggestion or solution would be really helpful.
Thanks!!
Hi Kevin,
Just an addition. To create SSIS package programmatically, please refer to the document:
http://msdn.microsoft.com/en-us/library/ms345167.aspx
To implement dynamic column mapping through SSIS Script Component, please refer to:
http://wikiprogrammer.wordpress.com/2011/04/08/dynamic-column-mapping-in-ssis-part-1/
http://blog.quasarinc.com/ssis/best-solution-to-load-dynamically-change-csv-file-in-ssis-etl-package/
Regards,
Mike Yin
TechNet Community Support
Similar Messages
-
Opening Excel Workbook Fails when run from Scheduled Task on Windows Server 2008 Rw
Hi,
I have a little vbs script that instantiates the Excel.Application object and then opens a work book to perform some tasks on it. The script runs fine when run from the command line. When I attempt to run it as a scheduled task (it is supposed to update
data that is pulled from a SQL Server at regular intervals), it fails with the following error:
Microsoft Office Excel cannot access the file 'c:\test\SampleWorkbook.xlsm'. There are several possible reasons: .....
The file does exist. The path reported in the error is correct. The account under which the task is running is the same account I use to run it from the command line. User Account Control is not enabled, and the task is set up to run with highest privileges.
When I run the same script through the Task Scheduler from a Windows Server 2003 machine, it works without issue.
I was just wondering if somebody on this forum has run into a similar issue in connection with Windows Server 2008 R2 and figured out what the magic trick is to make it work. I'm sure it is rights related, but I haven't quite figured out what which rights
are missing.
Thanks in advance for any advice you may have.This is truly killing me ... trying to get it working on Windows Server 2012 without success.
I desperately need to automate running Excel macros in a "headless" environment, that is non-interactive, non-GUI, etc.
I can get it to work using Excel.Application COM, either via VBScript or Powershell, successfully on many other Windows systems in our environment - Windows Server 2008 R2, Windows 7 (32-bit), etc., -BUT-
The two servers we built out for running our automation process are Windows Server 2012 (SE) - and it just refuses to run on the 2012 servers - it gives the messages below from VBScript and PowerShell, respectively-
I have tried uninstalling and re-installing several different versions of Microsoft Excel (2007 Standard, 2010 Standard, 2010 Professional Plus, 32-bit vs. 64-bit, etc.), but it makes no difference.
Would be extremely grateful if any one out there has had any success in running Excel automation on Server 2012 in a non-interactive environment that they could share.
( I have tried adding the "%windir%\syswow64\config\systemprofile\desktop"
folder, which did fix the issue for me when testing on Windows Server 2008 R2, but sadly did not resolve it on Windows Server 2012 )
[VBScript error msg]
Z:\TestExcelMacro.vbs(35, 1) Microsoft Office Excel: Microsoft Office Excel cannot
access the file 'Z:\TestExcelMacro.xlsm'. There are several possible reasons:
• The file name or path does not exist.
• The file is being used by another program.
• The workbook you are trying to save has the same name as a currently open work
[Powershell error msg]
Exception calling "Add" with "0" argument(s): "Microsoft Office Excel cannot open or save any more documents because th
ere is not enough available memory or disk space.
To make more memory available, close workbooks or programs you no longer need.
To free disk space, delete files you no longer need from the disk you are saving to."
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : ComMethodTargetInvocation
You cannot call a method on a null-valued expression.
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : InvokeMethodOnNull -
Stored procedure fails when called through jdbc
Stored procedure returns incorrect results
Hi!
I am really puzzled about this and would really appreciate some help.
We are migrating an servlet/jsp-application from websphere application server 3.5.3 to
Oracle Application Server 10g. The backend is an Oracle 9i database server. Most of the application
is build upon stored procedures in oracle. So the migration part is to make the servlets and JSPs
to work in their new environment, creating deployment descriptors and so on.
We are using JDeveloper 9.0.5.2 for development on Windows XP OS
Java version is 1.4.2_03-b02 (ojvm)
and also have an Oracle Application Server Enterprise Edition 9.0.4.0.0
running on a W2K SP6. Java version is 1.4.2_03-b02, for development and testing.
When running standalone (that comes width JDeveloper) OC4J the Java version is 1.4.2-b28 (client VM)
The problem:
When calling a stored procedure that calls another user defined procedure the returned result is wrong.
The example snippet I've enclosed only success on my standalone OC4J.
When I run the Java-app in JDeveloper using embedded OC4J it behaves incorrectly.
When I run the Java-app deployed at the standalone OC4J that comes with JDeveloper it runs just fine.
When I deploy the Java-app at the Oracle Application Server it behaves bad again.
It looks like the procedure called by the procedure cannot execute correctly or that it got null value as in-parameter.
What puzzles me is that this procedure did work correctly on websphere (JVM 1.2.2 and Oracle jdbc-drivers classes12.jar)
and it does executes just fine on the standalone OC4J.
I have tried different classes12.jar, ojdbc14.jar, different datasources (com.evermind.sql.DriverManagerDataSource,
oracle.jdbc.pool.OracleConnectionCacheImpl) width no succes.
/Mats Nord
The code follows....
- The java code
CallableStatement cs =
con.prepareCall("{ call oden.sp_tidGetProjForMember( ?, ?, ?, ?, ? ) }");
cs.registerOutParameter(1, OracleTypes.CURSOR);
cs.setLong(2, employeeNo);
cs.setLong(3, projId);
cs.setLong(4, actId);
cs.setLong(5, period);
cs.execute();
ResultSet rs = (ResultSet) cs.getObject(1);
while (rs.next()) {
MyItem aItem =
new MyItem(
Long.toString(employeeNo),
rs.getString(1),
rs.getString(2),
rs.getString(3),
rs.getString(4),
rs.getString(5),
rs.getString(6),
rs.getString(7),
rs.getString(8), // (Actually a number). This row returns "0" but it shuold return a value greater than "0"
rs.getString(9));
items.addElement(aItem);
- The stored procedure:
PROCEDURE sp_tidGetProjForMember (
rs_Proj IN OUT report_types.weakCur -- recordset
,p_pnr IN V_ADMANV.anv_PNR%TYPE -- pnr projektmedlem, NUMBER(12)
,p_projid IN PROJ.PROJ_ID%TYPE -- projekt-id, NUMBER(5), kan vara tomt
,p_aktid IN AKT.AKT_ID%TYPE -- aktivitets-id, NUMBER(5), kan vara tomt
,p_yyyymm IN NUMBER) -- period NUMBER(6), YYYYMM, kan vara tomt
<< Removed som lenghtly code from the post >>
OPEN rs_Proj FOR
SELECT
DECODE(TIDPROJ.prj_status, v_skickad_id, 0, -- godkänd
v_godknd_id, 0, -- godkänd
NULL, 1, -- under bearbetning
v_ejgodk_id, 2, -- ej godkänd
v_undknd_id, 2 -- ej godkänd
) AS STATUS -- skickad 0: godkänd, 1: under bearbetning, 2: underkänd
,TIDPROJ.prj_projid
,PROJ.proj_namn
,V_ADMANV.anv_namn AS PLEDNAMN
,V_ADMANV.anv_PNR AS PLEDPNR
,TIDPROJ.prj_aktid
,AKT.AKT_NAMN AS AKTNAMN
,NVL(SUM(NVL(UTILITY.sp_ds2Decnum(TIDPROJ.prj_tid), 0)), 0) -- when calling the fuction UTILTY.sp_ds2Decnum
,NVL(TIDPROJKOMMENT.tpk_kommentar, ' ')
FROM TIDPROJ
<< Removed som lenghtly code from this post >>
v_utdata := 'Recordset';
Utility.sp_putLogg (v_spname, 0, v_indata, v_utdata, SQLCODE, SQLERRM); -- commits
EXCEPTION
WHEN OTHERS THEN
v_utdata := 'FEL';
Utility.sp_putLogg (v_spname, 0, v_indata, v_utdata, SQLCODE, SQLERRM); -- rollbacks
END sp_tidGetProjForMember;Hi Kevin,
Just an addition. To create SSIS package programmatically, please refer to the document:
http://msdn.microsoft.com/en-us/library/ms345167.aspx
To implement dynamic column mapping through SSIS Script Component, please refer to:
http://wikiprogrammer.wordpress.com/2011/04/08/dynamic-column-mapping-in-ssis-part-1/
http://blog.quasarinc.com/ssis/best-solution-to-load-dynamically-change-csv-file-in-ssis-etl-package/
Regards,
Mike Yin
TechNet Community Support -
Unit test runs perfectly fine with NUnit but fails when run from TestExplorer
Hello all,
I have a TestProject, Harmony.Tests. In there, I have a method AddApplicationEvent()
which calls another method Send(InvokeRequestMessage requestMessage) which calls a webservice (OperationHandlerBrokerWebService).
The code snippet looks like this. This is not the complete code but a part where we are calling the web service. It fails on the underlined Italic line of code.
OperationHandlerBrokerWebService brokerService = new OperationHandlerBrokerWebService();
brokerService.UseDefaultCredentials = true;
brokerService.Url = address;
brokerService.Timeout = timeoutInMilliseconds;
byte[] serializedResponseMessage = brokerService.InvokeOperationHandler(serializedRequestMessage);
The same test works and passed fine when I ran it with NUnit, but failed with following exception when I tried to run it from TestExplorer.
Test Name: AddApplicationEvent
Test FullName: N4S.Harmony.Tests.CaseManagement.HarmonyFacadeTests.AddApplicationEvent
Test Source: d:\TFS\TMW\Dev\TMWOnline\Harmony\N4S.Harmony.Tests\CaseManagement\HarmonyFacadeTests.cs : line 665
Test Outcome: Failed
Test Duration: 0:00:00.296
Result Message:
SetUp : Message returned System.Web.Services.Protocols.SoapException: Server was unable to process request. ---> System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.ArgumentException: Invalid token for impersonation - it cannot be duplicated.
at System.Security.Principal.WindowsIdentity.CreateFromToken(IntPtr userToken)
at System.Security.Principal.WindowsIdentity..ctor(SerializationInfo info)
at System.Security.Principal.WindowsIdentity..ctor(SerializationInfo info, StreamingContext context)
--- End of inner exception stack trace ---
at System.RuntimeMethodHandle._SerializationInvoke(Object target, SignatureStruct& declaringTypeSig, SerializationInfo info, StreamingContext context)
at System.Reflection.RuntimeConstructorInfo.SerializationInvoke(Object target, SerializationInfo info, StreamingContext context)
at System.Runtime.Serialization.ObjectManager.CompleteISerializableObject(Object obj, SerializationInfo info, StreamingContext context)
at System.Runtime.Serialization.ObjectManager.FixupSpecialObject(ObjectHolder holder)
at System.Runtime.Serialization.ObjectManager.DoFixups()
at System.Runtime.Serialization.Formatters.Binary.ObjectReader.Deserialize(HeaderHandler handler, __BinaryParser serParser, Boolean fCheck, Boolean isCrossAppDomain, IMethodCallMessage methodCallMessage)
at System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.Deserialize(Stream serializationStream, HeaderHandler handler, Boolean fCheck, Boolean isCrossAppDomain, IMethodCallMessage methodCallMessage)
at System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.Deserialize(Stream serializationStream)
at N4S.Forms.OperationHandlerBroker.AMessage.DeserializeMessage(Byte[] serializedMessage)
at N4S.Forms.OperationHandlerBroker.WebServiceServer.BrokerService.InvokeOperationHandler(Byte[] serializedInvokeRequestMessage)
--- End of inner exception stack trace ---
expected: <0>
but was: <1>
Result StackTrace:
at N4S.Harmony.Tests.TestHelper.InvokeOperation(OperationHandler handler, OperationHandlerInput input, Boolean expectedToWork) in d:\TFS\TMW\Dev\TMWOnline\Harmony\N4S.Harmony.Tests\TestHelper.cs:line 136
at N4S.Harmony.Tests.TestHelper.LoginAsUser(String username, String password) in d:\TFS\TMW\Dev\TMWOnline\Harmony\N4S.Harmony.Tests\TestHelper.cs:line 394
at N4S.Harmony.Tests.TestHelper.Login(TestUserName requestedUser) in d:\TFS\TMW\Dev\TMWOnline\Harmony\N4S.Harmony.Tests\TestHelper.cs:line 377
at N4S.Harmony.Tests.TestHelper.LoginAsAdvisor() in d:\TFS\TMW\Dev\TMWOnline\Harmony\N4S.Harmony.Tests\TestHelper.cs:line 230
at N4S.Harmony.Tests.CaseManagement.HarmonyFacadeTests.Login() in d:\TFS\TMW\Dev\TMWOnline\Harmony\N4S.Harmony.Tests\CaseManagement\HarmonyFacadeTests.cs:line 76
at N4S.Harmony.Tests.CaseManagement.HarmonyFacadeTests.SetupTest() in d:\TFS\TMW\Dev\TMWOnline\Harmony\N4S.Harmony.Tests\CaseManagement\HarmonyFacadeTests.cs:line 67
I am not sure what causing the issue. I checked the Credentials, Windows Identity during both the test run and there is no difference. Please advise !!
Thanks,
DeepakHi Tina,
Thanks for your reply.
I do have NUnit adapter installed. I even noticed that the test runs fine with NUnit GUI and also if I run it through Resharper Test Explorer window.
As you can see in the image above the same test is passed when I ran it from Resharper Unit Test Explorer window but fails when I ran it from Test Explorer window. I also captured the information on fiddler.
There was a significant difference in the Header Content length. Also under the User-Agent property the protocol versions are different.
Not sure why VSTest ExecutionEngine is picking a different version.
The UnitTest in question is calling a webservice method which in turn calls a method from another referenced project.
Web Service class
using System;
using System.Web.Services;
using N4S.Forms.OperationHandlerBroker.Server;
using NLog;
namespace N4S.Forms.OperationHandlerBroker.WebServiceServer
/// <summary>
/// The operaton-handler broker service.
/// </summary>
[WebService(Description = "The N4S Forms Operation-Handler Broker Web-Service.", Name = "OperationHandlerBrokerWebService",
Namespace = "N4S.Forms.OperationHandlerBroker.WebServiceServer")]
[WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
public class BrokerService : WebService
{ /// <summary>
/// Calls <see cref="HandleRequest"/>. Updates performance-counters.
/// </summary>
/// <param name="serializedInvokeRequestMessage">the binary-serialized <see cref="InvokeRequestMessage"/></param>
/// <returns>the binary-serialized response message</returns>
[WebMethod(BufferResponse = true, CacheDuration = 0, Description = "Invokes the requested operation-handler and returns a binary-serialized response-message.", EnableSession = false)]
public byte[] InvokeOperationHandler(byte[] serializedInvokeRequestMessage)
logger.Trace(Strings.TraceMethodEntered);
PerformanceMonitor.RecordRequestStarted();
InvokeRequestMessage requestMessage = (InvokeRequestMessage) AMessage.DeserializeMessage(serializedInvokeRequestMessage);
InvokeResponseMessage responseMessage;
try
responseMessage = HandleRequest(requestMessage);
PerformanceMonitor.RecordSuccessfulRequest();
catch (Exception)
PerformanceMonitor.RecordFailedRequest();
throw;
finally
PerformanceMonitor.RecordRequestEnded();
logger.Trace(Strings.TraceMethodExiting);
return AMessage.SerializeMessage(responseMessage);
UnitTest snippet
OperationHandlerBrokerWebService brokerService = new OperationHandlerBrokerWebService();
brokerService.UseDefaultCredentials = true;
byte[] serializedResponseMessage = brokerService.InvokeOperationHandler(serializedRequestMessage);
Please advise.
Thanks,
Deepak -
OLE Program works in debug mode fails when run from F8
Hello,
I have implemented a code from this forum for sending documents to printer, as below, but although it works well in debug mode, it fails when I execute directly from SE38? Any idea?
Best Regards,
Didem GUNDOGDU
CREATE OBJECT gs_word 'WORD.APPLICATION'.
SET PROPERTY OF gs_word 'Visible' = '0'.
CALL METHOD OF gs_word 'Documents' = gs_documents .
CALL METHOD OF gs_documents 'Open' = gs_newdoc
EXPORTING
#1 = p_filep .
CALL METHOD OF gs_word 'ActiveDocument' = gs_actdoc.
CALL METHOD OF gs_actdoc 'PrintOut' .
CALL METHOD OF gs_word 'Quit' .Hi Didem,
Just a suggestion: could you print syst-subrc after each method call? Perhaps that can give you a clue.
Regards,
John. -
Stored Procedure Does Not Run Properly When Called From Portlet
We have a simple Java portlet that calls a PL/SQL stored procedure which we wrote. The stored procedure sends a large number of emails to users in our corporation using the "utl_smtp" package. The stored procedure returns a count of the emails back to the Java portlet when it's finished. This number is displayed in the portlet.
Our problem:
The stored procedure functions as expected when run from a PL/SQL block in SQL*Plus, and the Java portlet calls the procedure properly when sending out a smaller number of emails (Less than 200 usually). When we get into a higher number of emails the procedure hangs when called from the portlet, but it STILL functions properly from SQL*Plus. It does not return the number of emails sent
and the Java portlet is unable to return a SQLException. Also, we have noticed that emails are sent at a much slower pace from the stored procedure when it's called from the portlet.Any Ideas?
-
Failed to initialize Databank exception when run from OLT
HI All,
I've created a script with an associated databank,that runs perfectly fine when run from OpenScript. But when I run from OLT, the following waring is encountered.
Start failure message from agent "OLT Server": oracle.oats.common.databank.DatabankException: Failed to initialize Databank 'Forms.forms'
Stopped Autopilot because of error on agent "OLT Server".
Name of my DB is : forms
and Script name is :Forms
Please let me know the solution for this.
Regards,
Karthik
Edited by: user777720 on May 21, 2013 5:00 AMHave you tried changing the parameter in the Assets/Databank "Save Path" from "Relative to current script" to "Relative to a repository"?
Regards, Ian. -
Create trigger not audited when run from sql developer Version 3.2.20.09
Creating or editing a trigger is not being stored in the audit table when run from sql developer.
Here is a sample script to show the issue:
Grant Connect,create table,create trigger To testuser Identified By testuser;
create table testuser.testtab(t1 number);
Select Count(*) From Dba_Audit_Trail Where Owner='TESTUSER';
CREATE OR REPLACE TRIGGER testuser.testtab_bi_trg BEFORE
Insert
ON testuser.testtab FOR EACH ROW
begin
null;
end;
Select Count(*) From Dba_Audit_Trail Where Owner='TESTUSER';
drop user testuser cascade;
If I run the script from sql developer the CREATE TRIGGER statement does not get audited.
If I run the script from sql plus or All Arround Automations PL/SQL Developer the CREATE TRIGGER statement does get audited.
If I edit the trigger from sql developer the CREATE TRIGGER statement does not get audited.
If I edit the trigger from All Arround Automations PL/SQL Developer the CREATE TRIGGER statement does get audited.DoyleFreeman wrote:
Not sure what you mean by "perform the audit".
Have you tested my script? Does the "Select Count(*) From Dba_Audit_Trail Where Owner='TESTUSER';" increment by 1 after each of the ddl statements or only after the Create table statement.
Your question doesn't have ANYTHING to do with sql developer and should be posted in the Database General forum
https://forums.oracle.com/community/developer/english/oracle_database/general_questions
Yes - and it works just fine once you ENABLE AUDITING. Your scripIt does NOT include the statements or code used to ENABLE auditing and, specifically, enable auditing for triggers.
Auditing doesn't just 'happen'; you have to enable it and you have to specify any non-default auditing that you want to perform.
Have you read any of the extensive documentation about auditing to learn how to use it?
See the Database Security Guide
http://docs.oracle.com/cd/E11882_01/network.112/e16543/auditing.htm
Also see 'Auditing Functions, Procedures, Packages, and Triggers
http://docs.oracle.com/cd/E11882_01/network.112/e16543/auditing.htm#BCGBEAGC
And see the AUDIT statement in the SQL language doc for how to specify auditing of specific operations.
http://docs.oracle.com/cd/E11882_01/server.112/e26088/statements_4007.htm
Select count(*) From Dba_Audit_Trail Where (Owner='SCOTT' or username = 'SCOTT')
and action_name = 'CREATE TRIGGER';
COUNT(*)
0
audit create trigger by scott
CREATE OR REPLACE TRIGGER emp_copy_bi_trg BEFORE
Insert
ON emp_copy FOR EACH ROW
begin
null;
end;
Select count(*) From Dba_Audit_Trail Where (Owner='SCOTT' or username = 'SCOTT')
and action_name = 'CREATE TRIGGER';
COUNT(*)
1 -
I have a production mobile Flex app that uses RemoteObject calls for all data access, and it's working well, except for a new remote call I just added that only fails when running with a release build. The same call works fine when running on the device (iPhone) using debug build. When running with a release build, the result handler is never called (nor is the fault handler called). Viewing the BlazeDS logs in debug mode, the call is received and send back with data. I've narrowed it down to what seems to be a data size issue.
I have targeted one specific data call that returns in the String value a string length of 44kb, which fails in the release build (result or fault handler never called), but the result handler is called as expected in debug build. When I do not populate the String value (in server side Java code) on the object (just set it empty string), the result handler is then called, and the object is returned (release build).
The custom object being returned in the call is a very a simple object, with getters/setters for simple types boolean, int, String, and one org.23c.dom.Document type. This same object type is used on other other RemoteObject calls (different data) and works fine (release and debug builds). I originally was returning as a Document, but, just to make sure this wasn't the problem, changed the value to be returned to a String, just to rule out XML/Dom issues in serialization.
I don't understand 1) why the release build vs. debug build behavior is different for a RemoteObject call, 2) why the calls work in debug build when sending over a somewhat large (but, not unreasonable) amount of data in a String object, but not in release build.
I have't tried to find out exactly where the failure point in size is, but, not sure that's even relevant, since 44kb isn't an unreasonable size to expect.
By turning on the Debug mode in BlazeDS, I can see the object and it's attributes being serialized and everything looks good there. The calls are received and processed appropriately in BlazeDS for both debug and release build testing.
Anyone have an idea on other things to try to debug/resolve this?
Platform testing is BlazeDS 4, Flashbuilder 4.7, Websphere 8 server, iPhone (iOS 7.1.2). Tried using multiple Flex SDK's 4.12 to the latest 4.13, with no change in behavior.
Thanks!After a week's worth of debugging, I found the issue.
The Java type returned from the call was defined as ArrayList. Changing it to List resolved the problem.
I'm not sure why ArrayList isn't a valid return type, I've been looking at the Adobe docs, and still can't see why this isn't valid. And, why it works in Debug mode and not in Release build is even stranger. Maybe someone can shed some light on the logic here to me. -
Can I create a Stored Procedure That access data from tables of another servers?
I'm developing a procedure and within it I'm trying to access another server and make a select into a table that belongs to this another server. When I compile this procedure I have this error message: " PLS-00904: insufficient privilege to access object BC.CADPAP", where BC.CADPAP is the problematic table.
How can I use more than one connection into an Oracle Stored Procedure?
How I can access tables of a server from a Stored Procedure since the moment I'm already connected with another server?
Can I create a Stored Procedure That access data from tables of another servers?You need to have a Database Link between two servers. Then you could do execute that statement without any problem. Try to create a database link with the help of
CREATE DATABASE LINK command. Refer Document for further details -
11gR2 RAC install fail when running root.sh script on second node
I get the errors:
ORA-15018: diskgroup cannot be created
ORA-15072: command requires at least 2 regular failure groups, discovered only 0
ORA-15080: synchronous I/O operation to a disk failed
[main] [ 2012-04-10 16:44:12.564 EDT ] [UsmcaLogger.logException:175] oracle.sysman.assistants.util.sqlEngine.SQLFatalErrorException: ORA-15018: diskgroup cannot be created
ORA-15072: command requires at least 2 regular failure groups, discovered only 0
ORA-15080: synchronous I/O operation to a disk failed
I have tried the fix solutions from metalink note, but did not fix issue
11GR2 GRID INFRASTRUCTURE INSTALLATION FAILS WHEN RUNNING ROOT.SH ON NODE 2 OF RAC USING ASMLIB [ID 1059847.1Hi,
it looks like, that your "shared device" you are using is not really shared.
The second node does "create an ASM diskgroup" and create OCR and Voting disks. If this indeed would be a shared device, he should have recognized, that your disk is shared.
So as a result your VMware configuration must be wrong, and the disk you presented as shared disk is not really shared.
Which VMWare version did you use? It will not work correctly with the workstation or player edition, since shared disks are only really working with the server version.
If you indeed using the server, could you paste your vm configurations?
Furthermore I recommend using Virtual Box. There is a nice how-to:
http://www.oracle-base.com/articles/11g/OracleDB11gR2RACInstallationOnOEL5UsingVirtualBox.php
Sebastian -
LSMW Fails when run in B/G but works fine in Front end..why?
Hi All,
i am trying to run a batch process by LSMW, my files are accurate, no problem with them, everything works fine but it fails when run in BG..works absolutely fine in front end. whats the diff with running in B/G?
same thing happens when i am trying to execute an RFC thru SAP JCO, it works when debugger is on (i guess switching on debugger is similar to running in B/G) but it doesnt work when debugger is off. but when i execute that RFC directly in se37 from SAP gui it works fine..fails when connected to JCO..
i am not having this issue with r/3 4.6c or mySAP ECC.6.0 i have this issue only in r/3 4.7.
has anyone faced the similar situtaion? pls help.
thanks.
p.s if this may help. the RFC and LSMW error both are pertaining to change in address of US employees.( infotype 0006)Applying SAP note 928273 Solved this issue.
-
LSMW fails when run in B/G works fine in Frontend..why?
Hi All,
i am trying to runa batch process by LSMW, my files are accurate, no problem with them, everything works fine but it fails when run in BG..works absolutely fine in front end. whats the diff with running in B/G?
same thing happens when i am trying to execute an RFC thru SAP JCO, it works when debugger is on 9i guess switching on debugger is similar to running in B/G) but it doesnt work when debugger is off. but when i execute that RFC directly in se37 from SAP gui it works fine..fails when connected to JCO..
i am not having this issue with r/3 4.6c or mySAP ECC.6.0 i have this issue only in r/3 4.7.
has anyone faced the similar situtaion? pls help.
thanks.
p.s if this may help. the RFC and LSMW error both are pertaining to change in address of US employees.( infotype 0006)for LSMW its the recording of transaction PA40 (employee hire fails when filling address details) and PA30 (change address) and same is the case with RFC..well its a BAPI_ADDRESSEMPUS_CHANGE.
To eloborate more..the error is..Fill in all the mandatory fields.
which i am very much doing..there are no hidden fields or anything..i have seen the screens etc..I AM filling all mandatory fields. infact i am not leaving anything unfilled., same scrren is going fine when in front end..i am just clicking ok..ok..ok and boom transaction complete..no complaints. but running B/G is killing me.
i have to run batch for 100,000 employees
What fails my logic is..its working fine in 4.6c and mySAP ECC.6.0 but not in 4.7
Hruser
Message was edited by:
Hruser -
LSMW fails when run in B/G but works in Frontend..why?
Hi All,
i am trying to runa batch process by LSMW, my files are accurate, no problem with them, everything works fine but it fails when run in BG..works absolutely fine in front end. whats the diff with running in B/G?
same thing happens when i am trying to execute an RFC thru SAP JCO, it works when debugger is on 9i guess switching on debugger is similar to running in B/G) but it doesnt work when debugger is off. but when i execute that RFC directly in se37 from SAP gui it works fine..fails when connected to JCO..
i am not having this issue with r/3 4.6c or mySAP ECC.6.0 i have this issue only in r/3 4.7.
has anyone faced the similar situtaion? pls help.
thanks.
p.s if this may help. the RFC and LSMW error both are pertaining to change in address of US employees.( infotype 0006)Applying SAP note 928273 Solved this issue.
thank you. -
Calling a Stored Procedure with output parameters from Query Templates
This is same problem which Shalaka Khandekar logged earlier. This new thread gives the complete description about our problem. Please go through this problem and suggest us a feasible solution.
We encountered a problem while calling a stored procedure from MII Query Template as follows-
1. Stored Procedure is defined in a package. Procedure takes the below inputs and outputs.
a) Input1 - CLOB
b) Input2 - CLOB
c) Input3 - CLOB
d) Output1 - CLOB
e) Output2 - CLOB
f) Output3 - Varchar2
2. There are two ways to get the output back.
a) Using a Stored Procedure by declaring necessary OUT parameters.
b) Using a Function which returns a single value.
3. Consider we are using method 2-a. To call a Stored Procedure with OUT parameters from the Query Template we need to declare variables of
corresponding types and pass them to the Stored Procedure along with the necessary input parameters.
4. This method is not a solution to get output because we cannot declare variables of some type(CLOB, Varchar2) in Query Template.
5. Even though we are successful (step 4) in declaring the OUT variables in Query Template and passed it successfully to the procedure, but our procedure contains outputs which are of type CLOB. It means we are going to get data which is more than VARCHAR2 length which query template cannot return(Limit is 32767
characters)
6. So the method 2-a is ruled out.
7. Now consider method 2-b. Function returns only one value, but we have 3 different OUT values. Assume that we have appended them using a separator. This value is going to be more than 32767 characters which is again a problem with the query template(refer to point 5). So option 2-b is also ruled out.
Apart from above mentioned methods there is a work around. It is to create a temporary table in the database with above 3 OUT parameters along with a session specific column. We insert the output which we got from the procedure to the temporary table and use it further. As soon the usage of the data is completed we delete the current session specific data. So indirectly we call the table as a Session Table. This solution increases unnecessary load on the database.
Thanks in Advance.
RajeshRajesh,
please check if this following proposal could serve you.
Define the Query with mode FixedQueryWithOutput. In the package define a ref cursor as IN OUT parameter. To get your 3 values back, open the cursor in your procedure like "Select val1, val2, val3 from dual". Then the values should get into your query.
Here is an example how this could be defined.
Package:
type return_cur IS ref CURSOR;
Procedure:
PROCEDURE myProc(myReturnCur IN OUT return_cur) ...
OPEN myReturnCur FOR SELECT val1, val2, val3 FROM dual;
Query:
DECLARE
MYRETURNCUR myPackage.return_cur;
BEGIN
myPackage.myProc(
MYRETURNCUR => ?
END;
Good luck.
Michael
Maybe you are looking for
-
Quicktime not playing iMovie project
Hi, I recently completed a slideshow on iMovie and after exporting the file, quicktime will not play it. It's currently in .mp4 format. Currently I can only play it by opening up iMovie, which is very inconvenient; I'd like to be able to view it usin
-
Hi, I've read the forum discussions/solutions on setting PDF bookmarks, but I'm afraid the various solutions appear to only work if you are consistently working in .fm book files, not if your source files are in structured .fm format. For example, I
-
My wi if connection Hangs at random and the iPad then needs to be restarted for it to reconnect! Has anyone had a problem like this or know a solution.?
-
Install 250GB 7200RPM PATA [IDE, EIDE] HARD DRIVE into Mac G5
I have a 250 GB drive that I purchased in a LaCie Firewire enclosure that still works just fine, but won't mount from the external enclosure. We determined that the case's power supply is faulty. I want to install it into the open drive bay in my G5,
-
Install Errors - PSE11 on Macbook Air
Ok, after a marathon session with Support last night and some renewed attempts tonight I am close to giving up with this one! It's a Macbook Air running 10.7.5 and PSE11 is a retail boxed version from Amazon. I have tried: - Installing from Remote Di