Best approach for performing DMLs using stored procedures

Hi,
I have a really general question and would like to hear your say about this.
I want my application to manipulate or read data using stored procedures (or packages in that manner) and not directly using queries against the DB.
Let's say I have a table with many columns:
create table test (pkid number(10),col1 varchar2(30), col2 number(10), col3 date,...);For such a DML procedure, is it best to do something like
procedure do_update(i_pkid IN number,i_col1 IN varchar2, i_col2 IN number, i_col3 IN date,...) as
begin
   update test
   set col1=i_col1,
        col2=i_col2,
        col3=i_col3...
   where pkid=i_pkid;
   commit;
end;Or do a selective update, meaning update only a certain column every time, given only 1 column actually changes? (and how to do that - separate procedures for each column? [columns can be nulls])
Also, is it better to work with test.col1%type instead of specifying the data type in the procedure?
And one last question - If I have a table with 100 columns and I don't want to create a procedure with 100 parameters - the best approach would be to use a record?
I just need to be set on the way I start implementing things in order to do it well from the start.
Many thanks.
Edited by: Pyrocks on Nov 17, 2010 1:58 PM

Pyrocks wrote:
One last clarification (although it may be more related to c/c++ developers - maybe one of you will know):
We are working with C++ and VB against a SQLServer and my part is to translate all the existing procedures to Oracle in order to migrate the application to work with Oracle DB.
The existing procedures use an IN parameter for each column in the table and I would really like to use rowtype like you mentioned.
Since I'm not a c/c++/vb developer - is there an easy way of working with such types, or even User-Defined Types, from c/c++/vb (as in passing a rowtype record from c++ to a SP ?)
I'm not looking for the actual way - just want to know how hard it would be and how much code needs to be changed in order to be able to convince the developers that this is the RIGHT way to work.
PS. none of our developers have experience with ORACLE so they wouldn't know the answer...Not actually an Oracle server-side (SQL language or PL/SQL language) question - but a client one. And it has been a long time since I wrote a fat client using C/C++ or Delphi.
The OCI (<i>Oracle Call Interface</i>) supports advance (user defined) SQL data types. Has since Oracle 8i. So in that respect, yes the client can support custom SQL data types.
How well it does depends entirely on that client language's features wrt OCI integration. For example, Delphi 4 was release around Oracle 8i and supported custom SQL types. I would expect that most languages today (like Java and C#) will provide support for it.
As for usiong +%ROWTYPE+ - this is a PL/SQL clause as far as I know. Unsure whether it is supported by the OCI. What could support it is a pre-compiler like Pro*C. These enable you to mix pseudo SQL source code with client language source code. The pre-compilation step then replaces the pseudo SQL code with native client language calls to the OCI. The code is then compiled by that client language's compiler. Pre-compilers can pull all kinds of interesting "tricks" with their pseudo SQL code support.
The best would be to consult the applicable client language's manuals that describe the interface it supports (via OCI) to Oracle.

Similar Messages

  • Best approach for uploading document using custom web part-Client OM or REST API

    Hi,
     Am using my custom upload Visual web part for uploading documents in my document library with a lot of metadata.
    This columns contain single line of text, dropdownlist, lookup columns and managed metadata columns[taxonomy] also.
    so, would like to know which is the best approach for uploading.
    curretnly I am  trying to use the traditional SSOM, server oject model.Would like to know which is the best approach for uploading files into doclibs.
    I am having hundreds of sub sites with 30+ doc libs within those sub sites. Currently  its taking few minutes to upload the  files in my dev env. am just wondering, what would happen if  the no of subsites reaches hundred!
    am looking from the performance perspective.
    my thought process is :
    1) Implement Client OM
    2) REST API
    Has anyone tried these approaches before, and which approach provides better  performance.
    if anyone has sample source code or links, pls provide the same 
    and if there any restrictions on the size of the file  uploaded?
    any suggestions are appreciated!

    Try below:
    http://blogs.msdn.com/b/sridhara/archive/2010/03/12/uploading-files-using-client-object-model-in-sharepoint-2010.aspx
    http://stackoverflow.com/questions/9847935/upload-a-document-to-a-sharepoint-list-from-client-side-object-model
    http://www.codeproject.com/Articles/103503/How-to-upload-download-a-document-in-SharePoint
    public void UploadDocument(string siteURL, string documentListName,
    string documentListURL, string documentName,
    byte[] documentStream)
    using (ClientContext clientContext = new ClientContext(siteURL))
    //Get Document List
    List documentsList = clientContext.Web.Lists.GetByTitle(documentListName);
    var fileCreationInformation = new FileCreationInformation();
    //Assign to content byte[] i.e. documentStream
    fileCreationInformation.Content = documentStream;
    //Allow owerwrite of document
    fileCreationInformation.Overwrite = true;
    //Upload URL
    fileCreationInformation.Url = siteURL + documentListURL + documentName;
    Microsoft.SharePoint.Client.File uploadFile = documentsList.RootFolder.Files.Add(
    fileCreationInformation);
    //Update the metadata for a field having name "DocType"
    uploadFile.ListItemAllFields["DocType"] = "Favourites";
    uploadFile.ListItemAllFields.Update();
    clientContext.ExecuteQuery();
    If this helped you resolve your issue, please mark it Answered

  • Best Approach for Performance/Event Monitoring

    I'm completely new to SAP and would appreciate the input of some more experienced folks.
    I have been given two tasks for SAP: performance monitoring of the SAP application/associated modules (e.g. SAP Memory utilization, number of processes, etc.) and alarm event monitoring (anything that would be characterized as a error and logged).
    For the performance monitoring I would like a access the metrics once every 5 minutes to keep a running history.  Several things I've started to look at are the Alert Manager and perhaps a Perl API interface.
    For alarm events, I assume there is some SAP module that can monitor for these and generate SNMP traps or perhaps a custom application that does log mining.  As part of this I am trying to determine where error events are written to in SAP (e.g. a log file??).
    If anyone has insight or can point me in the right direction, I would be very appreciative.  Thanks in advance.
    Ed

    Dear Ed,
    If Wily Introscope is configured for your Portal, which shoold be the case because SAP delivers it as with SAP Solution Manager, all metrics which oyu mention would be available.
    If you are running a testing instance which is not integrated with Solution Manager, you could try the Web Administrator
    http://<host>:<port>/nwa ->  Availability and Performance -> Resource Monitoring
    Best Regards,
    Sylvia

  • Best practices to use stored procedure

    Just wondering about best practices of using Stored procedures in TOPLink with respect to Objects. Any thoughts on this?
    I find the approach suggested at re:Coding for Stored Procedures is a lot of work! seems to be fine.
    Is there any thoughts on converting results directly into Java objects?
    Murali

    I encountered the same problems.
    See the topic I posted: Re: Mapping a Java attribute to the result of a function call
    The solution I used is good, but has its restrictions.
    I created a database view on the query that uses stored functions. Then, I mapped my object to the database view. Problem solved.
    However, like I said, this solution has its restrictions:
    1) Your database must support views
    2) This only works for read-only queries
    I hope this helps you any further.
    Kind regards,
    Erwin

  • Using stored procedures for insert, update and delete

    Hello all;
    We have a question from our customer (who is the DBA), who has not in the past used TopLink, about whether it makes sense to tie TopLink into existing stored procedures to save and retrieve informations.
    Is it possible?
    Is there any circumstance under which it is a good idea?
    Thanks

    In TopLink any operation for which TopLink generates SQL, can be replaced by custom SQL, or a stored procedure call.
    Custom SQL or stored procedures can be used for each of the descriptor's CRUD operations, but also for mapping queries and named queries. The Mapping Workbench only supports defining custom SQL for the descriptor CRUD operations and named queries, so many of the stored procedures call from the descriptor and mappings will need to be done through amendment methods.
    Whether it makes sense or not depends on the application and the company and their requirements. It will add significant overhead to the development process to have to define and maintain all of the stored procedures, and the stored procedure calls in the descriptors. You may wish develop your application using TopLink generated SQL, and once you have the model and queries stabilized then switch to using stored procedures.
    Whether it is a good idea depends on the application and the company and their requirements. Stored procedures may give the DBA more freedom to change the data-model once in production, and may allow for adding database-level security checks. In general using stored procedures will not improve performance if the procedures contain the same SQL that would have be executed anyway, but they may allow for the DBA to tune the SQL better.

  • SSRS adhoc (2005/2008) - Need to create Report Model (.smdl file) for adhoc using Stored procedure

    Hi All,
    I need to create Report Models using stored procedures. But whenever I am going to create data sources for same, I am just getting tables and views to use. Please see the screenshot below:
    Is there any way out to create data source based on stored procedure? Please help.
    Thanks in Advance
    Regards
    Kumud

    Hi Kumud,
    As per my understanding, it is not support to create a Datasource View (DSV) using stored procedures.
    In a DataSource View, only views, tables or named queries can be used. We can use stored procedures in a query. No parameterized queries, parameterized stored procedures or parameterized UDFs can be used in a DSV named query. You can try to using below approach
    to achieve what you are require:
    To build views in our relational source and add the views to the datasource view to proceed with modelling.
    There is a similar issue, you can refer to it.
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/22207c21-03c7-4e5a-bb67-0372f29220a3/sql-server-reporting-services-2008-creating-report-models-using-stored-procedures
    Regards,
    Alisa Tang
    Alisa Tang
    TechNet Community Support

  • How to use stored procedure in crystal report for eclipse

    Hi;
    I am working on crystal report for eclipse 2.0 and  trying to use stored procedure in report .It is created from Oracle but it is not showing any dependencies
    CREATE OR REPLACE procedure ACT_DB.getCostTransferDetails
            p_In_Contract_ID         varchar2 DEFAULT  '*',
            p_In_Status_ID           varchar2 DEFAULT  '*',
            p_In_COST_TRAN_DATE      varchar2 DEFAULT  '*',
            p_In_DEPT_ID             varchar2 DEFAULT  '*',
            p_In_PROJECT_ID          varchar2 DEFAULT  '*',
            p_In_EMPLOYEE_ID         varchar2 DEFAULT  '*',
            p_Out_CostTransfer_Cusor OUT COST_TRANSFER_PACKAGE.COST_TRANSFER_TYPE
    AS
    BEGIN    
            OPEN p_Out_CostTransfer_Cusor FOR
            SELECT
                header.cost_tran_id,
                header.cost_tran_date,
                header.total_credit_amount,
                header.total_debit_amount,
                header.transfer_status,
                header.updated_by,
                header.added_by,
                header.trf_over_ninety_days,
                header.business_unit_id,
                header.equip_approval,
                header.batch_desc,
                details.opr_unit_id,
                status.status_id,
                details.fund_code_id,
                details.prgm_code_id,
                details.class_id,
                details.activity_id,
                details.product_id,
                details.status_id,
                details.transaction_amount,
                details.jrnl_date,
                details.adjust_prcnt,
                details.adjust_amnt,
                details.details_equip_approval,
                details.detail_desc,
                details.account_id,
                details.project_id,
                details.dept_id,
                details.contract_id,
                details.reason_code_id,
                details.jrnl_id,
                employee.f_name,
                employee.l_name,
                employee.employee_id,
                project.project_id,
                project.project_status,
                --header.transfer_status,
                contracts.contract_desc 
              FROM header, details, status, employee, department, project, contracts
             WHERE     (header.cost_tran_id = details.cost_tran_id)
                    AND (header.cost_tran_date = details.cost_tran_date)
                    AND (header.transfer_status = status.status_id)
                    AND (department.dept_id = details.dept_id)
                    AND (department.dept_id = employee.dept_id)
                    AND (project.project_id = details.project_id)
                    AND (contracts.contract_id = details.contract_id)
                    AND (details.Contract_ID = p_In_Contract_ID OR p_In_Contract_ID ='*' )
                    AND (header.TRANSFER_STATUS = p_In_Status_ID OR p_In_Status_ID ='*' )
                    AND (TO_CHAR(details.COST_TRAN_DATE, 'MM/DD/YYYY') = p_In_COST_TRAN_DATE OR p_In_COST_TRAN_DATE ='*' )
                    AND (details.DEPT_ID  = p_In_DEPT_ID  OR p_In_DEPT_ID  ='*' )
                    AND (details.PROJECT_ID  = p_In_PROJECT_ID  OR p_In_PROJECT_ID  ='*' )
                    AND (header.ADDED_BY  = p_In_EMPLOYEE_ID  OR '*'=p_In_EMPLOYEE_ID  )
    Can any another way to make procedure
    Thanx
    Regards ,
    Amol

    Hi Amol,
    Are you able to execute your stored procedure from oracle?.
    - If "Yes" then use  the Crystal Report XIR2 Designer to create the report.
    - Import this report in the Crystal Report For Eclipse Workbench.
    - The latest version for Crystal Report For Eclipse is SP1.
    Please let me know the results.
    Thanks,
    Neeraj

  • What are the best approaches for mapping re-start in OWB?

    What are the best approaches for mapping re-start in OWB?
    We are using OWB repository 10.2.0.1.0 and OWB client 10.2.0.1.31. The Oracle version is 10 G (10.2.0.3.0). OWB is installed on Linux.
    We have number of mappings. We built process flows for mappings as well.
    I like to know, what are the best approches to incorportate re-start options in our process. ie a failure of mapping in process flow.
    How do we re-cycle failed rows?
    Are there any builtin features/best approaches in OWB to implement the above?
    Does runtime audit tables help us to build re-start process?
    If not, do we need to maintain our own tables (custom) to maintain such data?
    How did our forum members handled above situations?
    Any idea ?
    Thanks in advance.
    RI

    Hi RI,
    How many mappings (range) do you have in a process flows?Several hundreds (100-300 mappings).
    If we have three mappings (eg m1, m2, m3) in process flow. What will happen if m2 fails?Suppose mappings connected sequentially (m1 -> m2 -> m3). When m2 fails then processflow is suspended (transition to m3 will not be performed). You should obviate cause of error (modify mapping and redeploy, correct data, etc) and then repeat m2 mapping execution from Workflow monitor - open diagram with processflow, select mapping m2 and click button Expedite, choose option Repeat.
    In re-start, will it run m1 again and m2 son on, or will it re-start at row1 of m2?You can specify restart point. "at row1 of m2" - I don't understand what you mean (all mappings run in Set based mode, so in case of error all table updates will rollback,
    but there are several exception - for example multiple target tables in mapping without corelated commit, or error in post-mapping - you must carefully analyze results of error).
    What will happen if m3 fails?Process is suspended and you can restart execution from m3.
    By having without failover and with max.number of errors=0, you achieve re-cycle failed rows to zero (0).This settings guarantee existence only two return result of mapping - SUCCSES or ERROR.
    What is the impact, if we have large volume of data?In my opinion for large volume Set based mode is the prefered processing mode of data processing.
    With this mode you have full range enterprise features of Oracle database - parallel query, parallel DML, nologging, etc.
    Oleg

  • How to run ssis 2012 package deployed on SSISDB using stored procedures from [SSISDB].[catalog] by regular AD accounts ?

    Hello, I need to run ssis package by stored proc(pass parameters inside and get result using stored procedures [SSISDB].[catalog].[create_execution]) etc. Do not suggest me to use SQL Agent here as this is not the case. or by pass this using configuration
    for SSIS, I need to know what permission I'm missing if those stored proc run by AD account, with rights described from BOL and I'm having "The current security
    context cannot be reverted. Please switch to the original database where 'Execute As' was called and try it again." If I schedule stored proc to run by Agent and specify parameters - will work, from remote PC - does not.  I have sysdamin
    rights on the server where SSIS and stored proc which is call it exists.   I was digging around and have ugly solution to make it working, I want to use elegant solution using those stored proc from [SSISDB].[catalog] ...
    Any help especially from Microsoft guys here...?
    Vladimir

    >>>>If I schedule stored proc to run by Agent and specify parameters - will work,
    from remote PC - does not. 
    Vova
    If you schedule the job it will be running under SQL Server Agent account security....  What error
    do you get when you run the SP from remote PC?
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • Using stored procedures

    hi all
    How can i use stored procedures for tuning the database?
    thanks

    If you mean "will using stored procedures help improve application performance?" the answer is they may, depending on your application, number of users, etc, etc.
    Stored procedures are just application code stored in the database. This means the network traffic of information being sent back and forth between the server and clients is reduced and the application has more resources (usually) as it is running on the server and not the client. Also stored procedures can be used to cache information for future use by the same user.
    These 2 tutorials on pl/sql may help
    [http://www.asktheoracle.net/plsql-tutorial.html | oracle-plsql-tutorial-part1.htm]
    and
    [http://www.smart-soft.co.uk/Oracle/oracle-plsql-tutorial-part1.htm | plsql-tutorial-part1.htm]
    They are specifically on pl/sql but the general principles pf when and why to use them would be teh same for Java stored procedures or .net stored procedures.
    Edited by: ow007829 on 15/06/2010 14:40

  • How to use dynamic parameter when a report is created using Stored Procedures

    Hi all,
    any one have the idea of how to use dynamic parameter in crystal report XI R2
    when report is created using Stored Procedure.
    Regards
    shashi kant chauhan

    Hi
    You can create an SQL command in Database Expert > Expand your datasource > Add command
    Then enter the SQL query that will create the list of values to supply to the user
    eg select field1,field2 from table
    Then edit the parameter of the report.  These will be the SP parameters adn can be seen in field explorer.
    Change the parameter type to Dynamic
    Under the word Value click on Click here to add item
    Scroll down to your Command and select one of the values that you want to appear in the list
    e.g field1
    Then click on the Parameters field - this is essential to create the param
    You can edit other options as required
    That should do it for you.
    I must say that i use CR 2008 connected to Oracle 10g SP, but i reckon this will work for SQL DB and CR XI as well
    Best of luck

  • Why Dynamic Parameter is not working, when i create report using stored procedure ?

    Post Author: Shashi Kant
    CA Forum: General
    Hi all
    Why Dynamic Parameter is not working, when i create report XI using stored procedure ?
    Only i shaw those parameters which i used in my stored procedure, the parameter which i create dynamic using stored procedure
    is not shown to me when i referesh the report for viewing the results.
    I have used the same procedure which i mention below but can not seen the last screen which is shown in this .
    ============================================================================================
    1. Select View > Field Explorer2. Right-click on Parameter Fields and select New from the right-click menu.3. Enter u201CCustomer Nameu201D as the name for your parameter4. Under u201CList of Valuesu201D select u201CDynamicu201D5. Under the Value column, click where is says u201Cclick here to add itemu201D and select Customer Name from the drop-down list. The dialog shown now look like the one shown below in Figure 1. Click OK to return to your report design.
    Dynamic Parameter Setup6. Next, select Report > Select Expert, select the Customer Name field and click OK.7. Using the drop-down list beside select u201CIs Equal Tou201D and using the drop-down list, select your parameter field (it should be the first field). 8. Click OK to return to your report design and see the parameter dialog.The parameter dialog will appear and show you a dynamic list of values that is updated each time your run your report. It couldnu2019t be easier! In our next tutorial, we will be looking at how to use this feature to create cascading parameter fields, where the values are filtered by the preceding selection.
    Dynamic Parameters in Action
    My question is that whether dynamic parameter is working with storedprocedure or not.
    When i added one table and try to fetch records using dyanmic prameters. after that i am not be able to find the dynamic parameter option when i referesh my report.
    One more thing when i try the static parameter for my report, the option i see when i referesh the screen.
    Please reply soon , it's urgent
    Regards
    shashi kant

    Hi Kishore,
    I have tested the issue step by step by following you description, while the first issue works well in my local environment. Based on my research, this can be caused by the lookup expression or it indeed return Male value based on the logic. If you use the
    expression below, it will indeed only return the Male record. So please try to double-check the record in the two datasets and the expression in your environment:
    =lookup(first(Fields!ProgramID.Value,"DataSet1"),Fields!ProgramID.Value,Fields!Gender.Value,"DataSet2")
    As to the second issue, please try to use the following expression:
    =Count(Lookup(fields!ProgramID.value,fields!ProgramID.value,fields!Gender.value,"DataSet2"))
    Besides, if this issue still exist, in order to trouble shoot this issue more efficiently, could you please post both the .rdl  file with all the size properties to us by the following E-mail address?  It is benefit for us to do further analysis.
    E-mail: [email protected]
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • What's the best approach for handeling about 1300 connections in Oracle.

    What's the best approach for handling about 1300 connections in Oracle 9i/10g through a Java application?
    1.Using separate schema s for various type users(We can store only relevant data with a particular schema.     Then No. of records per table can be reduced by replicating tables but we have to maintain all data with a another schema     Then we need update two schema s for a given session.Because we maintain separate scheama for a one user and another schema for all data and then there may be Updating problems)
    OR
    2. Using single schema for all users.
    Note: All users may access the same tables and there may be lot of records than previous case.
    What is the Best case.
    Please give Your valuable ideas

    It is a true but i want a solution from you all.I want you to tell me how to fix my friends car.

  • Unable to run SSIS Package using Stored Procedure

    Hi Guys,
            I have create one simple SSIS Package(Move the first table record to second table).It will execute fine in locally.
    But, If i'm going call/execute .dtsx file using stored procedure. I'm getting Error. so please let me know the valuable solution.
    My Package Path : D:\Temp\SSIS_TESTING\TESTSSIS\TESTSSIS\Package.dtsx
    Stored Procedure
    ================
    CREATE PROCEDURE SPEXECUTESSISPACKAGE
    @FILEPATH VARCHAR(8000)
    ,@STATUS VARCHAR(500) = NULL OUTPUT
    AS
    BEGIN
    DECLARE @SQLQUERY VARCHAR(8000)
    DECLARE @STATUSCODE INT
    SET @SQLQUERY = 'DTEXEC /FILE "'+ @FILEPATH +'"'
    SELECT @SQLQUERY AS 'SSIS FULL PATH'
    EXEC @STATUSCODE = master..xp_cmdshell @SQLQUERY
    SELECT @STATUSCODE AS 'STATUSCODE'
    IF @STATUSCODE <> 0
    BEGIN
    SET @STATUS = 'PACAKGE EXECUTE FAILED'
    PRINT @STATUS
    END
    ELSE
    BEGIN
    SET @STATUS = 'PACAKGE EXECUTE SUCCESS'
    PRINT @STATUS
    END
    END
    GO
    RUN
    ============================================================================
    EXEC SPEXECUTESSISPACKAGE 'D:\Temp\SSIS_TESTING\TESTSSIS\TESTSSIS\Package.dtsx', NULL
    Error
    =============================================================================
    Microsoft (R) SQL Server Execute Package Utility
    Version 9.00.4035.00 for 32-bit
    Copyright (C) Microsoft Corp 1984-2005. All rights reserved.
    NULL
    Started:  5:06:26 AM
    Error: 2014-11-14 05:06:26.07
       Code: 0xC0011007
       Source: {1244CD18-F96A-4DAD-8FC2-35F794015CA1}
       Description: Unable to load the package as XML because of package does not have a valid XML format. A specific XML parser error will be posted.
    End Error
    Error: 2014-11-14 05:06:26.07
       Code: 0xC0011002
       Source: {1244CD18-F96A-4DAD-8FC2-35F794015CA1}
       Description: Failed to open package file "D:\Temp\SSIS_TESTING\TESTSSIS\TESTSSIS\Package.dtsx" due to error 0x80070003 "The system cannot find the path specified.".  This happens when loading a package and the file cannot
    be opened or loaded correctly
    into the XML document. This can be the result of either providing an incorrect file name was specified when calling LoadPackage or the XML file was specified and has an incorrect format.
    End Error
    Could not load package "D:\Temp\SSIS_TESTING\TESTSSIS\TESTSSIS\Package.dtsx" because of error 0xC0011002.
    Description: Failed to open package file "D:\Temp\SSIS_TESTING\TESTSSIS\TESTSSIS\Package.dtsx" due to error 0x80070003 "The system cannot find the path specified.".  This happens when loading a package and the file cannot be opened
    or loaded correctly int
    o the XML document. This can be the result of either providing an incorrect file name was specified when calling LoadPackage or the XML file was specified and has an incorrect format.
    Source: {1244CD18-F96A-4DAD-8FC2-35F794015CA1}
    Started:  5:06:26 AM
    Finished: 5:06:26 AM
    Elapsed:  0.047 seconds
    NULL
    I'm getting Execte SSIS Package Status : 4
    ===================================
    4

    Two things to check
    1. The account executing the package has access to the path. If package is stored in different server you need to pass the UNC path (ie like \\machinename\...)
    2. Make sure the version of SSIS service is the same in both the servers ie where package is created/stored and where its getting executed. A lower version SSIS service will not be able to load and execute higher version package and it will throw you similar
    error messages as you posted above.
    Please Mark This As Answer if it solved your issue
    Please Mark This As Helpful if it helps to solve your issue
    Visakh
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Best approach for IDOC - JDBC scenario

    Hi,
    In my scenarion I am creating sales order(ORDERS04) in R/3 system and which need to be replicated in a SQL Server system. I am sending the order to XI as an IDoc and want to use JDBC for sending data to SQL Server. I need to insert data in two tables(header & details). Is it possible without BPM?  Or what is the best approach for this?
    Thanks,
    Sri.

    Yes, this is possible without the BPM.
    Just create the Corresponding Datatype for the insertion.
    if the records to be inserted are different, then there wil be 2 different datatypes ( one for header and one for detail).
    Do a mutlimapping, where your Source is mapped into the header and details datatype and then send using the JDBC sender adapter.
    For the strucutre of your Datatype for insertion , just check this link,
    http://help.sap.com/saphelp_nw04/helpdata/en/7e/5df96381ec72468a00815dd80f8b63/content.htm
    To access any Database from XI, you will have to install the corresponding Driver on your XI server.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3867a582-0401-0010-6cbf-9644e49f1a10
    Regards,
    Bhavesh

Maybe you are looking for

  • ORA 12154 on OCIServerAttach

    Hi! I get an ORA 12154 when calling OCIServerAttach. The OCIEnvCreateand two OCIHandleAlloc calls are OK. This setup used to work with 32bit pcs. But on 64bit pcs the troubles arise. If we do tnsping on the the same server name, it responds. So I can

  • Table name for Bill of Exchange no

    Hi, We are using Payment Wizard for generating Bill of Exchange and sent customer for acceptence. We would like to create Query base report for printing Bill of Exchange sent for acceptence using payment wizard name. Can any one provide table name of

  • C7 how to access the PRIVATE folder in phone memor...

    C7-00 how to access the PRIVATE folder in phone memory? Meaning the folder C:\private With X-plore you can see the folder but it is marked as 0 kb. Edit: reason for this is that I had Angry Birds installed on memory C: Then I had changed my phones de

  • TS1398 Iphone 4s wont connect to my wifi 'unable to connect' HELP i enter password correctly

    Iphone 4s wont connect to my internet when i try to connect i enter password in and it comes up with 'unable to connect' ive tryed reseting router, reseting network settings on iphone. HELP!

  • Reinstallng mail 1.3

    When i set up the mail account something went wrong when entering my password. Now the account will not accept any password when i startup the program and hence i can not use the program. Is there a way to uninstall and reinstall this software? thank