Possible to test Dev Transaction with Production data in eCatt?

Hi,
I am new to eCatt but got the impression that it was possible to test your transactions in dev with the data in production, presumably through an RFC connection.
Is this at all possible?
Thanks,
Will

Hi Will,
with RFC your are right, but your direction of thinking is wrong.
eCATT lets you develop testscripts in one (central test) system and execute them in and against other systems.
Means, you can write your scripts in a Solution Manager and then execute it via RFC connections against ERP-development
or ERP test system with their specific ERP data.
Please take a look into the eCATT [Online Help|http://help.sap.com/saphelp_nw70ehp1/helpdata/en/d0/e3933f09a5fb47e10000000a114084/frameset.htm]
Hope that helps to understand better distributed testing.
Best regards,
Christoph

Similar Messages

  • Warehouse stock with production date of batch

    Hello,
    Can somebody tell me any report where user can see warehouse stocks along with production date of a batch. User wants to do FIFO while doing inventory transactions. And user wants to first transfer oldest produced batch first.
    Please suggest.

    Hi Sudhir,
    Thanks for your response. LX02 shows only GR date for quant- for certain quant GR date is updated as todays date if some stock transaction is done for that quant.
    User wants report
    He will then be able to move oldest stock based on production date.  Example quant has GR date as 24th Feb 2011. But its production date is 24th Feb 2010. So ideally this stock should be moved in FIFO method based on production date.
    To summarise, I require standard SAP report  stock in LX02 as well as production date/receiving date of a batch in quant.

  • Making Sandbox server with production data

    Hi experts ,
    We are going build sand box server we are presently using windows server 2003 R2, SQL server 2005 ,SAP ECC6.0
    we are  just installed operating system only please Guide me we have are Going to approach system copy method  or
    normally installation then after  full backup restore method please kindly guide me how to build sandbox server with
    production data
    Best Regards
    SURESH .M

    Hi Suresh,
    Refer to below SAP notes and the check out the links within the SAP note for necessary documentation
    683447 - SAP Tools for MS SQL Server
    1152240 - Setting Up Microsoft SQL Server 2008 (R2)  ( Refer Point II in this note )
    Note 151603 - Copying an SQL Server database
    Hope this helps.
    Regards,
    Deepak Kori

  • Creation of business transactions with product information from a campaign

    Hi,
    I need to create business transactions from a campaign execution, but I need that these business transactions are created with a specific product and quantity 1.
    Is this possible?
    I tried to assign Products on Product tab of campaign, and to assign a document template in Campaign, but nothing worked.
    Thanks a lot.
    Susana Messias

    Hi
    you can create business transactions from campaigns executed useing the following settings
    in order to create the transaction after executing the campaign you need to maintain following custimising settings
    Define communication medium
    select the channel for which you want to create transaction automatically under communication medium and select communication meathod and transaction type assign the under define communication meathod assign ther the transaction aganist the internet mail SMTP to the transaction type to it and  you can use appropriate BADI for assigning the product with the quantity as you desired
    reward points if helpful
    regards
    Dinaker vikas

  • Unable to use transactions with System.Data.OracleClient data provider

    I am using VS2008, System.Data.OracleClient, Oracle 10g, and ODAC 10.2.0.20. I haven't been able to get transactions to work. When I use 'connection.BeginTransaction()', the rollback doesn't work. When I use TransactionScope, the output parameter is always DBNull. Any ideas/comments?
    Here's the sample code:
    // #define ENABLE_TRANSACTION // failure is 'rollback not working'
    #define ENABLE_TRANSACTION_SCOPE // failure is 'no output parameter value'
    using System;
    using System.Collections.Generic;
    using System.Data;
    using System.Text;
    using System.Data.OracleClient;
    #if ENABLE_TRANSACTION_SCOPE
    using System.Transactions;
    #endif
    namespace TestOracleTransaction
    class Program
    static void Main(string[] args)
    #if ENABLE_TRANSACTION_SCOPE
    using (TransactionScope scope = new TransactionScope())
    #endif
    string connectionString = "Data Source=ORADEV;User ID=user;Password=pwd";
    using (OracleConnection connection = new OracleConnection(connectionString))
    try
    connection.Open();
    #if ENABLE_TRANSACTION
    using (OracleTransaction transaction = connection.BeginTransaction())
    #endif
    try
    #if ENABLE_TRANSACTION_SCOPE
    if (Transaction.Current == null)
    throw new ArgumentException("no ambient transaction found for OracleClient");
    #endif
    OracleCommand command = connection.CreateCommand();
    #if ENABLE_TRANSACTION
    command.Transaction = transaction;
    #endif
    command.CommandType = CommandType.StoredProcedure;
    command.CommandText = "TIS.P_TIS_GATEWAY_INFO_ADD";
    OracleParameter param = command.CreateParameter();
    param.ParameterName = "p_gateway_id";
    param.Direction = ParameterDirection.Input;
    param.DbType = DbType.Int64;
    param.Value = 18;
    command.Parameters.Add(param);
    param = command.CreateParameter();
    param.ParameterName = "p_info_id";
    param.Direction = ParameterDirection.Input;
    param.DbType = DbType.Int64;
    param.Value = 79;
    command.Parameters.Add(param);
    param = command.CreateParameter();
    param.ParameterName = "p_user";
    param.Direction = ParameterDirection.Input;
    param.DbType = DbType.String;
    param.Value = "spms";
    command.Parameters.Add(param);
    param = command.CreateParameter();
    param.ParameterName = "p_gateway_info_id";
    param.Direction = ParameterDirection.Output;
    param.DbType = DbType.Int64;
    param.Size = sizeof(Int64);
    command.Parameters.Add(param);
    int count = command.ExecuteNonQuery();
    object value = command.Parameters["p_gateway_info_id"].Value;
    long id = (value == DBNull.Value) ? -1 : Convert.ToInt64(value);
    if (id < 0)
    // FAILURE - no output parameter value when TransactionScope enabled
    throw new ArgumentException("no return value");
    #if ENABLE_TRANSACTION
    // FAILURE - rollback doesn't work when Transaction enabled
    transaction.Rollback();
    #endif
    #if ENABLE_TRANSACTION_SCOPE
    scope.Complete();
    #endif
    catch (Exception ex)
    System.Console.WriteLine("ERROR: " + ex.Message);
    #if ENABLE_TRANSACTION
    transaction.Rollback();
    #endif
    finally
    if (connection.State == ConnectionState.Open)
    connection.Close();
    }

    Hi,
    First, this is not the place for questions with System.Data.OracleClient, this is the Oracle Data Provider for .NET forum. Having said that I went ahead and tested your code with some slight modifications because you did not provide the stored procedure information. I am assuming your stored procedure is doing some sort of DML since you are using transactions and attempting to commit and rollback.
    I tested the following with both Transaction scope and a local transaction object and it worked fine with System.Data.OracleClient. I provided the create table and stored procedure I used.
    Observations
    ========
    When using transaction scope, a distributed transactions was executed and the data was inserted and returned in the output variable.
    From console
    p1 value is Hello World
    From SQL Plus
    SQL> select * from foo;
    C1
    Hello World
    When using a local transaction, the DML was not inserted when calling rollback and when I changed it to commit, the row was inserted successfully.
    Maybe you can test the simple foo example below to see if it works for you. Maybe there is something going on in your SP that is causing your specific observations.
    The code I posted at this point is using local transaction and calling transaction.commit(), rollback is commented out. But I tested all scenarios and they worked as expected.
    HTH
    Jenny
    #define ENABLE_TRANSACTION // failure is 'rollback not working'
    //#define ENABLE_TRANSACTION_SCOPE // failure is 'no output parameter value'
    using System;
    using System.Collections.Generic;
    using System.Data;
    using System.Text;
    using System.Data.OracleClient;
    #if ENABLE_TRANSACTION_SCOPE
    using System.Transactions;
    #endif
    create table foo (c1 varchar2(50));
    create or replace procedure getstr (p1 out varchar2) as
    begin
    insert into foo(c1) values ('Hello World') returning c1 into p1;
    end;
    namespace TestOracleTransaction
    class Program
    static void Main(string[] args)
    #if ENABLE_TRANSACTION_SCOPE
    using (TransactionScope scope = new TransactionScope())
    #endif
    string connectionString = "Data Source=orcl;User ID=scott;Password=tiger";
    using (OracleConnection connection = new OracleConnection(connectionString))
    try
    connection.Open();
    #if ENABLE_TRANSACTION
    using (OracleTransaction transaction = connection.BeginTransaction())
    #endif
    try
    #if ENABLE_TRANSACTION_SCOPE
    if (Transaction.Current == null)
    throw new ArgumentException("no ambient transaction found for OracleClient");
    #endif
    OracleCommand command = connection.CreateCommand();
    #if ENABLE_TRANSACTION
    command.Transaction = transaction;
    #endif
    command.CommandType = CommandType.StoredProcedure;
    command.CommandText = "SCOTT.GETSTR";
    OracleParameter param = command.CreateParameter();
    param.ParameterName = "p1";
    param.Direction = ParameterDirection.Output;
    param.DbType = DbType.AnsiString;
    param.Size = 20;
    command.Parameters.Add(param);
    int count = command.ExecuteNonQuery();
    object value = command.Parameters["p1"].Value;
    Console.WriteLine("p1 value is {0}",value.ToString());
    #if ENABLE_TRANSACTION
    // FAILURE - rollback doesn't work when Transaction enabled
    transaction.Commit();
    //transaction.Rollback();
    #endif
    #if ENABLE_TRANSACTION_SCOPE
    scope.Complete();
    #endif
    catch (Exception ex)
    System.Console.WriteLine("ERROR: " + ex.Message);
    #if ENABLE_TRANSACTION
    transaction.Rollback();
    #endif
    finally
    if (connection.State == ConnectionState.Open)
    connection.Close();
    }

  • It is possible create app. in APEX with semantics data?

    I´m sorry for this new thread,
    but I was looking for some information about APEX and semantic data (RDF ). But maybe I´m wrong, that I´m not found answer to my question:
    It is possible create application in APEX for the RDF data(I mean semantic data). I have DB Oracle 10.2g with semantic RDF framework. I created some test RDF database and now I need to know, If can be possible create some easy application on this RDF data in the APEX.
    If found also this topic: Semantics at Oracle Open World that the guy wrote that they planned "demo APEX with RDF data at the Oracle semantic technology".
    Does anyone know if this is possible?If it is, how? and it is this demo semantic. app ready?
    Thank you for answer.
    Best Regards
    Vladimír Vávra
    Edited by: user8321295 on 14.2.2010 2:32 - Edit some Grammar mistakes.

    Hi SArbjeet
    Both are giving same..
    In view i have a button, in button action i have this code
    DATA LR_NODE TYPE REF TO if_wd_context_node.
    DATA ls_data TYPE wd_this->element_node.
    data lt_data type wd_this->elements_node.
    lr_node = wd_context->get_child_node( 'NODE').
    ls_data-button1 = 'Add '.
    ls_data-name = 'Name1'.
    APPEND LS_DATA TO LT_dATA.
    ls_data-button1 = 'Add Sub'.
    ls_data-name = 'Name2'.
    APPEND LS_DATA TO LT_dATA.
    LR_NODE->BIND_TABLE( LT_DATA ).
    It creating two records. on button click of ADD, code is like this
    DATA LR_NODE TYPE REF TO if_wd_context_node.
    DATA ls_data TYPE wd_this->element_node
    data lt_data type wd_this->elements_node.
    lr_node = wd_context->get_child_node( 'NODE' ).
    ls_data-button = 'SUB.'
    ls_data-name = 'name3'.
    APPEND LS_DATA TO LT_dATA.
    LR_NODE->BIND_TABLE( LT_DATA ).
    LR_NODE->BIND_TABLE(
    NEW_ITEMS = LT_DATA
    SET_INITIAL_ELEMENTS = ABAP_TRUE ).
    This code is adding one record, but instead of old records, but i want this record as third one. how to do this???
    My output is like this..
    After first button action 2 row like
    ADD(button) - Name(text view) - value(input).
    SUB(button) - Name1(tv) - Value(ip).
    After clicking ADD button at first row i want output like this
    ADD(button) - Name(text view) - value(input).
    SUB(button) - Name1(tv) - Value(ip).
    ADD1(button) - Name2(TV) - value(ip).
    but i am getting only--- ADD1(button) - Name2(TV) - value(ip).
    Thanks,
    Kris.

  • Possibility of combining exception broadcasting with master data bursting

    Hi Experts,
    We have a scenario where it is required to combine exception broadcasting with master data bursting.
    As per my understanding if you choose broadcast method as 'exception broadcasting', there is no option for choosing 'master data bursting'.
    Am I missing something or Is there a workaround or some method to achieve the same.
    Thanks
    Rajeev

    Hi,
        62 Scheduling: Master data inconsistent Check master data !
    Remove "In House Processing Time" from the Material Master MRP2
    view. Ensure that the Production Version & Routing are correct. Run
    MPS/MRP manually once corrected.
    Regards
    Sunil

  • Error in Distributed transactions with tx data source

              Hi, I need to write into two DB2 databases using two container managed entity beans
              from a container managed session bean. I use a JDBC 3.0 compatible JDBC driver
              and I have configured tx data source. When I run the session bean it gives me
              following exception.
              java.rmi.RemoteException: EJB Exception: ; nested exception is: javax.ejb.EJBException:
              EJB Exception:; nested exception is: java.sql.SQLException: Connection has already
              been created in this tx context for pool named DB2ConnectionPool2. Illegal attempt
              to create connection from another pool: DB2ConnectionPool1
              thank you Lahiru
              

              For a connection pool with xa-drivers, create a tx datasource and select 'Honor
              Global Transactions'. For a connection pool with non-xa drivers, create a tx datasource,
              select 'Honor Global transactions', and select 'Enable Two-phase commit'.
              "Lahiru Karunatilake" <[email protected]> wrote:
              >
              >Hi, I need to write into two DB2 databases using two container managed
              >entity beans
              >from a container managed session bean. I use a JDBC 3.0 compatible JDBC
              >driver
              >and I have configured tx data source. When I run the session bean it
              >gives me
              >following exception.
              >
              >java.rmi.RemoteException: EJB Exception: ; nested exception is: javax.ejb.EJBException:
              >EJB Exception:; nested exception is: java.sql.SQLException: Connection
              >has already
              >been created in this tx context for pool named DB2ConnectionPool2. Illegal
              >attempt
              >to create connection from another pool: DB2ConnectionPool1
              >
              >thank you Lahiru
              >
              

  • How to handle refreshing TEST schema with PRODUCTION schema ?

    - we have database 10g ( standard edition), database name : ABC
    - Schema name: ABC.PRODUCTION (which is our production schema)
    - Schema name: ABC.TEST (which is our testing schema, where developers work)
    Both the production & Test schemas exist in the same database.
    Now once a week I wanted to refresh TEST schema with PRODUCTION data
    Here is what I have been doing all these years:
    => Take a logical backup (EXPDP) of PRODUCTION schema (prod.dmp)
    => Drop user TEST cascade ( i don't need a backup of this TEST schema)
    => Create user TEST
    => Import PROD.DMP data into TEST schema
    All the above 4 steps are being done manually.
    Questions:
    ======
    1. Is there any easier way of doing the above steps using some tool ?
    2. Does Oracle enterprise manager which comes free with database installation (http://localhost:118/em)
    has any utility or tool to do this job ?
    3. I want everything to be refreshed (all database objects including data) ?
    Thanks
    John P
    Edited by: johnpau2013 on Feb 23, 2011 4:32 AM

    This is crazy. One inadvertent typo and you'll overwrite your Production schema. Plus, what happens if a developer 'tests' against the test schema and slows the Production database to a crawl.
    I presume you know all about this, though and can't make the case to management. I hope it's not a business-critical Production database!
    Anyway, your method is decent. I would advise against doing it automatically, to be honest, especially when your system is so precariously set up. But if you exist, you could use encapsulate all the steps into a script and use crontab to automate the process. I, personally, wouldn't use DBMS_SCHEDULER as you have to be careful with priorities and workload sometimes (at least in my experience) and you might end up having your export/import clash with other jobs in the system if you don't pay attention.
    Here are the steps I would use:
    Create a 'create user' script for the test schema based on dynamic SQL. That way you can be sure you have all the grants necessary for the user, in case things change.
    Drop the test user (use EXTRA caution and be defensive when coding this part!)
    Export the schema using FLASHBACK_SCN to ensure you have a consistent export
    Run your 'create user' script to create the test user
    Import the schema with a REMAP_SCHEMA option (use EXTREME caution with this!!!!)
    Compile invalid objects
    Compare objects and exclude any recycle_bin objects. Send an email alert if the object counts are different.
    Compare invalid objects. Any objects which aren't invalid in Production should be flagged up if they're invalid in test.
    Again, it's absolute insanity to have a test schema in a Production database. You absolutely must insist on addressing that with management.
    Mark

  • 2 Clients with Production Role in Production server , possible ?

    My company has implemented 2 Systems SAP Landscape with one development and one production server. So far , we have one client in Production server been configured with Production role and one more is configured with customizing role.
    If possible to configure the client with customizing role in production server to production role ? I tried but the syatem always tell me that a client with production role is already exist.
    Kindly advise .
    Thanks.
    Leon

    Hi, Rohit :
    Once again . Thanks for your input .But , I really can't try what you are suggesting here as my production system is running 24 hours per day and I can't really take the risk by just trying open and close the existing client with production role .  Based on your 1st reply , you said that Production system is possible to have 2 clients with production role . If it is so , there should be settings that I could configure to achieve this rather than trying  to reconfigure the existing client with production role.
    Thanks.
    Leon
    Edited by: Leon Wong on Apr 27, 2009 11:49 AM

  • Service Manager 2010 lab environment syncing with production database

    I am trying to setup a lab environment for SCSM 2010 to test the upgrade to SCSM 2012. I have been following the instructions found here:
    http://technet.microsoft.com/en-us/library/hh914226.aspx
    I have successfully made it to step 16. However, when I open the SCSM console in the lab environment it still shows the Data Warehouse and Reporting buttons. When I create a new ticket in either the lab or production environment it shows up in both consoles,
    even though the production console is connected to the production server and the lab console is connected to the lab server.
    Any ideas on why the lab environment is still syncing with the production server?
    Thanks

    Thread of the Necro-Dancer:
    Regardless, the step Katie was apparently implying (but seems to be missing from her description) is the bit where you backup the database and restore it to a separate instance. after this is done, you can then install a new management server targeting this
    new instance, and then promote the new management server to the workflow server, and run the isolated database using the new management server. 
    i would recommend, however, that you use
    the supported method of producing a upgrade lab with production data, which is very similar to the method Katie implied, includes directions covering all of these steps involved, and allows you to periodically restore production data to the lab database
    with minimal overhead. 
    I didn't notice the date before now. I just look at the latest unanswered posts, so not sure how I got into this one. But technet has been acting weird lately. I get an internal server error far too often.
    http://codebeaver.blogspot.dk/

  • Setting up a Service Manager Lab environment with prod. data

    Hi
    I´m about to set up a lab environment with production data. I´m following the guide here;  https://technet.microsoft.com/en-us/library/jj900180.aspx
    I have a question regarding the Management Group. The Lab environment will have the same Management group name. In the guide nothing is mentioned about changing the name of the management group.
    Will this be a problem when i fire up the Lab environment?
    /Regards

    Well in the guide https://technet.microsoft.com/en-us/library/jj900180.aspx states that you should
    install a secondary management server and stop the services, backup prod. db, restore prod db on the secondary server (lab-server), change  a bunch of rows in different tables.
    So naturally it will have the same management
    group name.
    Am i missing something?

  • Using Fields in Reports with stored Data

    Hello,
    i like to use fields in Reports with saved Data in it.
    Is this possible ?
    Example:
    A Invoicereport with saved Data  in a File. Now i like to get the Field Invoice_ID to use this in VB.net.
    Any Idea ?
    VS2008 with CR_reports in 8.5
    Thanx

    You wrote:
    Example:
    A Invoicereport with saved Data in a File. Now i like to get the Field Invoice_ID to use this in VB.net.
    You can export a report with saved data and then change the filtering without hitting the DB again. If you are asking if using a flat file as the data source there are multiple ways to do this. Define an ODBC data source to the file and specify the sturcture, or using ADO.NET and a record set, or using XML as the data source. Lots of ways in newer versions of CR. 8.5 was limited so the ODBC example is your best option.
    Or if you are asking how to get the data from the report you can use REadRecords. Not sure if 8.5 has this functionality though. It's way past it's end of life so I can't look.
    You wrote:
    VS2008 with CR_reports in 8.5
    Not supported, 8.5 has no .net assemblies and no RDC, only option would be to use RAS to make changes or reading records so you need to upgrade.

  • MRP run - misalignment for planned order between Date and Production Date

    Hi All,
    We've just integrated SAP for a new plant.
    The MRP runs based on Routing Lead Time for Make materials instead on Material Master Lead Time, for Capacity Assessment purpose.
    When requirements dates lay in the past, MRP performs forward scheduling starting from current date and properly defines Production End Date according to Routing.
    By checking MD04, for the proposed Planned Order, the basic date is same as Current Date (= same as Requirement Date) instead of being same as Production Date.
    It works as it if checks stil the MAterial MAster LEad time instead of Routing. In fact if I updated the LT of MM the MD04 provides the right Date.
    For future requirements, the MRP works properly, by having basic dates (both start and end date) aligned with Production Dates of Planned Order.
    How can I do the alignment w/o updating In House Production LT?
    Thanks,
    D

    Hi Caetano,
    we have already done this configuration in OPU5 but it doesn't provide any benefit.
    Below how we configured it:
    Scheduling level: Via detailed scheduling
    Adjust dates: adjust basic dates, adjust dep. reqmts to operation date
    For capacity scheduling: Always basic dates, dep. reqmts to operation dates
    Scheduling type: Backwards.
    Regards,
    D.

  • Production date in inbound delivery in batch split

    Hi,
         I have a issue with production date update in inbound delivery.
    I have created a inbound delivery for 10 PC. The inbound delivery has only one line item and there are no batch details on it. As soon as the inbound delivery is created a message is sent to the 3PL . Once the 3PL receives the goods from vendor he send us a message with the batch details. Say the 3PL has received multiple batches from the vendor. So he send us back an SHPCON message with multiple batch details. That  is batch A with QTY 6 and batch B with QTY 4. He also send us the production date for each batch via the shpcon idoc. say batch A has production date as 10.02.2011 and batch B has production date as 21.03.2011.
    The IDOC is created successfully in our system and the inbound delivery is also updated correctly with the batch split details with correct QTY. That is line item 10 has batch split line items 900001 with batch as A and QTY as 6 and line item 900002 with batch as B and QTY as 4. The issue is only with updation of date of production for each of the above batch. Both batch A & B are updated with date of production as 21.03.2011. Instead of batch A being updated with date of production as 10.02.2011 it is updated with the batch B date of production. I checked in the idoc. In the idoc each batch has correct production date. only when updating the inbound delivery it has updated wrongly..   Has any one come across similar kind of situation. ? How did u manage it?
    Ram

    I found out the issue and the approriate changes has been made in the message handling

Maybe you are looking for