Dev to testing SAP SWCV

Hello i have this following scenario :
Its an SAP SWCV , Every thing already imported in Testing box as well when installed , the only thing extra i will be making is importing IDocs. Now we have the same scenario in dev as well , now even after configuration : when IDocs generated i see messages only in dev box ? what might be the reason ?
Krishna

Hi Krishna,
Check the Business System in the Test environment. Is it pointing to proper system? Also check the Adapter Specific Properties in the Business systems.(Open the Business system->Menu->Services->Adapter Specific Properties).
Also make sure that all configuration is done correctly by checking business systems,Adapter configuration in the Directory.
Hope this helps,
Regards,
Moorthy

Similar Messages

  • Install a Test/Dev instance for SAP MII.

    Hi,
    I want to install a test/development instance for SAP MII on my laptop [2 GB RAM].
    Per my understanding, for this i would need set up SAP NetWeaver CE instance and deploy SAP MII 12.1 component.
    I also want do some custom development & integration with SAP through MII to SAP ME.
    As per Master Guide SAP MII 12.1.pdf, we need to Install SAP NetWeaver CE 7.1 EHP1 SP03.
    Since its a test/development instance can i install [SAP NetWeaver Composition Environment 7.2 Developer Edition|http://www.sdn.sap.com/irj/scn/downloads?rid=/library/uuid/a0a6bd7b-3dfc-2c10-eb95-aae0f777d4ab] and deploy MII.?
    Or do i need to have some specific version of Netweaver CE.
    Please let me know what are steps to install a Test/Dev environment for SAP ME/MII integration..
    Please advice.
    Thanks un advance.

    MII 12.1 has not been validated on NWCE 7.2.  And MII 12.2 is validated on NW 7.3, skipping NWCE 7.2.  It may install fine, but you will most likely run into problems executing some of the functions and features of MII.
    Regards,
    Mike

  • One maschine for dev and test

    Hallo,
    i have one maschine for dev and test. i know that it is possible to do this with 2 instances. SAP recommend to separate dev and test on separate maschines. But it is possible. have anyone experiance with that? When i don´t want 2 instances it is possible to have one instance and separate only the systems for dev and test in sld? Have I to copy the business systems, change the servers and make the configurations in integration directory?
    Thanks in advance...
    Frank Schmitt

    Hi Frank,
    one reason for having two systems
    is that you won't be able to import
    directory to your TEST system easily...
    because both r3 (dev and test will have the same integration server - you won't be able to add them to 2 different transport groups... )
    this means that you will have to create almost EVERYTHING
    in the directory twice... 
    at least without doing some tricks...
    make the life of a developer and create 2 servers
    it may cost less then using developers for creating many things twice
    Regards,
    michal

  • TOBJ entries missing from upgrade dev and test systems

    Hi,
    We are in the process of ECC Upgrade from 4.7 to ECC 6.0 and upgraded our dev and test systems. We missing some SAP defined object entries in TOBJ table from dev and test. Because of this some of the roles getting transport errors if the roles contain those missing objects. We tried to transport the missing entries but we don't see any provision for that. Does any one through an idea to move those missing entries from dev to test. Not sure we will get same error in production. This table updates during upgrade and may be we missed some thing in test as though we followed the same upgrade process for both systems.
    Thank you
    Venkat

    Possibly you are no longer licensed for components which these objects are called from in their concepts.
    In that case the ability to start the component objects should be removed as well.
    Of course if some Z-programs checked these objects or SAP's own "package concept" extended syntax checks did not respect them then you might have problems.
    This should normally only apply to manual authorization instances, as if you had upgraded the roles and regenerated them, then you would not have this problem (in the roles).
    It tells us something about how you or someone else without any training have built these roles (or subsequently mucked them up...)...
    When reading stuff like this, I am always of two minds...
    1) SAP should make the concepts simpler and more intuitive to use without "cowboy" activities allowed by users with SAP_ALL etc.
    2) Customers should bleed for their own sins for not training people appropriatly (here, faking CV's is also ma major pest!).
    It is always a bit of a cat & mouse game and ends up in forums such as SDN in the long run.
    Some of it might also have become "available" to be used by SAP (I believe it was the 6.20 upgrade which installed everything...).
    Real bugger. I would accept it and test as best you can. You can also compare "where.used-lists" and "code scans" before and after the upgrade to see whether it is used by foreign repository objects.
    The scan is more reliable here, as sometimes the object name itself is a variable from a data declaration or condition which you cannot see in the "procedural" code.
    Cheers,
    Julius

  • Error While testing SAP Webservice build by RFC enabled function Module

    Hi All,
    Iam getting an error " You are not authorized to view the requested resource"., while Iam testing SAP Webservice using transaction WSADMIN or SOAMANAGER.I have succesfully tested for WSDL and done with required configurations and settings.
    Can any help me with this ?

    Hi Friends ,
    My problem got solved , I added SAP_J2EE_ADMIN role to my profile and it worked.
    enjoying...
    Edited by: Sridhar Maheswar on Jun 20, 2008 1:21 PM

  • Best practice on 'from dev to test'move.

    Hello.
    My Repository 10.2.0.1.0
    My client 10.2.1.31
    I am writing to ask someone what would be the best practice and most common_sense_oriented way to move OWB from dev to qa/test environment?
    I have read a number of recommendations on this forum and other oracle docs and it seem somewhat tedious exercise...
    At the moment i am either having to simply copy and paste (ye not very professional but works a treat!) then just re-sync the tables to point to correct location on some of the smaller of my projects.
    Now i have a huge project with hundreds of maps with different source locations etc.
    I want to move it into test.
    My test environment is where we test ETL process before implementing it in live as oppose to UAT test.
    I imported the tables into OWB etc from the test, now i want to move my maps from dev to test and this is where my HOW TO comes.
    I have different runtime repositories on my test as per oracle recommended (same names apply to dev and test and live repositories for the consistency purposes). Importing the maps from the export of dev to test doesn't really work and i don't really want start tweaking with export files.
    for some reasons the import only imports it back to the project the export was taken from.... (which is as useful as a smack on a head, in my humble).
    copy and paste the re-sync all tables would be madness, misery and pain all in one!
    So what do i need to do?
    1) OK i imported all the tables and views from the test environment into OWB
    2) How do i move my maps from dev to test?
    3) even if i copy them over - would i honeslty have to then resync tables in every single map (i am already crying by the thought of it)?!
    It seem a little tidious to me.
    I can imagine that there is no silver bullet and everyone have different ideas, but someone please share your experience on how would do it?
    Here is something from the user guide and no matter how many times i read this - i just don't get how i can relate it to what i need to achieve.
    The following quote is from "OWB User Guide", Chapter 3.
    "Each location defined within a project can be registered separately within each Runtime Repository, and each registration can reference different physical information. Using this approach, you can design and configure a target system one time, and deploy it many times with different physical characteristics. This is useful if you need to create multiple versions of the same system such as development, test, and production."
    As i said - i have all my tables imported from DB to OWB, now how do i make my maps to appear in repository on test? I can see the relevance of location to deploy maps into the test runtime repository, but before then i somehow have to make them to appear in my test runtime repository in Design and make sure they are referencing correct tables etc.
    Any help would be greatly appreciated.
    Kind Regards
    Vix

    Hello Oleg.
    Thank you very much for such detailed and very helpful reply.
    You are correct - i have my Design Centre and within it two projects - dev and test.
    Dev has all the locations pointing to the Development DB and it has it's own runtime repository/control centre configured.
    Test has all the locations pointing to the Test DB and it also has it's own runtime repository/control centre configured.
    I have one design centre and two runtime environments.
    both dev and test have identical tables etc. I moved the logic over form dev to test (all the functions, procedures etc), i have also imported tables and logic from TEST DB to the test project.
    All i want to do is now move over the maps from DEv to TEST. Which is not a problem (copy and paste are helpful), but then the copied maps are still point to the tables in Dev. Which means i have to sync it with test tables - i hope i am making sense here!
    I was hoping that there is some clever way of just changing something to effectively tell table in the map 'to point to the table in this database'. If the map is already configured - the only way to do it is to sync the tables, which will enable you to select the DB and table you want your table in the map to be pointing to.
    The reason i do not use imp/exp between projects - it is not really reliable. Have to then jump through the hoops ensuring all contrains etc are there. It is safer to just import whatever i need from DB - ensuring all my constrains etc are there.
    I do regular exports as a means of 'having a backup copy of the project', but never managed to import anything from one project to another (was easier with OWB 9 where it was possible to amend the .mdl file). It works fine to import back to the project the export was taken from.
    I don't have problems with the location etc - took me hours to set everything the way i wanted it to be and now all the deployments are going to the right schemas, DBs etc.
    Is there are any other way re-pointing the tables in the map to another DB? Like in the falt files - there is an option to choose the location of the file. So once the location is define/registered etc - you can choose whatever one is needed fromm the drop down on the left of the map.
    I hoped there would be something similar for the tables. Like a big bulk option for 'tick here if you want all tables in the maps to be pointing to identical tables in another DB) type thing. Guess something like bulk sync option...
    Oh well, guess i just have to stick with sync option (sobbing uncontrolably) and it hasn't stopped raining here for days!
    Once again - thank you very much for all your kind help and advice.
    Kind Regards
    Vix

  • Create dev and test instances of Apex on the same server and database

    I have a dev and prod instances of Apex on different servers. I want a test instance on the same server as the dev instance.
    I am using workspace export/import so all instance workspaces have the same workspace id. The application ID is the same on each instance, in the same workspace. This allows pages to be exported/imported in the differenct instances.
    My question is (I am sure it is obvious) can I have more than one instances of Apex on the same database (dev and test) and have each instance have identical Workspace IDs, etc.
    Sam

    Hi Sam,
    But you can have more than one database on the same server.
    What we do is create a separate database for each APEX versions we are supporting (we still have a customer using APEX 2.0).
    All the databases are accessed with the same APACHE config. All you have to do is change the DAD and have a separate dad for each database (i.e. each APEX versions).
    ex : /pls/apex_dev /pls/apex_test
    This way , I can run different APEX versions on the same server.
    Francis.
    http://insum-apex.blogspot.com/

  • Best Practice - Move from DEV to TEST

    OK guys - we OWB'ers have tried many approaches to moving OWB Objects. With the new Ver 2 I like to use only 1 Design Repository - then I create 2 Projects 1 for DEV and 1 for TEST. I also have 2 Control Centers with associated Target Schemas - 1 for DEV and 1 for TEST.
    This works great since having only 1 Design Repository makes things less complicated and we can do Snapshots and Compare Objects since we use only 1 repository.
    The only problem is when I do a copy and paste from DEV to TEST with say a Fact Table it is difficult to reset all the location info - especially the Lookup Tables.
    Any suggestions?
    Do not suggest using on Project with multiple Configurations since I tried this and it create other issues.

    OK guys - we OWB'ers have tried many approaches to moving OWB Objects. With the new Ver 2 I like to use only 1 Design Repository - then I create 2 Projects 1 for DEV and 1 for TEST. I also have 2 Control Centers with associated Target Schemas - 1 for DEV and 1 for TEST.
    This works great since having only 1 Design Repository makes things less complicated and we can do Snapshots and Compare Objects since we use only 1 repository.
    The only problem is when I do a copy and paste from DEV to TEST with say a Fact Table it is difficult to reset all the location info - especially the Lookup Tables.
    Any suggestions?
    Do not suggest using on Project with multiple Configurations since I tried this and it create other issues.

  • AIA FP 11.1.1.3 - Deployment Plan Generation for migrating from Dev to Test

    Hi All,
    Can you please validate the following from a Deployment from Dev to Test perspective. We have almost completed the development for an interface using AIA 11g, and are in the process of moving the interface from one instnace to another, and any help is highly appreciated.
    Let us the take Purchase Order Integration between a legacy application and Oracle Purchasing, File Adapter-> Requester ABCS -> EBS -> Provider ABCS -> DB Adapter, which is a classical flow.
    1. The Functional person defines a project in the Project Life Cycle Workbench
    2. Functional decomposition done through Purchase Prder Business Task, and Service Solution components for each composite in the classical flow
    3. The ABCSs are created by linking the corresponding Service Solution Components from the LCW using Service Contructor. These composite.xmls will have the annotation populated by the Service Constructor. It is okay for these composites to have concrete wsdls during development phase, but the concreate ones have to be replaced with abstract ones before geenrating deployment plans.
    4. The EBS is not changed, it will have the annotations prepopulated, and there is no need to add any annoations to them
    5. The File Adapter and DB Adapters (Transport Adapters?) have to be annotated based on the developer guide
    6. The xsds and wsdl go into the mds based using the scripts provided. Any other common components can also be placed into MDS.
    7. Once the development is done, Harvest the composites using the AIA Harvester.
    8. After harvesting, from the AIA LCW, Generate the BOM. If we harvest all the 5 composites for the Business Task together, and do Generate BOM, does it capture all the composites? Or do they capture only the composites created using the Service Constructor? Or we have to add all the compistes manually to the Business Task?
    9. Add Harvested Contents by Editing the BOM -> "Search and Add Existing Composite" option. We couldnt locate the "Add Harvested Composite" option while right clicking the Business Task.
    10. Once all the harvested composites have been added to the BOM, export as XML
    11. Using the BOM, generate the DeploymentPlan. The deployment plan will have references to the xsds and wsdls in the MDS
    12. Using the Deployment Plan and AID, deploy the composites into a new instanec.
    These are the questions I have,
    1. Is the above understanding correct?
    2. Since the MDS is in the Dev database, a pre-requisite for AID seems to be deployment of xsds and wsdls into the Test MDS schema. Can you please validate?
    3. What happens to the bpels, xsls, mplans etc? How do they move from one instance to another?
    Regards,
    Anish

    Hi Anish,
    Following are the responses -
    These are the questions I have,
    1. Is the above understanding correct?
    1. The Functional person defines a project in the Project Life Cycle Workbench - Yes
    2. Functional decomposition done through Purchase Prder Business Task, and Service Solution components for each composite in the classical flow - Yes
    3. The ABCSs are created by linking the corresponding Service Solution Components from the PLWB using Service Contructor. These composite.xmls will have the annotation populated by the Service Constructor. It is okay for these composites to have concrete wsdls during development phase, but the concreate ones have to be replaced with abstract ones before geenrating deployment plans.-Yes but location attribute in binding.ws section of composite.xml should always have concrete WSDL location.Use <replaceToken> in DeploymentPlan to replace the ip and port.
    4. The EBS is not changed, it will have the annotations prepopulated, and there is no need to add any annoations to them - Yes
    5. The File Adapter and DB Adapters (Transport Adapters?) have to be annotated based on the developer guide-Yes
    6. The xsds and wsdl go into the mds based using the scripts provided. Any other common components can also be placed into MDS.- Yes
    7. Once the development is done, Harvest the composites using the AIA Harvester.-Yes
    8. After harvesting, from the AIA PLWB, Generate the BOM. If we harvest all the 5 composites for the Business Task together, and do Generate BOM, does it capture all the composites? Or do they capture only the composites created using the Service Constructor? Or we have to add all the compistes manually to the Business Task?-The 5 composites created by you should have a corresponding service solution component defined.When you define a service solution component in PLWB then a GUID is generated in the database table.You need to retrieve that value and manually enter in the annotation section of composite.xml except for ABCS.Then only the BOM will contain information on all the composites.
    9. Add Harvested Contents by Editing the BOM -> "Search and Add Existing Composite" option. We couldnt locate the "Add Harvested Composite" option while right clicking the Business Task.*-Once you harvest it will be visible.*
    10. Once all the harvested composites have been added to the BOM, export as XML - Yes
    11. Using the BOM, generate the DeploymentPlan. The deployment plan will have references to the xsds and wsdls in the MDS - Yes
    12. Using the Deployment Plan and AID, deploy the composites into a new instance.-Yes
    2. Since the MDS is in the Dev database, a pre-requisite for AID seems to be deployment of xsds and wsdls into the Test MDS schema. Can you please validate?
    Yes the 'UpdateMetaData' section under 'Configurations' in the DeploymentPlan will upload XSDs, WSDLs, DVM,Xrefs into MDS.
    3. What happens to the bpels, xsls, mplans etc? How do they move from one instance to another?
    The 'Deployments' section will deploy the project into server and write the information into MDS.
    Rgds,
    Mandrita

  • LCM - copy (promote) jobs from DEV to TEST

    Hi all!
    I am searching for a way to copy the LCM jobs (to promote objects from DEV to TEST) from the DEV environment to the TEST environment.
    Is there a way to do so? Or do I need to recreate the jobs on the TEST new, when I want to transfer the same objects from TEST to PROD?
    Thanks,
    Hakan

    And do you know, how an existing job can be refreshed?
    I can create a job where a report is promoted from DEV to TEST. When the report now changes, so that a universe used is changed, the job itself is not updated.
    The only way I found is to recreate the job again. If I can solve the refresh problem of a job, I could copy the jobs from DEV to TEST and exchange the logins from DEV -> TEST and from TEST -> PROD.
    ciao Hakan

  • DataSource / Structures mismatched between Dev and Test Systems. ..!!

    Hi,
    We are doing a scenario. Where XI will update the data into PSA through ABAP proxy.
    Scenario worked perfectly in development system.
    We transported the objects from Dev to test. The Strucures are mismatched in SE11 between development and Test systems as below.
    Data sources (ZDS_RECIHDR and ZDS_RECTPALL) looks ok. But when I saw the structures in SE11 they are not correct, they got mismatched.
    Development:
    /BIC/CQZDS_REC00001000 - Header (ZDS_RECIHDR)
    /BIC/CQZDS_REC00003000 - Allocation (ZDS_RECTPALL)
    Test:
    /BIC/CQZDS_REC00001000 - Allocation (ZDS_RECTPALL)
    /BIC/CQZDS_REC00003000 - Header (ZDS_RECIHDR)
    Kindly let me know where it might have gone wrong?
    Thanks
    Deepthi

    We done it already. Still it is failing.
    While transporting, it is failing and showing the error as
    Program ZPI_CL_IA_PAYMENT_ALLOCATION1=CP, Include ZPI_CL_IA_PAYMENT_ALLOCATION1=CM001: Syntax error in line 000016
    The data object 'L_S_DATA' has no component called'/BIC/ZSALENUM', but there is a component called
    Program ZPI_CL_IA_PAYMENT_HEADER======CP, Include ZPI_CL_IA_PAYMENT_HEADER======CM001: Syntax error in line 000016
    The data object 'L_S_DATA' has no component called'/BIC/ZTRANDATE', but there are the following com
    The Structure is mismatched in SE11 between header and allocation structures. That is the reason it is failing.
    Any more ideas pls?

  • Modified SAP SWCV and CR Content Upgrade

    Hello Experts,
    I am reviewing an existing XI installation and found that the developers added interfaces and other design objects to standard SAP SWCVs like BW and R/3 instead of creating a project SWCV to put their objects in.
    The question is, if we update the CR Content in the SLD would these developments residing in the SAP SWCVs be affected? and what would be the consequences to them?
    Thanks in advance for your answers.
    -Sam.

    At least he must have created the name spaces as per my assumption. If that is the case, it should not hurt updating. But if he has used the namespaces also as it is.. you might want to do a trial run on a sand box..!!
    VJ

  • Setup of Customer Dev and Test environment

    Customer wants to set up a Dev and Test env for Apex on the same machine. Been trying to wrap my head around the way to do this. Seems like these are the options:
    1) Separate databases, Two APEX homes, Two HTTP Servers
    2) One database, different Workspaces, One APEX home, One HTTP Server
    # 1 is the most 'pristine' from the perspective of separation of everything (and patching), but #2 seems like a more manageable environment. Only down side I see to #2 is that DEV can never have a newer release of APEX than TEST.
    BTW -the customer wants TEST to look like PROD except during code push time. They also want TEST env refreshed regularly from PROD in between code releases.
    Suggestions? Other options?
    Thanks,
    Dwight

    Hello Dwight,
    Check your dads.conf file. There is something like this:
    <Location /pls/apex>
    Order deny,allow
    PlsqlDocumentPath              docs
    AllowOverride                  None
    PlsqlDocumentProcedure         wwv_flow_file_mgr.process_downloadd
    PlsqlDatabaseConnectString     localhost:1521:XE ServiceNameFormat
    PlsqlNLSLanguage               AMERICAN_AMERICA.AL32UTF8
    PlsqlAuthenticationMode        Basic
    SetHandler                     pls_handler
    PlsqlDocumentTablename         wwv_flow_file_objects$
    PlsqlDatabaseUsername          APEX_PUBLIC_USER
    PlsqlDefaultPage               apex
    PlsqlDatabasePassword          apex_public_user
    PlsqlRequestValidationFunction wwv_flow_epg_include_modules.authorize
    Allow from all
    </Location>You can easily change the Location - /pls/apex - to (e.g.) /dev add another Location (/test) and provide it with the different connect strings.
    Then you can access your Dev environment using http://localhost:7778/dev and Test with http://localhost:7778/test.
    You can also change the (default) settings to your /i/ (virtual) directory to point to different dev and test directories.
    Example:
    Alias /dev/ "c:\Oracle\OAS\Apache/dev/images/"
    Alias /tst/ "c:\Oracle\OAS\Apache/test/images/"
    and change the value of the Image Prefix in your Application Definition accordingly.
    Greetings,
    Roel
    http://roelhartman.blogspot.com/
    You can reward this reply by marking it as either Helpful or Correct ;-)

  • Testing SAP HCM application using SAP TAO

    I have the following questions regarding SAP HCM testing.
    1)  Do we use any third-party i.e, HP's Quality Center, QTP tools to test SAP HCM applicationa?
    2) Do any one  tested SAP HCM using SAP TAO tool? 
    I am new to SAP HCM testing, appreciate if someone can send some documents related to testing using SAP TAO and other third-party tools.

    Basuvarma:
    I understand that we use Quality Center to maintain the test cases. I want to know if you can provid me any documents related to SAP HCM testing using QC or Rations tools....
    Appreciate your response.

  • BPEL DEV and TEST Environment

    Hello All,
    I have situation with a client. The Client wants to install a BPEL DEV and TEST environment on one Server Instance and One Database Instance.
    I know we can have multiple App Server Instances on a single Physical Server.
    But the question is about having just one Database Instance. Can multiple BPEL schemas reside on same database with different schema names like orabpelDEV and orabpelTEST? Can the default schemas names orabpel, oraesb be renamed for different environment? If so how do I do it?
    Any other suggestions is appreciated.
    Thanks,
    Abhay.
    PS: BPEL 10.1.2 is being used with MID TIER Installation

    In the documentation of the App Server in chapter 4 it's all about BPEL clustering:
    http://download-west.oracle.com/docs/cd/B31017_01/integrate.1013/b28980/clusteringsoa.htm
    This can also apply to your environment. It is possible to have multiple BPEL PM server using one signle database and the same dehydration store. This helps you to increase the througput, scalibility and reailibility.
    You can also implement multiple BPEL domains on a single BPEL server. In this way you can implement a OT(AP) environment.

Maybe you are looking for

  • Report definition not found

    I install OBIEE 11g and login to BI publisher. However, I get errors as follows Report definition not found:/~administrator/Shared Folders/Sample Lite/Published Reporting/Reports/Product Listing/_report.xdo Where is the correct report definition?

  • Sub Folders Arrows not turned down When Mail.app Launched

    Hello, I imported some of my Email from Entourage. I have a mail folder "Companies", and several sub folders inside of that. Some of these folders I like to always have open (the little arrow to the left of the folder name turned down). However, ever

  • Can't view un rendered HDV 1080-30 on time line

    I captured HDV video I shot on my JVC GY HD200 to Final Cut 5.4. When I drop the sceen on the time line it has to be rendered once I render if I move it then it wants to be rendered again. This is driving me crazy... I just want to edit the project t

  • After installing Java 6 update 11 for OSX 10.6, unable to open .jnlp files.

    Appears to download application then quits without an error message. When clicking on the .jnlp file again, nothing happens. I can go into java preferences and remove the application from the cache, but it does the same thing, redownloads the applica

  • MacBook Pro 15" mid 2012 repeatedly crashes due to graphics hardware failure.

    I'm dead sure it is related to either graphics switching hardware OR the nVidia 330M graphics chip ACCIDENTALLY PRESSED CMD+S AND SUBMITTED THIS TOO EARLY! EDDITING NOW ! PLEASE BE PATIENT