Pre production deployment checklist
Hi all.
Iam in the process of adding, couple of channels to the running production environment.
Is there anything that i should consider technically before deploying my new channels on the production environment?
I have two challenges :
1) To add channels so that it fits into the exsisting layout exactly. For this, what are the things that i should consider before importing the channels? Should i concentrate on the Tab provider being used currently?
2) how would i make sure, these new channels must appear for all exsisting ldap users.
I appreciate any feedback that get me started confidently.
Thanks in advance
Veera
I would add to the list:
- run reconciliation report beetween AM accounting and GL accounting.
- set Company Code as "in production" for FI
- set Company Code as "in production" for assets
- create variants for the programas that creates DME files (tipically F110...)
- Documents, vendor, assets, customers, range numbers
- check that all the customer and vendor banks are already created in the system
- modify bank descriptions if needed
regards
Similar Messages
-
Setting Office 2013 on Windows 8x in Pre-Production Mode
Hello
We are in the process of evaluating ADRMS for protecting documents. Currently, I have ADRMS running on a Windows Server 2012 x64. I have 2 other machines which talk to this - one with Office 2013 installed on a windows 8.1 set in production mode and another
similar machine having a custom program based on ADRMS SDK 2.1 APIs.
How can I set the server running ADRMS as well as Office 2013 in pre-production mode? If there are any documents on this, can this be shared or pointed to please
Thanks
SNHello,
Does anyone know how to do this or is this done differently?
This is what I want to test to start with - I have a document (docx) and I have protected it using ADRMS SDK2.1 based custom client application. I want to test this protected document by opening using Office 2013 and see if the permissions I set are working.
The custom application is installed on a machine (8.1, set to pre-prod by setting MSIPC & uDRM \Hierarchy to 1) connecting to a server (Win 2012, set in pre-prod by setting DRMS\Hierarchy to 1) with both ADRMS & ADDS installed in it. For some
reason the client works even with uDRM/Hierarchy set to 0, so not sure what is the relevance of uDRM.
I have the Office 2013 installed on a Windows 8.1 different from the above 2 machines and connected to the above domain. But I am unable to open the document protected by the custom RMS app. In fact, when I try to protect a document created using Word 2013
on this machine it says "The system cannot find the file specified"
00000001 0.00000000
[23512] /-----------------------------------------------------------------------
00000002 0.00000000
[23512] Public API called: IpcFreeMemory
00000003 0.00000000
[23512] pb: 0000005F3D432070
00000004 3.09854221
[23512] /-----------------------------------------------------------------------
00000005 3.09854221
[23512] Public API called: IpcpBootstrapUser
00000006 3.09854221
[23512] pConnectionInfo: NULL
00000007 3.09854221
[23512] pUser: PCIPC_USER
00000008 3.09854221
[23512] -->dwType: 1
00000009 3.09854221
[23512] -->wszID: [email protected]
00000010 3.09854221
[23512] pToken: PCIPCP_TOKEN
00000011 3.09854221
[23512] -->dwType: 2
00000012 3.09854221
[23512] -->pcvTokenData: NULL
00000013 3.09854221
[23512] -->cbTokenData: 0x00000000
00000014 3.09854221
[23512] pContext: PCIPC_PROMPT_CTX
00000015 3.09854221
[23512] -->cbSize: 40
00000016 3.09854221
[23512] -->hwndParent: NULL
00000017 3.09854221
[23512] -->dwFlags: 0x00000000
00000018 3.09854221
[23512] -->hCancelEvent: 0000000000000558
00000019 3.09854221
[23512] -->pcCredential:
00000020 3.09854221
[23512] NULL
00000021 3.09854221
[23512] pvReserved: NULL
00000022 3.09866428
[23512] [msipc]:Disabled SSPI auth for RMSO
00000023 3.09868693
[23512] [msipc]:IpcTokenCache::GetCachedToken Creating a new IIpcToken
00000024 3.09872818
[23512] [msipc]:+LoggedOnUserToken Created
00000025 3.09875011
[23512] [msipp]:+ippGetUser
00000026 3.09876585
[23512] [msipp]:+ippGetUserForUserInfo
00000027 3.09879208
[23512] [msipp]:+IPPUserIdentity::InitializeIdentityForUserInfo entered
00000028 3.09882998
[23512] [msipp]:+IPPLicenseStore::GetMachineCert
00000029 3.09911990
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\msdrm\clientsdk\store\drmstore.cpp(2222)
00000030 3.09911990
[23512] HRESULT = 0x8004f00d
00000031 3.09914303
[23512] [msipc]:+ippActivateMachine
00000032 3.09933877
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\msdrm\clientsdk\store\drmstore.cpp(708)
00000033 3.09933877
[23512] HRESULT = 0x8004f00d
00000034 3.09944534
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\msdrm\clientsdk\store\drmstore.cpp(763)
00000035 3.09944534
[23512] HRESULT = 0x8004f00d
00000036 3.09955812
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\msdrm\clientsdk\store\drmstore.cpp(708)
00000037 3.09955812
[23512] HRESULT = 0x8004f00d
00000038 3.09959126
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\msdrm\clientsdk\store\drmstore.cpp(763)
00000039 3.09959126
[23512] HRESULT = 0x8004f00d
00000040 3.09977841
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\msdrm\clientsdk\store\drmstore.cpp(708)
00000041 3.09977841
[23512] HRESULT = 0x8004f00d
00000042 3.09981275
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\msdrm\clientsdk\store\drmstore.cpp(763)
00000043 3.09981275
[23512] HRESULT = 0x8004f00d
00000044 3.09994483
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\msdrm\clientsdk\store\drmstore.cpp(708)
00000045 3.09994483
[23512] HRESULT = 0x8004f00d
00000046 3.09997869
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\msdrm\clientsdk\store\drmstore.cpp(763)
00000047 3.09997869
[23512] HRESULT = 0x8004f00d
00000048 3.10005188
[23512] Exception at d:\bt\2274\private\client\source\port\windows\ipcossecuritygeneral.cpp(90): hr = CDRMtoSP::CreateMachineCerts()
00000049 3.10005188
[23512] HRESULT = 0x80070002
00000050 3.10007334
[23512] [msipc]:Error encountered while activating the machine, hr = 0x80070002
00000051 3.10011625
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\core\ippmsdrmwrapper.cpp(650)
00000052 3.10011625
[23512] HRESULT = 0x80070002
00000053 3.10017371
[23512] [msipp]:-ippGetUser HR=80070002
00000054 3.10020924
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\framework\win7\win7ippuser.cpp(92): ippGetUser(IPP_GU_CONNECTION_INFO, pConnectionInfo, dwFlags, ipcContext, m_pUser)
00000055 3.10020924
[23512] HRESULT = 0x80070002
00000056 3.10022879
[23512] Error HRESULT 0x80070002 mapped to 0x80070002
00000057 3.10026145
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\api\ippapi.cpp(2525): IppGetUser( IPP_GU_CONNECTION_INFO, &pUserFromConnInfo, pUserFromConnInfo.wszID ? dwFlags : (dwFlags | IPP_GU_NEW), pIpcContext, &hUser)
00000058 3.10026145
[23512] HRESULT = 0x80070002
00000059 3.10028005
[23512] Error HRESULT 0x80070002 mapped to 0x80070002
00000060 3.10030127
[23512] Public API IpcpBootstrapUser exited with return code 0x80070002
00000061 3.10030127
[23512] \-----------------------------------------------------------------------
00000062 3.10036349
[23512] /-----------------------------------------------------------------------
00000063 3.10036349
[23512] Public API called: IpcGetTemplateIssuerList
00000064 3.10036349
[23512] pConnectionInfo: NULL
00000065 3.10036349
[23512] dwFlags: 0x00000000
00000066 3.10036349
[23512] pContext: PCIPC_PROMPT_CTX
00000067 3.10036349
[23512] -->cbSize: 40
00000068 3.10036349
[23512] -->hwndParent: NULL
00000069 3.10036349
[23512] -->dwFlags: 0x00000003
00000070 3.10036349
[23512] -->hCancelEvent: 0000000000000558
00000071 3.10036349
[23512] -->pcCredential:
00000072 3.10036349
[23512] NULL
00000073 3.10036349
[23512] pvReserved: NULL
00000074 3.10039186
[23512] [msipp]:+ippGetUser
00000075 3.10040927
[23512] [msipp]:+ippGetUserForUserInfo
00000076 3.10043311
[23512] [msipp]:+IPPUserIdentity::InitializeIdentityForUserInfo entered
00000077 3.10046244
[23512] [msipp]:+IPPLicenseStore::GetMachineCert
00000078 3.10075140
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\msdrm\clientsdk\store\drmstore.cpp(2222)
00000079 3.10075140
[23512] HRESULT = 0x8004f00d
00000080 3.10077405
[23512] [msipc]:+ippActivateMachine
00000081 3.10079026
[23512] [msipc]:ippActivateMachine Skipping Machine activation because of Offline mode
00000082 3.10082817
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\core\ippmsdrmwrapper.cpp(621)
00000083 3.10082817
[23512] HRESULT = 0x8004020d
00000084 3.10084748
[23512] [msipp]:-ippGetUser HR=8004020d
00000085 3.10087729
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\framework\win7\win7ippuser.cpp(92): ippGetUser(IPP_GU_CONNECTION_INFO, pConnectionInfo, dwFlags, ipcContext, m_pUser)
00000086 3.10087729
[23512] HRESULT = 0x8004020d
00000087 3.10089636
[23512] Error HRESULT 0x8004020d mapped to 0x8004020d
00000088 3.10091305
[23512] [msipc]:+IpcGetTemplateIssuerList IppGetUser Failed hr = 0x8004020d
00000089 3.10093284
[23512] [msipp]:+IPPLicenseStore::GetCLCWithRAC
00000090 3.10112619
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\msdrm\clientsdk\store\drmstore.cpp(2222)
00000091 3.10112619
[23512] HRESULT = 0x8004f00d
00000092 3.10114932
[23512] [msdrm]:-CDRMStore::GetLicense type=3 hr=8004f00d
00000093 3.10118246
[23512] Exception at d:\bt\2274\private\client\source\adrms\native\protection\core\ipplicensestore.cpp(399): m_cdrmStore->GetLicense( DRM_LICENSE_TYPE_CLC, pwszSearchPattern, uSearchPatternCount, fDefaultForUser, puCount, prgwszLicense, prgwszRACs)
00000094 3.10118246
[23512] HRESULT = 0x8004f00d
00000095 3.10121131
[23512] Public API IpcGetTemplateIssuerList exited with return code 0x8004020D
00000096 3.10121131
[23512] \-----------------------------------------------------------------------
00000097 3.10124445
[23512] /-----------------------------------------------------------------------
00000098 3.10124445
[23512] Public API called: IpcFreeMemory
00000099 3.10124445
[23512] pb: 0000005F3F8B0DB0
00000100 3.10132265
[23512] /-----------------------------------------------------------------------
00000101 3.10132265
[23512] Public API called: IpcGetErrorMessageText
00000102 3.10132265
[23512] hrError: 0x80070002
00000103 3.10132265
[23512] dwLanguageId: 1033
00000104 3.10215545
[23512] Public API IpcGetErrorMessageText exited with return code 0x00000000
00000105 3.10215545
[23512] \-----------------------------------------------------------------------
00000106 4.53870678
[23512] /-----------------------------------------------------------------------
00000107 4.53870678
[23512] Public API called: IpcFreeMemory
00000108 4.53870678
[23512] pb: 0000005F3D4324F0
00000109 109.36756897
[24016] SHIMVIEW: ShimInfo(Complete)
00000110 118.99750519
[24088] SHIMVIEW: ShimInfo(Complete)
Any help will be much appreciated
Thanks
SN -
FCS for animation pre-production
I'm searching for an asset management solution for our animation production. The assets in this case will primarily be designs and storyboards. None of which would be used directly in a FCP project. This would all be part of pre production.
As the artwork moves through the pipeline, having one central location for everyone to work from would be great. Someone recommended FCS and though it looks great I'm wondering if this will work for us.
Thanks.Final Cut Server would be a great solution for this needs regardless if you plan to use the digital assets in FCP or not. Final Cut Server can be installed both Windows and Mac OS X. I am assuming that at some point you may use After Effects to produce an animatic and Final Cut Server can be use for this as well. Also if you have a lot of image sequences Final Cut Server 1.5 has native support for image sequences which will also automatically create preview version of the image sequence.
Hope this helps.
Nicholas Stokes
XPlatform Consulting -
Standard tcode FBL5N dump in pre production.
I am getting a dump when I execute the standard tcode FBL5N (Customer line item) with one company code. FBL5N tcode is not at all giving dump in production system, however it dumps in pre production.
The dump exists in the function module : fdm_local_get_objects_range while executing the SELECT stmt
*SELECT * FROM fdm_dcobj*
INTO CORRESPONDING FIELDS OF TABLE et_object_data
WHERE logsys LIKE i_logsys
AND obj_type LIKE i_obj_type
AND obj_key IN it_range_obj_key
AND relation LIKE i_relation
AND case_guid_loc IN it_guid
AND is_confirmed LIKE i_confirmed
AND is_voided LIKE i_voided.
Do I need to search for sap note ?
Below is dump analysis
Error analysis
An exception occurred that is explained in detail below.
The exception, which is assigned to class 'CX_SY_OPEN_SQL_DB', was not caught
in procedure "FDM_LOCAL_GET_OBJECTS_RANGE" "(FUNCTION)", nor was it propagated by
a RAISING clause.
Since the caller of the procedure could not have anticipated that the
exception would occur, the current program is terminated.
The reason for the exception is:
The SQL statement generated from the SAP Open SQL statement violates a
restriction imposed by the underlying database system of the ABAP
system.
Possible error causes:
o The maximum size of an SQL statement was exceeded.
o The statement contains too many input variables.
o The input data requires more space than is available.
o ...
You can generally find details in the system log (SM21) and in the
developer trace of the relevant work process (ST11).
In the case of an error, current restrictions are frequently displayed
in the developer trace.
How to correct the error
The SAP Open SQL statement concerned must be divided into several
smaller units.
If the problem occurred due to the use of an excessively large table
in an IN itab construct, you can use FOR ALL ENTRIES instead.
When you use this addition, the statement is split into smaller units
according to the restrictions of the database system used.
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:Note Still Valid in Market place.
"SAP Note 1111155 - FSCM-DM: Termination "Open SQL
statement is too large."
Note Language: English Version: 10 Validity: Valid Since 07-28-2010
Summary
Symptom
When you access dispute case using the integration (creating dispute case
using FI transactions (for example, FBL5N), clearing items to which dispute
case have been assigned), the system issues the dump "Open SQL statement is
too large".
The error occurs if there are many (more than 1000) dispute cases for a
customer.
Other terms
FBL5N, FBZ1, F-32, integration
Reason and Prerequisites
This problem is due to a very large dispute case volume.
Solution
Import the relevant Support Package or implement the attached correction
instructions.
Header Data
Release Status: Released for Customer
Released on: 07-28-2010 08:02:45
Master Language: German
Priority: Correction with medium priority
Category: Program error
Primary Component: FIN-FSCM-DM Dispute Management
Valid Releases
Software Component Release From
Release
To
Release
and
Subsequent
SAP_APPL 600 600 600
SAP_APPL 602 602 602
SAP_APPL 603 603 603
SAP_APPL 604 604 604
SAP_APPL 605 605 605
PI 2004.1 2004_1_47" -
Production deployment - Execution agent , Port connection
Hello Friends,
We are new to ODI. As part of production deployment, we have 1 master rep in PROD env and 1 Work rep per env[Dev & QA]. Like to clarify if the execution port defined (Ex: 20912) be opened between the Server were ODI agent is installed and repository DB Server? (OR) is it also required to open port 20912 to be accessed from were ODI client is installed as well?? in case we plan to use another production agent install as scheduler agent to test large volumn data loads.
Here's our setup:
ODI client install -> Win2003 Server
ODI Agent -> Unix - Solaris 10
Master Rep -> SQL 2005 - Win2003 Server
Thank you.It should be open where you have scheduled the agent (in your case Unix Solaris 10)
-
Pre-Production Setup for Failover
Hi
I have installed my SAP Production environment on Window 2008 R2 , SAP ECC 6.0 EHP5, Oracle 11.2.0.3.
I have setup my DR system at different seismatic zone on Window 2008 R2 , SAP ECC 6.0 EHP5, Oracle 11.2.0.3.
The Oracle Dataguard is used for data replication from Production to SAP DR system.
Now I want to test the failover scenario. I would like to go and install the separate SAP environment in primary site as Pre-Production for DR Test on different hardware box. The system identification (SID) for production and pre-production is same as RP9.
Would like to know that Can I use the same root domain controller (RDC) available at primary site for production and pre-production environment. The domain users are same for both SAP systems.
Please clarify that does this situation will create any problem to my production or pre-production environment with same root domain controller?
or
should I go head with different root domain controller for pre-production environment ?
Regards
vimalHello
It is not recommended to have SAP systems sharing the same SID in a landscape... at least because it is not possible to have the system in the same TMS transport domain (but this can also have other drawback, especially if both system have client with same logical system name).
Even if it is possible to use the same domain users for two different systems I would not recommend this option, especially if one of the systems is a production one...
Suppose that for any reason the SAP domains accounts gets locked / modified / deleted ...
Either you create pre-prod system with a different SID (best option) and then you can share the same windows domain, or if you are willing to keep the same SID I would suggest that you install your pre-prod system either in an other domain or in a workgroup...
If your purpose is to test your Oracle DR abilities that is not a domain related mechanism, so you can test it in a workgroup...
Regards -
Cloud service restored to the last production deployment
I've update production deployment yesterday morning then I've made changes to service files using remote connection
add and update files and everything was OK.
today morning all the changes I've done after deployment was undone and customers use the old version and this cost us hundreds of thousand of pounds
i need to know what's happen nothing appeared in operations logHi,
>> I've made changes to service files using remote connection
As far as I know this is the reason, we could use remote connection to do some development or troubleshoot without redeploy our application, but I don't suggest you use it to update application in product, because the
cloud service will recover if the role needs to be recycled for whatever reason. I suggest you redeploy the application after make some changes
Best Regards,
Jambor
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
Can Tcode exist in Pre-production and Production without transporting it...
Hi ABAP Guru's,
Can a T-code that has been created in Development system saved in a package be available in Pre-production and Production environment without transporting it.
Apt answers will be rewarded.
RaghavHi,
I hope it is not possible in normal scenario. But it is possible if below thing has happend.
eg: U created a transport in development system and assigned to some package/development class.
If some one created another TR and transported the whole package to quality/production then your transaction code will also be tranported but ur TR will not be. This i am not sure. But still....i think this can happen.
Check the above thing if it has happend.
Thanks,
Vinod. -
UCS pre-production test plan from cisco
Hi,
Is there a pre-production tests plan from cisco to validate that a new UCS setup is A-OK?
I found some stuff on google. But nothing from cisco.
Thanks for any help.
Stephane.Hi Stephanie,
I am unaware of a particular pre-production test plan from Cisco to validate UCS setup but I do believe that Advanced Services is available in this regard, to assist. However, I suspect that this is not the level that you are requesting.
Thanks.
-Bruce -
Hi, How to do pre-production planning for the possible production with current resources in a stipulated time and resources required for completion of specified production qty with the stipulated time. i also need to condisder maintenance schedule etc.pls advise.
Dear Yadav,
In my understanding LTP is one way of doing the pre-production planning for a material or a list of
material through simulated planning and using the simulated planned orders.
This can be used for doing vendor analysis and also to check the capacity anaylsis.
Based on this you can modify the production plan or go for extra shift or extra resources to meet the
plan or adjust the plan as per the available capacity.
To know explore more about LTP,check this thread,
LTP - CM38
Just hold on for other's suggestions on the same.
Regards
Mangalraj.S -
Can I merge a site from pre-production with the content already in production - ALM query
Hi, I have a number of queries around the movement of content between environments in SP2010.
1) Move a site from production to test without the production data
Is the best option here to save the site as a template and don't tick the include content box? Or is it better to take a site collection backup and then restore it to the test environment.
2) Once I have my site in test is there any way I can make changes, then move the site back to production while still maintaining the production data?
I don't know how this would work but maybe there is a way somehow. Say for example the site has been modified in test such as adding new site columns or content types, how should this be deployed to production? Is it simply a matter of replicating all the
changes manually to production?
It would be useful to get some best practices around this topic. Thanks.Hi Speedbird85,
This is depend the changes you need to move.
For empty or small sites, using site template can meet the requirement, you can consider using this.
Backup and move the content database between environment will lost production changes, this is not really recommended, if you keep the test environment update to date with production, you can use this way.
For kinds of changes move, there are various ways to sync the conent, like lists, master pages, css, etc. should be packaged in solution file, and deploy it between them.
Please find related information from following article:
SharePoint 2010: Copy production content to acceptance/test environment:
http://www.appdelivery.com/2012/09/sharepoint-2010-copy-production-content-to-acceptancetest-environment/
SharePoint 2013 Dev/Test/Production environment – Moving content:
http://sharepoint.stackexchange.com/questions/78483/sharepoint-2013-dev-test-production-environment-moving-content
Thanks,
Qiao
Forum Support
Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
[email protected]
Qiao Wei
TechNet Community Support -
Hello All
We are soon going live with the integration of thirdparty CRM system with our SAP system . I would liek to know from all the experts the tasks and checklists that need ot be performed before going live .
Our scenarios are all involving MAster and sales data, and all scenarios are IDOC/BAPI to&from SOAP services. All of your sugestions are welcome ,
Thank you
SudheerContact SAP about your SAP GoingLive Check !
Guides You to a Smooth Start of Production and Technically Robust Operations
The SAP GoingLive Check safeguards your SAP solution to a smooth start of production, which is the final milestone of a successful implementation project. This proactive service mitigates the risks that arise from non-optimal system and business process configuration. It ensures a technically robust operation from the beginning and therefore protects your business.
During this service, certified SAP service consultants check the solution in a standardized procedure and give recommendations ensuring optimal performance and system availability for the core business processes. The SAP GoingLive Check is a remote service that is performed in SAP Solution Manager and uses the end-to-end root cause analysis tools. The results are delivered in a detailed service report including an action plan.
Depending on your SAP solution and contract type, you will receive a customized GoingLive Service portfolio tailored to your needs. The portfolio consists of up to three service sessions: analysis, optimization, and verification. The delivery of the individual sessions takes place at key phases in the implementation project:
Analysis session: checks the configuration of the SAP system, operating system and database. The session is scheduled a minimum four weeks before your start of production
Optimization session (optional): optimizes the response times of the core business transactions and is scheduled normally two weeks before your start of production depending on the availability of the productive data
Verification session: verifies the smooth operation and performance of the productive environment and is scheduled six to eight weeks after start of production
http://service.sap.com/~form/sapnet?_SHORTKEY=01100035870000707856&
Also check: https://service.sap.com/~sapidb/012006153200000920262007E/Readiness_Check_Version4.pdf -
Best practice - Production deployment
what is the best practice to deploy custom built application in production?
Where to deploy in Admnistration server or in manged server?
Thanks,
ShankarHi Shankar ,
It is always better to deploy your application in the Managed Server. You do not need to restart your admin server while you need to restart your managed server for changing your environment related to your application. Admin Server is meant for administering all the managed server instances. You can have many different managed servers for different specific applications. So it is a best practise to deploy your application in the managed server.
Regards,
Ilaya -
Java webservice production deployment
I have installed JWSDP1.0.01 and created a webservice 'VisitantWS' using the same. I can deploy, access and consume web service alright in the JWSDP deployment environment.
Next, I am trying to create a production level environment for which, I have installed IIS and Tomcat4.0 on seperate machine with IIS redirector in place.
For deployment of my webservice in the Tomcat, I am placing the .war file(from \dist directory of JWSDP) in to the \webApps and restart IIS, Tomcat. It extracts the .war file and creates folder hirerchy in \work directory automatically. Now, when I try to check my deployment using
http://localhost:8080/VisitantWS-jaxrpc/VisitantWS
it says 'the requested resource (/VisitantWS-jaxrpc/VisitantWS) not found.
I have changed my server.xml and added the context for VisitantWS to it.
when I check the status of the service using
http:/localhost:8080/manager/list
it says
'/VisitantWS-jaxrpc:stopped:0
/running:0'
trying to reload the service gives me
'FAIL - Encountered exception java.lang.IllegalStateException: Container StandardContext[VisitantWS-jaxrpc] has not been started'
How can I deploy the webservice created using JWSDP for the production use where I dont have to use Ant (which is the part of JWSDP) for the deployment or ship JWSDP alongwith my product. Am I missing something for the deployment of the webservice on Tomcat?
HemalIt was almost a year back and I have been distracted to other stuff but as I think of it now, having tomcat and IIS on different machines must be the problem as the webserver need the webcontainer to be on the same machine to redirect the requests. not sure but I plan to get back to it soon.
-
** Production Deployment Topologies ... **
Hi all,
I've been reading up on Coherence deployment topologies and would like some assistance on the best practice for deployment into a Production environment.
In production deployments, what are the recommended deployment best practices:
I understand that:
* Coherence Cache Clusters should be configured to run on a separate Multicast/Unicast addresses to avoid impact with any other applications.
However:
* Should Coherence Cache Servers deployed into their own JVM separate from the Applications that use them?
* Or, Should the Coherence Cache servers be configured to use the same JVM as the applications?
* In a multi-Container environment, where many containers hosting many different applications is possible, what is the best deployment topology?
* If Coherence Cache Servers and Applications are separated into different JVMs, how should they be configured to communicate with eachother (eg. Extend TCP Proxy??)
Any help would be appreciated.Hello,
I suggest taking a look at this document (especially towards the bottom):
http://coherence.oracle.com/display/COH34UG/Best+Practices
In general we do recommend separate JVMs to host cache servers. As you mention, you have the option of having cache client JVMs either join the cluster or connect to a proxy using Extend TCP. Here are the pros and cons of each approach:
Cluster Membership
Pros: less network "hops" per operation, highest performance
Cons: for best results, requires clients to be "near" servers, preferably in the same subnet/switch; poorly tuned GC on clients can affect cluster
Extend
Pros: allows for more flexible network topology since it uses TCP (i.e. clients can be in a separate network or separated from storage nodes through a firewall), poorly tuned GC will not have as adverse an effect
Cons: requires more configuration, more "hops" per operation (although affinity with a properly tuned near cache can make this moot)
Thanks,
Patrick
Maybe you are looking for
-
After upgrade Calendar no longer has repeat function
I have great trouble with Calender and Contact list. Just upgraded to Maverik. Seems every upgrade is downer for me. Calender gets many duplicates, No longer has Repeat function which makes the calender almost useless for me. Cannot use Cloud as
-
Apple charged me 1.99$ for free apps
apple charged me .99cents when created new id. they promisedd no charge for account creation.. then charged me 1$ for free apps.. kindly give me the solution...
-
I have many emails in file, is there a way I can delete them all at one time?
Is there any way I can delete my mail all at one time instead of having to check each one first.
-
I have some B&W pix going through a drop zone and have been getting a sort of 'ant' effect -- a cascading and in some cases a flicker on lines. I started dealing with it in FCP but have fixed everything there, and narrowed it down to the drop zone. W
-
Cannot reinstall osx on Mac Mini!?!?
I bought a refurb Mac Mini from apple in january. I'm trying to put a clean install on it because I would like to sell it. Whenever I try the install give me the error "This software cannot be installed on the computer." I am using the OSX install CD