Weblogic Server 10.3.6.0.8 one of data sources failing while other 6 not

Hello,
I have 7 data sources connecting to the same database schema. After I upgraded the environment (production) day before yesterday evening I noticed one of the data sources start failing with exception below:
"Connection test failed with the following exception: weblogic.common.resourcepool.ResourceUnavailableException: No resources currently available in pool ePayBatchDS to allocate to applications.
Either specify a time period to wait for resources to become available, or increase the size of the pool and retry.."
The data source capacity size is Min:1 Max:30, I increased to be as 1-50, after change it is worked for 15 minutes then it back again giving the same error message above. Please not that I have in production two servers as high availability approach however I upgraded one of them till now only and the same data source in the old server not failing.
Please advise.
Old Environment Specifications:
Weblogic Server version: 10.3.4.0
OS: Sun OS10 SPARC 64-bit
JDK: 1.6.0 update 32 64-bit
New Environment Specifications:
Weblogic Server version: 10.3.6.0.8
OS: Solaris 11.1 (64-Bit Mode) SRU 20 (18 June 2014 patchset).
JDK: 1.7.0 update 55 64-bit
Thanks,
Mohd.

Hii,
           No resources currently available in pool ePayBatchDS to allocate to applications.
This exception cleaarly shows the connection are not available to allocate to ePayBatchDS  datasource.
As you said there are altogether 7 data source out which six are working and one is not.
In general if you are creating a datasource and providing a connection pool size we should consider the number of connection available or allowed at the database side as well.
Please cross check at the DB side the number of connection available and based on that try to tune the connection pool size in your data sources.
Regards,
Abdul

Similar Messages

  • Crystal Reports connection to JD Edwards One World Data Source

    Anyone connect to a JD Edwards One World Data source? I believe I am filling in the correct sever name (using IP address), port, evironment, ID,  Password and sytem role, but am getting this fairly generic nssage that "Failed to communicate with the Database Server for JD Edwards Enterprise One process". Is there any logs somewhere or a way to better determine the problem?
    Thanks

    Finally got to collect the logs, but need help with their interpretation.
    Relevant excerpts are
    DbServerHandle::logonServer - connecting: JSY812, JIM, 172.24.241.59
    [Tue Nov 29 19:13:16 2011]     3444     3640     -
    JBInitLocalBridge
    [Tue Nov 29 19:14:06 2011]     3444     3640     -
    DbServerHandle::logonServer - error 2021 initializing bridge object.
    CJBLocalBridge::startChild - Started query server process.
    [Tue Nov 29 19:14:06 2011]     3444     3640     -
    CJBLocalBridge::stopChild - Stopping query server process.
    [Tue Nov 29 19:14:06 2011]     3444     3640     -
    CJBLocalBridge::stopChild - Stopped query server process.
    [Tue Nov 29 19:14:06 2011]     3444     3640     -
    CJBLocalBridge::startChild - Error verifying that query server process has started.
    [Tue Nov 29 19:14:06 2011]     3444     3640     -
    CJBLocalBridgeFactory::createLocalBridge - initialization error
    [Tue Nov 29 19:14:06 2011]     3444     3640     -
    CJBLocalBridge::stopChild - Stopping query server process.
    CJQProcess::dispatch - pid: 3444
    [Tue Nov 29 19:13:16 2011]     3964     1328     CJQQRYServer::startServer - entered
    [Tue Nov 29 19:13:16 2011]     3964     1328     -
    CJQSettingsMgr::buildClsPath(): JDE_JAR_FOLDER is : C:\Program Files\SAP BusinessObjects
    SAP BusinessObjects Enterprise XI 4.0\java\lib\jdedwards\default\
    [Tue Nov 29 19:13:16 2011]     3964     1328     -
    CJQSettingsMgr::buildClsPath(): jars from JDE does not exist in folder :C:\Program Files\SAP BusinessObjects
    SAP BusinessObjects Enterprise XI 4.0\java\lib\jdedwards\default\jdedwards\ .
    [Tue Nov 29 19:13:16 2011]     3964     1328     -
    CJQSettingsMgr::buildJVMPath(): full JVM path: C:\Program Files\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\win32_x86\jdk\jre\bin\client\jvm.dll
    [Tue Nov 29 19:13:16 2011]     3964     1328     CJQQRYServer::stopServer - returning 200
    [Tue Nov 29 19:13:16 2011]     3964     1328     CJQQRYServer::startServer - error 222 initializing server
    [Tue Nov 29 19:13:16 2011]     3964     1328     CJQProcess::dispatch - error 222 starting
    [Tue Nov 29 19:13:16 2011]     3964     1328     CJQProcess::logError - error code: 10222
    [Tue Nov 29 19:13:16 2011]     3964     1328     CJQProcess::logError - error text: Failed to instantiate Java object. The system may be out of memory, or the Java components may be invalid.
    Help. Thanks

  • Data load failed while loading data from one DSO to another DSO..

    Hi,
    On SID generation data load failed while loading data  from Source DSO to Target DSO.
    Following are the error which is occuuring--
    Value "External Ref # 2421-0625511EXP  " (HEX 450078007400650072006E0061006C0020005200650066
    Error when assigning SID: Action VAL_SID_CONVERT, InfoObject 0BBP
    So, i'm  not getting  WHY in one DSO i.e Source  it got successful but in another DSO i.e. Target its got failed??
    While analyzing all i check that SIDs Generation upon Activation is ckecked in source DSO but not in Target DSO..so it is reason its got failed??
    Please explain..
    Thanks,
    Sneha

    Hi,
    I hope your data flow has been designed in such a way where the 1st DSO as a staging Device and all transformation rules and routine are maintained in between 1st to 2nd dso and sid generation upon activation maintained in 2nd DSO.  By doing so you will be getting your data 1st DSO same as your source system data since you are not doing any transformation rules and routine etc.. which helps to avoid data load failure.  
    Please analyze the following
    Have you loaded masterdata before transaction data ... if no please do it first
    go to the property of first dso and check whether there maintained sid generation up on activation (it may not be maintained I guess)
    Goto the property of 2nd Dso and check whether there maintained sid generation up on activation (It may be maintained I hope)
    this may be the reason.
    Also check whether there is any special char involvement in your transaction data (even lower case letter)
    Regards
    BVR

  • Can't Create a Data Source - Failed to test connection. [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied

    Hi there,
    I am having a serious issue with The Power BI Data Management Gateway which I am hoping that someone can help me with.
    Basically I am setting a connection between a Power BI demo site and a SQL 2012 Database based on Azure. The Data Management Gateway and is up and running, and Power BI has managed to connect to it successfuly.
    By following the tutorials at
    here I was able to successful create my Data Connection Gateway with a self-signed certificate.
    However, when trying to create the data source I come into problems. The Data Source Manager manages to successfully resolve the hostname, as per the screenshot below:
    Bear in mind that I exposed the require ports in Azure as endpoints and I managed to modify my hosts file on my local machine so I could access the SQL server hosted in Azure using its internal name -- otherwise I would not be able to get this far.
    However the creation of the data source also fails when trying to created it whilst logged in the SQL server in question:
    The Data Source Manager returns the error when using the Microsoft OLE DB Provider for SQL Server:
    Failed to test connection. [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied
    I tried using the SQL Server Native Client 11.0 instead but I also get an error. This time the error is:
    Failed to test connection. Login timeout expiredA network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.Named Pipes Provider: Could not open a connection to SQL Server [53]. 
    Some considerations
    If I provide an invalid username/password, the Data Source Manager does say that the username and password is incorrect.
    Firewall is turned off in the SQL Server (either way, this error also happens if I try top use the Data Source Manager whilst logged in the SQL Server itself).
    SQL Profiler does not show any attempt of connection.
    The SQL server instance in question is the default one.
    The error happens regardless if I select the option to encrypt connection or not.
    In SQL Configuration manager I can see that all protocols are enabled (TCP/IP, Named Pipes and Shared Memory.
    The Event Viewer does not provide any further errors than the one I have copied in this post.
    I'm at a loss here. Could someone please advise what might I be doing wrong?
    Regards,
    P.

    Here is what I had to do to solve this issue:
    Basically I had to add the MSSQL TCP/IP port as an end-point in Azure. After I did that, then I was able to create the data-source. However, I was only able to authenticate with a SQL account, as any domain account would return me an error saying that the
    domain isn't trusted.
    What puzzles me here is how come the Data Source Manager would inform me that an account username/password was invalid, but it would fail/timeout if I provided valid credentials (!?!?!!?)

  • Data source change while migrating report to another server

    Hi all.
    I have actually two questions:
    1) How exactly works the data source connection in CR? If I create a package in database and then connect it with my report, what happens in time of creating the report? Does the CR use the current package version stored in DBS or use the package in version when it be connected with report (so it looks like the CR loaded the code inside the report and are resistant to all changes in dbs)? If the second option is right (I think it is, unfortunately), can it be somehow changed?
    2) And now - If I create report and connect it with data source (database package) in development server and then I want just to move the completed and functional report to test server (and then other environments), is there any possibility to change automatically the data source to test server database? Everything is the same on DEV and TEST servers and their databases, except data, of course.
    The imagination that after creating some reports and migrating them to another environment I have to open all of them and manually change the data source is very very bad.
    Thanks for answers!

    Hi Filip,
    I don't think I understand the first question.
    For the second one though, if you want your reports to work seamlessly across environments then you should have your reports connect to the database using an ODBC System DSN.
    Each Envinronment (Dev, Test and Prod) should have the same DSN Name with each DSN pointing to the respective database sever.
    -Abhilash

  • Create two or more flash files from one (Excel) data source

    Hi experts,
    I have the following requirement about Xcelsius.
    Our data source is a relational database. Via the ODBC driver we manage to create several queries and execute them into Microsoft Excel. Those excel sheets are the basic for the Xcelsius reports.
    Now we want to build some highly visualized reports on that, but the crucial fact is. We don't want to have all the reports in one FlashFile, but we need several flash file depending on the area of the queries.
    Hence our requirement would be one of the following points:
    - create multiple flash files from one .xlf
    - create multiple .xlf from data source
    Another requirement is the automatic execution of the process. We don't want to have a person in between, who has to call all the .xlf files to create the Flash reports step by step. What we need is an automatic process.
    Can this requirement be fulfilled in a way?
    Maybe by using the Xcelsius SDK?
    Thanks for any helps and comments!
    Sebastian

    Sebastian,
    Firstly talking about the important requirement i.e. automating the process:
    In your case you can achieve this by using the XML maps. This will pick the data automatically when ever report is refreshed.
    Secondly, both the approaches are correct, however i would go with the first one.
    1. Create multiple flash files from one .xlf
         You just need to create one dashboard and have a filter on areas (Invisible) and then export to flash (for every area).    
    2. Create multiple .xlf from data source
         This approach is also fine, however you need to create multiple dashboards and do the same thing i.e filter data based on area.
    P.S. Did you get a chance to explore options to integrate Xcelsius with your Relational Database, this will be much effective.
    -Anil

  • Stored procedure to insert into multiple tables in sql server 2012, using id col from one table to insert into the other 2

    Hi all,
    Apologies if any of the following sounds at all silly but I am fairly new to this so here goes...
    I have 3 tables that require data insertion at the same time. The first table is the customers table, I then want to take the automatically generated custid from that table and inser it into 2 other tables along with some other data
    Here's what I have so far which does not work:
    CREATE PROCEDURE CustomerDetails.bnc_insNewRegistration @CustId int,
    @CompanyName varchar(100),
    @FirstName varchar(50),
    @LastName varchar(50),
    @Email nvarchar(254),
    @HouseStreet varchar(100),
    @Town smallint,
    @County tinyint,
    @Postcode char(8),
    @Password nvarchar(20)
    AS
    BEGIN
    begin tran
    insert into CustomerDetails.Customers
    (CompanyName, FirstName, LastName, EmailAddress)
    Values (@CompanyName, @FirstName, @LastName, @Email)
    set @CustId = (select CustId from inserted)
    insert into CustomerDetails.Address
    (CustomerId, HouseNoAndStreet, Town, County, PostCode)
    values (@CustId, @HouseStreet, @Town, @County, @Postcode)
    insert into CustomerDetails.MembershipDetails
    (CustomerId, UserName, Password)
    values (@CustId, @Email, @Password)
    commit tran
    END
    GO
    If anyone could help with this I would very much appreciate it as I am currently building an online store, if there's no registration there's no customers.
    So to whom ever is able to help, I thank you whole heartedly :)

    I hope by now it is apparent that statements like "doesn't work" are not particularly helpful. The prior posts have already identified your first problem.  But there are others.  First, you have declared @CustID as an argument for your
    procedure - but it is obvious that you do not expect a useful value to be supplied when the procedure is executed.  Perhaps it should be declared as an output argument so that the caller of the procedure can know the PK value of the newly inserted customer
    - otherwise, replace it with a local variable since it serves no purpose as an input argument.
    Next, you are storing email twice.  Duplication of data contradicts relational theory and will only cause future problems. 
    Next, I get the sense that your "customer" can be a person or a company.  You may find that using the same table for both is not the best approach.  I hope you have constraints to prevent a company from having a first and last name (and
    vice versa).
    Next, your error checking is inadequate.  We can only hope that you have the appropriate constraints to prevent duplicates.  You should expect failures to occur, from basic data errors (duplicates, null values, inconsistent values) to system issues
    (out of space).  I'll leave you with Erland's discussion for more detail:
    erland - error handling.
    Lastly, you should reconsider the datatypes you are using for the various bits of information.  Presumably town and county are foreign keys to related tables, which is why they are numeric.  Be careful you don't paint yourself into a corner with
    such small datatypes.  One can also debate the wisdom of using a separate tables for Town and County (and perhaps the decision to limit yourself to a particular geographic area with a particular civic hierarchy). Password seems a little short to me. 
    And if you are going to use nvarchar for some strings, you might as well use it for everything - especially names.  Also, everyone should be security conscious by now - passwords should be encrypted at the very least.
    And one last comment - you really should allow 2 address lines. Yes, two separate ones and not just one much larger one.

  • Using several data connections in one SSIS data source?

    I am loading data from several SQL Server 2012 databases into a datamart. Currently, I am simply using sql sources and destinations for that. However, some of the queries require JOINs between the source databases. Therefore I currently refer to the databases
    within the command text of the sql source, e.g.
        SELECT t1.Field, t2.Field
        FROM [server1].[db1].[dbo].[table1] t1
        JOIN [server1].[db2].[dbo].[table1] t2 ON t1.Table1Id = t2.RemoteTable1Id;
    This is flawed, since all the sql commands have to be changed if the database or server names change. It would be far better to refer to connections which can be changed on package level and my question is how to get that done.
    One way I can think of to get that done is to use several sql source queries and join the data with SSIS join operations. For that all current data flows need to be reworked and I am unsure about the performance when joining large databases compared to the
    performance of a single t-sql query.
    The other approach would be some dynamic sql stuff which I want to avoid whenever possible. However, if it was possible to use a parameter to insert server/database into the sql command somehow, then that could be an option.

    Whatever you're doing now is a bad practice inside SSIS. You should be using data flow tasks for this if data come from different sources and use connection managers pointing to each of them. You add these connection properties as config items which will
    enable you to change them from outside package based on your environment
    If performance is your concern you can go for staging approach where you bring deltas(changes) alone on daily basis and then use it in the comparisons with destination tables using MERGE or T-SQL for set based processing. You need to have audit columns in
    your tables for this though.
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • One section data of adobe form is not read by web dynpro code

    Hi,
    I am facing problem in adobe form integrated with web dynpro. Form is having Java script. There is a section in the form related to settlement rule of wbs. It can be added in the form by pressing add record button.
    When I use this form offline and press add record and one section is added. I fill the data in those two section and upload on the portal to process online data in the second section is not reaby the code whereas in the first section is present there.
    But if the same activity is performed online data is properly read by the code.
    Could you please assist if there is any issue of saving data in the form offline and how it is handled in offline form.
    Thanx..

    Hi ,
    Try this.
    I think  you have corresponding context node is in the WDP. using  'APPEND INITIAL LINE TO'  just add  some blank data to the context. .
    Inside the form, create a table  for the context node. so at  run time all the empty entry will be displayed. Make this Table as hidden.
    Create another  Table that will display all the non empty row from  the above hidden table. When ever  we add / delete rows, the corresponding entry will update the hidden table. or else you can update the  hidden table on the click event of SUBMIT .
    Hope  this will help you.
    Regards ,
    Shaira

  • Question: one weblogic server listening on several port

    can i start one weblogic server that listening on several port, one for
    different application?
    for example,
    7001 for general user, and
    7005 for admistrators and ask for two way authentification?
    can i do this? or do i have to start two weblogic instance? does that
    violate it's license for one computer and one ip address?
    thank u.

    Ummm.. how would this help security? If I want to bypass authentication, I just go to the unprotected port.
    I don't think you can listen to different ports in the same instance
    . You can listen to different IP addresses in the same instance.
    WL is licensed by CPU so this would not cost any more to license.
    mike
    "Gong Wenxue" <[email protected]> wrote:
    can i start one weblogic server that listening on several port, one for
    different application?
    for example,
    7001 for general user, and
    7005 for admistrators and ask for two way authentification?
    can i do this? or do i have to start two weblogic instance? does that
    violate it's license for one computer and one ip address?
    thank u.

  • Weblogic Server 9.2 not starting

    Hello All,
    Our weblogic server 9.2 admin server is not being started. This is a admin instance and we are having issue with the managed server startup as well.
    We get the following error message - I have read another thread as well on this forum for different version of weblogic but nothing seems to be working. Currently the log says that some ldap files inside the ldap directory and log files are owned via root and when we try to start the server with dmadmin user, the files are not accessable.
    Our problem is -
    (1) It is a production box so we have to careful when we change the ownership of the files if required from root to dmadmin.
    (2) If we change the ownership or delete the ldap files how it will impact to the production environment.
    (3) We have test environment and there all these files are owned by dmadmin user but in production some how few of them are owned by root and not starting up...since we are using the dmadmin seesion to start it....
    I am pretty new to Weblogic server so dont know how to solve it. Please share your thoughts....
    JAVA Memory arguments: -Xms256m -Xmx512m
    WLS Start Mode=Production
    CLASSPATH=:/documentum/bea9.2/patch_weblogic920/profiles/default/sys_manifest_classpath/weblogic_patch.jar:/documentum/bea9.2/jdk/lib/tools.jar:/documentum/bea9.2/weblogic/server/lib/weblogic_sp.jar:/documentum/bea9.2/weblogic/server/lib/weblogic.jar:/documentum/bea9.2/weblogic/server/lib/webservices.jar::/documentum/bea9.2/weblogic/common/eval/pointbase/lib/pbclient51.jar:/documentum/bea9.2/weblogic/server/lib/xqrl.jar::
    PATH=/documentum/bea9.2/weblogic/server/bin:/documentum/bea9.2/jdk/jre/bin:/documentum/bea9.2/jdk/bin:/oracle/app/oracle/product/10.2.0.3/bin:/documentum/product/6.0/convert:/documentum/java/1.5.0_00/bin:/documentum/product/6.0/bin:/documentum/dba:/usr/bin:/etc:/usr/sbin:/usr/ucb:/home/dmadmin/bin:/usr/bin/X11:/sbin:.
    * To start WebLogic Server, use a username and *
    * password assigned to an admin-level user. For *
    * server administration, use the WebLogic Server *
    * console at http://hostname:port/console *
    starting weblogic with Java version:
    Starting WLS with line:
    /documentum/bea9.2/jdk/bin/java -Xms256m -Xmx512m -Dcom.sun.xml.namespace.QName.useCompatibleSerialVersionUID=1.0 -da -Dplatform.home=/documentum/bea9.2/weblogic -Dwls.home=/documentum/bea9.2/weblogic/server -Dwli.home=/documentum/bea9.2/weblogic/integration -Dweblogic.management.discover=true -Dwlw.iterativeDev=false -Dwlw.testConsole=false -Dwlw.logErrorsToConsole= -Dweblogic.ext.dirs=/documentum/bea9.2/patch_weblogic920/profiles/default/sysext_manifest_classpath -Dweblogic.Name=adminServer -Djava.security.policy=/documentum/bea9.2/weblogic/server/lib/weblogic.policy weblogic.Server
    <Apr 17, 2012 10:11:44 PM EDT> <Notice> <WebLogicServer> <BEA-000395> <Following extensions directory contents added to the end of the classpath:
    /documentum/bea9.2/weblogic/platform/lib/p13n/p13n-schemas.jar:/documentum/bea9.2/weblogic/platform/lib/p13n/p13n_common.jar:/documentum/bea9.2/weblogic/platform/lib/p13n/p13n_system.jar:/documentum/bea9.2/weblogic/platform/lib/wlp/netuix_common.jar:/documentum/bea9.2/weblogic/platform/lib/wlp/netuix_schemas.jar:/documentum/bea9.2/weblogic/platform/lib/wlp/netuix_system.jar:/documentum/bea9.2/weblogic/platform/lib/wlp/wsrp-common.jar>
    <Apr 17, 2012 10:11:45 PM EDT> <Info> <WebLogicServer> <BEA-000377> <Starting WebLogic Server with IBM J9 VM Version 2.3 from IBM Corporation>
    <Apr 17, 2012 10:11:47 PM EDT> <Info> <Management> <BEA-141107> <Version: WebLogic Temporary Patch for CR322044 Tue Jul 31 12:48:52 IST 2007
    WebLogic Temporary Patch for CR334842 Thu Aug 2 14:45:06 2007
    WebLogic Temporary Patch for CR284142 Tue Oct 10 16:41:47 2006
    WebLogic Temporary Patch for CR299086 Wed Nov 15 18:45:32 2006
    WebLogic Temporary Patch for CR276285 Thu Nov 16 10:27:34 2006
    WebLogic Server 9.2 Tue Dec 12 14:26:03 PST 2006 876043 >
    <Apr 17, 2012 10:11:54 PM EDT> <Info> <WebLogicServer> <BEA-000215> <Loaded License : /documentum/bea9.2/license.bea>
    <Apr 17, 2012 10:11:54 PM EDT> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to STARTING>
    <Apr 17, 2012 10:11:54 PM EDT> <Info> <WorkManager> <BEA-002900> <Initializing self-tuning thread pool>
    <Apr 17, 2012 10:11:54 PM EDT> <Error> <Log Management> <BEA-170020> <The server log file ./servers/adminServer/logs/adminServer.log could not be opened successfully.>
    <Apr 17, 2012 10:11:56 PM EDT> <Error> <EmbeddedLDAP> <000000> <Error opening the Transaction Log: ./servers/adminServer/data/ldap/ldapfiles/EmbeddedLDAP.tran (Permission denied)>
    <Apr 17, 2012 10:11:56 PM EDT> <Error> <EmbeddedLDAP> <000000> <Error Instantiating 'dc=DctmDomain': null>
    <Apr 17, 2012 10:11:56 PM EDT> <Critical> <EmbeddedLDAP> <BEA-171522> <An error occurred while initializing the Embedded LDAP Server. The exception thown is java.lang.ClassCastException: com.octetstring.vde.backend.BackendRoot incompatible with com.octetstring.vde.backend.standard.BackendStandard. This may indicate a problem with the data files for the Embedded LDAP Server. If the problem is with the data files and it can not be corrected, backups of previous versions of the data files exist in ./servers/adminServer/data/ldap/backup.>
    <Apr 17, 2012 10:11:56 PM EDT> <Critical> <WebLogicServer> <BEA-000362> <Server failed. Reason:
    There are 1 nested errors:
    java.lang.ClassCastException: com.octetstring.vde.backend.BackendRoot incompatible with com.octetstring.vde.backend.standard.BackendStandard
         at weblogic.ldap.EmbeddedLDAP.start(EmbeddedLDAP.java:273)
         at weblogic.t3.srvr.SubsystemRequest.run(SubsystemRequest.java:64)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:181)
    >
    <Apr 17, 2012 10:11:56 PM EDT> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to FAILED>
    <Apr 17, 2012 10:11:56 PM EDT> <Error> <WebLogicServer> <BEA-000383> <A critical service failed. The server will shut itself down>
    <Apr 17, 2012 10:11:56 PM EDT> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to FORCE_SHUTTING_DOWN>

    Thanks Arun. Your reply is helpful. How can I identify weather I am using the node manager or not....What we use is the following script to start the server...
    #this script will start Documentum Application
    . /home/dmadmin/.profile
    nohup /documentum/bea9.2/domains/DctmDomain/startWebLogic.sh > /documentum/dba/log/Weblogic/StartWL.log &
    sleep 90
    nohup /documentum/bea9.2/domains/DctmDomain/startMethodServer.sh >/documentum/dba/log/Weblogic/startMethodServer.log &
    sleep 90
    These are the files currently owned via root. I have asked the Unix team to change the ownership because I even can not delete the file inside the ldap directory owned via root. Then I will try to restart the server ....
    <Domain_Home>\data\ldap\ldapfiles\EmbeddedLDAP.tran
    <Domain_Home>\logs\adminServer.log
    <Domain_Home>\logs\DctmDomain.log
    Kindly suggest if we are on the right track....and about the node manager...
    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    One more thing if you see our script -
    #this script will start Documentum Application
    . /home/dmadmin/.profile
    nohup /documentum/bea9.2/domains/DctmDomain/startWebLogic.sh > /documentum/dba/log/Weblogic/StartWL.log &
    sleep 90
    nohup /documentum/bea9.2/domains/DctmDomain/startMethodServer.sh >/documentum/dba/log/Weblogic/startMethodServer.log &
    sleep 90
    We start the Method server after the weblogic server and we get the nested error which says - Could not obtain the lock
    <Domain_Home>\tmp\DctmServer_MethodServer.lok
    server might already running..
    Thanks for your help

  • How to avoid JMS validation when starting weblogic server

    Hi All,
    When starting up WebLogic server, it will validate JMS destinations one by one for deployed applications.
    If I don't connect the VPN, then these JMS destinations are not reachable, and WebLogic Server will spend a lot of time to try connecting to these JMS destinations.
    Thus it will take a lot of time to startup the WebLogic Server.
    How can I disable JMS validation when starting weblogic server?
    Thanks and Regards!

    Hi Daniel,
    By blank do you mean that the screen is black? Is it gray? Is it blue? The word "blank" is vague when trying to determine and isolate startup issues.

  • Weblogic Server Installation in Master/Slave

    Hi All,
    I have two linux server dedicated for weblogic server and I want to configure weblogic server on both the machines with one of them becomes a server and other machine will be a slave.
    I have searched a lot but couldn't find any adequate information. Could someone please shed some light on how to install/configure weblogic server in master/slave fashion/pattern?
    I appreciate for any help.
    Thanks,
    Sanjay

    I think you need to do some reading of how domain's are organized with 1 Admin Server and 0 to many Managed Servers. The Managed Servers can be grouped into Clusters if you want. The servers can be across multiple machines. Check out the following link and perhaps the detail under clustering and respond if you have a more precise question.
    http://edocs.beasys.com/wls/docs103/domain_config/understand_domains.html#wp1101973

  • Weblogic server 10.0

    Hi,
    I want Weblogic server 10.0 mp1. can any one tell me how to download it. I am getting servers from 10.3 only.
    Regards,
    Haranadh

    You can use the advanced section of the Logging tab of your Server. Here you find an attribute called 'Date Format Pattern'. By default this is set to MMM d, yyyy h:mm:ss a z.
    It uses a DateFormatPattern string which conforms to the specification of the java.text.SimpleDateFormat class. So you can use a 'S' for milliseconds. (also see the javadocs http://java.sun.com/j2se/1.4.2/docs/api/java/text/SimpleDateFormat.html)

  • Weblogic server console error.

    HI,
    I am working on weblogic server . I am having only one domain and same domain i am having one managed server and one admin server.
    I can start the managed server from command line but not from the weblogic console.It is showing in shutdown state.Please help what to do? Please suggest some key area to look for this error.I need to deploy application on managed server from console.

    Thanks for the replay..
    1> Is there any way to start any managed server from admin console without configuring node manager.
    2> For my case i have started managed server from command line.
    3>So should i have to configure node manager for starting managed server from console.

Maybe you are looking for

  • Payment Issue, one month subscription

    I have a subscription for one month and 75 minutes and I would like to purchase from this month another 400 minutes to Aerica as I am currently travelling through South-East Asia.  Is that possible and if it how would I be able to do that? I also hav

  • Cursor not right-justifying in Oracle Forms 11.1.1.6.0

    Hi All, A user notice during testing that 'cursor not right-justifying' in a field on a form. There were not such problem in the old forms enviroment 10.1.2.0.2 and clients running JInitiator 1.3.1.22. The new forms enviroment is Oracle weblogic 10.3

  • Computer crash with iTunes - Using external firewire 800 drive

    I am using an external firewire 800 drive (2TB Lacie) to store all my music for iTunes... At random times this crashes the computer, the music "skips" and the computer has a grey notice telling me to shutdown immediately. I am not sure why this is ha

  • How to make export with sensitive user data encrypted?

    My organization need a backup dump of prod database sent to a customer, but want me to encrypt some user's password, and some data. Which expdp parameter can do that? Or any way I can do that ? Thanks in advance.

  • Reinstalling Crystal errors

    Hi, I'm getting an error when I try to reinstall Crystal on my computer.  "The Product key code "actual key shown" is not valid or has expired."  I've verified the key and it is correct.  Any suggestions?