Monitoring/debug the database connection process

Hi,
I am trying to define a new database connection using the JDBC-ODBC Bridge (trying to connect to a legacy apps), and keep getting an error message from the ODBC driver manager, which work perfectly fine outside the JDev environment.
Is there a way for me to monitor/debug the database connection process, prior to a successful test connection, in order for me to find out exactly where/what the problem is?
Thanks for the help,
Arie.

Repost

Similar Messages

  • Timeout: Tuxedo kills the service but not the database connection

    Hi all,
    I am experiencing some performance problems on my system due an efficient SQL and a Tuxedo improper timeout handling.
    A service is using a "problematic" SQL (we will tune it but it's not the main problem). After 60 seconds from the execution, Tuxido kills the services for a timeout.
    At this point I would like Tuxedo to notify DB2 database as well in order to stop processing the SQL. Instead the SQL continues running on the database (also if the service is killed) and this produce a gradual slow down of the performances.
    In the UBBCONFIG, we are using a service configuration like the following timeout configuration:
    .RESOURCES
    SCANUINIT 5
    SANITYSCAN 6
    BLOCKTIME 12
    .SERVICES
    DEFAULT: SVCTIMEOUT=45
    service1 SVCTIMEOUT=60 TRANTIME=60
    service2 SVCTIMEOUT=60 TRANTIME=60
    Note: not all the services are listed in the .SERVICES section and we are using the default NOTIFY as well as an OPENINFO.
    Can you please help me in finding a configuration to kill both the services and the database?
    Thanks in advance,
    Benedetto

    Hi Benedetto.
    First of all, Tuxedo doesn't kill services, it kills servers. Your UBBCONFIG file specifies three timeouts, BLOCKTIME, SVCTIMEOUT, and TRANTIME.
    BLOCKTIME specifies how long a Tuxedo API that needs a response will wait for that response. If the response isn't received in that period of time, Tuxedo will return TPETIME to the caller. As with any failure, if the request was part of a transaction, the transaction is marked rollback only. Note, this timeout does not affect the request, whether sitting in a server's IPC queue or currently executing in a server.
    SVCTIMEOUT is a much more severe timeout and determines how long Tuxedo will allow a service implementation to execute. If a service implementation doesn't reply within the SVCTIMEOUT period, Tuxedo will issue an OS level KILL request to kill the process. If the server is marked restartable, Tuxedo will then try to restart the server assuming none of the restart limits have been reached. Killing the server causes the request to be lost within the server so the caller will stay blocked until BLOCKTIME is reached at which point the above actions will take place.
    TRANTIME is the amount of time Tuxedo allows a transaction to remain active and viable. When this period expires, Tuxedo will mark the transaction as timed out with the only option being rollback. As well, Tuxedo aborts any API requests that would normally cause messaging to occur, i.e., making a tpcall() within a timed out transaction will fail without any attempt to call the service.
    So in your case, the issue is partially that you have the values of your timeouts in somewhat reverse order. Typically we see BLOCKTIME being the smallest value, with TRANTIME typically larger than BLOCKTIME, and SVCTIMEOUT larger even still, although there are good reasons for exceptions to this guideline. Part of the reasoning behind this is that killing a server is a significant thing and its usually best to try and let the server complete whatever its doing, if if the work has been timed out either due to BLOCKTIME or TRANTIME, since the cost of killing and restarting a server is significant.
    Tuxedo will notify the database of the transaction status when the application finally issues a tpcommit() or a tpabort() request, but not until then. Although, if SVCTIMEOUT is hit, then killing the server should cause the database connection to be lot.
    If you could describe the behavior you are seeing and the relevant portions of your ULOG we can try to make some sense of what is happening.
    Regards,
    Todd Little
    Oracle Tuxedo Chief Architect

  • Closing the database connection after report in a server application

    I searched these forums and the internet for a definitive answer on asking the Crystal SDK for java to close the jdbc connection after it has generated a report.  We have been using the Crystal Report SDK to generate reports within our JEE application, built upon the Spring framework, for the past two years.  It works well, especially if you prepare views in the database for your reports.
    From what I can tell once you have used ReportClientDocument to create your report you all the close() method to release resources associated with report generation, but this does not close the jdbc database connection.
    Further research states that if you are using the CrystalReportViewer you can call the dispose method to close the database connection.  We are not using JSP nor this class, so that does us little good.
    Finally I found a post that one could call ((AdvancedReportDocument)reportClientDocument.getReportSource()).dispose().  This doesn't drop the connection either.
    Each report actually opens 3 connections according to SQL Server.  Each report will reuse the connections it has open, so for 50 reports, theoretically, we could have up to 150 connections.  We explained to our client those connections  remain inactive, however this is unacceptable to our client as they would like to minimize the number of connections left open to their database.
    If anyone can post any further information on this issue, it is much appreciated.

    Yes, another team member found the issue.  Quite embarrassing really I didn't see it.  I was looking for the answer within Crystal's libraries.  It had nothing to do with Crystal.
    The developer who wrote the helper code for using Crystal first opened a connection to the datasource for the live production database and read that connection information for the report. Next he set that connection information in the report template's PropertyBag, then ran the report. The developer however forgot to close the connection he used to look up the connection info, leaving a memory leak and using up all the connections.
    I'm glad you inquired.  I forgot to post the resolution here.

  • Error while applying a patch "Unable to get the database connection"

    Dear Experts,
    A patch which got successfully applied is failing Production and the error is kind of surprising to me.
    Apps version is 11.5.10.2
    db version is 10.2.0.4
    The worker log file shows
    Time when worker restarted job: Thu Nov 24 2011 22:14:52
    Start time for file is: Thu Nov 24 2011 22:14:52
    adjava -ms128m -mx256m -nojit oracle.apps.fnd.odf2.FndXdfCmp &un_apps &pw_apps &un_apps &pw_apps &jdbc_protocol &jdbc_db_addr table &fullpath_pa_patch/115
    Reading product information from file...
    Reading language and territory information from file...
    Reading language information from applUS.txt ...
      Temporarily resetting CLASSPATH to:
      "/erp/oracle/prodappl/ad/11.5.0/java/adjri.zip:/usr/java14/jre/lib/charsets.jar:/usr/java14/jre/lib/core.jar:/usr/java14/jre/lib/graphics.jar:/usr/java1
      Calling /usr/java14/bin/java ...
    Exception occured
                  Copyright (c) 2003 Oracle Corporation
                     Redwood Shores, California, USA
             XDF(XML Object Description File) Comparison Utility
                            Version 1
    NOTE: You may not use this utility for custom development
          unless you have written permission from Oracle Corporation.
    Unable to get the database connection using schema username/passwordIo exception: The Network Adapter could not establish the connection
    AD Run Java Command is complete.
                         Copyright (c) 2002 Oracle Corporation
                            Redwood Shores, California, USA
                                        AD Java
                                     Version 11.5.0
    NOTE: You may not use this utility for custom development
          unless you have written permission from Oracle Corporation.
    AD Worker error:
    The above program failed.  See the error messages listed
    above, if any, or see the log and output files for the program.
    Time when worker failed: Thu Nov 24 2011 22:14:53
    {code}
    The error says database connection error.  I am able to connect to the database using sqlplus.  I tried to restart the failed worker, but the same error is repeating. 
    Any help would be appreciated.
    Thanks
    qARS
    Edited by: user7640966 on Nov 24, 2011 9:07 AM                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

    Hussein,
    One thing which I noticed now is in the apps Tier the
    tnsnames.ora under $TNS_ADMIN
    shows entry like this:
    PROD=
            (DESCRIPTION=
                    (ADDRESS=(PROTOCOL=tcp)(HOST=<appsServerName>)(PORT=1521))
                    (ADDRESS=(PROTOCOL=tcp)(HOST=<dbServer>)(PORT=1521))
                (CONNECT_DATA=
                    (SID=PROD)
            )Actually *(ADDRESS=(PROTOCOL=tcp)(HOST=<appsIPaddress>)(PORT=1521))* this line in the tnsnames.ora is not correct. In fact the appsServerName should be replaced with dbServerName
    I deleted the tnsnames.ora and reran autoconfig but it is again recreating the same entry.
    Any clue how this can be fixed?
    Thanks
    qARS

  • Why do I get a class conflict between the Prepare SQL.vi and the Get Column Name.vi with the SQL Toolkit compatibility vis from the Database Connectivity Toolkit?

    I have done extensive programming with the SQL Toolkit with LabVIEW versions through 6.1. My customer now wants to upgrade to Windows 7, so I am trying to upgrade to LabVIEW 2009 (my latest purchased version) using the Database Connectivity Toolkit, and the SQL Toolkit Compatibility vis. Everything seemed to be going okay with the higher level SQL operations, but I ran into trouble with the Get Column Name.vi. 
    The pictures below show the problem. The original SQL Toolkit connected the Prepare SQL.vi with the Get Column Name.vi with a cluster of two references, one for connection, and one for sql. The new compatibility vis have a class conflict in the wire because the Prepare SQL.vi contains a cluster with connection, and command references, but the Get Column Name.vi expects a cluster with connection and recordset references. 
    How do I resolve this conflict?
    Thank You.
    Dan

    I've never worked with the old version of the toolkit, so I don't know how it did things, but looking inside the SQL prep VI, it only generates a command, and the the column name VI wants a recordset. I'm not super familiar with all the internals of ADO, but my understanding is that is standard - you only have the columns after you execute the command and get the recordset back. What you can apparently do here is insert the Execute Prepared SQL VI in the middle and that will return what you need.
    I'm not sure why it worked before. Maybe the execute was hidden inside the prep VI or maybe you can get the column names out of the command object before execution. In general, I would recommend considering switching to the newer VIs.
    Try to take over the world!

  • Problem in getting the database connection from a connection pool

    Hai All,
    I am facing a problem in getting the database connection from a connection pool created on weblogic server 8.1.
    I am using the Oracle database 8.1.7.
    I have configured my connection pool, datasource and JNDI in weblogic.
    In my java program i have the following code to retrieve the connection.
    import java.sql.*;    
    import java.util.Hashtable;
    import javax.naming.Context;
    import javax.naming.InitialContext;
    class jdbcshp1 {
        public static void main(String[] args) {
         Connection connection = null;
         try {
               Hashtable ht = new Hashtable();
               ht.put(Context.INITIAL_CONTEXT_FACTORY,"weblogic.jndi.WLInitialContextFactory");  // Wanna get rid of this.
               ht.put(Context.PROVIDER_URL,"t3://localhost:7001"); // wanna get rid of this.
               // Get a context for the JNDI look up
               Context ctx = new InitialContext(ht);
            javax.sql.DataSource ds = (javax.sql.DataSource) ctx.lookup ("myjndi1");
              //Create a connection object
              connection = ds.getConnection();
         The above code is working fine but, the two ht.put statements are creating problem.
    The problem is, after converting the application into WAR file it can be deployed
    on any machine or different port on same machine. My application fails if its deployed on
    weglogicserver which is at different port.
    Is there any way that i can get rid of those ht.put statements or any other way to solve the problem.
    any help is appreciated.
    Thanks in advance
    Pooja.

    Hai All,
    Firstly, thanks for ur reply.
    Even i have seen some code which uses context constructor with out any parameter and works fine.
    i dont understand why its not working for my code.
    When i remove those ht.put code and use context constructor with out any parameter, it giving an error.
    Context ctx = new InitialContext();
    javax.sql.DataSource ds = (javax.sql.DataSource) ctx.lookup ("ocjndi");
    connection = ds.getConnection();The error is as follows:
    javax.naming.NoInitialContextException: Need to specify class name in environment or system property, or as an applet parameter, or in an application resource file: java.naming.factory.initial
    the above error is forcing me to include those code but if the port number is changed the code will not work. Plz let meknow if some setting have to be made.
    I appreciate all ur valuable help.
    Thanks once again.
    Pooja.

  • When using the Database Connectivity Toolset, reads and writes with long binary fields are incompatible.

    I am trying to write LabVIEW Variants to long binary fields in a .mdb file using the Database Connectivity Toolset. I get errors when trying to convert the field back to a variant after reading it back from the database.
    I next tried flattening the variant before writing it and ultimately wound up doing the following experiments:
    1) If I use DB Tools Insert Data to write an ordinary string and read it back using a DB Tools Select Data, the string is converted from ASCII to Unicode.
    2) If I use DB Tools Create Parameterized Query to do an INSERT INTO or an UPDATE operation, specifying that the data is BINARY, then read it back using a DB Tools Select Data,
    the length of the string is prepended to the string itself as a big-endian four-byte integer.
    I can't think of any way to do a parameterized read, although the mechanism exists to return data via parameters.
    Presuming that this same problem affects Variants when they are written to the database and read back, I could see why I get an error. At least with flattened strings I have the option of discarding the length bytes from the beginning of the string.
    Am I missing something here?

    David,
    You've missed the point. When a data item is flattened to a string, the first four bytes of the string are expected to be the total length of the string in big-endian binary format. What is happening here is that preceding this four-byte length code is another copy of the same four bytes. If an ordinary string, "abcdefg" is used in place of the flattened data item, it will come back as <00><00><00><07>abcdefg. Here I've used to represent a byte in hexadecimal notation. This problem has nothing to do with flattening and unflattening data items. It has only to do with the data channel consisting of writing to and reading from the database.
    I am attaching three files that you can use to demonstrate the problem. The VI file c
    ontains an explanation of the problem and instructions for installing and operating the demonstration.
    Ron Martin
    Attachments:
    TestLongBinaryFields.vi ‏132 KB
    Sample.UDL ‏1 KB
    Sample.mdb ‏120 KB

  • I am using the database connectivity toolkit to retrieve data using a SQL query. The database has 1 million records, I am retrieving 4000 records from the database and the results are taking too long to get, is there any way of speeding it up?

    I am using the "fetch all" vi to do this, but it is retrieving one record at a time, (if you examine the block diagram) How can i retrieve all records in a faster more efficient manner?

    If this isn't faster than your previous method, then I think you found the wrong example. If you have the Database Connectivity Toolkit installed, then go to the LabVIEW Help menu and select "Find Examples". It defaults to searching for tasks, so open the "Communicating with External Applications" and "Databases" and open the Read All Data. The List Column names just gives the correct header to the resulting table and is a fast operation. That's not what you are supposed to be looking at ... it's the DBTools Select All Data subVI that is the important one. If you open it and look at its diagram, you'll see that it uses a completely different set of ADO methods and properties to retrieve all the data.

  • The Database Connection could not be found

    Hi guys,
    we did a complete new installation of EPM 11.1.2.1 for a testing environment.
    Did the installation 1-1 to our production system.
    But on TEST we have the problem to create database connections:
    Tools > Database Connection Manager > New Database Connection > Fails:
    8001: The Database Connection could not be found: -15da10bd_1378bcfb29f_-7d2c
    Any ideas?
    I am Admin and have all privileges.
    The problem exists for Essbase connections and HFM, too.
    Regards,
    Bernd

    Are you using a NAS share for storing reporting and analysis files? I mean the RM1 folder?

  • Where to put break point when debugging the inbound IDOC processing

    Hi, Dear Experts,
    If IDOC has error and want to debug, which program or function module  to put break point when debugging the inbound IDOC processing
    Thank you so much!
    Helen

    It depends on if you have the custom FM or standard FM...Is it MM invoice or FI invoice? ... You can find out your FM by going to partner profile (WE20) for your sender partner and partner function.. Drill down to your partner and the message type and find the inbound process code.. Double clicking on the process code will help you find the FM...
    If you don't have access to WE20, You can also put BP into any of the IDOC_INPUT_INVOIC* FM and see if it is getting triggered (assuming your IDOC is standard FM)...

  • Dynamic Portlet: Setting the database connection

    With OmniPortlet, I can set the database connection I wish to use for that instance of the portlet in the configuration, but I can't seem to find a way to accomplish this with Dynamic Page portlets. How can I configure what server/user the portlet is using?
    Thanks,
    Rick

    Damn. That is exactly what I was afraid of. We'll have dblinks all over the place!
    Is there a way to create multiple instances of the Dynamic Page Portlet that use different database providers, or is that impossible? (I'm somewhat new to Portal, so forgive me if that is an ignorant question)
    Thanks for the help!

  • Where do I put the .jar file for the database connection?

    I am trying to connect to an Oracle 11g database. I see under the coldfusion settings summary that the java version is 1.7.0_55. So I downloaded the ojdbc7.jar file from Abode.com. Now the question is where do I put it so that coldfusion can access it?
    Of course I need to set up the database connection (which I still don't have the connection string for), but I still need to know where to put the ojdbc.jar file.
    Thanks.

    jasonwryan wrote:What does the man page say?
           The  program  has builtin defaults and temperature thresholds but users
           can   specify   their   own    settings    in    configuration    files
           /etc/default/i8kmon  and  ~/.i8kmon.   The daemon defines 4 states with
           different fan speeds ({0 0}, {1 0}, {1 1}, {2 2}) and  for  each  state
           are  defined  the temperature thresholds which cause the switching to a
           higher or lower  state.  Furthermore  each  state  can  have  different
           thresholds  for operation on ac power or battery.  For example the fol‐
           lowing configuration:
    I've put the following file in /etc/default/i8kmon:
    # Run as daemon, override with --daemon option
    set config(daemon) 0
    # Automatic fan control, override with --auto option
    set config(auto) 1
    # Report status on stdout, override with --verbose option
    set config(verbose) 1
    # Status check timeout (seconds), override with --timeout option
    set config(timeout) 1
    # Temperature thresholds: {fan_speeds low_ac high_ac low_batt high_batt}
    set config(0) {{-1 0} -1 50 -1 50
    set config(1) {{-1 1} 50 70 50 70}
    set config(3) {{-1 2} 70 128 70 128}
    But it doesn't seem to do anything. I'm running a Dell Inspiron 3521 with no dedicated GPU, so it only has one fan. I was somehow able to get i8kmon to work in Ubuntu by putting this config in /etc/i8kmon, but can't get it to work in Arch. Putting it in /etc/i8kmon and /etc/default/i8kmon doesn't do anything.

  • Workspace Error 8001: The Database Connection could not be found

    We had to reboot all our windows servers recently and since then all the database connections in the database base manager in workspace are gone.
    I can see all the existing reports but cannot use any of them as I dont have any connections.
    Also I cannot create any new connections
    I get the error 8001: The Database Connection could not be found: 13cbd63_12b5e2a8b73_-7f8f when I try to create a new connection.
    Not sure what the problem is.
    Any suggestions?

    Hi Pat,
    It loks like your Biplus relational DB is corrupted, try with those steps:
    1. De-register BI+ from Shared Services (Config Utility->Hyperion Reporting and Analysis > Deregister from Shared Services)
    2. Configure against the backup database (Drop and recreate tables option).
    3. Re-register BI+ with Shared Services.
    4. Re-deploy BI+.
    5. Test Workspace and add new connections.
    Regards.

  • How to work offline without the database connection

    The first thing that make me do not like this tool is it is not possible to work offline or at least that I find out easily.
    Why should I need the database connection to write PL/SQL, SQL or whatever.

    Thanks Duncan,
    I want exactly what I said. To have the ability to modify the PL/SQLor SQL code, of course stored in the local file system or some kind of version control system.
    This happens because I was at home and could not create stupid file to just put main methods and SQL's I want while not having the connection.
    I do not want to use 3 different tools to do simple things. I am expecting from this tool to do all things for me that are meaningful as I am in Oracle more then 10 years and know exactly how robust Oracle tools are usually covering more then simple things.
    Offline work is more then simple requirement.
    Source control is more then simple requirement as sql developer is extracted from JDeveloper.
    It is so obvious that it is not possible to modify PL/SQL code inside the database without having the connection :)
    What are you doing when the database is not available or you need to work from home and no VPN or so, in the plane (never did that but there are people that will do), ...?
    Both TOAD and if you want SQL*Plus can do it without connection of course.
    sqlplus /nolog will do it :)
    Oracle SQL*Developer is the same but you need first to create manually some SQL or other file somewhere and then open it.
    While we are here SQL*Developer is not source control aware and that is not good.

  • Can i change the database connections through MAXL Scripts?

    Hi,
    just want to know whether i can change the database connections through MAXL Scripts. i am using essbase 9.3.1.

    Hi John,
    I have built my rulefile by connecting to a database & now i want to change the database connections. I know i can change the database connections through frontend. ( File--> open file). I want to know whether i can change the database connections by writing any MAXL Scripts.

Maybe you are looking for

  • F-32: document Splitting generates lines with alternating +/-signs

    Hi Sap Guru, While clearing the Customer in F-32, user is getting  error as document Splitting generates lines with alternating +/-signs. It is happening only for few customers. Please advice me what could be the reason for the same. Thanks, Manya

  • Sending a File (XML or TXT) as attachment to Mail Adapter

    Dear all, In a scenario I'd like to pull a file by using  'File Adapter' and send it (without content change) as attachment to an Email recipient. Is their any possibility to send a file by using Mail Adapter? Do you have any similar solution in plac

  • Min Mx planning report gives an error when run with RE-Stock yes.

    Hi All, We are going to implement Min Max planning for the first time.However when we test it at org level with Re-Stock parameter YES it gave an error. what are the additional setups need to be done for it? It works fine when running with re stock n

  • DCA-CONNECTION_LOOKUP_FAILED: Failure looking up the repository connection

    Using Jdev 10.1.3.2, we created a Data Control on Oracle Content DB 10.2 and used it to add content of a folder as table to a jspx page. The page run correctly from webcenter preconfigured OC4J. However, we encounter the following DCA error message w

  • Openscript is supporting SOAP 1.2 protocol. Really?

    Hello guys, can you help me with this one? I have an OpenScript project about testing web services and the SOAP protocol's version is 1.2. Even though in the documentation it says that OATS is supporting SOAP 1.2 I'm unable to locate which variable t