Sybase JDBC Connection - OWB11gr2 Issue

Hi All,
We are uning OWB11gr2 on 64-bit linux box. Database is Oracle 11.1.0.6. We are trying to connect to Sybase database is ASA 9.0.
Added the JAR files "jconn2.jar" and "jodbc.jar" under OWB_HOME/owb/lib/ext.
Restarted Control Centre Agent and OWB client.
Created a new platform for Sybase database connection.
Under location Details -
I entered Username and Password.
Port Number of the database.
I kept Schema option Blank.
Driver Class - com.sybase.jdbc2.jdbc.SysDriver
Url - jdbc:sybase:Tds:machine-name:port-number/databasename
When I hit the test connection button it returns as Successful.
But When i try to import database objects , I cannot see any database objects.
After testing connection(successfully) when i hit the browse button to get schema names i get the following error -
Failed: java.sql.SQLException:jZ0sj:Metadata accessor information was not found on this database. Please install the required tables as mentioned in the jconnect documentation.
What could be the issue ?? I am not sure whether this problem is related to OWB or Sybase.
When I use the ODBC connection on windows box with same username and password for the same database i see few views and tables.
Any help is appreciated.
Regards,
Samurai.

Hi David,
I just edited the script. Since there are no tables( user just has read rights over few views) for the user I am trying to run , i modified the script to run against the views.
The following is the output for the view which had the text column -
DataType : varchar Column : nfsExportName Table : nfsExportView
DataType : varchar Column : nfsExportActualPath Table : nfsExportView
DataType : varchar Column : nfsExportAnonymousUserID Table : nfsExportView
DataType : varchar Column : nfsExportNoSUID Table : nfsExportView
DataType : long varchar Column : nfsExportReadOnlyAccess Table : nfsExportView
DataType : long varchar Column : nfsExportReadWriteAccess Table : nfsExportView
DataType : long varchar Column : nfsExportRootAccess Table : nfsExportView
DataType : long varchar Column : nfsExportSecurity Table : nfsExportView
DataType : unsigned int Column : nfsExportHostId Table : nfsExportView
DataType : unsigned int Column : nfsExportVolId Table : nfsExportView
DataType : unsigned int Column : nfsExportQtreeId Table : nfsExportView
If you need the entire output please let me know , i will paste it.
It looks like text is subsititued by long varchar. What can be done to overcome this situation ?
Thanks,
Samurai.

Similar Messages

  • Issue JDBC connection pool with Glassfish 3.1.2.2 and Oracle XE 11gR2

    Hello,
    I am experiencing an issue with pinging a JDBC connection Pool.
    I installed the following without any warnings or errors:
    Operating System: Oracle Enterprise Linux 5
    Oracle XE 11gR2 (11.2.0.2.0) database
    Glassfish 3.1.2.2
    I will refer to the steps I did after the installations
    1) In the .profile file of the OS I add the following:
    JRE_HOME=/usr/java/jre1.6.0_31; export JRE_HOME
    JAVA_HOME=/usr/java/jdk1.6.0_31; export JAVA_HOME
    GLASSFISH_DIR=/u01/glassfish3
    GLASSFISH_HOME=/u01/glassfish3/glassfish
    DERBY_HOME=$GLASSFISH_DIR/javadb
    OPEN_MQ_HOME=$GLASSFISH_DIR/mq
    PATH=:$JAVA_HOME/bin:$JRE_HOME/bin:$PATH:$HOME/bin:$GLASSFISH_HOME/bin:$DERBY_HOME/bin:$OPEN_MQ_HOME/bin
    export GLASSFISH_HOME
    export DERBY_HOME
    export OPEN_MQ_HOME
    export PATH
    LD_LIBRARY_PATH=/u01/app/oracle/product/11.2.0/xe/lib; export LD_LIBRARY_PATH
    . /u01/app/oracle/product/11.2.0/xe/bin/oracle_env.sh
    2) I copied the ojdbc6.jar to the $GLASSFISH_HOME/domains/domain1/lib
    3) I login to the Glassfish admin console and created a new JDBC Connection Pool.
    Pool Name: ds_orasys
    Resource Type: javax.sql.DataSource
    Datasource Classname: oracle.jdbc.pool.OracleDataSource
    User: [myschema]
    Password: [myschema password]
    URL: jdbc:oracle:thin:@localhost:1521:xe
    When I ping the connection pool I get the following message in the server log:
    [#|2012-10-23T12:14:37.069+0300|WARNING|glassfish3.1.2|javax.enterprise.resource.resourceadapter.com.sun.enterprise.connectors.service|_ThreadID=22;_ThreadName=Thread-2;|RAR8054: Exception while creating an unpooled [test] connection for pool [ ds_orasys ], Connection could not be allocated because: Invalid Oracle URL specified|#]
    [#|2012-10-23T12:14:37.071+0300|SEVERE|glassfish3.1.2|org.glassfish.admingui|_ThreadID=19;_ThreadName=Thread-2;|RestResponse.getResponse() gives FAILURE. endpoint = 'http://212.205.62.217:4848/management/domain/resources/ping-connection-pool.json'; attrs = '{id=ds_orasys}'|#]
    I tried to use different jar files. I used ojdbc6dms.jar and ojdbc14.jar.
    I also copied the jar files in the $GLASSFISH_HOME/domains/domain1/lib/ext directory as some people suggested. Still no luck. I keep getting the same error messages in the server.log
    Can anybody help me out or point me to the right direction.
    Thank you in advance

    The error is in the URL. It was in front of my eyes and I couldn't see the error. I skipped the ':' before the '@' when I created the pool. It is working fine now.

  • Issues with JDBC Connection Pooling

    Hi all,
    I'm experiencing some unexpected behaviour when trying to use JDBC Connection Pooling with my BC4J applications.
    The configuraiton is -
    Web Application using BC4J in local mode
    Using Default Connection Stagegy
    Stateless Release Mode
    Retrieving Application Modules using Configuration.createRootApplicationModule( am , cf );
    Returning Application Modules using Configuration.releaseRootApplicationModule( am, false );
    Three application modules
    AppModuleA - connects to DatabaseConnection1
    AppModuleB - connects to DatabaseConnection2
    AppModuleC - connects to DatabaseConnection2
    My requirement is to -
    Use App Module Pooling and have individual pool for each Application Module
    Use JDBC Pooling and have individual pool for each Database connection
    Note: All configuration was achieved in design mode (i.e. right clicking AppModule->Configurations...)
    1. Initial approach -
    In the configuration for each Application Module I specified the connection type as 'JDBC Datasource' and specified to approriate datasource.
    Tried setting doConnecitonPooling to 'true' as well as 'false'
    In the data-sources.xml I specified all the appropriate info including min-connections and max-connections.
    I would expect, with the above config that BC4J would use OC4J's built in JDBC connection pooling.
    2. Second approach -
    In the configuration for each Application Module I specified the connection type as JDBC URL.
    In the configuration I specified doConnectionPooling = 'true' as well as the max connection, max available and min available
    What I experienced in both cases was that the max connections seem to be ignored as the number of connection as reported by the database (v$session) was exceeded by more than 10.
    In addition to this once the load was removed the number of JDBC connecitons did not drop (I would have expected it to drop to max available connections)
    My questions are -
    1. When specifying to use a 'JDBC Datasource' style of connection, is it in fact OC4J that is then responsible for pooling JDBC connections? And in this case should BC4J's doConnectionPooling parameter be set to true or false?
    2. Are there any known issues with the use of the JDBC Conneciton Pool as stated by the above to approaches?

    Thanks for the additional info. Please see my comments. below.
    Sorry should have been more specififc -
    1. Is each application pool using a different JDBC user? You mentioned DatabaseConnection1 and DatabaseConnection2
    above; are these connections to different schemas / users? If so, BC4J will create a separate connection pool for each
    JDBC user. Each connection pool will have its own maximum pool size.
    Each 'DatabaseConnection' refers to a different database, actually hosted on a seperate physical server, different
    schema and different user.BC4J will maintain a separate connection pool for each permutation of JDBC URL / schema. If each user is connecting
    to a different DB instance then I would expect no greater than 10 DB sessions. However, if a DB instance is hosting
    more than user then I would expect greater than 10 DB sessions (though still no more than 10 DB sessions per user).
    2. Are all the v$session sessions related to the JDBC clients? There should be at least one additional database
    session which will be related to the session that is querying v$session.
    When querying the v$session table I specifically look for connections from the user in quesiton and from the machine
    name in question and in doing so eliminate the database system's connections, as well as the query tools'
    connection. One area I'm not sure about is the connection BC4J uses to write to its temporary tables. I am using
    Stateless release mode and have not explicetly stated to save to the database but I'm wondering if it still does if so
    and how does it come into the equation with max connections?BC4J's internal connections are also pooled and the limits apply as mentioned above. So, if you have specified
    internal connection info for a schema which is different than the users above I would expect the additional conns.
    One helpful diagnostic tool, albeit programmatic, might be to print the information about the connection pools after
    your test client(s) have finished. This may be accomplished as follows:
    // get a reference to the BC4J connection pool manager
    import oracle.jbo.server.ConnectionPoolManagerFactory;
    import oracle.jbo.server.ConnectionPoolManagerImpl;
    import oracle.jbo.pool.ResourcePool;
    import java.io.PrintWriter;
    import java.util.Enumeration;
    // get the ConnectionPoolManager. assume that it is an instance of the supplied manager
    ConnectionPoolManagerImpl mgr = (ConnectionPoolManagerImpl)ConnectionPoolManagerFactory.getConnectionPoolManager();
    Enumeration keys = mgr.getResourcePoolKeys();
    PrintWriter pw = new PrintWriter(System.out, true);
    while (keys.hasMoreElements())
    Object key = keys.nextElement();
    ResourcePool pool = (ResourcePool)mgr.getResourcePool(key);
    System.out.println("Dumping pool statistics for pool: " + key);
    pool.dumpPoolStatistics(pw);
    }

  • Sybase ODBC Connection Issue in Windows Server 2008 R2

    Hi ,
    We have three sybase odbc connection in Windows Server 2008 R2 server . One connection is not working now, it was working fine before 30 days.
    I can ping the server address successfully from BO Server through "dsedit".
    i have created the same connection in a diffrent server (Win server 2003),and it is working fine.
    Sybase ASE Driver version 3.50.00.10
    Please help me to overcome this issue.
    Thanks,
    Saurabh upadhyay

    Hi Saurabh,
    The error message you are reporting is telling us that you are using the very old ctlibrary ODBC Driver.
    That is pretty much the only way you can get that error.
    Run from a command prompt,  isql -v  that will return a version string.  We can then verify the version.
    However, the old ODBC driver will function with the newer client.
    The old driver would read the sql.ini for the network address, and was also a translation layer between
    the client application and the ASE.
    Both were eliminated when Sybase released the "Native" ODBC Driver.
    The ct_connect error states that the connection attempt is using ctlibrary.
    In your ODBC configuration, Check to see in the ODBC Administrator and check to make
    sure that the driver you are using for your DSN is the Adaptive Server Enterprise and not the
    Sybase ASE ODBC Driver.
    Also  check the Drivers tab in the ODBC Admin.  See what the actual file is listed for the driver.
    sysybnt.dll
    syodase.dll  are unsupported drivers.
    It would be to your advantage to open an incident with SAP for more complete support
    Thank you,
    Kevin

  • JDBC Connection Jobs Continue to Run After Network Issues

    We've got 7.0 SP13 installed in our DEV and QAS environments.  We connect to iSeries DB2 tables using a JDBC connection.  We recently experienced a network issue that shutdown PI.  Problem is that the system jobs used to connect to the iSeries are still running and still requiring more and more temporary storage.  We've rebooted our PI boxes but still these connection jobs continue to run and gobble up more memory.
    Are we missing something in our configuration of these communication channels that will force these jobs to end???
    Thanks in advance,
    Chad

    iSeries is on a separate server altogether.  JDBC connection to the iSeries is setup through the communication channel.
    Problem is that the network issue that disconnected PI from the iSeries left these communication jobs running on the iSeries. 
    Rebooting PI only created MORE of these communication jobs.
    Thanks for the reply VJ, but I don't believe the poll interval would help me in this case.
    Chad

  • JDBC connection issue with multiple DBs on same instance

    I have two databases on one sql server 2012 instance.  One called 'demotime' the other called 'demotime_dev'.  how ever when I change in my JDBC connection the DB from demotime to demotime_dev.  the connection still remains established with
    the demotime, is there any known reason to cause this? does both being on the same instance have anything to do with the problem?

    Do your application may send a "USE [demotime]" command to Switch to the database? Do you may use a config file in your application where the database is still pointing to the other database?
    Olaf Helper
    [ Blog] [ Xing] [ MVP]

  • JDBC connection issue to 9.2.0.7

    Hello,
    I am trying to make a jdbc connection to an Oracle 9i database (on AIX, Entreprise 9.2.0.7 - 64 bit) from a linux box.
    On the client, I have last jdbc driver available :
    jdbc driver name: Oracle JDBC driver
    jdbc driver version: 9.2.0.5.0
    jdbc driver major version: 9
    jdbc driver minor version: 2
    db product name: Oracle
    db product version: Oracle9i Release 9.2.0.6.0 - Production
    JServer Release 9.2.0.6.0 - Production
    However, when trying to connect I got the following execption :
    java.sql.SQLException: Io exception: Connection refused(DESCRIPTION=(TMP=)(VSNNUM=153093888)(ERR=12505)(ERROR_STACK=(ERROR=(CODE=12505)(EMFI=4))))
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:179)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:334)
    at oracle.jdbc.ttc7.TTC7Protocol.handleIOException(TTC7Protocol.java:3678)
    at oracle.jdbc.ttc7.TTC7Protocol.logon(TTC7Protocol.java:352)
    at oracle.jdbc.driver.OracleConnection.<init>(OracleConnection.java:365)
    at oracle.jdbc.driver.OracleDriver.getConnectionInstance(OracleDriver.java:547)
    at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:347)
    at java.sql.DriverManager.getConnection(DriverManager.java:512)
    at java.sql.DriverManager.getConnection(DriverManager.java:172)
    at austin.db.TestDB.main(Unknown Source)
    Note, that with the same code, I can successfully connect to another server (Oracle9i Release 9.2.0.6.0 running on windows)
    Does anyone has an idea oof what is going wrong

    Hello,
    so here is the solution : in your tnsnames.ora file, the entry name must be the same as the DB SID.

  • Crystal Report 2008 IDE - JDBC connection issue

    Hello,
    I'm using Crystal Report 2008 IDE. I've created a report that goes against Oracle 11G database.
    I have installed Oracle Instant Client (32 bit) 11.2.0.2.0.
    I can establish and Oracle connection via the IDE.
    When I attempt to establish a JDBC connection through the database expert wizard, I keep getting JDBC driver not found message.
    I've updated my CRconfig.xml specifically the classpath to point to where my ojdbc5.jar file is located.
    What else do i need to do in order to establish a JDBC connection?

    Hi,
    First of all what is the service pack that you are using for Crystal Reports 2008, because Oracle 11g R2 is supported from Service pack 4 onwards in CR 2008.
    For JDBC connection with CR 2008 you can have a look at this link might this would help you:
    http://www.sheroz.com/articles/crystal-reports-2008-mysql-and-oracle-databases
    Regards,
    Kuldeep G

  • JDBC connection pool failures when used by JMS stores

              We are using WebLogic 6.1 sp2. We defined a separate connection pool for use by
              a JMS Store.
              <JDBCConnectionPool Name="sybaseJMSPool"
              Targets="cluster00"
              InitialCapacity="2"
              MaxCapacity="10"
              DriverName="com.sybase.jdbc2.jdbc.SybDriver"
              Properties="[email protected]@;[email protected]@;charset=utf8"
              URL="jdbc:sybase:Tds:@jms.db.host@/@jms.db.name@"/>
              (note that the @xxx@ string are replaced by actual values).
              We are using Sybase Jconnect 5.5 to a Sybase ASE 12.5 database.
              We deployed this configuration on a number of environments (testing, staging,
              ..). The actual hardware and network configuration is different for the different
              system, but the WebLogic domain stays the same regarding this issue.
              On the test system we frequently get the following exceptions:
              <Aug 13, 2002 1:56:04 PM CEST> <Alert> <JMS> <www00-test> <node00>
              <ExecuteThread: '6' for queue: 'JMS.TimerClientPool'> <> <> <040048>
              <JMSServer "JMSServer00", store failure while writing message for topic
              OrderChangeTopic, java.io.IOException: JMS JDBC store, connection pool =
              <sybaseJMSPool>, prefix = <JMS00>: write failed
              java.sql.SQLException: JZ006: Caught IOException:
              com.sybase.jdbc2.jdbc.SybConnectionDeadException: JZ0C0: Connection is already
              closed.
              at com.sybase.jdbc2.jdbc.ErrorMessage.raiseErrorCheckDead
              (ErrorMessage.java:715)
              at com.sybase.jdbc2.tds.Tds.handleIOE(Tds.java:3124)
              at com.sybase.jdbc2.tds.Tds.cancel(Tds.java:1412)
              at com.sybase.jdbc2.tds.Tds.cancel(Tds.java:1341)
              at com.sybase.jdbc2.jdbc.SybStatement.doCancel(SybStatement.java:564)
              at com.sybase.jdbc2.jdbc.SybStatement.updateLoop(SybStatement.java:1672)
              at com.sybase.jdbc2.jdbc.SybStatement.executeUpdate
              (SybStatement.java:1625)
              at com.sybase.jdbc2.jdbc.SybPreparedStatement.executeUpdate
              (SybPreparedStatement.java:91)
              at com.p6spy.engine.logging.P6LogPreparedStatement.executeUpdate
              (P6LogPreparedStatement.java:179)
              at weblogic.jdbc.pool.Statement.executeUpdate(Statement.java:293)
              at weblogic.jms.store.JDBCIOStream.write(JDBCIOStream.java:1246)
              at weblogic.jms.store.StoreRequest.doTheIO(StoreRequest.java:250)
              at weblogic.jms.store.JMSStore.execute(JMSStore.java:182)
              at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:139)
              at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:120)
              .>
              java.io.IOException: JMS JDBC store, connection pool = <sybaseJMSPool>, prefix
              = <JMS00>: write failed
              java.sql.SQLException: JZ006: Caught IOException:
              com.sybase.jdbc2.jdbc.SybConnectionDeadException: JZ0C0: Connection is already
              closed.
              at com.sybase.jdbc2.jdbc.ErrorMessage.raiseErrorCheckDead
              (ErrorMessage.java:715)
              at com.sybase.jdbc2.tds.Tds.handleIOE(Tds.java:3124)
              at com.sybase.jdbc2.tds.Tds.cancel(Tds.java:1412)
              at com.sybase.jdbc2.tds.Tds.cancel(Tds.java:1341)
              at com.sybase.jdbc2.jdbc.SybStatement.doCancel(SybStatement.java:564)
              at com.sybase.jdbc2.jdbc.SybStatement.updateLoop(SybStatement.java:1672)
              at com.sybase.jdbc2.jdbc.SybStatement.executeUpdate
              (SybStatement.java:1625)
              at com.sybase.jdbc2.jdbc.SybPreparedStatement.executeUpdate
              (SybPreparedStatement.java:91)
              at com.p6spy.engine.logging.P6LogPreparedStatement.executeUpdate
              (P6LogPreparedStatement.java:179)
              at weblogic.jdbc.pool.Statement.executeUpdate(Statement.java:293)
              at weblogic.jms.store.JDBCIOStream.write(JDBCIOStream.java:1246)
              at weblogic.jms.store.StoreRequest.doTheIO(StoreRequest.java:250)
              at weblogic.jms.store.JMSStore.execute(JMSStore.java:182)
              at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:139)
              at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:120)
              at weblogic.jms.store.JDBCIOStream.throwIOException
              (JDBCIOStream.java:1213)
              at weblogic.jms.store.JDBCIOStream.write(JDBCIOStream.java:1256)
              at weblogic.jms.store.StoreRequest.doTheIO(StoreRequest.java:250)
              at weblogic.jms.store.JMSStore.execute(JMSStore.java:182)
              at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:139)
              at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:120)
              Before that this message appeared:
              <Aug 13, 2002 11:31:16 AM CEST> <Error> <ConnectionManager> <www00-test>
              <node00> <ExecuteThread: '26' for queue: 'default'> <> <> <000000>
              <Closing: 'weblogic.rjvm.t3.T3JVMConnection@795af6' because of: 'Server
              received a message over an uninitialized connection: 'JVMMessage from: 'null'
              to: '-4555218188801970213S:192.168.13.1:[7001,7001,7002,7002,7001,7002,-
              1]:ADIS:node00' cmd: 'CMD_REQUEST', QOS: '101', responseId: '1',
              invokableId: '287', flags: 'JVMIDs Not Sent, TX Context Not Sent', abbrev
              offset: '34'''>
              This problem did not occur on another system which was used during a 2 day stress
              testing session.
              It seems that the problem occurs after a period in which no user request where
              made. The user requests trigger EJB's that start sending JMS messages.
              When the problem occurs, the JMS messaging systems seems to lock up as no messages
              are received anymore by the different listeners (MDBs).
              Undeploying and redeploying the JBDC connection pool solves the problem. This
              solution is unacceptable in case of a production system.
              A similarly defined connection pool, which is used by the EJBs to make database
              connection, does not manifest this problem.
              <JDBCConnectionPool Name="sybasePool"
              Targets="cluster00"
              InitialCapacity="10"
              CapacityIncrement="5"
              MaxCapacity="50"
              PreparedStatementCacheSize="150"
              DriverName="com.sybase.jdbc2.jdbc.SybDriver"
              Properties="[email protected]@;[email protected]@;JCONNECT_VERSION=6;charset=utf8"
              URL="jdbc:sybase:Tds:@db.host@/@db.name@"/>
              The JDBC connection pool is used as follows by the JDBC store
              <JMSJDBCStore ConnectionPool="sybaseJMSPool" Name="JDBCStore00" PrefixName="JMS00"/>
              <JMSServer Name="JMSServer00" Store="JDBCStore00" Targets="node00">
              <JMSTopic JNDIName="ADIS.JMSError" JNDINameReplicated="false" Name="ErrorTopic"/>
              <JMSTopic JNDIName="ADIS.Status"
              Name="StatusTopic" RedeliveryDelayOverride="300000"/>
              <JMSTopic JNDIName="ADIS.OrderChange" JNDINameReplicated="false"
              Name="OrderChangeTopic" RedeliveryLimit="3"/>
              </JMSServer>
              Turning on the "Test Reserved Connection" with a appropriate test table does not
              help.
              Some sources on the internet tell us that JZ0C0 errors in the Jconnect driver
              can be related to network problems. Nevertheless the connection pool should be
              able to cope with this.
              Can you provide any solution for this ? Or give us hints what can cause the problem
              

    Zhenhao Qi wrote:
    thanks! Joe.
    The SQL statement itself can no longer be simplified, the long excuation time is due to the database size and complicated Select criteria. I can easily reproduce the problem by using this SQL. I tried "BEA's Oracle driver (Type 4): Version 8.1.7,9.0.1,9.2.0". the question can be dissect into 2 pieces:
    1) why the jdbc connection (using oracle.jdbc.OracleDriver) won't return anything if the SQL execution time > 5min, that is probably the Oracle's problem
    2) why the occupied connection pool won't release even I set "Statementtimeout=600", this is Weblogic's problem.
    ZhenhaoHi. Yes, (1) is oracle's problem. (2) may also be. The JDBC spec has very few
    allowances for one thread to interrupt a second thread's JDBC call. If we
    transmit your timeout request by calling setQueryTimeout() on the oracle
    statement, and if you have a weblogic-controlled transaction we call
    Statement.cancel() on any ongoing statement, we end up relying on whether
    the Oracle driver implements and responds to those calls.
    Are you doing weblogic-controlled transactions? Are you/can you
    call Statement.setQueryTimeout() on your statements, or are these
    generated JDBC queries?
    If you can duplicate the problem using the weblogic.jdbc.oracle.OracleDriver
    we have some other debug avenues. This would be good even if you really
    want to use the thin driver, because we will do the same JDBC calls to
    either driver, and the debug would prove (if) we set up a query timeout
    and if we call cancel(). If we do, then we can know that it is the Oracle
    driver failing in these regards.
    Joe

  • Sybase JDBC driver & Invalid column name error

    I submitted a note a year ago concerning JDBC-ODBC bridge and SQL Server db. Same Invalid column name error. The resolution was a bug in the XSU code.
    This time the error is with a jconnect5 JDBC driver from Sybase to a Sybase ASA db. ASA is Adaptive Server Anywhere.
    <ERROR xsql-timing="140">oracle.xml.sql.OracleXMLSQLException: S0022: Invalid column name 'name'.</ERROR>
    However if I use a Sybase JDBC driver from INet Software of Germany, I get the desired result of my query.
    Below are sample XSQLConfig.xml definitions.
    INet Software Sybase JDBC driver definition:
    - <connection name="deasa">
    <username>dba</username>
    <password>sql</password>
    <dburl>jdbc:inetsyb:LEMKAU:2638?database=asademo</dburl>
    <driver>com.inet.syb.SybDriver</driver>
    <autocommit>true</autocommit>
    </connection>
    Sybase jconnect5 Sybase JDBC driver definition:
    - <connection name="asa">
    <username>dba</username>
    <password>sql</password>
    <dburl>jdbc:sybase:Tds:lemkau:2638?ServiceName=asademo</dburl>
    <driver>com.sybase.jdbc.SybDriver</driver>
    <autocommit>true</autocommit>
    </connection>
    I believe the bug has to do with the numeric codes that the drivers use to determine data types are not properly interpreted.
    See XML General note title "insert-request, xsu 2.1.0 beta & SQL Server" for reference.
    Steve.

    Thanks for the notification. We will look into this issue...

  • Report with JDBC connection does not work when they includes CommandTable

    I am trying to render using the new version of Crystal report java component - CRJ a report contains Data base Fileds of type Command Table (Row Set) which seems to be not working.
    when i use the previes version of crystal SDK it works fine.
    after deugging the Sample CrystalHelper.java file which contains the method .changeDatasource()
    i found that the DataBaseController.setLocation() method changes the CommandTable class to Table when using it on CommandTable instance and as result all the fields defined into that CommandTable were disappear.

    That appears to be a known issue:  Eclipse JRC: To change the JDBC connection at run time
    Sincerely,
    Ted Ueda

  • How to configure JDBC connection in oracle BI publisher with teradata datab

    Hi,
    I am going to use Oracle BI publisher to create report.
    Our database is Teradata.
    How to create database connection for Teradata in BI publisher.
    How to create JDBC connection.
    What should be the Database Driver Class?
    What should be the connection string format?
    Please provide me the suggetion.
    Thanks,
    Santanu Manna

    Hi;
    I suggest please refer below which could be helpful on your issue:
    How To Generate XML Output (Excel, HTML, PDF) for FSG Reports [ID 804913.1]
    E-XMLP: BI Publisher Report RTF Template to Excel output is not same as PDF,RTF or HTML format.[Article ID 1515711.1]
    BI Publisher Enterprise Excel Output File Size is Too Large [ID 1271544.1]
    Regard
    Helios

  • How can I check a jdbc connection

    With Sun Java App Server, after I create a JDBC connection, I use a button to check if the connection was OK.
    I cannot find the same button or method to see if my Sybase connection is correct because all I get is "is not bound in this Context".
    I installed the sybase jdbc driver in the install_dir\lib_directory.
    In the jdbc wizard I use the Sybase Driver: com.sybase.jdbcx.SybDataSource
    TIA,
    Lorenzo Jimenez

    With Sun Java App Server, after I create a JDBC connection, I use a button to check if the connection was OK.
    I cannot find the same button or method to see if my Sybase connection is correct because all I get is "is not bound in this Context".
    I installed the sybase jdbc driver in the install_dir\lib_directory.
    In the jdbc wizard I use the Sybase Driver: com.sybase.jdbcx.SybDataSource
    TIA,
    Lorenzo Jimenez

  • Remote JDBC connect error; missing SecStore.properties

    Hi
    After creating the intial context for a remote JDBC connection, using a Java client, to the J2EE engine, when calling createConnection(), I get the following error on ver 6.4:
    com.sap.engine.services.dbpool.exceptions.BaseSQLException: ResourceException in method ConnectionFactoryImpl.getConnection(): com.sap.engine.services.dbpool.exceptions.BaseResourceException: SQLException thrown by the physical connection:
    com.sap.sql.log.OpenSQLException: Error while accessing secure store: File "SecStore.properties" does not exist although it should..
    I can not find the file on the server, although what limited doco I can find suggests SecStore should be created at install time. Am I missing something here?
    Thanks, Leonard

    Tried, but no success. Am definitely connecting to jndi, getting the context etc. The issue seems to be here;
    Caused by: com.sap.security.core.server.secstorefs.FileMissingException: File "SecStore.properties" does not exist although it should.
            at com.sap.security.core.server.secstorefs.StorageHandler.openExistingStore(StorageHandler.java:372)
            at com.sap.security.core.server.secstorefs.SecStoreFS.openExistingStore(SecStoreFS.java:1946)
            at com.sap.sql.connect.OpenSQLConnectInfo.getStore(OpenSQLConnectInfo.java:803)
            at com.sap.sql.connect.OpenSQLConnectInfo.lookup(OpenSQLConnectInfo.java:784)
            at com.sap.sql.connect.OpenSQLDataSourceImpl.setDataSourceName(OpenSQLDataSourceImpl.java:206)
    Why is StorageHandler.openExistingStore trying to open SecStore.properties from the remote java client and not on the j2ee server? This makes no sense.
    Thanks

  • How to set roles from JDBC connections

    Hi guys,
    I have a jdbc connection which purpose is to run queries based on a string that I construct in my program.
    My question is: if I have to run a DCL, like: SET ROLE RL_XXX TO USER1;
    What's the easiest way to do it with my same connection?
    Thanks.

    Hi Marc,
    Sorry for the typo. It's a BDC source, I use a WCF client to access a SQL Database (HR External System) that has 4 fields that are necessary to present in the Sharepoint User Profile. The issue occurs with a Full or a Delta Sync. The problem is that if the
    BDC source is not present the fields are deleted (I get a SPS-Dummy Added and all of the pbjects in the BDC Connector Space are deleted).
    I do not want this to happen. I do not want the User Profile Attributes/Fields to be empty/deleted if there is no connection I simply want them to stay what they are... I have two issues.
    1) Is that the even if i change my data on SQL Server side, the changes do not get picked up by the sync. Since the only field that is being tested for change is an ADid, since the id does not change the BDC does not consider them changes.
    2) If there is no connection I do not want the attributes to be deleted. I have not figured out a way to effectively do this.
    So my issue appears to be simple to solve, but after 4 days and hundreds of tutorial pages read I have yet to figure out a proper way to do this.
    Here is the pseudo-specification
    The Fields that come form the HR System (SQL Server) are to be presented in the user profile. If there is no connection to the BDC file the fields remain as they are until there is a connection and updates can be made. Changes to any of the fields are performed
    manually in the HR system. These changes must be picked up by the daily sync.

Maybe you are looking for