Execution time of transactions

Hi,
I'm a bit confused about a the execution time of transactions in a transaction.
I just wrote a transaction which calls four other transactions (just for time testing) and I get the following result:
[INFO ]: Execution Started At: 09:11:37
[INFO ]: Execution Started At: 09:11:37
[INFO ]: Execution Completed At: 09:11:37 Elapsed Time was 453 mS
[INFO ]: Execution Started At: 09:11:38
[INFO ]: Execution Completed At: 09:11:38 Elapsed Time was 31 mS
[INFO ]: Execution Started At: 09:11:38
[INFO ]: Execution Completed At: 09:11:38 Elapsed Time was 109 mS
[INFO ]: Execution Started At: 09:11:38
[INFO ]: Execution Completed At: 09:11:38 Elapsed Time was 125 mS
[INFO ]: Execution Completed At: 09:11:38 Elapsed Time was 1547 mS
the first 4 time information are from the calles transaction. The last time information is from the "outer" transaction.
But I just call the 4 transactions and don't do anything else. Why is there such a big time difference? (more then double of time) And how can I solve this problem?
Thanks
Matthias

Jeremy,
If the box is checked and you only call the TRX action once in your logic flow I don't think it really makes any difference because the runner servlet has to load/parse the file at least once no matter what.
Do you mean it doesn't make any differences generally or I performance?
As the "log" shows there is a difference. Even in my original transaktion I feel that it is really faster. But the general results are the same. So I get the same results faster
Just to be sure: The outer transaction and the transaction will be loaded everytime it is called by a user, webservice or what ever. So if the the transaction is startet twice a the same second in different threads they were also loaded twice?
And another question: I don't think it is possible to start a new thread in a transaction, is it? Why I'm asking: In my transaction I have some action that shuld be executed immediately and other action that can be done at a later time. (Some updates / inserts in tables and they aren't critical if they fail). So best solution on performance point of view would be if the main thread runs fast and the User gets the result and all other things run after this. Do I have any chance to do so or is something like this planed in future versions of MII?
I mark this question as answered as my originaly question is answered.
Thank you.
Regards,
Matthias

Similar Messages

  • Transaction Execution Time

    Hii,
    Is there any procedure to check execution time of Transaction.
    For E.g How we know Trigger time for output
    Regards
    Amit
    Edited by: Amit Gupta on Dec 2, 2008 5:43 AM

    Hi Amit,
    Normally the performance can be tested with the tcode SM30 and ST05( select queries) , this is done by the ABAPers to know the performance of the program.
    Basis team has another set of tcodes  ST03 through which they know the execution time of the program , pls check with them.
    Regards,
    Vvieks

  • Transaction execution time and block size

    Hi,
    I have Oracle Database 11g R2 64 bit database on Oracle Linux 5.6. My system has ONE hard drive.
    Recently I experimented with 8.5 GB database in TPC-E test. I was watching transaction time for 2K,4K,8K Oracle block size. Each time I started new test on different block size, I would created new database from scratch to avoid messing something up (each time SGA and PGA parameters ware identical).
    In all experiments a gave to my own tablespace (NEWTS) different configuration because of oracle block-datafile size limits :
    2K oracle block database had 3 datafiles, each 7GB.
    4K oracle block database had 2 datafiles, each 10GB.
    8K oracle block database had 1 datafile of 20GB.
    Now best transaction (tranasaction execution) time was on 8K block, little longer tranasaction time had 4K block, but 2K oracle block had definitly worst transaction time.
    I identified SQL query(when using 2K and 4K block) that was creating hot segments on E_TRANSACTION table, that is largest table in database (2.9GB), and was slowly executed (number of executions was low compared to 8K numbers).
    Now here is my question. Is it possible that multiple datafiles are reasone for this low transaction times. I have AWR reports from that period, but as someone who is still learning things about DBA, I would like to asq, how could I identify this multi-datafile problem (if that is THE problem), by looking inside AWR statistics.
    THX to all.

    >
    It's always interesting to see the results of serious attempts to quantify the effects of variation in block sizes, but it's hard to do proper tests and eliminate side effects.
    I have Oracle Database 11g R2 64 bit database on Oracle Linux 5.6. My system has ONE hard drive.A single drive does make it a little too easy for apparently random variation in performance.
    Recently I experimented with 8.5 GB database in TPC-E test. I was watching transaction time for 2K,4K,8K Oracle block size. Each time I started new test on different block size, I would created new database from scratch to avoid messing something up Did you do anything to ensure that the physical location of the data files was a very close match across databases - inner tracks vs. outer tracks could make a difference.
    (each time SGA and PGA parameters ware identical).Can you give us the list of parameters you set ? As you change the block size, identical parameters DON'T necessarily result in the same configuration. Typically a large change in response time turns out to be due to changes in execution plan, and this can often be associated with different configuration. Did you also check that the system statistics were appropriately matched (which doesn't mean identical cross all databases).
    In all experiments a gave to my own tablespace (NEWTS) different configuration because of oracle block-datafile size limits :
    2K oracle block database had 3 datafiles, each 7GB.
    4K oracle block database had 2 datafiles, each 10GB.
    8K oracle block database had 1 datafile of 20GB.If you use bigfile tablespaces I think you can get 8TB in a single file for a tablespace.
    Now best transaction (tranasaction execution) time was on 8K block, little longer tranasaction time had 4K block, but 2K oracle block had definitly worst transaction time.We need some values here, not just "best/worst" - it doesn't even begin to get interesting unless you have at least a 5% variation - and then it has to be consistent and reproducible.
    I identified SQL query(when using 2K and 4K block) that was creating hot segments on E_TRANSACTION table, that is largest table in database (2.9GB), and was slowly executed (number of executions was low compared to 8K numbers).Query, or DML ? What do you mean by "hot" ? Is E_TRANSACTION a partitioned table - if not then it consists of one segment, so did you mean to say "blocks" rather than segments ? If blocks, which class of blocks ?
    Now here is my question. Is it possible that multiple datafiles are reasone for this low transaction times. I have AWR reports from that period, but as someone who is still learning things about DBA, I would like to asq, how could I identify this multi-datafile problem (if that is THE problem), by looking inside AWR statistics.On a single disc drive I could probably set something up that ensured you got different performance because of different numbers of files per tablespace. As SB has pointed out there are some aspects of extent allocation that could have an effect - roughly speaking, extents for a single object go round-robin on the files so if you have small extent sizes for a large object then a tablescan is more likely to result in larger (slower) head movements if the tablespace is made from multiple files.
    If the results are reproducible, then enable extended tracking (dbms_monitor, with waits) and show us what the tkprof summaries for the slow transactions look like. That may give us some clues.
    Regards
    Jonathan Lewis

  • To Check the execution time for each transaction.

    Abapers,
    How to find out the process time for each transaction eg.order entry,shippng,billing, etc in SAP.
    TIA,
    sinthu

    Hi,
        By default you can see the execution time at right side corner of sap session.
    You can use SE30 to get in to more details like database time , abap time etc...
    Hope it helps...
    Regards,
    Vijay

  • Multi-level error propagation during the execution of CFM2 transaction

    Hi,
    We are implementing SAP SCM 5.0 integrated to SAP R/3 Mills IS.
    During the execution of CFM2  transaction, while  trying to activate the integration model for the object u201COrdersu201D for the very first time, the following error message appears:
    *"Multi-level error propagation carried out".*
    There are also messages related to inbound queues blocked. In the case of maintenance orders, CIF_MNT_INBOUND message shows up. While working with Production orders, CIF_ORDER_INBOUND message emerges.
    The items listed below should be taken into account:
    -The production orders that are meant to be transferred to APO are regular production orders corresponding to several products.
    -We are working with configurable materials.
    -We are working with the active model (000) and for each plant we are using 2 different planning versions.
    Regards,
    Analía Nahmías

    Dear Analía,
    The corresponding error messages issued when executing //CCR indiacted the peg-ids are missing in liveCache.
    Recommendations:                                                      
    -Run transaction /SAPAPO/OM17 to check the internal consistency between database and liveCache within SCM system. Correct the inconsistencies                                                       
    -Execute report /sapapo/cif_deltareport3 to correct the inconsistencies between ECC and SCM system                            
    Regards,
    Tibor

  • Execution time issues with SU01 demo script

    Having worked with Scripting in a Box for a while now I wanted to try out the examples there. I read FM: SO_USER_LIST_READ or another one explaining why my attempt to narrow the returned users failed (Craig, did you find out why the functionality was removed?) and Re: Issue with "Scripting in a Box" seeing that  Harry had the same problems with only ~200 users in his system. However, Craig's original post states he successfully managed with over 400 users. I'm a bit confused...
    I included some simple timing stuff and found out that processing of one user in the loop takes about 1.7 seconds - little surprise then that every script times out. This seems to be due to the additional calls to GetStatus() and GetValid() - by commenting them out I get the whole list rather quickly.
    Unfortunately commenting them out also means no nice icons for 'Status' and 'Valid', which is not desired. I probably could create a Z FM to deliver me the userlist with these two fields already added (which would save on rfc-calls, assuming the operation is much quicker on the server directly), but I hoped to get a solution based purely on PHP, not own ABAP coding (being aware that Craig also used a Z FM anyway, but still...)
    I'm a bit unsure now how easy it is to actually create useful frontends in PHP, with such long execution times. I assume this will happen in other occasions as well, not only for user lists. Is there an alternative? Or a general way to do those things quicker?
    :Frederic:

    Craig: you say it's easy to go from 1.7 seconds per user lookup down to a small fraction of it? Then apparently I'm lacking these skills. Could you please give me a hint what should be done there?
    I though about creating a Z function, but having to write custom wrappers - possibly for about any transaction in this style - I wanted to avoid.
    Bala: the two functions only take one user as input, not a list. So w/o modifying the ABAP side I can't feed the whole list in there. I wonder how much it would improve the result time anyway, so perhaps I'm trying it. It's just not a solution I'd prefer.
    Paging is a good idea, the actual call to get the whole userlist is quite quick. Having like 20 users displayed at a time is manageable - still slow, but the script won't timeout anymore. I think I'll implement this today.
    About AJAX: yes, I want to play around a bit with AJAX, however having the two columns valid and status removed and the information only displayed on mouseover etc is a bit like cheating. And 1.7+ seconds waiting for a hoover info is too long. So I'd like to optimize on the rfc-calling side primarily.
    Craig: surely it was just a demo, and I'm just trying to get it to work for understanding it
    :Frederic:

  • Reduce execution time

    How to reduce the execution time of this code? 
    Loop at porder1.
    SELECT aufnr bstmg hsdat sgtxt bwart charg FROM mseg INTO
    (porder-aufnr,porder-bstmg,porder-hsdat,porder-sgtxt,porder-bwart,
    porder-charg)
                                             WHERE matnr = porder1-matnr AND
                                                   aufnr = porder1-aufnr AND
                                                   werks = porder1-pwerk AND
                                                   ( bwart = '101' OR
                                                     bwart = '102' ).
    Endselect.
    Endloop.
    Regards
    Praju .

    Hi prajwal.
    I would like to suggest,
    It is possible to reduce the time of execution by Increasing the number of fields in the WHERE clause as only those specific number of records are fetched which results in comparatively less execution time.
    Also, SAP has designed such powerfull tools like Transactions - ST05,  ST07, ST30, and many more
    I would like to suggest you a couple of references relating to your case,
    [SDN - Reference for Long execution time during processing of a select query|/thread/477540 [original link is broken];
    [SDN - Reference for Reducing the Execution time of the program - Tools|How can i reduce time of execution;
    [SDN - Reference for solutions to reduce the execution time of a program|How to reduce my query execution time?;
    Hope that's usefull.
    Good Luck & Regards.
    Harsh Dave

  • Execution time in microseconds

    Hi,
    Can someone tell me what should be the ideal execution time of a program in microseconds. My program consists of 2586 lines and has almost around 10 subroutines and some 10 function modules. Could you suggest me how the execution time is determined for a program.
    Thanks,
    Binay.

    You can use simple command like
    declare three variables :
    data : a Type i,
             b type i,
             c type i.
    start-of-selection.
    Write the command in staring
    get runtime field a.  -> will have intial start time
    write the one more command in last
    get runtime field b. -> this is final time
    c = b - a.  -> c will have total execution time in micro seconds
    or
    you can use ST05 Transaction to find out the time,here you need count the all times.
    Thanks
    Seshu

  • Execution Time for T.Code

    Hi Experts,
    I want to know the exact execution time for a t.code. I check it in ST03 or ST03n but I can't get proper data. In ST03 I get the average and total response time but I want the exact ececution or response  time.
    Waiting for your inputs.
    Regards,
    Nisit

    From SAP
    For old verions
    In ST03 transaction
    click on "Performance Database" tab 
    Then double click on Total
    It will open Dialogue box CHOOSE TIME PERIOD  ,select the time period and then click on the required date
    Then Click on Transaction Profile ,here it will give the list of transaction executed and its execution time
    In STO3N , you will find "TRANSACTION PROFILE" under Analysis views, when you double click on it will give present days Transaction codes executed.
    If you want the month data then go to Export mode in ST03n ,you will get the data "TRANSACTION PROFILE" under Analysis views  .
    Regards,
    Beena

  • Query Execution Time for a Query causing ORA-1555

    dear Gurus
    I have ORA-01555 error , earlier I used the Query Duration mentioned in Alert Log and increased the Undo Retention as I did not find th UnDOBLKS column of v$undostat high for the time of occurence of ORA-01555..
    But new ORA-01555 is coming whose query duration exceeds Undo Retention time..
    My question -
    1. Is it possible to accurately find the query duration time besides the Alert Log file ?

    abhishek, as you are using an undo tablespace and have already increased the time that undo data is retained via undo_retention then you might want to consider the following ideas which were useful with 1555 error under manual rbs segment management.
    1- Tune the query. The faster a query runs the less likely a 1555 will occur.
    2- Look at the processing. If a process was reading and updating the same table while committing frequenctly then the process under manual rbs management would basically create its own 1555 error rather than just being the victum of another process changing data and the rbs data being overlaid while the long running query was still running. With undo management the process could be generating more data than can be held for the undo_retention period but because it is committed Oracle has been told it doesn't really have to keep the data for use rolling back a current transaction so it gets discarded to make room for new changes.
    If you find item 2 is true then separating the select from the update will likely eliminate the 1555. You do this by building a driving table that has the keys of the rows to be updated or deleted. Then you use the driver to control accessing the target table.
    3- If the cause of the 1555 is or may be delayed block cleanout then select * from the target prior to running the long running query.
    Realistically you might need to increase the size of the undo tablespace to hold all the change data and the value of the undo_retention parameter to be longer than the job run time. Which brings up back to option 1. Tune every query in the process so that the job run time is reduced to optimal.
    HTH -- Mark D Powell --
    dear mark
    Thanks for the excellent advise..I found that the error is coming because of frequent commits..which is item 2 as u righly mentioned ..
    I think I need to keep a watch on the queries running , I was just trying to find the execution time for the queries..If there is any way to find the query duration without running a trace ..
    regards
    abhishek

  • Execution time for an insert/update

    Hello!
    We are using EJB entities 3.0 and JPA configured to run on WAS and DB2. We also are using Container Managed Persistence
    We have a transactional method let's name it addA(), when executed, ultimately inserts data in 11 DB2 tables.
    In some of the 11 tables there are could be multiple rows inserted, in average, about 2 inserts.
    We are using the EntityManager.persist method to handle each entity.
    The method completes in about 11 seconds when the resources on the server (CPU,memory) are in a good state (so not overloaded).
    Is this a reasonable/decent time for the operation we are trying to do?
    If not, what would be a reasonable running time for such an operation?
    What do we need to do in order to improve the performance and decrease the execution time, other than switching to BMP and coding manual SQL inserts?

    user2617486 wrote:
    Do you have any idea how we can localize/isolate better the problem at the DB level?
    Can we programatically insert log statements to see how long it takes the processing on the WAS and how long takes the actual SQL statements execution once they hit the DB2 database?You need help from a DBA, you can't reason this problem away. You need cold hard facts from whatever tooling the database provides. Of course you could try adding log statements to see how long each database operation is taking on the Java side of things, but that only proves that it is slow, not WHY it is slow.
    The network latency can not be considered in this case since we run the test application on the same WAS where the application resides so it no networking involved.and the database runs on that machine as well? This is new information you are pulling out of your hat by the way, now all of a sudden there are two applications? And with the limited information you give I am to assume you are having performance problems from the test application and not from your "main application"? Otherwise I see no point in you making this argument.

  • Lat execution time of REPORT

    hi all gurus,
    how can i know the last execution time of any ABAP report. i mean how can i know that when the last time this particular report was executed?
    is there any transaction code?
    or is there any method for it?
    please let me know.
    thanks in advance.
    regards,
    hardik.

    hi Araujo,
       thanks for your response.
          But from this STAD i am not able to get the information that " when i have executed thsi particular named report last".
    can you try once more?
    thanks,
    regards,
    Hardik.

  • AJAX, how to check execution time of queries?

    CF8, using CFDIV to bind to a CFC. The CFC has several SQL
    queries, including select, insert, update, delete. I need to check
    the execution time of these queries. I have the CF Ajax logger
    turned on (cfdebug), but it doesn't seem to list query execution
    times. I see the query execution times from normal form pages in
    the log info on that page, at the bottom as normal, but not the
    queries in Ajax binded CFC's.
    Am I missing a server setting? I have everything checked in
    the Ajax logger. Global, LogReader, http, bind, debug, info,error,
    window.
    Thanks in advance for any help!
    Mike

    I use Ajax (in CF7) a lot to access a SQL Server database. To
    determine execution times for Ajax queries, I run the application
    in Firefox with the "Firebug" add-on. Firebug reports the execution
    of each (and every) Ajax transaction in milliseconds.

  • SMART FORMS  execution time slow

    Dear SAP,
    We have been applying support packs to our system from last few days.
    We have upgraded our ABAP and BASIS level from 10 to 17th Component.
    It has been found that TCodes take time to Open for first time after
    applying the components. In case of SMART FORMS the execution time
    in every process is taking very long. This has been the case for very
    long now. Kindly suggest what should be done to speed up the Process of
    Working in Smart Forms.
    - Jitesh
    ( Basis )

    Hi Njitesh,
    After Upgrade you should run SGEN ( SAP Load Generator ), which will execute all the transactions once and makes the system faster.
    Run the SGEN in the background mode, usually it takes a long time to compile all the objects.
    Hope you get the solution.
    Thanks
    Arbind

  • Loading jar files at execution time via URLClassLoader

    Hello�All,
    I'm�making�a�Java�SQL�Client.�I�have�practicaly�all�basic�work�done,�now�I'm�trying�to�improve�it.
    One�thing�I�want�it�to�do�is�to�allow�the�user�to�specify�new�drivers�and�to�use�them�to�make�new�connections.�To�do�this�I�have�this�class:�
    public�class�DriverFinder�extends�URLClassLoader{
    ����private�JarFile�jarFile�=�null;
    ����
    ����private�Vector�drivers�=�new�Vector();
    ����
    ����public�DriverFinder(String�jarName)�throws�Exception{
    ��������super(new�URL[]{�new�URL("jar",�"",�"file:"�+�new�File(jarName).getAbsolutePath()�+"!/")�},�ClassLoader.getSystemClassLoader());
    ��������jarFile�=�new�JarFile(new�File(jarName));
    ��������
    ��������/*
    ��������System.out.println("-->"�+�System.getProperty("java.class.path"));
    ��������System.setProperty("java.class.path",�System.getProperty("java.class.path")+File.pathSeparator+jarName);
    ��������System.out.println("-->"�+�System.getProperty("java.class.path"));
    ��������*/
    ��������
    ��������Enumeration�enumeration�=�jarFile.entries();
    ��������while(enumeration.hasMoreElements()){
    ������������String�className�=�((ZipEntry)enumeration.nextElement()).getName();
    ������������if(className.endsWith(".class")){
    ����������������className�=�className.substring(0,�className.length()-6);
    ����������������if(className.indexOf("Driver")!=-1)System.out.println(className);
    ����������������
    ����������������try{
    ��������������������Class�classe�=�loadClass(className,�true);
    ��������������������Class[]�interfaces�=�classe.getInterfaces();
    ��������������������for(int�i=0;�i<interfaces.length;�i++){
    ������������������������if(interfaces.getName().equals("java.sql.Driver")){
    ����������������������������drivers.add(classe);
    ������������������������}
    ��������������������}
    ��������������������Class�superclasse�=�classe.getSuperclass();
    ��������������������interfaces�=�superclasse.getInterfaces();
    ��������������������for(int�i=0;�i<interfaces.length;�i++){
    ������������������������if(interfaces[i].getName().equals("java.sql.Driver")){
    ����������������������������drivers.add(classe);
    ������������������������}
    ��������������������}
    ����������������}catch(NoClassDefFoundError�e){
    ����������������}catch(Exception�e){}
    ������������}
    ��������}
    ����}
    ����
    ����public�Enumeration�getDrivers(){
    ��������return�drivers.elements();
    ����}
    ����
    ����public�String�getJarFileName(){
    ��������return�jarFile.getName();
    ����}
    ����
    ����public�static�void�main(String[]�args)�throws�Exception{
    ��������DriverFinder�df�=�new�DriverFinder("D:/Classes/db2java.zip");
    ��������System.out.println("jar:�"�+�df.getJarFileName());
    ��������Enumeration�enumeration�=�df.getDrivers();
    ��������while(enumeration.hasMoreElements()){
    ������������Class�classe�=�(Class)enumeration.nextElement();
    ������������System.out.println(classe.getName());
    ��������}
    ����}
    It�loads�a�jar�and�searches�it�looking�for�drivers�(classes�implementing�directly�or�indirectly�interface�java.sql.Driver)�At�the�end�of�the�execution�I�have�found�all�drivers�in�the�jar�file.
    The�main�application�loads�jar�files�from�an�XML�file�and�instantiates�one�DriverFinder�for�each�jar�file.�The�problem�is�at�execution�time,�it�finds�the�drivers�and�i�think�loads�it�by�issuing�this�statement�(Class�classe�=�loadClass(className,�true);),�but�what�i�think�is�not�what�is�happening...�the�execution�of�my�code�throws�this�exception
    java.lang.ClassNotFoundException:�com.ibm.as400.access.AS400JDBCDriver
    ��������at�java.net.URLClassLoader$1.run(URLClassLoader.java:198)
    ��������at�java.security.AccessController.doPrivileged(Native�Method)
    ��������at�java.net.URLClassLoader.findClass(URLClassLoader.java:186)
    ��������at�java.lang.ClassLoader.loadClass(ClassLoader.java:299)
    ��������at�sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:265)
    ��������at�java.lang.ClassLoader.loadClass(ClassLoader.java:255)
    ��������at�java.lang.ClassLoader.loadClassInternal(ClassLoader.java:315)
    ��������at�java.lang.Class.forName0(Native�Method)
    ��������at�java.lang.Class.forName(Class.java:140)
    ��������at�com.marmots.database.DB.<init>(DB.java:44)
    ��������at�com.marmots.dbreplicator.DBReplicatorConfigHelper.carregaConfiguracio(DBReplicatorConfigHelper.java:296)
    ��������at�com.marmots.dbreplicator.DBReplicatorConfigHelper.<init>(DBReplicatorConfigHelper.java:74)
    ��������at�com.marmots.dbreplicator.DBReplicatorAdmin.<init>(DBReplicatorAdmin.java:115)
    ��������at�com.marmots.dbreplicator.DBReplicatorAdmin.main(DBReplicatorAdmin.java:93)
    Driver�file�is�not�in�the�classpath�!!!�
    I�have�tried�also�(as�you�can�see�in�comented�lines)�to�update�System�property�java.class.path�by�adding�the�path�to�the�jar�but�neither...
    I'm�sure�I'm�making�a/some�mistake/s...�can�you�help�me?
    Thanks�in�advice,
    (if�there�is�some�incorrect�word�or�expression�excuse�me)

    Sorry i have tried to format the code, but it has changed   to �... sorry read this one...
    Hello All,
    I'm making a Java SQL Client. I have practicaly all basic work done, now I'm trying to improve it.
    One thing I want it to do is to allow the user to specify new drivers and to use them to make new connections. To do this I have this class:
    public class DriverFinder extends URLClassLoader{
    private JarFile jarFile = null;
    private Vector drivers = new Vector();
    public DriverFinder(String jarName) throws Exception{
    super(new URL[]{ new URL("jar", "", "file:" + new File(jarName).getAbsolutePath() +"!/") }, ClassLoader.getSystemClassLoader());
    jarFile = new JarFile(new File(jarName));
    System.out.println("-->" + System.getProperty("java.class.path"));
    System.setProperty("java.class.path", System.getProperty("java.class.path")+File.pathSeparator+jarName);
    System.out.println("-->" + System.getProperty("java.class.path"));
    Enumeration enumeration = jarFile.entries();
    while(enumeration.hasMoreElements()){
    String className = ((ZipEntry)enumeration.nextElement()).getName();
    if(className.endsWith(".class")){
    className = className.substring(0, className.length()-6);
    if(className.indexOf("Driver")!=-1)System.out.println(className);
    try{
    Class classe = loadClass(className, true);
    Class[] interfaces = classe.getInterfaces();
    for(int i=0; i<interfaces.length; i++){
    if(interfaces.getName().equals("java.sql.Driver")){
    drivers.add(classe);
    Class superclasse = classe.getSuperclass();
    interfaces = superclasse.getInterfaces();
    for(int i=0; i<interfaces.length; i++){
    if(interfaces[i].getName().equals("java.sql.Driver")){
    drivers.add(classe);
    }catch(NoClassDefFoundError e){
    }catch(Exception e){}
    public Enumeration getDrivers(){
    return drivers.elements();
    public String getJarFileName(){
    return jarFile.getName();
    public static void main(String[] args) throws Exception{
    DriverFinder df = new DriverFinder("D:/Classes/db2java.zip");
    System.out.println("jar: " + df.getJarFileName());
    Enumeration enumeration = df.getDrivers();
    while(enumeration.hasMoreElements()){
    Class classe = (Class)enumeration.nextElement();
    System.out.println(classe.getName());
    It loads a jar and searches it looking for drivers (classes implementing directly or indirectly interface java.sql.Driver) At the end of the execution I have found all drivers in the jar file.
    The main application loads jar files from an XML file and instantiates one DriverFinder for each jar file. The problem is at execution time, it finds the drivers and i think loads it by issuing this statement (Class classe = loadClass(className, true);), but what i think is not what is happening... the execution of my code throws this exception
    java.lang.ClassNotFoundException: com.ibm.as400.access.AS400JDBCDriver
    at java.net.URLClassLoader$1.run(URLClassLoader.java:198)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:186)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:299)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:265)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:255)
    at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:315)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:140)
    at com.marmots.database.DB.<init>(DB.java:44)
    at com.marmots.dbreplicator.DBReplicatorConfigHelper.carregaConfiguracio(DBReplicatorConfigHelper.java:296)
    at com.marmots.dbreplicator.DBReplicatorConfigHelper.<init>(DBReplicatorConfigHelper.java:74)
    at com.marmots.dbreplicator.DBReplicatorAdmin.<init>(DBReplicatorAdmin.java:115)
    at com.marmots.dbreplicator.DBReplicatorAdmin.main(DBReplicatorAdmin.java:93)
    Driver file is not in the classpath !!!
    I have tried also (as you can see in comented lines) to update System property java.class.path by adding the path to the jar but neither...
    I'm sure I'm making a/some mistake/s... can you help me?
    Thanks in advice,
    (if there is some incorrect word or expression excuse me)

Maybe you are looking for

  • Excel JV Upload without Interface

    Hello All, I have a requirement at my client to develop a JV upload program which takes in excel as an input file, post the FI documents (vendor invoice or credit memo), and also generate a summary of errors which can be referred back to excel, corre

  • Archiving CO_Item

    Hi every one I am having problems with object CO-Item. I am trying to archive CO_Item in my variant I have to fiscal year 2005 and to posting period 12. But it picks up PM Orders from fiscal year 2006,2007 and 2008. Can anyone assit me with this

  • How to link MacBook Pro 15" Retina with 2 EIZO ColorEdge CE240W monitors

    I have the new MacBook Pro 15" with retina display (with 2 thunderbolt outputs, 2 USB, and an HDMI) and 2 EIZO ColorEdge CE240W monitors (they have a VGA, DVI, 2 downstream USB, 1 upstream USB). How can I go about linking my MacBook Pro to the two EI

  • PLSQL does not support 'Sum Over Partition'

    I have something (and many obvious syntactical variations) like the following which works great in SQLPLUS but not in PLSQL: select table1.aField, table2.bField, sum (table1.type) over (partition by type) from table1, table2 where aField = something;

  • We need Oracle 9.0.2 ReportServices for IBM AIX

    hi we are trying to download oracle Reports Services 9.0.2 for IBM AIX system can i have the exact link form which i can download thanks raju kalidindi