JDBC Bulk Insert to MS Access

I am trying to do bulk insert to MS Access database from text file. One of the solutions recommended by bbritta is as follows
import java.sql.*;
public class Test3 {
  public static void main(String[] arghs) {
    try {
      Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
      String filename = "C:/DB1.mdb";
      String database =
          "jdbc:odbc:Driver={Microsoft Access Driver (*.mdb)};DBQ=C:/DB1.MDB";
      Connection con = DriverManager.getConnection(database, "", "");
      Statement statement = con.createStatement();
      statement.execute("INSERT INTO Table1 SELECT * FROM [Text;Database=C:\\;HDR=YES].[TextFile.txt]");
      statement.close();
      con.close();
    } catch (Exception e) { e.printStackTrace(); }
}Whenever I try to use that approach, I get error message
java.sql.SQLException: [Microsoft][ODBC Microsoft Access Driver] Number of query values and destination fields are not the same.
at sun.jdbc.odbc.JdbcOdbc.createSQLException(JdbcOdbc.java:6958)
at sun.jdbc.odbc.JdbcOdbc.standardError(JdbcOdbc.java:7115)
at sun.jdbc.odbc.JdbcOdbc.SQLExecDirect(JdbcOdbc.java:3111)
at sun.jdbc.odbc.JdbcOdbcStatement.execute(JdbcOdbcStatement.java:338)
Fields in Access destination tables are exactly the same as in text field and I still get an error message. I could manually import to Access from same file without any problem.
I was wondering if someone out there could suggest another approach.

>
1) Is there a type-4 JDBC connector available to
connect directly to MS Access databases and if so
would it be difficult to implement or migrate to?
This is important because dbAnywhere does not appear
to be supported on Windows 2000, which is the
platform we are migrating to. We need to eliminate
dbAnywhere if possible.
By definition no such driver can exist. A type 4 driver is java only and connects directly to the database. Excluding file writes the only connection method is via sockets and there is nothing for a socket to connect to in a MS Access database - MS Access doesn't work that way.
You can look into type 3 driver. I believe there are a number of them. They use an intermediate server. Search here http://industry.java.sun.com/products/jdbc/drivers
You could implement your own using RmiJdbc at http://www.objectweb.org/. However I personally would think that that would require a serious long look at security issues before exposing a solution to the internet.

Similar Messages

  • SSIS BULK INSERT unsing UNC inside of ForEach Loop Container Failed could not be opened. Operating system error code 5(Access is denied.)

    Hi,
    I am trying to figure out how to fix my problem
    Error: Could not be opened. Operating system error code 5(Access is denied.)
    Process Description:
    Target Database Server Reside on different Server in the Network
    SSIS Package runs from a Remote Server
    SSIS Package use a ForEachLoop Container to loop into a directory to do Bulk Insert
    SSIS Package use variables to specified the share location of the files using UNC like this
    \\server\files
    Database Service accounts under the Database is runing it has full permission on the share drive were the files reside.
    In the Execution Results tab shows the prepare SQL statement for the BULK insert and I can run the same exact the bulk insert in SSMS without errors, from the Database Server and from the server were SSIS package is executed.
    I am on a dead end and I don’t want to re-write SSIS to use Data Flow Task because is not flexible to update when metadata of the table changed.
    Below post it has almost the same situation:
    https://social.msdn.microsoft.com/Forums/sqlserver/en-US/8de13e74-709a-43a5-8be2-034b764ca44f/problem-with-bulk-insert-task-in-foreach-loop?forum=sqlintegrationservices

    Insteresting how I fixed the issue, Adding the Application Name into the SQL OLAP Connection String Fixed the issue. I am not sure why SQL Server wasn't able to open the file remotely without this.

  • How to insert bulk data into ms-access

    Hi,
    I am trying to insert bulk data into ms-access. I used Statement it is
    working fine but not allowing to insert single quote. Then I tryed with
    PreparedStatement which is allowing single quote but not allowing bulk data. The following error i am getting.
    javax.servlet.ServletException: [Microsoft][ODBC Microsoft Access Driver]String data, right truncated (null)
    please help me..
    guru

    have u tried out the memo datatype in access?

  • File access problem while using BULK INSERT

    I'm creating a script to automatically convert a large mess of data.  Here's a test query I was using to bring a file into the database:
    INSERT [ImportTestTable]
    SELECT a.*
    FROM
    OPENROWSET(
    BULK 'D:\TestFile.csv',
    FORMATFILE = 'D:\TestStyle.fmt',
    FIRSTROW = 2
    ) AS a;
    SELECT * FROM ImportTestTable
    I've used queries like this on other networks and machines before, but when I run that query on the particular machine I'm working with now, I get the following error:
    Msg 4861, Level 16, State 1, Line 13
    Cannot bulk load because the file "D:\TestFile.csv" could not be opened. Operating system error code 21
    (The device is not ready.).
    Here's some relevant facts I can think of that might help:
    I am running the query from SQL Server Management Studio on a remote machine running Windows 7 Ultimate.  I am connected to this server using SQL authentication.  I believe we are on the same domain / network.  I am not the DBA.  I do
    have the permission "Administer Bulk Operations" explicitly granted to me by the DBA.  The user I am currently logged in as in windows is capable of opening, editing, and saving the file in windows explorer.  The format file is a NON-XML
    format file.
    Any pointers as to where to look for more detailed information would be greatly appreciated!

    I know this is a little old by now, but this is what I used just this week:
    bulk insert [dbo].[Test_Table]
    from 'C:\Documents and Settings\rshuell\Desktop\Test_File.txt'
    WITH (
    FIELDTERMINATOR=',',
    ROWTERMINATOR = '\n',
    KEEPNULLS,
    FIRSTROW=2
    That worked fine for me.
    Of course, we Naomi stated, the file has to be ON THE SERVER.  Or, if you're using an FTP site, for instance, the path would have to point to the FTP site.
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

  • Error while running bulk insert in SSIS package

    Hi:
    I have an error when I am running bulk insert in SSIS package.
    I have implemented an SSIS package to update master data directly from R/3, R/3 gives the file in a specified format, I take this and insert all the records into a temporary table and then update mbr table and process the dimension.
    This works perfectly well in our development system where both our app server and sql server on the same box. But in QAS, the 2 servers are separate and when I try to run the SSIS package I get the below error.
    We have tested all connections and are able to access the path and file from both app server and sql server using the shared folder. Our basis team says that it is a problem with bulk insert task and nothing to do with any authorization.
    Has anyone experienced with this sort of problem in multi server environment? Is there another way to load all data from a file into bespoke table without using bulk insert.
    Thanks,
    Subramania
    Error----
    SSIS package "Package.dtsx" starting.
    Error: 0xC002F304 at Insert Data Into Staging Table (Account), Bulk Insert Task: An error occurred with the following error message: "Cannot bulk load because the file "
    msapbpcapq01\dim\entity.csv" could not be opened. Operating system error code 5(Access is denied.).".
    Task failed: Insert Data Into Staging Table (Account)
    SSIS package "Package.dtsx" finished: Success.
    The program '[2496] Package.dtsx: DTS' has exited with code 0 (0x0).

    Hi Subramania
    From your error:
    Error: 0xC002F304 at Insert Data Into Staging Table (Account), Bulk Insert Task: An error occurred with the following error message: "Cannot bulk load because the file "
    msapbpcapq01\dim\entity.csv" could not be opened. Operating system error code 5(Access is denied.).".
    Let say, server A is where the file entity.csv is located
    Please check the Event Viewer->Security of Server A at the time when the SSIS run, there must be an entry with Logon Failure and find what user was used to access the shared path.
    If your both servers are not in a domain, create the user in server A with the same name and password and grant read access to the shared folder.
    The other workaround is grant read access to Everybody on the shared folder.
    Halomoan
    Edited by: Halomoan Zhou on Oct 6, 2008 4:23 AM

  • Bulk insert security

    I have an SSIS package on server A which calls a stored procedure on server B.  The stored procedure runs the bulk insert command to load an XML file.  The service account which SQL Server Services is running under on server A is a sysadmin on
    Server B, and has permissions to the file.  When I run the package I am getting "Cannot bulk load because the file could not be opened.  Operating system error code 5(Access is denied).  
    If I create the server B database on server A, so that the SSIS package and the database where I call the bulk insert command from are both on server A, and run the SSIS package it runs successfully.
    When everything is on the same server it works, so whatever account it is using has the necessary permissions.  But that account must not have the necessary permissions on server B.  What other accounts would need permissions on server B?
    Thanks for the help.

    If you are logged in with a Windows account, BULK INSERT impersonates that user, and that user's Windows permissions apply. If you are logged in with an SQL login, the permissions of the service account applies.
    Keep in mind that, the login on server B is a new login. There could be a login mapping set up. Also, several levels of impersonation may not work out well.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Bulk insert task issue

    I Have table,It contains 4 millions records,I want load data into Sql Server table using Bulk Insert task.
    How can i load data using Bulk Insert task.Bulk insert task supports only text source.
    Thanks in Advance.

    If its a sql server table to table transfer You can use data flow task with OLEDB Source and destination. In the OLEDB destination use
    table or view - fast load option as the data access mode. 
    Also if databases are in same server you can even use Execute SQL task with statement like
    INSERT INTO DestTable
    SELECT *
    FROM SourceDB.dbo.SourceTable
    which will be set based
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Bulk Insert CSV file in Oracle 9i

    Greetings all,
    Is there a similar way in Oracle to the Bulk Insert Command in SQL Server to bulk load a csv file in Oracle?
    Something else then the External Table and the SQL*Loader.
    Sincerely
    Dan

    Here is an example:
    CREATE TABLE EXT_ASSC
       (     LEVEL_1 VARCHAR2(30),
         LEVEL_2 VARCHAR2(50),
         CATEGORY_CODE VARCHAR2(100),
         LEVEL_3 VARCHAR2(100)
       ORGANIZATION EXTERNAL
        ( TYPE ORACLE_LOADER
          DEFAULT DIRECTORY some_directory
          ACCESS PARAMETERS
          ( RECORDS DELIMITED BY NEWLINE
              NODISCARDFILE
              NOLOGFILE
              BADFILE 'my_bad_file.bad'
            FIELDS
              TERMINATED BY ','
              LRTRIM
              MISSING FIELD VALUES ARE NULL
          LOCATION
           ( 'some_file.csv'
       REJECT LIMIT UNLIMITED;

  • BULK INSERT

    I don't know if this has been posted before. I have look around and could not find any similar question.
    I am written a stored proc that will do a bulk load using bulk insert command. But I am getting an error msg 4861. below is my full error msg and my code. Can someone advice what I am doing wrong. The sql server engine is on a total different server from
    my text file. But they are all on the same network.
    use test_sp
    go
    Declare @path nvarchar(max)
    declare @str varchar (1000)
    declare @Fulltblname varchar (1000)
    Set @path ='\\myservername\ShareName\Path\FileName.txt'
    Set @Fulltblname ='table1'
    --bulk load the table with raw data
    Set @str = 'BULK INSERT [dbo].['+@Fulltblname+'] 
    FROM ' + char(39) + @Path + Char(39) + '
    WITH 
    FIELDTERMINATOR = ''|'',
    FIRSTROW = 1,
    ROWTERMINATOR =''\n'',
    MAXERRORS = 0
    Exec sp_executesql @str
    Errors getting below
    Cannot bulk load because the file "\\myservername.domainname\ShareName\Path\FileName.txt" could not be opened. Operating system error code 5(Access is denied.).
    Msg 4861, Level 16, State 1, Line 1
    Cannot bulk load because the file "\\myservername.domainname\ShareName\Path\FileName.txt" could not be opened. Operating system error code 5(Access is denied.).
    Mail queued.

    Hi,
    Try below links :
    http://blogs.msdn.com/b/dataaccesstechnologies/archive/2012/03/22/10082977.aspx
    http://blogs.msdn.com/b/jay_akhawri/archive/2009/02/16/resolving-operating-system-error-code-5-with-bulk-insert-a-different-perspective.aspx
    http://stackoverflow.com/questions/14555262/cannot-bulk-load-operating-system-error-code-5-access-is-denied
    sathya - www.allaboutmssql.com ** Mark as answered if my post solved your problem and Vote as helpful if my post was useful **.

  • SQL Server 2008 - RS - Loop of multiple Bulk Inserts

    Hi,
    I want to import multiple flat files to a table on SQL Server 2008 R2. However, I don't have access to Integration Services to use a foreach loop, so I'm doing the process using T-SQL. Actually, I'm using manually code to which file to introduce the data on
    tables. My code are like this:
    cREATE TABLE #temporaryTable
        [column1] [varchar](100) NOT NULL,
        [column2 [varchar](100) NOT NULL
    BULK
    INSERT #temp
    FROM 'C:\Teste\testeFile01.txt' 
    WITH
    FIELDTERMINATOR = ';',
    ROWTERMINATOR = '\n',
    FIRSTROW = 1
    GO
    BULK
    INSERT #temp
    FROM 'C:\Teste\testeFile02.txt' 
    WITH
    FIELDTERMINATOR = ';',
    ROWTERMINATOR = '\n',
    FIRSTROW = 1
    GO
    -------------------------------------------------INSERT INTO dbo.TESTE ( Col_1, Col_2)
    Select RTRIM(LTRIM([column1])), RTRIM(LTRIM([column2])) From #temporaryTable
    IF EXISTS(SELECT * FROM #temporaryTable) drop table #temporaryTable
    The problem is that I have 20 flat files to Insert... Do I have any loop solution in T-SQL to insert all the flat files on same table?
    Thanks!

    Here is a working sample of powershell script I adopted from internet( I don't have the source handy now).
    Import-Module -Name 'SQLPS' -DisableNameChecking
    $workdir="C:\temp\test\"
    $svrname = "MC\MySQL2014"
    Try
    #Change default timeout time from 600 to unlimited
    $svr = new-object ('Microsoft.SqlServer.Management.Smo.Server') $svrname
    $svr.ConnectionContext.StatementTimeout = 0
    $table="test1.dbo.myRegions"
    #remove the filename column in the target table
    $q1 = @"
    Use test1;
    IF COL_LENGTH('dbo.myRegions','filename') IS NOT NULL
    BEGIN
    ALTER TABLE test1.dbo.myRegions DROP COLUMN filename;
    END
    Invoke-Sqlcmd -ServerInstance $svr.Name -Database master -Query $q1
    $dt = (get-date).ToString("yyyMMdd")
    $formatfilename="$($table)_$($dt).xml"
    $destination_formatfilename ="$($workdir)$($formatfilename)"
    $cmdformatfile="bcp $table format nul -c -x -f $($destination_formatfilename) -T -t\t -S $($svrname) "
    Invoke-Expression $cmdformatfile
    #Delay 1 second
    Start-Sleep -s 1
    $q2 = @"
    Alter table test1.dbo.myRegions Add filename varchar(500) Null;
    #add the filename column to the target table
    Invoke-Sqlcmd -ServerInstance $svr.Name -Database master -Query $q2
    $files = Get-ChildItem $workdir
    $items = $files | Where-Object {$_.Extension -eq ".txt"}
    for ($i=0; $i -lt $items.Count; $i++) {
    $strFileName = $items[$i].Name
    $strFileNameNoExtension= $items[$i].BaseName
    $query = @"
    BULK INSERT test1.dbo.myRegions from '$($workdir)$($strFileName)' WITH (FIELDTERMINATOR = '\t', FIRSTROW = 2, FORMATFILE = '$($destination_formatfilename)');
    Invoke-Sqlcmd -ServerInstance $svr.Name -Database master -Query $query -querytimeout 65534
    #Delay 10 second
    Start-Sleep -s 10
    # Update the filename column
    Invoke-Sqlcmd -ServerInstance $svr.Name -Database master -querytimeout 65534 -Query "Update test1.dbo.myRegions SET filename= '$($strFileName)' WHERE filename is null; "
    # Move uploaded file to archive
    If ((Test-Path "$($workdir)$($strFileName)") -eq $True) { Move-Item -Path "$($workdir)$($strFileName)" -Destination "$($workdir)Processed\$($strFileNameNoExtension)_$($dt).txt"}
    Catch [Exception]
    write-host "--$strFileName "$_.Exception.Message

  • How to insert data from access to sql server ?

    How to insert data from access to sql server ?
    Please help me
    thanks

    phamtrungkien wrote:
    How to insert data from access to sql server by JAVA?The first four words of my last post:
    masijade wrote:
    JDBC with two connectionsGet a resultset from the jdbc-odbc bridge access connection, cycle through it and add batch insert commands to the jdbc connection to sql server. Give it a try and if the code has an error, then post your code ans ask a question.
    The real question, though, is why you think it absolutely necessary to use Java for this.

  • Exception while doing bulk insertion

    Hi,
    I am trying to do a bulk insert of records into a table using my application. I am using prepared statement to achieve this. I am getting the following exception while doing bulk insert.
    java.lang.NegativeArraySizeException
    I am using SQL Server driver version 2000.80.380.00 for this. The database type chosen is JDBC-ODBC.
    Your early response is appreciated.
    Regards
    Ramesh

    Hi,
    I am trying to do a bulk insert of records into a
    table using my application. I am using prepared
    statement to achieve this. I am getting the following
    exception while doing bulk insert.
    java.lang.NegativeArraySizeException
    I am using SQL Server driver version 2000.80.380.00
    for this. The database type chosen is JDBC-ODBC.
    Your early response is appreciated.
    RegardsLooks like one of your arrays has a problem with its size, possibly a negative size!
    It could be a problem...
    somewhere...
    in your application...
    in the code...
    somewhere.
    Possibly at the line number indicated by the exception... just a wild guess!
    Thought about looking for it? Thats what I'd do first.
    Or do you expect someone to say "Ahhhhh 2000.80.380.00 marvelous plumage, bugger with the bulk inserts"

  • BCP-style bulk insert from remote C++ ODBC Native client application

    I am trying to find documentation or sample code for performing bulk inserts into SQL Server 2012 from a remote client using the ODBC native client driver from Linux.  We currently perform INSERT statements on blocks of data, wrapping it in BEGIN/COMMIT,
    and achieving through approximately half of bcp reading from a delimited text file.  While there are many web pages talking about bulk inserts via the native driver, this page (http://technet.microsoft.com/en-us/library/ms130792.aspx) seems closest to
    what I'm after but doesn't go into any detail or give API calls.  The referenced header file is just a bunch of options and constants, so presumablyone gains access to bulk functions via the standard ODBC mechanism, the question is how.
    For clarity, I am NOT interested in:
    BULK INSERT: because it requires a server-side data file or a UNC path with appropriate permissions (doesn't work from Linux)
    INSERT ... SELECT
    * FROM OPENROWSET(BULK...): same problem as above
    IRowsetFastload: OLEDB, but I need ODBC on Linux.
    Basically, I want to emulate BCP.  I don't want to *run* BCP because it requires landing data to disk. 
    Thanks
    john
    John Lilley Chief Architect RedPoint Global Inc.

    Other than block inserts within BEGIN/COMMIT transaction blocks or running bcp, is there anything else that can be done on Linux?
    No other option from Linux that I am aware of.  The SQL Server Native Client ODBC driver also supports table-valued-parameters, which can be used to stream data but the Linux ODBC driver API doesn't have a way to do that either.  That said, I would
    still expect file-based BCP to significantly outperform inserts with large batches.  I've seen a rate of 100K/sec. with this technique, including the file create overhead but much depends on the particulars of your use case.
    Consider voting for this on Connect.  BCP is on the roadmap but no date yet: 
    https://connect.microsoft.com/SQLServer/SearchResults.aspx?SearchQuery=linux+odbc+bcp
    Also, I filed a Connect item for TVP support:
    https://connect.microsoft.com/SQLServer/feedback/details/874616/add-tvp-support-to-sql-server-odbc-driver-for-linux
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • Proper array size to be used in bulk insert

    What is the proper array size to be used in bulk insert?
    I have around 1 million records. Should I insert them all at a time or distribute them over many iterations.

    I'd generally expect external tables to be more efficient than SQL*Loader if only because you don't have to spend cycles loading data into a staging table. Depending on the file, Oracle may be able to access the data in parallel via the external table interface.
    From a pure efficiency standpoint, it would be best if the processing could be encapsulated into a single multi-table insert statement. Whether that is realistic, of course, depends on your logic, how complicated that SQL statement would be, etc. If you have to resort to PL/SQL, bulk processing is surely the way to go. it doesn't matter too much what LIMIT size you choose-- something like 100 or 1000 is generally appropriate-- but the marginal difference is pretty small. On a load of only 1 million rows, it may not be particularly easy to measure.
    Justin

  • Sub-SELECT in Bulk INSERT- Performance Clarification

    I have 2 tables- emp_new & emp_old. I need to load all data from emp_old to emp_new. There is a transaction_id column in emp_new whose value needs to be fetched from a main_transaction table which also includes a Region Code column. Something like -
    TRANSACTION_ID REGION_CODE
    100 US
    101 AMER
    102 APAC
    My bulk insert query looks like this -
    INSERT INTO emp_new
    (col1,
    col2,
    transaction_id)
    SELECT
    col1,
    col2,
    *(select transaction_id from main_transaction where region_code = 'US')*
    FROM emp_old
    There would be millions of rows which need to be loaded in this way. I would like to know if the sub-SELECT to fetch the transaction_id would be re-executed for every row, which would be very costly and I'm actually looking for a way to avoid this. The main_transcation table is a pre-loaded table and its values are not going to change. Is there a way (via some HINT) to indicate that the sub-SELECT should not get re-executed for every row ?
    On a different note, the execution plan of the above bulk INSERT looks like -
    | Id | Operation | Name | Rows | Bytes | Cost (%CPU)|
    | 0 | INSERT STATEMENT | | 11M| 54M| 6124 (4)|
    | 1 | INDEX FAST FULL SCAN| EMPO_IE2_IDX | 11M| 54M| 6124 (4)|
    EMPO_IE2_IDX -> Index on emp_old
    I'm surprised to see that the table main_transaction does not feature in the execution plan at all. Does this mean that the sub-SELECT will not get re-executed for every row? However, atleast for the first read, I would assume that the table should appear in the plan.
    Can someone help me in understanding this ?

    Dear
    From 10.2, AUTOTRACE uses DBMS_XPLAN anywayYes but with the remark that it uses the estimated part of DBMS_XPLAN i.e explain plan for + select * from table(dbms_xplan.display);
    Isn'it ?
    mhouri> cl scr
    mhouri> desc t
    Name                    Null?    Type
    ID                               VARCHAR2(10)
    NAME                             VARCHAR2(100)
    mhouri> set linesize 150
    mhouri> var x number
    mhouri> exec :x:=99999
    PL/SQL procedure successfully completed.
    mhouri> explain plan for
      2  select sum(length(name)) from t where id >  :x;
    Explained.
    mhouri> select * from table(dbms_xplan.display);
    PLAN_TABLE_OUTPUT                                                                                                                                    
    Plan hash value: 1188118800                                                                                                                          
    | Id  | Operation                    | Name | Rows  | Bytes | Cost (%CPU)| Time     |                                                                
    |   0 | SELECT STATEMENT             |      |     1 |    23 |     4   (0)| 00:00:01 |                                                                
    |   1 |  SORT AGGREGATE              |      |     1 |    23 |            |          |                                                                
    |   2 |   TABLE ACCESS BY INDEX ROWID| T    |    58 |  1334 |     4   (0)| 00:00:01 |                                                                
    |*  3 |    INDEX RANGE SCAN          | I    |    11 |       |     2   (0)| 00:00:01 |                                                                
    PLAN_TABLE_OUTPUT                                                                                                                                    
    Predicate Information (identified by operation id):                                                                                                  
       3 - access("ID">:X)                                                                                                                               
    15 rows selected.
    mhouri> set autotrace on
    mhouri> select sum(length(name)) from t where id >  :x;
    SUM(LENGTH(NAME))                                                                                                                                    
                10146                                                                                                                                    
    Execution Plan
    Plan hash value: 1188118800                                                                                                                          
    | Id  | Operation                    | Name | Rows  | Bytes | Cost (%CPU)| Time     |                                                                
    |   0 | SELECT STATEMENT             |      |     1 |    23 |     4   (0)| 00:00:01 |                                                                
    |   1 |  SORT AGGREGATE              |      |     1 |    23 |            |          |                                                                
    |   2 |   TABLE ACCESS BY INDEX ROWID| T    |    58 |  1334 |     4   (0)| 00:00:01 |                                                                
    |*  3 |    INDEX RANGE SCAN          | I    |    11 |       |     2   (0)| 00:00:01 |                                                                
    Predicate Information (identified by operation id):                                                                                                  
       3 - access("ID">:X)                                                                                                                               
    Statistics
              0  recursive calls                                                                                                                         
              0  db block gets                                                                                                                           
             15  consistent gets                                                                                                                         
              0  physical reads                                                                                                                          
              0  redo size                                                                                                                               
            232  bytes sent via SQL*Net to client                                                                                                        
            243  bytes received via SQL*Net from client                                                                                                  
              2  SQL*Net roundtrips to/from client                                                                                                       
              0  sorts (memory)                                                                                                                          
              0  sorts (disk)                                                                                                                            
              1  rows processed                                                                                                                          
    mhouri> set autotrace off
    mhouri> select sum(length(name)) from t where id >  :x;
    SUM(LENGTH(NAME))                                                                                                                                    
                10146                                                                                                                                    
    mhouri> select * from table(dbms_xplan.display_cursor);
    PLAN_TABLE_OUTPUT                                                                                                                                    
    SQL_ID  7zm570j6kj597, child number 0                                                                                                                
    select sum(length(name)) from t where id >  :x                                                                                                       
    Plan hash value: 1842905362                                                                                                                          
    | Id  | Operation          | Name | Rows  | Bytes | Cost (%CPU)| Time     |                                                                          
    |   0 | SELECT STATEMENT   |      |       |       |     5 (100)|          |                                                                          
    |   1 |  SORT AGGREGATE    |      |     1 |    23 |            |          |                                                                          
    |*  2 |   TABLE ACCESS FULL| T    |    59 |  1357 |     5   (0)| 00:00:01 |                                                                          
    Predicate Information (identified by operation id):                                                                                                  
       2 - filter(TO_NUMBER("ID")>:X)                                                                                                                    
    19 rows selected.
    mhouri> spool offBest regards
    Mohamed Houri

Maybe you are looking for