Database calling BLOB

folks
I am quering a sybase database on a "TEXT" datatype, here is the code for it,
public String getCaseNotes(String caseNumber)
          ResultSet resultSet;
          byte[] result = null;
          String byteString  = "";
          try
               resultSet = (ResultSet)caseQuery.getCaseNotes(caseNumber);
               if(resultSet.next())
                    ByteArrayOutputStream oStm = new ByteArrayOutputStream(1024);
                    BufferedInputStream bStm = new BufferedInputStream(resultSet
                                                   .getBinaryStream("case_history"));
                byte[] buffer = new byte[1024];                                                     
                   int bytesRead = 0;            
                    if(!resultSet.wasNull())
                         if ( (bytesRead = bStm.read(buffer,0,buffer.length)) != -1 ) {
                              oStm.write(buffer,0,bytesRead);
                         }else{
                              byteString = oStm.toString();
                    bStm.close();     
          }catch(Exception ex){
               System.out.println("exception occurred while calling blob objects="
                                + ex.getMessage());
          return byteString;     
    }Now when i call this method i get the following error,
"=JZ0TE: Attempted conversion between an illegal pair of types. Valid database datatypes are: 'image, binary, long binary, varbinary'
Does any knows about this please let me know, If i am doing anything wrong,
thanks
KM

The major problem is you are using getBinaryStream on
a field defined as a TEXT field in the database.Sir,
@OP change the field type from text to image. I believe image is a BLOB field in sybase.
Sincerely,
Slappy

Similar Messages

  • [XSQL] Displaying PDF stored in database as BLOB

    Dear Sirs,
    we have PDF stored in the database as BLOB.
    I have written a PLSQL function which I call accordingly
    <xsql:ref-cursor-function >
    digitallibrary.getdocs('{@resource_seqid}')
    </xsql:ref-cursor-function>
    however what I get is the following
    <page>
    <rowset>
    <row num="1">
    <document>255044462D312E320A...</document>
    </row>
    </rowset>
    </page>
    instead of having the customer's browser starting up Acrobat.
    It looks to me as if the data is the HEX of the BLOB:
    25 %
    50 P
    44 D
    46 F
    2D -
    31 1
    2E .
    32 2
    0A Line Feed
    This coincides with what previously reported
    "BLOB data is serialized as hex bytes in XSU."
    I understand that "XSQL has no built-in support for BLOB data, but it is extensible via user-written action handlers or user-written serializers."
    I have the feeling that I should use an appropriate serializer for this purpose but it is not entirely clear to me how ...
    Any suggestion is appreciated.
    Looking forward your advice.
    Regards and thanks
    Luca

    The following serializer
    public void serialize(Document document, XSQLPageRequest xsqlpagerequest) throws java.lang.Throwable {
    NodeList nodelist=document.getElementsByTagName("*");
    xsqlpagerequest.setContentType("application/pdf");
    OutputStream outputstream=xsqlpagerequest.getOutputStream();
    String DocString = new String();
    for (int loopIndex=0; loopIndex<=10; loopIndex++) {
    display(nodelist.item(loopIndex),"");
    if (loopIndex == 6 ) { DocString=displayStrings[loopIndex];}
    byte abyte0[] = oracle.xml.sql.core.OracleXMLConvert.convertHexStringToByte(DocString);
    outputstream.write(abyte0, 0, abyte0.length);
    outputstream.flush();
    works fine however the Internet Explorer 5.5 displays the native PDF code instead of firing the plugin... any suggestion ?

  • How to optimize Database Calls to improve performance of an application

    Hi,
    I have a performance issue with my applications. It takes a lot of time to load, as it is making several calls to the database. And, moreover, the resultset has more than 2000 records that are returned. I need to know what is the better way to improve the performance
    1. What is the solution to optimize the database calls so that I can improve the performance of my application and also improve on the trun around time to load the web pages.
    2. Stored procedures are a good way to get the data from the result set iteratively. How can I implement this solution in Java?
    This is very important, and any help is greatly appreciated.
    Thanks in Advance,
    Sailatha

    latha_kaps wrote:
    I have a performance issue with my applications. It takes a lot of time to load, as it is making several calls to the database. And, moreover, the resultset has more than 2000 records that are returned. I need to know what is the better way to improve the performance
    1. What is the solution to optimize the database calls so that I can improve the performance of my application and also improve on the trun around time to load the web pages.
    2. Stored procedures are a good way to get the data from the result set iteratively. How can I implement this solution in Java?
    This is very important, and any help is greatly appreciated.1. 2000 records inside a resultset are not a big number.
    2. Which RDBMS you use?
    Concerning the answer to 2. you have different possibilities. The best thing is always to handle as many transactions as possible inside the database. Therefore a stored procedure is the best approach imho.
    Below there is an example for an Oracle RDBMS.
    Assumption #1 you have created an object (demo_obj) in your Oracle database:
    create type demo_obj as object( val1 number, val2 number, val3 number);
    create type demo_array as table of demo_obj;
    /Assumption #2 you've created a stored function to get the values of the array in your database:
    create or replace function f_demo ( p_num number )
    return demo_array
    as
        l_array demo_array := demo_array();
    begin
        select demo_obj(round(dbms_random.value(1,2000)),round(dbms_random.value(2000,3000)),round(dbms_random.value(3000,4000)))
        bulk collect into l_array
          from all_objects
         where rownum <= p_num;
        return l_array;
    end;
    /For getting the data out of database use the following Java program (please watch the comments):
    import java.sql.*;
    import java.io.*;
    import oracle.sql.*;
    import oracle.jdbc.*;
    public class VarrayDemo {
         public static void main(String args[]) throws IOException, SQLException {
              DriverManager.registerDriver(new oracle.jdbc.driver.OracleDriver());
              Connection conn = DriverManager.getConnection(
                        "jdbc:oracle:oci:@TNS_ENTRY_OF_YOUR_DB", "scott", "tiger"); // I am using OCI driver here, but one can use thin driver as well
              conn.setAutoCommit(false);
              Integer numRows = new Integer(args[0]); // variable to accept the number of rows to return (passed at runtime)
              Object attributes[] = new Object[3]; // "attributes" of the "demo_obj" in the database
              // the object demo_obj in the db has 3 fields, all numeric
              // create an array of objects which has 3 attributes
              // we are building a template of that db object
              // the values i pass below are just generic numbers, 1,2,3 mean nothing really
              attributes[0] = new Integer(1);
              attributes[1] = new Integer(2);
              attributes[2] = new Integer(3);
              // this will represent the data type DEMO_OBJ in the database
              Object demo_obj[] = new Object[1];
              // make the connection between oracle <-> jdbc type
              demo_obj[0] = new oracle.sql.STRUCT(new oracle.sql.StructDescriptor(
                        "DEMO_OBJ", conn), conn, attributes);
              // the function returns an array (collection) of the demo_obj
              // make the connection between that array(demo_array) and a jdbc array
              oracle.sql.ARRAY demo_array = new oracle.sql.ARRAY(
                        new oracle.sql.ArrayDescriptor("DEMO_ARRAY", conn), conn,
                        demo_obj);
              // call the plsql function
              OracleCallableStatement cs =
                   (OracleCallableStatement) conn.prepareCall("BEGIN ? := F_DEMO(?);END;");
              // bind variables
              cs.registerOutParameter(1, OracleTypes.ARRAY, "DEMO_ARRAY");
              cs.setInt(2, numRows.intValue());
              cs.execute();
              // get the results of the oracle array into a local jdbc array
              oracle.sql.ARRAY results = (oracle.sql.ARRAY) cs.getArray(1);
              // flip it into a result set
              ResultSet rs = results.getResultSet();
              // process the result set
              while (rs.next()) {
                   // since it's an array of objects, get and display the value of the underlying object
                   oracle.sql.STRUCT obj = (STRUCT) rs.getObject(2);
                   Object vals[] = obj.getAttributes();
                   System.out.println(vals[0] + " " + vals[1] + " " + vals[2]);
              // cleanup
              cs.close();
              conn.close();
    }For selecting 20.000 records it takes only a few seconds.
    Hth

  • Are mutliple database calls really significant with a network call for a web API?

    At one of my employers, we worked on a REST (but it also applies to SOAP) API. The client, which is the application UI, would make calls over the web (LAN in typical production deployments) to the API. The API would make calls to the database.
    One theme that recurs in our discussions is performance: some people on the team believe that you should not have multiple database calls (usually reads) from a single API call because of performance; you should optimize them so that each API call has only
    (exactly) one database call.
    But is that really important? Consider that the UI has to make a network call to the API; that's pretty big (order of magnitude of milliseconds). Databases are optimized to keep things in memory and execute reads very, very quickly (eg. SQL Server loads and
    keeps everything in RAM and consumes almost all your free RAM if it can).
    TLDR: Is it really significant to worry about multiple database calls when we are already making a network call over the LAN? If so, why?
    To be clear, I'm talking about order of magnitude -- I know that it depends on specifics (machine hardware, choice of API and DB, etc.) If I have a call that takes O(milliseconds), does optimizing for DB calls that take an order of magnitude less, actually
    matter? Or is there more to the problem than this?
    Edit: for posterity, I think it's quite ridiculous to make claims that we need to improve performance by combining database calls under these circumstances -- especially
    with a lack of profiling. However, it's not my decision whether we do this or not; I want to know what the rationale is behind thinking this is a correct way of optimizing web API calls.

    But is that really important? Consider that the UI has to make a network call to the API; that's pretty big (order of magnitude of milliseconds). Databases are optimized to keep things in memory
    and execute reads very, very quickly (eg. SQL Server loads and keeps everything in RAM and consumes almost all your free RAM if it can).
    The Logic
    In theory, you are correct. However, there are a few flaws with this rationale:
    From what you stated, it's unclear if you actually tested / profiled your app. In other words, do you actually know that
    the network transfers from the app to the API are the slowest component? Because that is intuitive, it is easy to assume that it is. However, when discussing performance, you should never assume. At my employer, I am the performance lead. When I first joined,
    people kept talking about CDN's, replication, etc. based on intuition about what the bottlenecks must be. Turns out, our biggest performance problems were poorly performing database queries.
    You are saying that because databases are good at retrieving data, that the database is necessarily running at peak performance, is being used optimally, and there is nothing that can be done
    to improve it. In other words, databases are designed to be fast, so I should never have to worry about it. Another dangerous line of thinking. That's like saying a car is meant to move quickly, so I don't need to change the oil.
    This way of thinking assumes a single process at a time, or put another way, no concurrency. It assumes that one request cannot influence another request's performance. Resources are shared,
    such as disk I/O, network bandwidth, connection pools, memory, CPU cycles, etc. Therefore, reducing one database call's use of a shared resource can prevent it from causing other requests to slow down. When I first joined my current employer, management believed
    that tuning a 3 second database query was a waste of time. 3 seconds is so little, why waste time on it? Wouldn't we be better off with a CDN or compression or something else? But if I can make a 3 second query run in 1 second, say by adding an index, that
    is 2/3 less blocking, 2/3 less time spent occupying a thread, and more importantly, less data read from disk, which means less data flushed out of the in-RAM cache.
    The Theory
    There is a common conception that software performance is simply about speed.
    From a purely speed perspective, you are right. A system is only as fast as its slowest component. If you have profiled your code and found that the Internet is the slowest component, then everything else is obviously not the slowest part.
    However, given the above, I hope you can see how resource contention, lack of indexing, poorly written code, etc. can create surprising differences in performance.
    The Assumptions
    One last thing. You mentioned that a database call should be cheap compared to a network call from the app to the API. But you also mentioned that the app and the API servers are in the same LAN. Therefore, aren't both of them comparable as network calls? In
    other words, why are you assuming that the API transfer is orders of magnitude slower than the database transfer when they both have the same available bandwidth? Of course the protocols and data structures are different, I get that, but I dispute the assumption
    that they are orders of magnitude different.
    Where it gets murkey
    This whole question is about "multiple" versus "single" database calls. But it's unclear how many are multiple. Because of what I said above, as a general rule of thumb, I recommend making as few database calls as necessary. But that is
    only a rule of thumb.
    Here is why:
    Databases are great at reading data. They are storage engines. However, your business logic lives in your application. If you make a rule that every API call results in exactly one database call, then your business logic may end up in the database. Maybe that
    is ok. A lot of systems do that. But some don't. It's about flexibility.
    Sometimes to achieve good decoupling, you want to have 2 database calls separated. For example, perhaps every HTTP request is routed through a generic security filter which validates from the DB that the user has the right access rights. If they do, proceed
    to execute the appropriate function for that URL. That function may interact with the database.
    Calling the database in a loop. This is why I asked how many is multiple. In the example above, you would have 2 database calls. 2 is fine. 3 may be fine. N is not fine. If you call the database in a loop, you have now made performance linear, which means it
    will take longer the more that is in the loop's input. So categorically saying that the API network time is the slowest completely overlooks anomalies like 1% of your traffic taking a long time due to a not-yet-discovered loop that calls the database 10,000
    times.
    Sometimes there are things your app is better at, like some complex calculations. You may need to read some data from the database, do some calculations, then based on the results, pass a parameter to a second database call (maybe to write some results). If
    you combine those into a single call (like a stored procedure) just for the sake of only calling the database once, you have forced yourself to use the database for something which the app server might be better at.
    Load balancing: You have 1 database (presumably) and multiple load balanced application servers. Therefore, the more work the app does and the less the database does, the easier it is to scale because it's generally easier to add an app server than setup database
    replication. Based on the previous bullet point, it may make sense to run a SQL query, then do all the calculations in the application, which is distributed across multiple servers, and then write the results when finished. This could give better throughput
    (even if the overall transaction time is the same).
    TL;DR
    TLDR: Is it really significant to worry about multiple database calls when we are already making a network call over the LAN? If so, why?
    Yes, but only to a certain extent. You should try to minimize the number of database calls when practical, but don't combine calls which have nothing to do with each other just for the sake of combining them. Also, avoid calling the database in a loop at all
    costs.

  • Move pictures in Jpg from a server to SAP (Oracle database) in BLOB

    Hi SAP Friends,
    We would like to know if this is possible.
    We have pics in jpg/jpeg format in one server. We want to move these pics automatically once every hour to SAP Oracle database in BLOB format. The server has the capability to push these jpg files into BLOB format of Oracle database. We need to know if it is possible to send to SAP's Oracle database and store in BLOB format. If so, pl let us know how.
    Pl let us know.
    Niranjan

    You are facing two things here:
    a) Licensing issue
    Check Note 581312 - Oracle database: licensing restrictions:
    As of point 3, it follows that direct access to the Oracle database is only allowed for tools from the areas of system administration and monitoring. If other software is used, the following actions, among other things, are therefore forbidden at database level:
    Creating database users
    Create database objects
    Querying/changing/creating data in the database
    Using ODBC or other SAP external access methods
    This depends on the contract and where you bought which licenses for Oracle.
    b) Technical issues
    It´s not a good idea to insert data into a SAP database without using SAP tools. Even if it´s a separate table(space) or "isolated" in a SAP understanding. You never know, how upgrades behave with that table, you may see the table in sense of SAP-DDIC consistency (Table without DDIC reference) etc.
    If you want to insert JPEGs easily,  you can use transaction CSADMIN, create a repository and write a small program, that uploads the data to the database using SAP standard interfaces. This will insure DATA integrity and will make sure, the data is accessible even after database/SAP system upgrades.
    Markus

  • What's database call and user call?

    There is no stats named as "database call" in v$statname, but there is a stats named as "user call". Are they the same? And what's the meaning of database call? in 10046 raw trace file, there are something like, parse, fetch, exec, wait. Are all of them "database call"? And someone said "database call" is some OCI function, anyone could put more information?
    Thanks,
    Daniel

    You can read about database calls in this great book and one of the user call definitions is here

  • Database call in human task

    Hello,
    Is it possible to make database call to save some data in human task into database and later taking this task and submitting it as done to the process?
    What I want to achieve is that when user gets task, he/she opens it and enters some information and saves it into database, later returns, reviews information he/she entered and submits it to process (ends task).
    Any suggestions appreciated.

    Yes, it is possible assign a group as participant of some human task, passing the group name as parameter.
    I have tested just now.
    It works pretty well in SOA 11.1.1.4 (BPEL or BPM).
    Make sure add a data parameter in your human task definition and pass a valid group name to it.
    At the Assignment tab, in the participants' list, add a group, data type by expression, and set the value to the right xpath expression to the corresponding parameter.
    For example: /task:task/task:payload/task:group
    If it is not working look the SOA log files, probably you'll find some information about the error there. Maybe there is some problem with your jazn.com configuration.
    You can also test if there is something wrong related to the group name, trying to transfer some task to the same group by the worklist.

  • Database calls, please suggest

    Hi,
    I working on j2ee project. I have a requirement like this.
    I need to insert a data into one master and one detail table. There will one header record and multiple detail records. I should insert successfully in both the table. If any insert fails I should rollback both the inserts. How can I achieve it. Should I use user transaction or entity bean or stored procedure. IF it is a stored procedure, I can have only one database call and can manage rollback and commit in procedure itself but the drawback is I can not insert variable number of detail records, I will have to use limited number of detail records ( i.e. 20 etc. as I have oracle 7.3.4 database).
    Another option is user transaction, but I will have to make one database call per insert i.e. one for header and one each for detail line. I really want minimum number of database calls as our application server is 700 miles away from database server.
    Third suggestion is entity bean, I don�t know much about it but ready to explore.
    P lease suggest me best possible solution.
    Thanks
    Pramod

    Hi,
    I working on j2ee project. I have a requirement like
    this.
    I need to insert a data into one master and one detail
    table. There will one header record and multiple
    detail records. I should insert successfully in both
    the table. If any insert fails I should rollback both
    the inserts. How can I achieve it. Should I use user
    transaction or entity bean or stored procedure. IF it
    is a stored procedure, I can have only one database
    call and can manage rollback and commit in procedure
    itself but the drawback is I can not insert variable
    number of detail records, I will have to use limited
    number of detail records ( i.e. 20 etc. as I have
    oracle 7.3.4 database).
    Another option is user transaction, but I will have to
    make one database call per insert i.e. one for header
    and one each for detail line. I really want minimum
    number of database calls as our application server is
    700 miles away from database server.
    Third suggestion is entity bean, I don�t know much
    about it but ready to explore.
    P lease suggest me best possible solution.
    Thanks
    PramodI don't know if this is the best possible, but it's the best I know:
    - use stateless session beans as far as possible (assuming high user load)
    - transactions are managed outside the ejb code and how you do it in the DD is container specific
    - in general, avoid entity beans unless you really need them, if you use them use CMP for your container
    - don't use BMP unless you are sure you know exactly what you are doing
    Not a bible just some ideas,
    /k1

  • Capturing database calls in JDeveloper

    Hi,
    We have an ADF Swing application and we want to capture database calls for each of the events of our application. We are using Toad to monitor the database queries, while running the application, however Toad only shows the list of queries executed for a particular session, it doesn't show what all queries get executed for each action. Is there a way to track database calls for each action through JDeveloper, so that it works like a profiler, giving us the database queries getting executed as we do various actions in our application.
    Thanks,
    Makrand

    Try the -Djbo.debugoutput=console option:
    http://www.oracle.com/technology/products/jdev/tips/muench/debugger/index.html

  • Direct Database calls from FLEX client

    Hi all,
    is there any way to make direct database calls to a MySQL database (or any other database) from a FLEX client, rather than invoking through a service? simply, i need to remove the middle tier.
    Thanks in advance.
    SajKK

    Air only supports SQL Lite for now may be more in future.
    http://www.adobe.com/devnet/air/ajax/quickstart/sync_simple_sql_database.html
    http://www.insideria.com/2008/03/air-api-querying-a-local-datab.html
    http://ntt.cc/2008/07/08/sqlite-example-for-adobe-air-working-with-local-sql-databases-wit h-source-code.html

  • Database call fired twice when using actionListener in dataTable

    Hi all,
    I have a question regarding the request bean lifecylce in the current use case (using Sun JSF 1.2)
    I have a managed bean in request scope that contains an ArrayList which is used as the data provider in a dataTable on a faces page.
    The bean contains an init() method to populate the ArrayList using a database call.
    The dataTable also contains a column with a commandLink that calls a method via actionListener inside the managed bean to delete the current row.
    When I click the link the action gets called and deletes the row from the database. I also reload the data from the database and assign it to my ArrayList.
    However, the init Method is also called before the action is executed. So the database call is fired twice when hitting the link:
    - First time in the init() method of the bean
    - Second time in the actionListener method when reloading the data
    I can not remove the call from the actionListener, because the data has not deleted yet.
    Question:*
    How can I make sure the database call is fired once only? (and also making sure the ArrayList is populated appropriate)
    Maybe I am doing something wrong here? Thanks in advance for any help.
    Maik
    This is the request scope bean:
    public class UserBean implements Serializable {
        private List all;
        private Long userId = null;
        @PostConstruct
        public void init() {
            if(all == null) {
                new ArrayList();
                loadUserList();
         * Constructor
        public UserBean() {
            super();
         * @return the userId
        public Long getUserId() {
            return userId;
         * @param userId
         *            the userId to set
        public void setUserId(Long userId) {
            this.userId = userId;
         * @param all
         *            the all to set
        public void setAll(List all) {
            this.all = all;
        public List getAll() throws GeneralModelException {
            return all;
        public void loadUserList() {
            EntityManager em = Contexts.getEntityManager();
            Query q = em.createNamedQuery("user.findAll");
            all = q.getResultList();
        public void deleteAction(ActionEvent ae) {
            EntityManager em = Contexts.getEntityManager();
            Query q = em.createNamedQuery("user.byId");
            q.setParameter("id", userId);
            try {
                User user = (User) q.getSingleResult();
                if (user != null) {
                    em.remove(user);
                    loadUserList();
            } catch (NoResultException e) {
                // TODO
    }

    No, I do not call the init() method.
    Basically the init() is called before the deleteAction() so the ArrayList still contains the old value, unless a second database call is triggered after the entity has been deleted.
    Maybe I am missing something here...
    See also here (JSF 1.2 RI - Bean Instantiation and Annotation)
    [http://weblogs.java.net/blog/jhook/archive/2007/05/jsf_12_ri_backi.html]
    Here is the init() call stack trace
    Daemon Thread [http-8080-2] (Suspended (breakpoint at line 32 in UserBean))     
         UserBean.init() line: 32     
         NativeMethodAccessorImpl.invoke0(Method, Object, Object[]) line: not available [native method]     
         NativeMethodAccessorImpl.invoke(Object, Object[]) line: not available     
         DelegatingMethodAccessorImpl.invoke(Object, Object[]) line: not available     
         Method.invoke(Object, Object...) line: not available     
         DefaultAnnotationProcessor.postConstruct(Object) line: 79     
         Tomcat6InjectionProvider.invokePostConstruct(Object) line: 118     
         ManagedBeanBuilder(BeanBuilder).invokePostConstruct(Object, InjectionProvider) line: 223     
         ManagedBeanBuilder(BeanBuilder).build(InjectionProvider, FacesContext) line: 108     
         BeanManager.createAndPush(String, BeanBuilder, ELUtils$Scope, FacesContext) line: 368     
         BeanManager.create(String, FacesContext) line: 230     
         ManagedBeanELResolver.getValue(ELContext, Object, Object) line: 88     
         FacesCompositeELResolver(CompositeELResolver).getValue(ELContext, Object, Object) line: 53     
         FacesCompositeELResolver.getValue(ELContext, Object, Object) line: 72     
         AstIdentifier.getValue(EvaluationContext) line: 61     
         AstValue.getTarget(EvaluationContext) line: 59     
         AstValue.setValue(EvaluationContext, Object) line: 129     
         ValueExpressionImpl.setValue(ELContext, Object) line: 249     
         JspValueExpression.setValue(ELContext, Object) line: 85     
         RestoreViewPhase.doPerComponentActions(FacesContext, UIComponent) line: 240     
         RestoreViewPhase.doPerComponentActions(FacesContext, UIComponent) line: 245     
         RestoreViewPhase.doPerComponentActions(FacesContext, UIComponent) line: 245     
         RestoreViewPhase.execute(FacesContext) line: 195     
         RestoreViewPhase(Phase).doPhase(FacesContext, Lifecycle, ListIterator<PhaseListener>) line: 100     
         RestoreViewPhase.doPhase(FacesContext, Lifecycle, ListIterator<PhaseListener>) line: 104     
         LifecycleImpl.execute(FacesContext) line: 118     
         FacesServlet.service(ServletRequest, ServletResponse) line: 265     

  • Network or database calls are made when joining more than one table

    Hi Friends,
    could anybody please let me know how may networks are called when joining more than one table.
    Thanks
    Rinky

    Hi Rinky,
      Normally when a JOIN between two database tables is made then following steps occur:-
    1) The control goes to database. Based on the JOINING and WHERE condition, an internal table is created in the DATABASE only which is filled. So here the computation is done at DATABASE level.
    2) Once the internal table is filled at database level, it is sent back to the application level.
    A Join operation normally minimizes the round trips to the database as most of the computation is done at database level only and results sent back to the Application layer.
    <b>Thus for a simple JOIN OPERATION makes a single DATABASE call.</b>
    NOTE: If you are satisfied with the explanation, then please reward points
               accordingly :).
    Thanks and regards,
    Ravi .

  • Help:- Store Word file in database using blob

    Hai all,
    i want to store word document in database and reterieve from database but i don't have any idea i heared about blob but i don't know how to make it any one have source code or example form pls send me
    email: [email protected]

    Hello,
    First of all, you have got to be granted the privilege to read and write to the file system, this is a must to use BFile read access.
    A global function bfilename returns a bfile taking the file path as an argument is needed to construct a blob object through dbms_lob.loadfromfile.
    A very helpful example I found on orafaq follows:Create a link to the directory where the file to load in the blob is
    (connect as system then grant read privilege to your user):
    create or replace directory blob_dir as 'C:\Documents';
    Custom the following procedure that I copied from a Thread
    create or replace procedure blob_ins(p_id in number, p_filename in
    varchar2) as
    l_bfile bfile;
    l_blob blob;
    begin
    insert into blob_test(p_id, empty_blob())
    returning blob_file into l_blob;
    l_bfile := bfilename('BLOB_DIR', p_filename);
    dbms_lob.fileopen(l_bfile);
    dbms_lob.loadfromfile(l_blob, l_bfile, dbms_lob.getlength(l_bfile));
    dbms_lob.fileclose(l_bfile);
    commit;
    return;
    end blob_ins;
    You can use UTL_FILE package in another way, but this is much more safe;
    Have Fun
    Hossam Al Din

  • ADF BC and direct database call from ActionListener Method

    Hi
    I have an ADF BC application which generates a simple form . In the submit button I am calling a Java method which makes a direct database connection. The application hangs when the method does the executeUpdate() method
    public void onSubmit(ActionEvent actionEvent) {
    DBSequence prdSequenct=((DBSequence) resolveExpression("#{bindings.ProductId.inputValue}"));
              int productId = new Integer(prdSequenct.toString()).intValue();
              String productName = (String) resolveExpression("#{bindings.ProductName.inputValue}");
    update(productName,productId);
    public void update(String productName,int productId) {
    String str ="update products set product_name=? where product_id =?";
    Connection con=null;
    PreparedStatement pstmt= null;
    try {
    con = new DatabaseConnection().getConnection();
    pstmt= con.prepareStatement(str);
    pstmt.setString(1, productName)
    pstmt.setInt(2, productId)
    pstmt.executeUpdate();
    } catch (SQLException e) {
    e.printStackTrace();
    }finally {
    try
    pstmt.close();
    con.close();
    }catch(SQLException sqle) {
    sqle.printStackTrace();
    when I submit the form the update() method hangs at executeUpdate();
    If I run the update() method from a standalone java its works fine.Can anybody tell what could be the issue ?
    Thanks
    Suneesh
    Edited by: Suneesh Raman on Aug 18, 2010 10:14 AM

    I am using jDev Studio Edition Version 11.1.1.2.0 . Infact I need to call a PL/SQL api from the onSubmit method which I set in the action listener of submit button of the form.
    My getConnection method is :
         public Connection getConnection() throws SQLException {
              DriverManager.registerDriver(new OracleDriver());
              Connection con =
              DriverManager.getConnection("jdbc:oracle:thin:@localhost:1521:XE","rme_pbkr", "rme_pbkr");
              OracleDataSource ods = new OracleDataSource();
              ods.setUser("rme_pbkr");
              ods.setPassword("rme_pbkr");
              ods.setURL("jdbc:oracle:thin:@localhost:1521:XE");
    System.out.println("::: con = :::"+con);
              return ods.getConnection();
    Does that mean we cannot make a db connection from the actionlistener implementation method ?
    How to put the method in AM implementation class ? Just write a public void method there and call from onSubmit button actionlistener through some binding ?
    Thanks
    Suneesh

  • Inconsistent results from database calls

    Hi all I was wondering if anyone would be able to come up with some possible ideas as to why my application is inconsistently retrieving data from a database. I have an application that is pulling information from a database. These calls can be made repetitively pulling the same information over and over again. The quandry that I am faced with is why the application would pull the right data most of the time, but occasionally not pull the correct values.
    I have consided the following:
    blocking - all of the calls are read only and scroll insensitive so I doubt this is the case
    Limited number of connections
    Seems like apossibility but then I would expect an error to be thrown.
    Actions happening to fast
    I was thinking perhaps the cycle is completing before everything has had a chance to finish the last set - is that a possibility ?
    I can't post the code, so I am just looking for any general thoughts or ideas on the topic, if nothing thanks anyways.

    Thank you both for your thoughts. The transactions is definately not a possibility (at least as I understand) as this is on read information only, there is no commits nor any updates being completed by this method.
    I will continue to hunt down this little bugger. The biggest issue is that it is one behemoth of a method that should not exist, but unfortunately does.

Maybe you are looking for

  • Iphone 4 voice memo crash?

    So i am an iphone 4 user. Just today, i have use my voice recorder to record a VERY IMPORTANT interview, the whole interview takes about 39:46s, and yes, i can remember the number so clearly is because i do take a look before i press the stop button.

  • Goods issue in Project System(SAP-PS)

    While creating the Goods Issue with reference to Goods Receipt from Tcode MIGO, it show the error message " Selected material document does not correspond with action to be executed ". First i created Purchase Request.After this Purchase Order with r

  • Can't have more than one item on at the same time

    Hello, I have a WRT110. I have two wireless notebooks and a wireless printer on a network. Everytime I turn more than one thing on, it makes my connection drop. Example, I will have my notebook on and if I want to print something out, I turn on the p

  • Display the value in millions

    Hi,     I am designing a pie chart, the output value of the pie chart should be display as 72%($2.75Mil) but it was displaying as 72%(2,752,852). I used the format as 0,,;(0,,) and expression as =" #PERCENT{P0}" & VbCrlf & "("&FormatCurrency(IIF(IsNo

  • Loading in a new runtime-me​nu

    Hello, I was wandering if anyone can help. I am looking for a bit of code or a way in which I can be running a program then on a particualar button press, load in a new run time menu to the program so that all the options are now changed. I am assumi