Error removing object from cache with write behind

We have a cache with a DB for a backing store. The cache has a write-behind delay of about 10 seconds.
We see an error when we:
- Write new object to the cache
- Remove object from cache before it gets written to cachestore (because we're still within the 10 secs and the object has not made it to the db yet).
At first i was thinking "coherence should know if the object is in the db or not, and do the right thing", but i guess that's not the case?

Hi Ron,
The configuration for <local-scheme> allows you to add a cache store but you cannot use write-behind, only write-through.
Presumably you do not want the data to be shared by the different WLS nodes, i.e. if one node puts data in the cache and that is eventually written to a RAC node, that data cannot be seen in the cache by other WLS nodes. Or do you want all the WLS nodes to share the data but just write it to different RAC nodes?
If you use a local-scheme then the data will only be local to that WLS node and not shared.
I can think of a possible way to do what you want but it depends on the answer to the above question.
JK

Similar Messages

  • Local Cache with write-behind backing map

    Hi there,
    I am a Coherence newb, so I hope my question isn't too naive. I have been experimenting with Coherence using a write-behind JPA backing map, but I have only been able to make it work with a distributed cache. Because of my specific database RAC architecture, I need to ensure that entries written to the database from a given WLS node are restricted to a specific RAC node. In my view, using a local cache rather than a distributed cache should solve my problem, although if there is a way of configuring my cache so this works I'd appreciate the info.
    So, the short form of the question: can I back a local cache with a write-behind JPA map?
    Cheers,
    Ron

    Hi Ron,
    The configuration for <local-scheme> allows you to add a cache store but you cannot use write-behind, only write-through.
    Presumably you do not want the data to be shared by the different WLS nodes, i.e. if one node puts data in the cache and that is eventually written to a RAC node, that data cannot be seen in the cache by other WLS nodes. Or do you want all the WLS nodes to share the data but just write it to different RAC nodes?
    If you use a local-scheme then the data will only be local to that WLS node and not shared.
    I can think of a possible way to do what you want but it depends on the answer to the above question.
    JK

  • Can a db slowdown with write-behind cause a slowdown in cache operations?

    If we have a coherence cluster, and one cache configured with write-behind is having trouble writing to the db (ie, it's slow), and we keep adding objects to the cache that exceed the ability of the db to consume them; will flow-control kick in and cause the writes to the cache to block/slow-down? Ie, the classic producer-consumer problem, where we are adding objects to the cache, faster than the cachestore can consume them.
    What happens in this case? Will flow-control kick in and block writes to the cache? Will an internal buffer just keep growing? Are there any knobs to tweak this behavior (eg, in the case of spikes, where temporarily the producer is producing faster than the consumer can consume for a brief period of time, but then things go back to normal)?

    user9222505 wrote:
    I believe we discovered that the same thread pool is used for all requests to the cache, including gets, puts and calls into the cachestore. So if the writes are slow within the cachestore, then it uses up all of the threads and slows everything down.Hi,
    This is not really correct.
    If a cache in a service is configured to use write-behind then a separate thread for that service is started, which deals with write-behind store and storeAll operations.
    The remove operations need to be handled synchronously to avoid corruption of the data-set in the scenario of reading a entry from the cache immediately after removing it (if it were not synchronously deleted from the backing storage, then reading it back could give an incorrect non-null value). Therefore remove operations are handled synchronously on the service / worker thread, and not delayed on the write-behind thread.
    Gets are also synchronously handled, so they again are served on the service / worker thread.
    So if the puts are slow and wait too much, that may delay other puts but should not contend with other threads. If the puts are computation intensive, then obviously they hinder other threads because of consumption of the same CPU resource, and not simply because they execute.
    Best regards,
    Robert

  • Net.sf.jasperreports.engine.JRException: Error loading object from file : C

    I started using JasperReports for my web application report generation. I'm using JSPs for web development.
    I created a .jrxml file using iReport and used the following code to generate the report.
    try {
              JasperDesign jasperDesign = JRXmlLoader.load("C:\\tomcat\\webapps\\web\\JSP\\reports\\samples\\pmm-final.jrxml");
              JasperReport jasperReport = JasperCompileManager.compileReport(jasperDesign);
    // Second, create a map of parameters to pass to the report.
              Map parameters = new HashMap();
              parameters.put("Title", "JasperReport");
    // Third, get a database connection
              Connection conn = null;
              Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
              conn=DriverManager.getConnection("jdbc:odbc:driver={Microsoft Access Driver (*.mdb)};DBQ=C:/tomcat/webapps/db1/db1.mdb");
    // Fourth, create JasperPrint using fillReport() method
              JasperPrint jasperPrint = JasperFillManager.fillReport(jasperReport,
                                  parameters, conn);
    // You can use JasperPrint to create PDF
              //JasperExportManager.exportReportToPdfFile(jasperPrint, "C:\\tomcat\\webapps\\web\\JSP\\reports\\TestReport.pdf");
              JasperExportManager.exportReportToHtmlFile(jasperPrint, "C:\\tomcat\\webapps\\web\\JSP\\reports\\TestPMM.html");
              JRXlsExporter exporter = new JRXlsExporter();
              exporter.setParameter(JRExporterParameter.JASPER_PRINT, jasperPrint);
              exporter.setParameter(JRExporterParameter.OUTPUT_FILE_NAME, "C:\\tomcat\\webapps\\web\\JSP\\reports\\TestPMM.xls");
              exporter.exportReport();
    // Or to view report in the JasperViewer
              //JasperViewer.viewReport(jasperPrint);
    } catch (JRException e) {
              // TODO Auto-generated catch block
              e.printStackTrace();
    } catch (SQLException e) {
              // TODO Auto-generated catch block
              e.printStackTrace();
    The above pmm-final.jrxml uses a subreport 'top.jasper'. Error being thrown while loading top.jasper file. Error is as follows.
    java.io.InvalidClassException: net.sf.jasperreports.engine.base.JRBaseReport; lo
    cal class incompatible: stream classdesc serialVersionUID = 604, local class ser
    ialVersionUID = 606
    at java.io.ObjectStreamClass.initNonProxy(Unknown Source)
    at java.io.ObjectInputStream.readNonProxyDesc(Unknown Source)
    at java.io.ObjectInputStream.readClassDesc(Unknown Source)
    at java.io.ObjectInputStream.readNonProxyDesc(Unknown Source)
    at java.io.ObjectInputStream.readClassDesc(Unknown Source)
    at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
    at java.io.ObjectInputStream.readObject0(Unknown Source)
    at java.io.ObjectInputStream.readObject(Unknown Source)
    at net.sf.jasperreports.engine.util.JRLoader.loadObject(JRLoader.java:86
    at net.sf.jasperreports.engine.util.JRLoader.loadObjectFromLocation(JRLo
    ader.java:236)
    at net.sf.jasperreports.engine.fill.JRFillSubreport.evaluate(JRFillSubre
    port.java:295)
    at net.sf.jasperreports.engine.fill.JRFillBand.evaluate(JRFillBand.java:
    340)
    at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillPageBand(JRVert
    icalFiller.java:1224)
    at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillPageHeader(JRVe
    rticalFiller.java:353)
    at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillReportStart(JRV
    erticalFiller.java:205)
    at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillReport(JRVertic
    alFiller.java:119)
    at net.sf.jasperreports.engine.fill.JRBaseFiller.fill(JRBaseFiller.java:
    613)
    at net.sf.jasperreports.engine.fill.JRBaseFiller.fill(JRBaseFiller.java:
    483)
    at net.sf.jasperreports.engine.fill.JRFiller.fillReport(JRFiller.java:77
    at net.sf.jasperreports.engine.JasperFillManager.fillReport(JasperFillMa
    nager.java:248)
    at org.apache.jsp.JSP.UserGuide_jsp._jspService(org.apache.jsp.JSP.UserG
    uide_jsp:74)
    at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:99)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper
    .java:325)
    at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:2
    95)
    at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:245)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Appl
    icationFilterChain.java:252)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationF
    ilterChain.java:173)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperV
    alve.java:214)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextV
    alve.java:178)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.j
    ava:126)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.j
    ava:105)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineVal
    ve.java:107)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.jav
    a:148)
    at org.apache.jk.server.JkCoyoteHandler.invoke(JkCoyoteHandler.java:306)
    at org.apache.jk.common.HandlerRequest.invoke(HandlerRequest.java:385)
    at org.apache.jk.common.ChannelSocket.invoke(ChannelSocket.java:745)
    at org.apache.jk.common.ChannelSocket.processConnection(ChannelSocket.ja
    va:675)
    at org.apache.jk.common.SocketConnection.runIt(ChannelSocket.java:868)
    at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadP
    ool.java:684)
    at java.lang.Thread.run(Unknown Source)
    NESTED BY :
    java.io.InvalidClassException: net.sf.jasperreports.engine.base.JRBaseReport; lo
    cal class incompatible: stream classdesc serialVersionUID = 604, local class ser
    ialVersionUID = 606
    at java.io.ObjectStreamClass.initNonProxy(Unknown Source)
    at java.io.ObjectInputStream.readNonProxyDesc(Unknown Source)
    at java.io.ObjectInputStream.readClassDesc(Unknown Source)
    at java.io.ObjectInputStream.readNonProxyDesc(Unknown Source)
    at java.io.ObjectInputStream.readClassDesc(Unknown Source)
    at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
    at java.io.ObjectInputStream.readObject0(Unknown Source)
    at java.io.ObjectInputStream.readObject(Unknown Source)
    at net.sf.jasperreports.engine.util.JRLoader.loadObject(JRLoader.java:86
    at net.sf.jasperreports.engine.util.JRLoader.loadObjectFromLocation(JRLo
    ader.java:236)
    at net.sf.jasperreports.engine.fill.JRFillSubreport.evaluate(JRFillSubre
    port.java:295)
    at net.sf.jasperreports.engine.fill.JRFillBand.evaluate(JRFillBand.java:
    340)
    at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillPageBand(JRVert
    icalFiller.java:1224)
    at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillPageHeader(JRVe
    rticalFiller.java:353)
    at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillReportStart(JRV
    erticalFiller.java:205)
    at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillReport(JRVertic
    alFiller.java:119)
    at net.sf.jasperreports.engine.fill.JRBaseFiller.fill(JRBaseFiller.java:
    613)
    at net.sf.jasperreports.engine.fill.JRBaseFiller.fill(JRBaseFiller.java:
    483)
    at net.sf.jasperreports.engine.fill.JRFiller.fillReport(JRFiller.java:77
    at net.sf.jasperreports.engine.JasperFillManager.fillReport(JasperFillMa
    nager.java:248)
    at org.apache.jsp.JSP.UserGuide_jsp._jspService(org.apache.jsp.JSP.UserG
    uide_jsp:74)
    at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:99)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper
    .java:325)
    at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:2
    95)
    at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:245)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Appl
    icationFilterChain.java:252)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationF
    ilterChain.java:173)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperV
    alve.java:214)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextV
    alve.java:178)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.j
    ava:126)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.j
    ava:105)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineVal
    ve.java:107)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.jav
    a:148)
    at org.apache.jk.server.JkCoyoteHandler.invoke(JkCoyoteHandler.java:306)
    at org.apache.jk.common.HandlerRequest.invoke(HandlerRequest.java:385)
    at org.apache.jk.common.ChannelSocket.invoke(ChannelSocket.java:745)
    at org.apache.jk.common.ChannelSocket.processConnection(ChannelSocket.ja
    va:675)
    at org.apache.jk.common.SocketConnection.runIt(ChannelSocket.java:868)
    at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadP
    ool.java:684)
    at java.lang.Thread.run(Unknown Source)
    NESTED BY :
    net.sf.jasperreports.engine.JRException: Error loading object from file : C:\tom
    cat\webapps\web\JSP\reports\samples\top.jasper
    at net.sf.jasperreports.engine.util.JRLoader.loadObject(JRLoader.java:90
    at net.sf.jasperreports.engine.util.JRLoader.loadObjectFromLocation(JRLo
    ader.java:236)
    at net.sf.jasperreports.engine.fill.JRFillSubreport.evaluate(JRFillSubre
    port.java:295)
    at net.sf.jasperreports.engine.fill.JRFillBand.evaluate(JRFillBand.java:
    340)
    at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillPageBand(JRVert
    icalFiller.java:1224)
    at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillPageHeader(JRVe
    rticalFiller.java:353)
    at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillReportStart(JRV
    erticalFiller.java:205)
    at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillReport(JRVertic
    alFiller.java:119)
    at net.sf.jasperreports.engine.fill.JRBaseFiller.fill(JRBaseFiller.java:
    613)
    at net.sf.jasperreports.engine.fill.JRBaseFiller.fill(JRBaseFiller.java:
    483)
    at net.sf.jasperreports.engine.fill.JRFiller.fillReport(JRFiller.java:77
    at net.sf.jasperreports.engine.JasperFillManager.fillReport(JasperFillMa
    nager.java:248)
    at org.apache.jsp.JSP.UserGuide_jsp._jspService(org.apache.jsp.JSP.UserG
    uide_jsp:74)
    at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:99)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper
    .java:325)
    at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:2
    95)
    at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:245)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Appl
    icationFilterChain.java:252)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationF
    ilterChain.java:173)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperV
    alve.java:214)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextV
    alve.java:178)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.j
    ava:126)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.j
    ava:105)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineVal
    ve.java:107)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.jav
    a:148)
    at org.apache.jk.server.JkCoyoteHandler.invoke(JkCoyoteHandler.java:306)
    at org.apache.jk.common.HandlerRequest.invoke(HandlerRequest.java:385)
    at org.apache.jk.common.ChannelSocket.invoke(ChannelSocket.java:745)
    at org.apache.jk.common.ChannelSocket.processConnection(ChannelSocket.ja
    va:675)
    at org.apache.jk.common.SocketConnection.runIt(ChannelSocket.java:868)
    at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadP
    ool.java:684)
    at java.lang.Thread.run(Unknown Source)
    Caused by: java.io.InvalidClassException: net.sf.jasperreports.engine.base.JRBas
    eReport; local class incompatible: stream classdesc serialVersionUID = 604, loca
    l class serialVersionUID = 606
    at java.io.ObjectStreamClass.initNonProxy(Unknown Source)
    at java.io.ObjectInputStream.readNonProxyDesc(Unknown Source)
    at java.io.ObjectInputStream.readClassDesc(Unknown Source)
    at java.io.ObjectInputStream.readNonProxyDesc(Unknown Source)
    at java.io.ObjectInputStream.readClassDesc(Unknown Source)
    at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
    at java.io.ObjectInputStream.readObject0(Unknown Source)
    at java.io.ObjectInputStream.readObject(Unknown Source)
    at net.sf.jasperreports.engine.util.JRLoader.loadObject(JRLoader.java:86
    ... 33 more
    Anyone pls help. It's bit urgent. Thanks.

    I was using iReport v0.5.0 and it uses jasperreports-0.6.7.jar (v0.6.7) of
    JasperReports. I compiled and i deployed my application in BEA weblogic server. I got the error listed below. Only after i saw your
    response explaining that iReport was the issue, i checked the iReport lib directory and found this version of jasperreport jar.
    iReport creates a java source file which is used to a jasper file.
    iReport will link in the v0.6.7 version of jasperReports. When you
    deploy your web application it will recognize this version through the compiled jasper file and give you the InvalidClassException, even
    though you only have one jasperReport jar deployed with your war file.
    The way i fixed this problem was to create my web application with the
    jasperreport jar comes with iReport.
    Thanks for mentioning iReport.
    Christopher
    Error:
    Caused by: java.io.InvalidClassException: net.sf.jasperreports.engine.base.JRBas
    eReport; local class incompatible: stream classdesc serialVersionUID = 607, local class serialVersionUID = 10002
    at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:463)
    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1521)
    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1435)
    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1521)
    ...

  • How to read appended objects from file with ObjectInputStream?

    Hi to everyone. I'm new to Java so my question may look really stupid to most of you but I couldn't fined a solution by myself... I wanted to make an application, something like address book that is storing information about different people. So I decided to make a class that will hold the information for each person (for example: nickname, name, e-mail, web address and so on), then using the ObjectOutputStream the information will be save to a file. If I want to add a new record for a new person I'll simply append it to the already existing file. So far so good but soon I discovered that I can not read the appended objects using ObjectInputStream.
    What I mean is that if I create new file and then in one session save several objects to it using ObjectOutputStream they all will be read with no problem by ObjectInputStream. But after that if in a new session I append new objects they won't be read. The ObjectInputStream will read the objects from the first session after that IOException will be generated and the reading will stop just before the appended objects from the second session.
    The following is just a simple test it's not actual code from the program I was talking about. Instead of objects containing different kind of information I'm using only strings here. To use the program use as arguments in the console "w" to create new file followed by the file name and the strings you want save to the file (as objects). Example: "+w TestFile.obj Thats Just A Test+". Then to read it use "r" (for reading), followed by the file name. Example "+r TestFile.obj+". As a result you'll see that all the strings that are saved in the file can be successfully read back. Then do the same: "+w TestFile.obj Thats Second Test+" and then read again "+r TestFile.obj+". What will happen is that the strings only from the first sessions will be read and the ones from the second session will not.
    I am sorry for making this that long but I couldn't explain it more simple. If someone can give me a solution I'll be happy to hear it! ^.^ I'll also be glad if someone propose different approach of the problem! Here is the code:
    import java.io.*;
    class Fio
         public static void main(String[] args)
              try
                   if (args[0].equals("w"))
                        FileOutputStream fos = new FileOutputStream(args[1], true);
                        ObjectOutputStream oos = new ObjectOutputStream(fos);
                        for (int i = 2; i < args.length ; i++)
                             oos.writeObject(args);
                        fos.close();
                   else if (args[0].equals("r"))
                        FileInputStream fis = new FileInputStream(args[1]);
                        ObjectInputStream ois = new ObjectInputStream(fis);
                        for (int i = 0; i < fis.available(); i++)
                             System.out.println((String)ois.readObject());
                        fis.close();
                   else
                        System.out.println("Wrong args!");
              catch (IndexOutOfBoundsException exc)
                   System.out.println("You must use \"w\" or \"r\" followed by the file name as args!");
              catch (IOException exc)
                   System.out.println("I/O exception appeard!");
              catch (ClassNotFoundException exc)
                   System.out.println("Can not find the needed class");

    How to read appended objects from file with ObjectInputStream? The short answer is you can't.
    The long answer is you can if you put some work into it. The general outline would be to create a file with a format that will allow the storage of multiple streams within it. If you use a RandomAccessFile, you can create a header containing the length. If you use streams, you'll have to use a block protocol. The reason for this is that I don't think ObjectInputStream is guaranteed to read the same number of bytes ObjectOutputStream writes to it (e.g., it could skip ending padding or such).
    Next, you'll need to create an object that can return more InputStream objects, one per stream written to the file.
    Not trivial, but that's how you'd do it.

  • How can I fill a table of objects from cursor with select * bulk collect???

    Hi All, I have a TYPE as OBJECT
    create or replace type dept2_o as object (
    deptno NUMBER(2),
    dname VARCHAR2(14),
    loc VARCHAR2(13));
    I can fill a table of objects from cursor with out select * bulk collect...., row by row
    declare
    TYPE dept2_t IS TABLE of dept2_o;
    dept_o_tab dept2_t:=dept2_t();
    i integer;
    begin
    i:=0;
    dept_o_tab.extend(20);
    for rec in (select * from dept) loop
    i:=i+1;
    dept_o_tab(i):=dept2_o(
    deptno => rec.deptno,
    dname => rec.dname,
    loc =>rec.loc
    end loop;
    for k IN 1..i loop
    dbms_output.put_line(dept_o_tab(k).deptno||' '||dept_o_tab(k).dname||' '||dept_o_tab(k).loc);
    end loop;
    end;
    RESULT
    10 ACCOUNTING NEW YORK
    20 RESEARCH DALLAS
    30 SALES CHICAGO
    40 OPERATIONS BOSTON
    But I can't fill a table of objects from cursor with select * bulk collect construction ...
    declare
    TYPE dept2_t IS TABLE of dept2_o;
    dept_o_tab dept2_t:=dept2_t();
    begin
    dept_o_tab.extend(20);
    select * bulk collect into dept_o_tab from dept;
    end;
    RESULT
    ORA-06550: line 6, column 39;
    PL/SQL: ORA-00947: not enough values ....
    How can I fill a table of objects from cursor with select * bulk collect???

    create or replace type dept_ot as object (
    deptno NUMBER(2),
    dname VARCHAR2(14),
    loc VARCHAR2(13));
    create table dept
    (deptno number
    ,dname varchar2(14)
    ,loc varchar2(13)
    insert into dept values (10, 'x', 'xx');
    insert into dept values (20, 'y', 'yy');
    insert into dept values (30, 'z', 'zz');
    select dept_ot (deptno, dname, loc)
      from dept
    create type dept_nt is table of dept_ot
    declare
       l_depts dept_nt;
    begin
       select dept_ot (deptno, dname, loc)
         bulk collect
         into l_depts
         from dept
       for i in l_depts.first .. l_depts.last
       loop
          dbms_output.put_line (l_depts(i).deptno);
          dbms_output.put_line (l_depts(i).dname);
          dbms_output.put_line (l_depts(i).loc);    
       end loop;
    end;
    /

  • Unable to prepare project for publishing the project could not be prepared for publishing because an error occurred. (file already open with write permission)

    Get this message when I try to publish my iMovie project.  "unable to prepare project for publishing the project could not be prepared for publishing because an error occurred. (file already open with write permission)"

    See the discussion here:
    https://discussions.apple.com/message/16784714#16784714

  • Modify Application: Error Message:: Object Variable or With Block variable

    Hi Experts,
    We just installed 5.1 on our server and once the application is installed.
    When I try to modify the aplication I get this error at the end
    This is step what the modify app does:
    Check environment information
    Drop Fact table index.
    Modify fact table.
    Update application information.
    Set Fact table index.
    Create stored procedures and comment table.
    Increase application version and make legal consolidation table.
    Make OLAP database and Journal/Audit reports. / Validate dimension formulas.
    Error Message:: Object Variable or With Block variable
    Any one has any any answer ? I will appreciate your help on this.
    Thanks

    Please check if Reporting Services is working properly.
    Just open IE and type http://nameofserver/reports
    If you receive any error then you have to fix that error verifying configuration of RS.
    If RS is working make sure you add the right information into Server Manager Server Option for RS and Report Server virtual directory.
    Also please check the tblappsetinfo to have all the fields completed (regarding servers).
    Regards
    Sorin Radulescu

  • Run time error '9' object variable or with bloch variable not set in WAD

    Iwas able to open the web application designer see some of the web tenplates that have done by me but now when i try to open any of the web templates its popping up me the msg Internal Error: No template standard property  . Run time error '9' object variable or with bloch variable not set
    What setting needs to be done
    Priya

    It got solved
    Wait until all the web items get loaded then own ur web template

  • UnsupportedOperationException Error while removing object from List

    Hi,
    I need to remove a set of objects from List if it contains similar objects as the other one.
    Follwing is the code snippet of the above scenario...
    List selectedPaxList = new ArrayList();
    TreeSet seatAssignedPaxlist= new TreeSet();
    selectedPaxList.add("1");
    selectedPaxList.add("2");
    selectedPaxList.add("3");
    seatAssignedPaxlist.add("1");
    seatAssignedPaxlist.add("2");
    if(selectedPaxList.containsAll(seatAssignedPaxlist))
    selectedPaxList.removeAll(seatAssignedPaxlist);
    If i do this in java program it executes fine without any error.But if I do the same in JSP I am getting the following error......
    java.lang.UnsupportedOperationException
    at java.util.AbstractList.remove(AbstractList.java:167)
    at java.util.AbstractList$Itr.remove(AbstractList.java:432)
    at java.util.AbstractCollection.removeAll(AbstractCollection.java:351)
    Plz... help me to resolve the issue

    java.lang.UnsupportedOperationException
    at
    java.util.AbstractList.remove(AbstractList.java:167)
    at
    java.util.AbstractList$Itr.remove(AbstractList.java:43
    2)
    at
    java.util.AbstractCollection.removeAll(AbstractCollect
    ion.java:351)
    That stack trace looks wrong to me.
    ArrayList overrides the remove method, so it
    shouldn't be using the method in AbstractList. That's what I thought, too. There is either something missing or it's not an ArrayList. In fact the javadoc of AbstractList.remove sais:
    "This implementation always throws an UnsupportedOperationException."
    So it really looks like another subclass is used.
    Also
    the object it is trying to remove is a list (from
    your exmaple it should be a String)
    I could be wrong.These are just the code references not the parameters, I think.
    -Puce

  • How to remove an object from session with JSF 2.0 + Faceletes

    hi all,
    I have a facelets page which calls a backing bean of session scope. Now when ever i click on this page i want the existing bean object to be removed from the session . In my existing jsp i have a logic something like this to remove the object from session
         <% if (request.getParameter("newid") != null) {
              request.getSession().removeAttribute("manageuserscontroller");
    %>
    Now i have to refactor my jsp to use facelets and i should not be using scriplets anymore . I did try with JSTL but the <c:remove> tag is not supported by facelets.
    Can someone help me how can i refactor this code to work for my facelets?
    I really appreciate your help in advance
    Thank you

    r035198x wrote:
    Redesign things so that the remove is done in a backing bean method rather than in a view page.Exactly that. I tend to cleanup session variables at the start and at the end of a page flow; generally the end is some sort of save or cancel action being invoked through a button but that is application specific.

  • Removing object from kodo datacache doesnt always work.

    Hi,
    We're using a very old version of kodo - 3.4.1 but it seems to work well.
    We seem to have come across an intermittent issue where if we remove an object from the datacache using the DataCache.commit call, then the object doesnt actually seem to get removed. So the application continues operation with the out of date cached object. We get no exception or error though.
    Has anyone see anything like this before? I can't seem to find any available known issues list or bug database for kodo, I guess this isnt available?
    Thanks,
    Dan

    The size will refer to individual DataCache size.
    KodoPersistenceManager.getDataStoreCache() or getDataStoreCache("myCache") will return the default or named datacaches respectively.
    You can evict the L2 cache content on the returned DataCache instance from the previous methods.

  • FDM schema upgrade error 91-Object variable or With block variable not set

    Hi,
    I try to upgrade FDM schema to 11.1.2.1 using FDM Workbench Client.
    From log it looks like we go fine through first 4 steps.
    When it comes to metadata (5 step) FDM gives the same error:
    Step 5: Updating Metadata Failed!
    91-Object variable or With block variable not set
    User that logs into the app is domain admin.
    I just check out that shared folder (FDM folder) has write permission for this user.
    DB user set in connection properties is DB owner.
    Under Dcomcnfg identity set to the right domain user in the following:
    HFMServer
    HFMService
    HSXServer
    HSVDataSource
    Any idea what is wrong?
    regards
    Karol

    idea looks good.
    Completely forgot to install Excel but we have to organize 64Bit version.
    For now 5 points as this is helpfull :) will check tomorrow morning
    thanks
    Karol

  • Preventing write-coalescing with write-behind

    Dear all,
    I am very interested in the write-behind feature but I would like to disable the write-coalescing optimization to see each.
    Is there any way to do this ?
    I feel the fact that CacheStore.storeAll(Map entries) works on cache-key/cache-value makes this impossible to have several cache-value for the same key :-( Not coalescing the consecutive changes on a given entry would require to have a method like CacheStore.storeAll(List<Change>) with Change holding the modification (insert/update/delete), the key and the value.
    Thanks,
    Cyrille
    Cyrille Le Clerc
    [email protected]
    http://blog.xebia.fr

    Thanks for the feedback Robert,
    I implemented this "batch processing without coalescing" thanks a "command queue" colocated with my data.
    Sample : perform async processing on each modification without coalescence of MyEntity identified by MyEntityKey store in "my-entity-cache".
    In my agent/entry-processor, I simultaneously modify my data "MyEntity" and put an entry MyEntityCommand (stored in "my-entity-command-cache"), MyEntityCommand holds enough information to do my async processing. This processing is done asynchronously by MyEntityCommandcacheStore.
    MyEntityCommand is associated with a key MyEntityCommandKey which is composed of MyIdentityKey+sequence-number, MyEntityCommandKey has a KeyAssociation with MyEntityKey to ensure colocation.
    Benefits :
    * There is no coalescence because MyEntityCommandKey contains a unique sequence number.
    * The overweight of this "command queue" is limited because the command object only contains that limited piece of data I need for my async processing (foreign key on the entity + few things) and the queue is self purging thanks to a <expiry-delay> on "my-entity-command-cache".
    Here is a configuration extract
    <distributed-scheme>
      <scheme-name>entity-partitionned</scheme-name>
      <service-name>EntityDistributedCache</service-name>
      <serializer>
      </serializer>
      <backing-map-scheme>
        <local-scheme>
          <scheme-ref>unlimited-backing-map</scheme-ref>
        </local-scheme>
      </backing-map-scheme>
      <thread-count>10</thread-count>
      <autostart>true</autostart>
    </distributed-scheme>
    <distributed-scheme>
      <scheme-name>entity-command-partitionned</scheme-name>
      <service-name>EntityDistributedCache</service-name>
      <serializer>
      </serializer>
      <backing-map-scheme>
        <read-write-backing-map-scheme>
          <cachestore-scheme>
            <class-scheme>
              <class-name>... EntityCommandStore</class-name>
            </class-scheme>
          </cachestore-scheme>
          <internal-cache-scheme>
            <local-scheme>
              <expiry-delay>30s</expiry-delay>
            </local-scheme>
          </internal-cache-scheme>
          <write-delay>10s</write-delay>
          <write-batch-factor>0.5</write-batch-factor>
        </read-write-backing-map-scheme>
      </backing-map-scheme>
      <thread-count>10</thread-count>
      <autostart>true</autostart>
    </distributed-scheme>
    {code}
    Please note that I had to play with <write-batch-factor> to increase the batching factor (ie the number of entries in each CacheStore.store()/storeAll() invocation). Under very high write load, <write-batch-factor> default value of 0 gave me an average of 1.2 entry per CacheStore.store()/storeAll() invocation. My first try of a 0.5 <write-batch-factor> largely increased this average to probably hundreds (my stats are done on an underlying layer, I don't have the exact average).
    Cyrille
    Cyrille Le Clerc
    [email protected]
    http://blog.xebia.fr                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • How to remove objects from pics using LR...

    I am not a professional photographer and I do not have a whole lot of photo editing knowledge. I started out several years ago using Photoshop elements 5 and eventually learned how to do several things like moving objects around or removing them from a photo, or moving someone's face into a different picture if their eyes were closed and lots of nice editing features. I loved it and still love it. My laptop was getting really old and slow so I just recently purchased a new computer which has windows 8 on it. I loaded my photoshop elements and cannot get it to open or work properly and am wondering if it's because of the windows 8 operating system. So I then decided to purchase a new photo editing software. I decided (quickly.. too quickly) to purchase adobe lightroom just because it had very good reviews online. Now I have several pictures that need to be edited for Christmas and I cannot find any "lasso tool" or any thing similar right now.. I was so used to using the layers and features from the PE and now my heart is broken because I feel so lost and maybe broke at the same time!! This program is soooo different than what I was used to using but I don't even know if I can do the same things with it. EXAMPLE: there is an air conditioner unit in one of the photos I've taken and I was planning on removing it from the picture. Also I was aiming on moving a face over from another picture because of some closed eyes.. Someone please tell me, is this program capable of doing these things or can I only play with the lighting in this? Will I have to spend more of my $$ to get another Photoshop elements to get the features I use?

    These tasks are tasks for Photoshop or Photoshop Elements, not for Lightroom, which is more an image data base.

Maybe you are looking for