Avoiding MONSTEROUS overkill in Serialization?

I have a JTextPane that I need to serialize. All I really need to save is the x,y,height,width, and the marked-up text from the field. (font, size, color, bold, italic.) A test case I set up should have taken about 100 to 120 bytes to capture every bit of data I need. Yet when I serialize the instance it ends up writing a file that's 18K long and filled with mountains, foothills, and gently rolling fields of irrelevant data.
That would not be such a problem, except that my actual application has dozens and dozens of JTextPanes in the same application that all need to be saved together into a single XML file. Is there an easy way just to extract the marked-up text from a JTextPane without the truckloads of superfluous garbage that serialize insists on including?
Want I would really dearly love to have is a StyledDocument to HTML conversion, but no such thing seems to exist, so I need to figure out how to grab the data and write such a conversion.
Thanks,
--gary                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

May be you can use XMLEncoder to stor as XML. But obviously you will not be able to select which properties are saved and which are not so if you need more control you will need to implementsomething on your own.
something like this
public class ConfigUtil{
   public void saveConfiguration(JTextPane jtp, String id){
        //Save the setting that you want to save someware against the specified ID
   public void configure(JTextPane jtp, String id){
      //Fetch the settings using the ID and set them to the geven jtp
}

Similar Messages

  • In need help: Analytic Report with Group by

    Good morning,
    I am trying to create a report with subtotal and grand total, which of couse goes to the group by clause, with rollup, cube, grouping... etc. I'd like to use rollup, then some columns in the Select list have to be put into the Group By clause, which is not supposed to be. So I had to use one of SUM, AVG, MIN and MAX functions, to make those columns as *aggregated, which is wrong.
    Another alternative I tried is to use Cube and Grouping_id to be the filter. However, that is still very cumbentsome and error-prone, also the order of the display is absolutely out of control.
    I am trying hard to stick to the first option of using the Rollup, since the result is very close to what I want, but avoid the usage of aggregation functions. For example, if I want to display column A, which should not be grouped. Other than using those aggregation functions, what I can do?
    Thanks in advance.

    Luc,
    this is a simple and a good reference for analytic functions:
    http://www.orafaq.com/node/55
    It takes some time to understand how they work and also it takes some time to understand how to utilize them. I have solved some issues in reporting using them, avoiding the overkill of aggregates.
    Denes Kubicek

  • Try to give reasons for these Q's

    1. Why clone() method to be defined as 'final'? What is the security reason involved in this?
    2.Why Interfaces should not be serializable? What is the security reason involved in this?

    I wanna have the security reason behind them. 'If u
    don't define clone() as final, what is the problem??".
    I want this reason.There is no security benefit to declaring clone() final, it simply means that a subclass can't implement clone(). It doesn't stop the subclass from declaring a method bypassClone() that does the same thing. It does break the idiom that clone() is the way to make a copy of an object.
    As same, reg Interfaces, You can instantiate classes
    that implement an interface that extended
    Serializable. Now, what is the security reason behind
    saying' Avoid making your Interfaces Serializable'I have to restate this as "avoid making an interface that extends Serializable" for it to make sense. I suppose that there's a security consideration in that any class that implements Serializable -- directly or indirectly -- can be written to persistent storage. But the real reason is that it's sloppy coding: interfaces describe a capability of a class, and there's seldom a reason to make some capability dependent on serializability.
    In general, "security" issues vis a vis the contents of Java classes are somewhat silly. With the right tools, you can access anything. The Java security framework limits your ability to actually do anything with what you've examined.

  • EJB3 CMP serializable, avoid phantom problem

    Hi everyone,
    does anyone have a clue how to avoid the phantom problem in container managed persistence in ejb3? All I really need is a "LOCK TABLE" statement.
    How do I tell the entity manager to do such a thing?
    thanks
    basti

    by the way:
    I am trying to use ejb3 out of container - as described in the article http://www.oracle.com/technology/tech/java/oc4j/ejb3/howtos-ejb3/howtooutofcontainer/doc/how-to-ejb30-out-of-container.html

  • Serialize BAPI/RFC executions to avoid locking issue

    We have an XI interface that calls and executes a BAPI/RFC to create an invoice receipt against a purchase order in R/3. 
    While the BAPI/RFC is running, it locks the purchase order.  If another XI call to the BAPI/RFC is initiated to create an invoice receipt against the same purchase order, the second BAPI/RFC call will fail in R/3 due to locking. 
    I have the following questions:
    In XI, can we serialize the execution of the BAPI/RFC so the second call will start only after the execution of the first one is complete in R/3?

    Hi,
    no and yes
    no: not in standard
    yes: there are workarounds - you can wrap the
    BAPI in a RFC in which you can control it
    (I used this solution and it works)
    Regards,
    michal

  • Is Low MTU (ie 320) for VoIP over Frame-relay can be used to avoid serialization delay for large data packets?

    In order to provide voice quality for VoIP, is FR fragmentation equivalent to lowering the MTU size for the serial subinterface of a Frame relay subinterface?
    Is there any isues like stop communications, using Low MTU ?

    If the router belongs to any of the
    platform listed below, then use
    FRF12 for you fragmentation. The MTU size
    lowering also works but this can cause a high over
    head as it can't be specified on a per dlci level.
    With multiple dlci i.e. subinterfaces use per dlci fragmentation.. this helps reduce the overhead of changing the MTU size of the physical interface.
    Snip config example.( __Must be configured on both side of the termination___)
    PHONE 3333312---ROUTERA ----DLCI 100----ROUTERB ---PHONE 2111123
    ROUTER A
    dial-peer voice 1 voip
    destination-pettern 2T
    session target ipv4:10.10.10.2
    int ser 0/0
    encap frame-relay
    frame-relay traffic-shaping
    no ip address
    interface serial0/0.1 point-to-point
    ip add 10.10.10.1 255.255.255.252
    frame-relay interface-dlci 100
    class voice
    map-class frame-relay voice
    frame cir 64000
    frame bc 640
    frame mincir 64000
    frame-relay ip rtp priority 16384 16383 48
    frame fragment 80
    frame fair-queue 64 256 0
    ROUTER B
    dial-peer voice 3 voip
    destination-pettern 3T
    session target ipv4:10.10.10.1
    int ser 0/0
    encap frame-relay
    frame-relay traffic-shaping
    no ip address
    interface serial0/0.1 point-to-point
    ip add 10.10.10.2 255.255.255.252
    frame-relay interface-dlci 100
    class voice
    map-class frame-relay voice
    frame cir 64000
    frame bc 640
    frame mincir 64000
    frame-relay ip rtp priority 16384 16383 48
    frame fragment 80
    frame fair-queue 64 256 0
    This should help if your router is
    c2600, c3600, mc3810, c7200, c1750
    all running the right level of IOS.
    12.1(5)T and above should work well.

  • XML Serializer....!

    Hi guys...!
    I'm writting a program in Java that prints out the content of an XML file. I imported different classes, but some of the class that I imported do not work. I don't know why.
    Since some of the classes do not work, I can't compile my java program.
    Here are the classes that give me error....
    import org.apache.xml.serialize; .============> ERROR
    import org.apache.xml.serialize.OutputFormat; ===========> ERROR
    import org.apache.xml.serialize.Serializer; ===============>ERROR
    import org.apache.xml.serialize.DOMSerializer; ===========>ERROR
    import org.apache.xml.serialize.SerializerFactory; =========>ERROR
    import org.apache.xml.serialize.XMLSerializer; ===========>ERROR
    and here are the codes that don't compile within my java program..
    OutputFormat format = new OutputFormat (doc); ==========>ERROR
    StringWriter stringOut = new StringWriter ();
    XMLSerializer serial = new XMLSerializer (stringOut, format); ==>ERROR
    serial.serialize(doc);
    System.out.println(stringOut.toString());
    Can you guys please help me finding out what is going on in my codes????
    Any help will be appreciated...
    Thanks...
    --- Spirit_Away

    While this problem is simple to someone who's battled classpath problems many, many times, it's frustrating as hell to those who haven't.
    You have a run time classpath and a compile time classpath.
    A compile-time classpath is used by javac or your IDE to find the classes that you're referencing when your code is compiled.
    If you compile a simple class from the command-line with javac, you'd use the -classpath option to specify to the Java compiler where it needs to look to find the classes you've referenced in your class.
    If you use an IDE, you need to tell your IDE where to find the classes you're referencing in your class -- each IDE does this slightly differently. In Eclipse, it's referred to as your "build path" and it contains either folders of classes or JAR files. As you add JARs or folders of classes to your build path, you make those classes available at compile time to your classes that you're compiling.
    If you manage to include those properly, you'll end up with YourClass.class -- a compiled version of your class.
    Now, to invoke your class, you need to make sure that the same classes/JARs that you put in your build path are also available to the Java Virtual Machine when you execute your class -- this is done through the CLI with the -classpath option to the java command.
    There are a few ways to get those classes in your runtime classpath.
    You can:
    a) set an environment variable called CLASSPATH which includes the JARs/class folders so that every time the jvm is invoked (e.g. prompt$ java MyClass ) those classes are included in the runtime classpath
    b) throw all of your dependent JARs/classes into $JAVA_HOME/jre/lib/ext so that anytime anyone on that machine executes the JVM, those dependent classes are included in the system-wide classpath
    c) include the JARs/class folders in the classpath argument when invoking the JVM (e.g. prompt$ java -classpath=".:HelperClasses.jar" MyClass )
    Personally, I find a) and b) to be bad ideas. Especially if you build Java apps to distribute. It makes it far too easy to forget to include dependent JARs and classes, because while they work for you on your system, they won't work on another system, unless that system also has the dependent JARs/classes installed properly in the classpath -- either for that system or the executing user.
    Another drawback is that by silently including classes in your runtime classpath, you can unknowingly create class conflicts as you can end up accidentally loading multiple copies of the same class -- sometimes different versions -- and you don't realize that the one you THINK is executing is actually not the one that ACTUALLY is.
    For this reason, I strongly suggest keeping your system class path and CLASSPATH variables empty and using option C -- until you have a strong handle on how classpaths work.
    Some people will throw all of their dependent classes/jar files into $JAVA_HOME/jre/lib/ext to avoid having to add them to their runtime classpath.
    In your case, you'd need to include the JAR files as options to the -classpath argument when you invoke the JVM.

  • Question concerning java Serialization of a complex internal field variable.

    Not everything in the J2SE implements serializable by default.  Say BufferedImage.
    I have learned from an online artical that absolutely all fields to be serialized must
    implement Serializable, including internal ("global") class fields,
    a la
    http://javarevisited.blogspot.com.au/2011/04/top-10-java-serialization-interview.html
    (point 5).
    However, for my purposes, I cannot re-implement (extend) and recompile every and any
    java class I would want to serialize, eg. anything which is a "complex" sub field of
    a conceptual class I do want to serialize, which doesn't occur on the java 1.6 list
    of default serializable classes:
    a la
    http://docs.oracle.com/javase/6/docs/api/java/io/Serializable.html
    and
    "If member variables of a serializable object reference to a non-serializable object, the code will compile but a RumtimeException will be thrown."
    Understanding there are implications for the java security model, is there anything to be done for these non-serializable-implementing class fields?
    Is there a Serialization add on kit for Java or something?

    Indeed, however, with a distinct lack of success with the instrumentation api, my request is dualfold now.
    I do understand the restrictions explained here concerning more complex classes and the Serializable API.
    I have found that there is an input line that needs to be submitted when running an Inustrumentation, premain Agent class.
    However, I would have to avoid that by doing System.setProperty(a,b); appropriately.
    Why won't the following just work, without any further supplied input parameters?
       import static java.lang.System.out;
       import java.io.PrintStream;
       import java.lang.instrument.Instrumentation;
       public class InstrumentationAgent
          private static volatile Instrumentation globalInstrumentation;
          public static void premain(String paramString, Instrumentation paramInstrumentation)
             System.out.println("premain...");
             globalInstrumentation = paramInstrumentation;
          public static void agentmain(String paramString, Instrumentation paramInstrumentation)
             System.out.println("agentmain...");
             globalInstrumentation = paramInstrumentation;
          public static void main(String ... args)
             out.println("!");
             String data = new String("data");
             long size = getObjectSize(data);
             out.println(size);
          public static long getObjectSize(Object paramObject)
             if (globalInstrumentation == null)
                throw new IllegalStateException("Agent not initialized.");
             return globalInstrumentation.getObjectSize(paramObject);
    I am still curious as to how the DataFlavor approach I have outlined above.  Does it accomplish a shallow or deep conversion?  Are
    there other limitations which restrict what's put to the underlying ByteOutputStream?
    Why won't my agent class execute though, regardless?

  • Domain design causing full object graph serialization

    Hello,
    I work on a project, based on 3-layer-architecture. We have rich client developed on swing, application server and we persist our domain objects in the database.
    We map the domain objects using hibernate, send the same objects (means, we use POJOs and don't use DTO) via spring from application server to the server and visa versa in case the objects are to be saved, deleted etc.
    We use an object identity store on the client side, in which all already loaded objects are stored.
    My challenge is as follows:
    - Since an object is stored in the object store, the same certain object is used in every GUI module. Imagine, we have customer, account and a list of transactions per account. In practice, let's say:
    1 customer
    2 accounts
    1000 transactions per account which ever exist
    - In one of the modules, the used loaded all list of transactions. So, at that moment, 1 customer object, 2 accounts and 2000 transaction objects are bound to each other via java references (object graph).
    - On another module, the user wants to edit a very small attribute of customer object, say he corrects his birthday date and saves it.
    - Saving operation sends the customer object to the server. But the problem is, a very big object graph hangs on this one little customer object and are serialized to the server, although the
    service.save(customer);is called, and this costs extremely much time for our application.
    boldSOLUTION:*bold* We solved the problem with a hack, as we used the serialization over XStream framework from Thoughtworks in xml-format. We could set a cut in the serialization and have serialized only the customer object for that use case.
    So the problem for the save-use case was solved. This has a disadventage: the loading time (server-client (de-)serialization) using xml is longer than the java standard, especially remarkable if you load a big number of objects.
    I've read many articles about the enterprise architectures which are related to domain objects, dto, serialization. It is not uncommon to avoid DTOs and use POJOs on the server and client as well, if the domain objects don't contain business logic. So, I just wonder, how the problem above should be solved smartly, if the non-DTO, say POJO alternative is chosen.
    I appreciate your opinions.
    I hope my question is not too long. Thanks in advance,
    A sample code:
    (The concrete case in our application requires the navigability from customer to account)
    public class Customer implements Serializable {
         private Set<Account> acounts = new HashSet<Account>();
         // Getters, setters
    public class Account implements Serializable {
         private Set<AccTransaction> acounts = new HashSet<AccTransaction>();
         // Getters, setters
    public class AccTransaction implements Serializable {
         private String operation;
         private Timestamp timestamp;
         // Getters, setters
    }Alper

    acelik wrote:
    Thanks for the quick answer.
    The numbers are just wildly guesses, in order to keep the sample simple.
    Actually we develop a system for automobile industry. So, instead of customer, you may think of a company, like Ford. Account would be then Fiesta. Instead of AccTransaction we may think a huge number of car pieces (clima, motor, radio, etc.).
    So hierarchically we have
    Company (Ford, Opel, etc)
    Model (Fiesta, Astra, etc.)
    Piece (1000 pieces per model, so Fiesta, Opel have 1000 different pieces each)
    The problem is, if I loaded the Ford->Fiesta->1000x pieces for one view and then switch to another view. In the second view, I rename Fiesta to Fiesta II, and want to save this renaming operation using:
    What exactly does that represent?
    If you want to rename an existing car then you need logic to do exactly that. You do not copy the entire tree.
    If however you do want to copy the entire tree then that is in fact what you must do. Although I seriously doubt that there is a valid business requirement for that. In terms of efficiency though you could write a proc that did that rather than propogating the tree.
    Although further if you do in fact have two cars which use the same inventory item(s) then, again, you do not copy the entire tree because you are not creating a brand new inventory item. Instead you should only be pointing to an existing item.
    I wonder how this problem is solved smartly, if you guys share domain objects between the client and server, and avoid DTO.The first step is to start with actual business requirements.

  • What is the serialization concept in ALE/IDOC?

    what is the serialization concept in ALE/IDOC?

    Hi Srinu ,
    IDoc Serialization means, sending/posting the idocs in sequence.
    We serialize IDocs in the following cases:
    · If you want the Integration Server to process the corresponding IDoc XML messages in the same sequence that it receives them from the IDoc adapter at the inbound channel.
    · If you want the receiver to receive the IDocs in the same sequence that the IDoc adapter sends them at the Integration Server outbound channel.
    The sequence at the Integration Server inbound or outbound channel can only be guaranteed if only IDocs are processed, and not if different protocols (for example, IDocs and proxies) are processed together.
    Do not confuse IDoc serialization using the IDoc adapter with the ALE serialization of IDocs.
    Prerequisites
    · The quality of service EOIO (Exactly Once In Order) must be specified in the message header.
    · The receiver system or the sender system must be based on SAP Web Application Server 6.40 or higher. If this is not the case, the quality of service is automatically changed to EO for compatibility reasons and the message is processed accordingly.
    Procedure
    If you want the Integration Server to process the IDoc XML messages created by the IDoc adapter in the same sequence that the IDocs are sent by your application, proceed as follows:
    · Enter a queue name in your application. You can use 16 alphanumeric characters. The prefix SAP_ALE_ is then added.
    The IDoc adapter checks the prefix and replaces it with the prefix of the corresponding Integration Server inbound queue (for example, XBQI0000).
    If you want the receiver to receive the IDocs in the same sequence that they are sent by the Integration Server using the IDoc adapter, proceed as follows:
    · In the communication channel, select the check box Queue processing for the receiver.
    The IDoc adapter replaces the prefix of the outbound queue (XBQO) with the prefix SAP_ALE_.
    You can display the individual messages in the qRFC monitor of the outbound queue. To do this, do one of the following:
    ¡ Use the queue ID in the list of displayed messages in the monitor for processed XML messages.
    ¡ Use the transaction ID in the list of displayed XML messages in the IDoc adapter.
    ¡ Call the transaction qRFC Monitor (Outbound Queue)(SMQ1).
    To navigate directly to the display of messages in the IDoc adapter, double click the transaction ID of a message in the outbound queue.
    To do this, you must have registered the display program IDX_SHOW_MESSAGE for the outbound queue in the qRFC administration (transaction SMQE) beforehand.
    In both cases, the function module IDOC_INBOUND_IN_QUEUE is called, which enables EOIO processing of the messages. The processing sequence is determined by the sequence of the function module calls.
    Unlike the other function modules (interface versions from the communication channel), with this function module you have to transfer segment types rather than segment names in the data records.
    Serialization of Messages
    Use
    Serialization plays an important role in distributing interdependent objects, especially when master data is being distributed.
    IDocs can be created, sent and posted in a specified order by distributing message types serially.
    Errors can then be avoided when processing inbound IDocs.
    Interdependent messages can be serially distributed in the following ways:
    Serialization by Object Type
    Serialization by Message Type
    Serialization at IDoc Level
    (not for IDocs from generated BAPI-ALE interfaces)
    Serialization at IDoc Level
    Use
    Delays in transferring IDocs may result in an IDoc containing data belonging to a specific object arriving at its destination before an "older" IDoc that contains different data belonging to the same object. Applications can use the ALE Serialization API to specify the order IDocs of the same message type are processed in and to prevent old IDocs from being posted if processing is repeated.
    SAP recommends that you regularly schedule program RBDSRCLR to clean up table BDSER (old time stamp).
    Prerequisites
    IDocs generated by BAPI interfaces cannot be serialized at IDoc level because the function module for inbound processing does not use the ALE Serialization API.
    Features
    ALE provides two function modules to serialize IDocs which the posting function module has to invoke:
    · IDOC_SERIALIZATION_CHECK
    checks the time stamps in the serialization field of the IDoc header.
    · IDOC_SERIAL_POST
    updates the serialization table.
    Check the following link:
    http://help.sap.com/saphelp_nw04s/helpdata/en/0b/2a66d6507d11d18ee90000e8366fc2/frameset.htm
    http://help.sap.com/saphelp_nw04s/helpdata/en/78/2175a751ce11d189570000e829fbbd/frameset.htm
    Ex: ADRMAS, DEBMAS(customer)
    ADRMAS, CREMAS(Vendor)
    In this case, Before posting Customer Data or Vendor Data it requires Address Data.
    Rgds
    Sree m

  • Known Issue: Serializable nested types result in an internal compiler error in "Release" built C#/VB UWP apps (Windows 10 Insider Preview SDK and tools, April 2015 release)

    If you have a type nested inside of a type deriving from a WinRT type, and the nested type is marked Serializable (such as with [DataContract]), your build will fail.
    This code will cause this failure.
        public sealed partial class MainPage : Page
            public MainPage()
                this.InitializeComponent();
                var d = new DataContractSerializer(typeof(MyData));
            [DataContract]class MyData { }

    Avoid serializable types nested inside of WinRT types.

  • Trying to serialize a simple object

    Hi,
    I can't seem to create an ObjectInputStream. The error is noted in the code below. I also noticd that the output file where I seemingly wrote the object without problems is empty.
    import java.io.*;
    //Class from which an object is serialized:
    class MyClass implements Serializable
         private int num;
         private String str;
         public MyClass(int num, String str)
              this.num = num;
              this.str = str;
         public MyClass()
              num = 0;
              str = null;
         public void show()
              System.out.println(num);
              System.out.println(str);
    public class DemoSerialization
         public static void main(String[] args)
              MyClass A = new MyClass(50, "hello serialization");
              A.show();
    /*Create output file:*****************/
              File myFile = new File("C:\\TestData\\javaIO.txt");
              try
                   myFile.createNewFile();
              catch(IOException exc)
                   System.out.println("couldn't create output file");
                   System.exit(1);
    /*Create output stream:*************/
              ObjectOutputStream out = null;
              FileDescriptor fd = null;     //FileDescriptor represents a connection to a file.  Then,
                                                           //you don't have to do all the file checking, etc, when
                                                           //you move the data in the other direction(see below)
              try
                   FileOutputStream fos = new FileOutputStream(myFile);
                   fd = fos.getFD();
                   out =      new ObjectOutputStream(
                                  new BufferedOutputStream(fos));
              catch(FileNotFoundException exc)
                   System.out.println("couldn't find the file while during creation of file output stream");
                   System.exit(1);
              catch(IOException exc)
                   System.out.println("io error creating output file stream");
    /**/     System.out.println("test1: " + fd); // "java.io.FileDescriptor@df6ccd"
    /*Serialize the object:***************/
              try
                   out.writeObject(A); 
              catch(InvalidClassException exc)
                   System.out.println("class definition of object being written to a file has a problem");
              catch(NotSerializableException exc)
                   System.out.println("class of the object did not implement Serializable");
              catch(IOException exc)
                   System.out.println("general file output error occurred");
    /*Create input file stream:***********/
              FileInputStream fis = new FileInputStream(fd);
              BufferedInputStream bis = new BufferedInputStream(fis);
    /**/     System.out.println("test2: " + bis);  // "java.io.BufferedInputStream@b89838"
              ObjectInputStream in = null;
              try
                   in = new ObjectInputStream(bis);      /*****ERROR--throws exception*****/
              catch(IOException exc     )
                   System.out.println("ObjectInputStream error");
    /**/     System.out.println("test3: " + in);  // "null"
    /*Read in the object:****************/
              MyClass B = null;
              try
                   B = (MyClass) in.readObject();  /***ERROR--NullPointerExecption(see above)***/
              catch(ClassNotFoundException exc)
                   System.out.println("no class definition exists in this program for the object read in");
              catch(InvalidClassException exc)
                   System.out.println("class def is present but there is something wrong with it");
              catch(StreamCorruptedException exc)
                   System.out.println("control info in stream is corrupt");
              catch(OptionalDataException exc)
                   System.out.println("attempted to read in a basic data type--not an object");
              catch(IOException exc)
                   System.out.println("general stream error occurred during read");
              B.show();
    }

    Your originaly source did not include the call to the
    close method of ObjectOutputStream which is why your
    file was created but empty. I assumed you added it
    after the call to writeObject.
    Well, that isn't the case. I posted that I added flush().
    Read the API docs on the FilterDescriptor class.
    This class basically represents an opened file (or
    r other open streams) in either an inputstream OR
    outputstream. So when you close the outputstream.
    The file is not open anymore and thus the
    e FileDescriptor object is no longer valid because it
    represents and open stream.
    I didn't have to read the API docs because to me that seemed like common sense: if I have an object that represents a connection to a file, and I close the connection, it makes sense to me that the object would no longer be valid. Therefore, I didn't use close() in my program.
    As far as avoiding lots of try blocks, why not wrap
    the whole program in a try block? I assume you're
    using all those System.out.println commands to help
    debug. It'd be more helpful if you called
    printStackTrace on the thrown exception when an error
    occurs.
    Ok, thanks.
    I've done some more tests, and the problem isn't specific to serialization. I'm also unable to use a FileDescriptor when reading a char from a file. I get the same "access denied" error:
    import java.io.*;
    class  DemoFileDescriptor
        public static void main(String[] args)
            //Create a file:
            File myfile = new File("C:\\TestData\\javaIO.txt");
            try
                myfile.createNewFile(); 
            catch(IOException exc)
                System.out.println(exc);
                System.out.println("io exception createNewFile()");
            //Create a FileInputStream:
            FileOutputStream out = null;
            try
                out = new FileOutputStream(myfile);
            catch(FileNotFoundException exc)
                System.out.println(exc);
                System.out.println("error creating FileOutputStream");
            //Get the FileDescriptor:
            FileDescriptor fd = null;
            try
                fd = out.getFD(); //a file connection
            catch(IOException exc)
                System.out.println(exc);
                System.out.println("trouble getting FileDescriptor");
            //Output something to the file:
            char ch = 'a';
            try
                out.write((int) ch);
            catch(IOException exc)
                System.out.println(exc);
                System.out.println("io error writing to file");
            //Create a FileInputStream using the FileDescriptor:
            FileInputStream in = new FileInputStream(fd);
            //Read back from the file:
            char input = ' ';
            try
                input = (char) in.read();  /**ERROR-access denied**/
            catch(IOException exc)
                System.out.println(exc);
                System.out.println("problem reading");
                exc.printStackTrace();
            System.out.println(input);
    } Here is the output and stack trace:
    ---------- java ----------
    java.io.IOException: Access is denied
    problem reading
    java.io.IOException: Access is denied
         at java.io.FileInputStream.read(Native Method)
         at DemoFileDescriptor.main(DemoFileDescriptor.java:69)
    Output completed (2 sec consumed) - Normal Termination

  • Serialization error in OIM 11g

    Hi,
    I have a custom jar file in OIM 11g which is causing this issue,
    <All session object
    s should be serializable to replicate. Check the objects in your session. Failed
    to replicate non-serializable object.
    java.rmi.UnmarshalException: error unmarshalling arguments; nested exception is:
    java.lang.ClassNotFoundException: Failed to load class ...
    Please suggest what should be done to avoid this.
    M

    User,
    please. please,please...
    I don't like to extract each peace of information using at least two posts.
    Is you custom class serializable? Does the class implement the Serializable interface to make it serializable? If not implement this interface (just add implements Serializable to your class definition) and try again.
    If this doesn't help you should give more and detailed info.
    Timo

  • How to avoid deadlock - ORA-00060 in trigger

    Hi,
    We are using 11.2.0.3 and have a trigger which acts upon a mutex table.
    Trying to catch when value from other program has changed locked value from 0 to 1 to indicate they have finished.
    Once they finished we need to lock the mutex table by updating locked value.
    Code below gives error ORA-00060 - deadlock.
    Any idea how to avoid but allow us to updat ethe mutex table in trigger at star t to say we are locking and at end to indicate we are finished.
    Thanks
    CREATE OR REPLACE TRIGGER example_trg AFTER UPDATE
    ON por_zic_flags
    FOR EACH ROW
    when (NEW.locked = '0' and old.locked = '1')
    declare
    PRAGMA AUTONOMOUS_TRANSACTION;
    I INTEGER;
    v_locked por_zic_flags.locked%type;
    begin
    select locked
    into v_locked
    from por_zic_flags
    --where tabname = ‘table name’
      for update;
    update por_zic_flags set locked = '2';--, guid = '100'; -- unique seq#
    commit;
    -- call procedure do processing on other tables
    -- Now indicate finished processing
    select locked
    into v_locked
    from por_zic_flags
      for update;
    update por_zic_flags set locked = '0';--, guid = '100'; -- unique seq#
    commit;
    end;

    >
    Trying to catch when value from other program has changed locked value from 0 to 1 to indicate they have finished.
    Once they finished we need to lock the mutex table by updating locked value.
    Code below gives error ORA-00060 - deadlock.
    Any idea how to avoid but allow us to updat ethe mutex table in trigger at star t to say we are locking and at end to indicate we are finished.
    >
    Triggers should not be used for non-transactional purposes like this.
    A simple SELECT . . . WHERE (myLock = 'UNLOCKED') FOR UPDATE can be used to query the record and lock it if the lock is not already set. Then your code can update the lock value and that will prevent the same query by other users from being successful.
    But you haven't really provided any information about what you are trying to do.
    If you want to prevent two processes from executing at the same time you can use Oracle's LOCK functionality.
    See this thread for how to use DBMS_LOCK to create your own locks that can serialize access to your procedures.
    Re: possible to lock stored procedure so only one session may run it at a time?

  • Inbound scenario using tRFC - serialization

    Hi Guys,
    I have the following scenario:
    An external System is communicating with SAP using tRFC queue.
    So from the SAP perspective this is an inbound scenario.
    I am wondering if the sequence of the IDocs that are attached to the tRFC queue by that external system
    is kept while they get processed on the SAP side.
    To say it in other words: Will SAP process the IDOcs in the same sequence as they were attached to the RFC queue before?
    So if there is a communication error will be the sequence kept after communication is established again?
    Or do I have to build the timestamp serialization - what I want to avoid as the ALE input function module doesn't provide this.
    Thanks a lot.
    Achim

    Hi Dheeraj,
    the challenge with the "map the TLOG and create WPUUMS Idoc" is that inside the TLOG you are getting receipt based data. But for the WPUUMS IDOC you'd be expecting to see aggregated information.
    The better and recommended way to integrate SAP POS with SAP Retail is to use the delivered standard integration with the following two data streams:
    1. SAP Retail master data (e.g. articels, prices, promotions) to SAP PI to SAP POS
    2. SAP POS - Transnet - SAP PI - SAP POS DM - SAP BW/SAP Retail
    The advantage would be that aggregation is handled in POS DM and you have all the detail available in SAP BW and that you will be able to utilize the data collected for other processes as well.
    Please check the Wiki for POS Integration which will give you a good overview about the integration and covered scenarios:
    https://wiki.sdn.sap.com/wiki/display/CK/ExchangeInfrastructure-SAPPOS+Integration
    For SAP POS DM you will find a wiki here:
    https://wiki.sdn.sap.com/wiki/display/Retail/SAPPOSDM%28SAPPoint-of-SaleDataManagement%29
    Additionally there are two courses focussing on this topics:
    Integration: W26TGI
    POS Data Management: W26POS
    Kind regards,
    Stefan

Maybe you are looking for

  • File type changing during "Save As" - Help!

    Good Morning!  I have a set of tif files converted from RAW that I'm resizing for an order for a client and every time I do a "Save As", the file gets changes to an extension of $CB, then Adobe and nothing else will recognize it. The history; Convert

  • Error opening PDF on photoshop error code 0X20030055

    Hi you all, I desperaly need help. I've created a pdf file (with illustrator cs3). when I try to open it on photoshop CS3 I receive this message: "error opening the portable document file (PDF) document. error code = 0X20030055" I did not find anythi

  • Gmail - receiving sent posts

    Has anyone experience with GMAIL, emails that is sent out from the iPhone goes to the receiver and also will post the same email back into the Gmail Inbox? Yahoo email does not have this issue.

  • Cannot open raw pic from A Panasonic G5.

    Hi, I cannot open raw pics from a Panasonic G5, it's Lightroom 5. Thanks!

  • Does  "NatSpeak" really work?

    Has anyone used the speach recognition software "NatSpeak 9"? If so, what is your opinion. Would like to use it in law office. Power Mac G4 Mac OS X (10.4.6)