Serialization format

Hi,
does anyone know a good reference source for describing how java encodes objects? My problem is that i need to be able to write a program in Perl to decode a simple object created by Java. I can't use XML, i have to read the native serialization format. Tried searching in various places but yet to find something that actually describes the format.
Thanks

Thanks! dont know why i couldnt find that - exactly what i needed & i can now do what i need.
I wonder if there are any open source projects designed to decode serialised objects into perl, probably not but if anyone knows otherwise......

Similar Messages

  • Enum serialization

    Has anyone ever tried to use the RemoteClass metatag for
    serializing between Java and AS3 implementations of an enum? The
    guide gives a good example of how to fake an enum in AS3, but I'm
    assuming I would have to develop custom serialization for enums.
    True?

    Here's the best way to serialize an enum I can think of.
    Here's an example of an AS3 enum:
    public class MyEnum {
    public static var APPLES:MyEnum = new MyEnum("APPLES");
    public static var ORANGES:MyEnum = new MyEnum("ORANGES");
    private var name:String;
    public function MyEnum(str:String) {
    name = str;
    public function toString() {
    return name;
    Unfortunately, implementing the IExternalizable interface for
    the enum class is pretty much useless. By the time readExternal()
    is called on your AS3 enum object, it has already been created and
    you have no hope of limiting the instances of your enum class to
    those defined as static members. AS3 serialization does not allow a
    class to define how it is created, only how it populates it's own
    fields after it has been created, which happens mysteriously at the
    top of the call stack for the main thead.
    Without any way of controlling the creation of the enum
    object such as registering a Factory or implementing readResolve()
    as in Java, you have to implement IExternalizable in an enclosing
    class such as:
    public class MyEnclosure implements IExternalizable {
    var myEnum:MyEnum;
    public function readExternal(input:IDataInput):void {
    myEnum = MyEnum[input.readUTF()];
    public function writeExternal(output:IDataOutput):void {
    output.writeUTF(myEnum.toString());
    Of course, the Java implementation must have an analogous
    implementation for IExternalizable. If your Java class does not
    implement IExternalizable, your AS3 readExternal() method will not
    be called, even though the standard Java enum serialization format
    is just fine for our purposes.
    Worse, if your enum is a member of multiple enclosing value
    objects, you have to externalize the serialization for all of these
    classes.
    Even worse, the default AS3 deserialization (unfortunately)
    does
    something for a Java enum when the IExternalizable interface
    is not implemented. It creates a new instance of your enum, and
    apparently throws away the "name" that is included in the Java
    serialization stream.
    Why?? At the Flex framework level, implementing serialization
    using a scheme I have defined here would be trivial. However, if
    that gets implmented now it would break backward compatibility. The
    only options for framework support of deserialization of enums
    would be to formally introduce enums into the language (not
    happening), or to create a [RemoteEnum] metatag that would enforce
    the semantics I describe, or to offer something analogous to the
    Java readResolve() method.
    The upshot is that every class that aggregates an enum, must
    implement IExternalizable, instead of the more intuitive approach
    of implementing IExternalizable for the enum class itself.
    For completeness, here is the Java implementation of the
    enclosing class:
    public class MyEnclosure implements IExternalizable {
    private MyEnum myEnum;
    public void writeExternal(ObjectOutput output) throws
    IOException {
    output.writeUTF(myEnum.name());
    public void readExternal(ObjectInput input) throws
    IOException, ClassNotFoundException {
    myEnum = Enum.valueOf(MyEnum.class, input.readUTF());

  • Custom serialization - could you do this?

    Hi
    I'll describe a partially hypothetical situation to allow me to ask the questions I want with some context.
    Let's say someone gave you the responsibility of implementing the following API with the specified contract:
    * Convert <code>object</code> to a serialized format and save in the file
    * <code>saveDestination</code>. You are free to choose the serialization format.
    public void serializeAndSave(java.io.Serializable object, java.io.File saveDestination);
    * Deserialize objects serialized by the <code>serializeAndSave()</code> method.
    public java.io.Serializable deserialize(java.io.File file);
    Additionally, let's say the following were conditions and restrictions were specified by the person asking you to implement these methods:
    * The version of the class for <code>object</code> or any of the super-classses in its class hierarchy or the type of any fields will never change.
    * The fields of <code>object</code> will probably be private. Fields consistent with the java.io.Serializable contract, must be serialied (for instance, transient fields do not need to be serialized).
    * You will never have any information about how any declared methods in <code>object</code> relate to declared fields, so you cannot rely on declared methods for getting field values when serializing or setting field values when deserializing.
    * You will never have information about how parameters in declared constructors relate to declared fields, so you cannot rely on declared constructors for setting field values when deserializing.
    * Besides the java.io.Serializable interface and the contract specified by Java when implementing the Serializable interface (eg. empty constructor), <code>object</code> will not necessarily conform to any interface or API that you will have prior knowledge of.
    * <code>object</code> may have an inheritance hierarchy to any depth. Super-classes in the class hierarchy may have fields too.
    * Super-classes in the object hierarchy, and their fields, are subject to the same conditions and constraints described here for <code>object</code>
    OK, there's a description of the hypothetical task someone has asked you to complete.
    Luckily, this is very easy to do. You can simply rely on Java's inbuilt serialization mechanisms. You can use java.io.ObjectOutputStream and java.io.ObjectInputStream - it's all very easy. Java's inbuilt serizlization/deserialization has no problem with private fields and no problems with super-classe fields even if private.
    Now let's say I changed your task by changing the contract for the serializeAndSave() method as follows:
    * Convert <code>object</code> to an XML serialization format and save in the file
    * <code>saveDestination</code>.
    public void serializeAndSave(java.io.Serializable object, java.io.File saveDestination);
    Notice that the class must now be serialized in XML format*.
    How would you implement the serializeAndSave() and deserialize() methods now? In particular:
    * In serialzeAndSave(), how would you obtain the values of private fields in any of the super-classes in the object hierarchy?
    * In deserialize(), assuming you succussfully implemented the serializeAndSave() method, how would you set the value of any of the private fields in <code>object</code> or any of the super-classes in <code>object</codes>'s class hierarchy?
    Cheers.

    >>>
    What you're proposing is a feeble solution, but if you insist on using it, it's genuinely laughably trivial to implement. What have you been doing for the last 15 years?
    <<<
    Yeah been getting that a lot. Funny thing is, no-one can back it up.
    Here's your opportunity for you to show that you are right and I a wrong. Provide the implementation for showValueOfA(), setValueOfA(), setValueForB(). I'll go out on a limb right now and tell you that you cannot do it, despite you trying to make out that you have a complete understanding of the technical issue at hand.
    package com;
    import java.lang.reflect.Field;
    public final class DisplayUnderstanding{
         public static void showFields(Data data){
              Class clazz = null;
              Field[] fields = null;
              Field field = null;
              clazz = data.getClass();
              fields = clazz.getDeclaredFields();
              for(int i = 0, n = fields.length; i < n; ++i){
                   field = fields;
                   System.out.println(field.getName() + ": " + field.getType());
         * Will write to <code>System.out</code> the value of the field <code>a</code>
         * in the instance <code>data</code>.
         public static void showValueOfA(Data data){
              /* IMPLEMENTATION REQUIRED */
         * Will change the value of the field <code>a</code> in the instance
         * <code>data</code> to <code>value</code>.
         public static void setValueOfA(Data data,int value){
              /* IMPLEMENTATION REQUIRED */
         * Will change the value of the field <code>b</code> in the instance
         * <code>data</code> to <code>value</code>.
         public static void setValueOfB(Data data,int value){
              /* IMPLEMENTATION REQUIRED */
         public static class Data extends AbstractData{
              private int b = 0;
              public Data(int a,int b){
                   super(a);
                   this.b = b;
         public static abstract class AbstractData{
              private int a = 0;
              public AbstractData(int a){
                   this.a = a;

  • How to use Avro Serialization for ASA input

    Hi There,
    I am trying to use Avro serialization format for Streaming Analytics input data. I am using Event Hub as my data source. Based on the answer to a similar question in this forum about CSV format, it seems like ASA expects header/schema information to
    be included with each event, rather than defined externally. So, each of my input events is a proper Avro Data File, with schema included followed by multiple binary records. I have tried to verify this format using the "Test" button in the Azure
    Streaming Portal. When I upload a file containing my Avro event, the portal returns "Unable to parse JSON" error. I have double checked that my input is indeed marked as Avro serialization and not JSON.
    Can you let me know what the expected Avro data format is? 
    Thanks,
    Dave

    Hi Mahith,
    I am now able to reproduce the FieldsMismatchError. I am attaching the log. Could you help debug this issue?
    Correlation ID:
    000593f0-4c05-412b-b096-2cf818bf6e9f
    Error:
    Message:
    Missing fields specified in create table. Fields expected: avgLight, avgOffLight, avgHum, avgLrTemp, avgBrLight, avgBrTemp, avgLrLight, avgOffHum, avgLrHum, avgOffTemp, avgBrHum, avgTemp, groupId, ts. Fields found: avgLight, avgOffLight, avgHum, avgLrTemp, avgBrLight, avgBrTemp, avgLrLight, avgOffHum, avgLrHum, avgOffTemp, avgBrHum, avgTemp, groupId, ts.
    Message Time:
    2015-01-15 00:28:59Z
    Microsoft.Resources/EventNameV2:
    sharedNode92F920DE-290E-4B4C-861A-F85A4EC01D82.hvac-input_0_c76f7247_25b7_4ca6_a3b6_c7bf192ba44a#0.output
    Microsoft.Resources/Operation:
    Information
    Microsoft.Resources/ResourceUri:
    /subscriptions/d5c15d24-d2ef-4443-ba7e-8389a86591ff/resourceGroups/StreamAnalytics-Default-Central-US/providers/Microsoft.StreamAnalytics/streamingjobs/hvac
    Type:
    FieldsMismatchError

  • FXD and scene serialization

    FXD is a format that is used to import a graph of nodes into a scene.
    I want to do the reverse. I want to save a graph of nodes to FXD and all I have is a scene.
    I want build a simple javafx designer and FXD seems to be the perfect serialization format.
    Is this possible? or will it be possible?

    Greg Brown wrote:
    That's not entirely accurate. Depending on your use case, you might be able to use FXMLLoader for this. It won't write a scene graph out, but it can be used to read one in. So, for example, you could generate FXML on a server and deserialize it on the client using FXMLLoader.I'll try it, thanks.

  • Rendering of Swing components

    Hi,
    As the renderer is used to be client independant, is it suitable to send back to the client a serialized view of Swing components (using the new XML serialization format) ?
    This could allow server-side management of the client's look-and-feel and total move of the business on the server.
    It might sounds weird and I'm not totally clear with myself anyway.
    I think a little "deamon" should handle the requests to the server and the demarshalling step of the server response.
    Any point of view on this ?
    (this is just a thought of an alternative use of the renderer capabilities coz todays it only sends back HTML tags)
    Ionel

    Hi,
    personally, I do not have too much experience with Swing.
    There is a project on sourceforge that implements your idea but is not based on Java Server Faces.
    It's called SwingML:
    swingml.sourceforge.net

  • Import process flow in the Process Composer

    Hello BPX'ers
    I've got a question about the Process Composer, is it possible to import work flows that where created in tools like for example Bizagi, this is a lite bpm tool (takes about 5 minutes to install) and you do not need any heavy pc requirements.
    It would be nice to export Process Flows from such a lite tool, and import them with Netweaver.
    Or are there future plans to develop a lite version of the Process Composer.
    Because if i want to model a process with a customer the first step is to validate the process, not to deploy or build it (yet) ,but just to draw it sow that I get a nice overview, then I don't want the whole Netweaver CE environment to be installed on my laptop.
    But it would be nice if these modeled processes could be imported in Netweaver CE..
    Are there future plans for this...
    Thanks,

    As far as I know, the support of import/export interfaces depends on the adoption of BPMN 2.0, as this version of the standard will introduce a serialization format. BPMN 2.0 is to be finalized in 2009 by the OMG group, and for this reason can be adopted by SAP NetWeaver BPM only beyond the release in 2009.
    Unfortunately, I didnu2019t find any externally available additional information.
    Regards, Preslav

  • Pass-through optimization

    I just upgraded our env from 3.3.1 to 3.4.2-patch05.
    I was intrigued by the following message in my log file.
    INFO | jvm 1 | 2009/06/19 23:43:02 | [INFO ] 2009-06-19 23:43:02.053/7571.909 Oracle Coherence GE 3.4.2/411p5 <Info> (thread=Proxy:ExtendTcpProxyService33:TcpAcceptorWorker:1, member=10): The cache "XYZ" does not support pass-through optimization for objects in internal format. If possible, consider using a different cache topology.
    I have no idea what it means...
    Any information about this appreciated.
    Thanks
    Sumax

    Hi Sumax,
    starting with 3.4.0 it is possible to configure your cache services and invocation services to store the data internally and communicate over the network in the POF format instead of the old Java serialization/ExternalizableLite format (EL from now on).
    Since from start the Coherence Extend clients used the POF format to communicate with the proxy, if EL was used as the serialization format within the service in the cluster, then the proxy node had to convert the data from EL to POF and vice versa. This required the deserialization of most data passing through the cluster (except for types that are known to both EL and POF... i.e. Strings and Java primitives). This was an expensive step in both CPU and memory terms.
    On the other hand, if you configured your service to use POF as its serialization format, then this conversion step is no more necessary, and the proxy can pass through the data from the service to the client in a streaming manner so the CPU cost is negligible and the memory cost is just a much smaller buffer size. This, along with the related effects of POF being used within the TCMP cluster as a storage and communication format is one of the greatest performance improvements the 3.4 release introduced.
    Obviously, if you just dropped in 3.4 jars instead of 3.3, then you did not configure the services to use POF.
    To configure so, you first have to ensure that all classes which are sent over the network are properly configured for POF serialization (they implement PortableObject or have a corresponding serializer and the user type is properly registered in pof-config.xml). Then you can enable the POF serialization either on a service by service basis (using the <serializer> element in the service configuration within the coherence-cache-config.xml). You can enable the POF format for all services with a single Java property (I don't know it off my head) but it is safer to go service by service.
    Best regards,
    Robert

  • Modify objects in file

    Hi guys...i am new to java and i face problem with an assignment. I am totally stack with the part where i have to modify objects in a file. Well i know how to write objects i know how to read them but i have no clue how to write again in the same file new objects without overwrite the file. Can u give me some help i wud appreciate it.

    I'm not sure exactly what it is you're trying to do. However, the objects in a serialization stream depend on each other; you can't just snip an object out of it or meddle with it freely without incurring side effects. It sounds like the simplest way to do what you want is to open the stream, read objects and write them to a new file, and change the object you want to change when it comes along.
    If you wanted to have a really delightful time of it, you could learn to interpret the serialization format used by ObjectOutputStream and manipulate it yourself using a RandomAccessFile or whatever... but that sounds like it's beyond the scope of your assignment.
    Does that help? If not, what exactly is it you need to do?

  • Go beyond the blob data model?

    is it possible to build a richer data structure for the value?
    right now it's a blob/bytes array. is it possible to provide a hook to the insert() code
    so that when we call
    insert(key, some_value)
    we do not simply write the "some_value" to overwrite the old value, but we provide an operator
    insert(key, op)
    so that after the node corresponding to the key is found, the operator is carried out, with the old blob value being fed
    into the operator. i.e. op(old_value), op() could update the old value in place.
    would this model be useful to other users?
    the reason I am bringing this up is that I experimented with abstracting out the storage engine of apache Cassandra. it currently uses its own storage implementation. the current storage implementation allows partial updates to a value to be done in place, so that if you have a lot of operations such as adding a delta to a value , like adding a user's action to a list of past action history, you do not need to serialize and deserialize the entire record every time.
    Thanks a lot
    Yang

    The first step would be for you to come up with a serialization format such that you can quickly calculate, for a given field, what bytes the field occupies and therefore what bytes to replace, without having the deserialize the entire record. If you have that, you can use JE's DatabaseEntry.setPartial feature to replace those bytes.
    This avoids deserialization and serialization. It does not avoid having to read the entire record (the whole byte array) and write it again, when the record is read from disk into cache and written from cache to disk. There is no way to avoid that with JE.
    --mark                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Oracle Coherence implementation for JDBC result set

    Instead of ORM, is Oracle coherence will support for accessing the data using normal JDBC calls?
    If it supports JDBC , what is the CacheStore implemetation for this?
    Can you provide me some example how to cache the JDBC ResultSet values and retrieve back from Cache?

    Hi,
    I think you mix up several concepts.
    user13266701 wrote:
    Instead of ORM, is Oracle coherence will support for accessing the data using normal JDBC calls?At the moment (up to 3.5.x), there is no way to query Coherence via JDBC out-of-the-box. There were a couple of initiatives to provide such features but they were not part of the product, only initiatives in the Incubator, but I believe they have been dropped.
    Upcoming releases may bring replacement, although I don't think it would be outright JDBC compatible, as that would imply converting an object-oriented data to result set so that you can convert it back, so it would in effect waste CPU resources.
    If it supports JDBC , what is the CacheStore implemetation for this?
    CacheStores have nothing to do with how you access the cache itself. Those are internal operations invoked by Coherence itself. Don't look for a relationship between querying the cache and doing anything with the cache store. The only relationship with them is that key-based operations on the cache may lead to operations on the cache store, but whether they do happen or not depends on the cache content, too.
    Can you provide me some example how to cache the JDBC ResultSet values and retrieve back from Cache?If I understand, that is yet a third concept. If you indeed want to cache JDBC ResultSet-s obtained from the database, then I have to disappoint you: it is not directly possible, as the ResultSet object is an abstraction over an open database cursor, and hence it keeps a database resource. Active resources cannot be cached in Coherence. You may acquire a RowSet from the JDBC driver, which would be possible to cache, however it may not be the most efficient thing to do due to possibly not optimal serialization format used by the RowSet to serialize itself.
    So if I did not answer the questions you wanted to ask, would you please explain in more detail what exactly you would like to do?
    Best regards,
    Robert

  • ClassNotFound Message when trying to deserialize data from an applet

    Hi,
    I have an applet that reads data from a web-server. The data was created by serializing a custom class (Test.java). The serialization format is not the standard one, but XML. This is reached by using the JSX-library (http://www.jsx2.com).
    My HTML-file (Test.html) looks like that:
    <html>
    <applet width="400"
    height="200"
         code="Test.class"
         codebase="./"
         archive="JSX2.0.9.4.jar">
    </applet>
    </html>
    Both, the Test.class and JSX2.0.9.4.jar are stored in the same directory. The Applet reads the serialization data of "Test" instances and tries to deserialize them. But when started, the following error message appears:
    Java VM version: 1.4.1_02
    Java VM vendor: Sun Microsystems Inc.
    java.lang.ClassNotFoundException: Test
    at java.net.URLClassLoader$1.run(URLClassLoader.java:198)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:186)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:299)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:265)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:255)
    at JSX.ObjectReader$Cache.getClass(ObjectReader.java:1503)
    at JSX.ObjectReader$Cache.getClassCache(ObjectReader.java:1511)
    at JSX.ObjectReader.object(ObjectReader.java:796)
    at JSX.ObjectReader.readObject(ObjectReader.java:381)
    at JSX.ObjectReader.readObjectOverride(ObjectReader.java:342)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:318)
    at Test.read(Test.java:49)
    at Test.start(Test.java:72)
    at org.kde.kjas.server.KJASAppletStub$1.run(KJASAppletStub.java:102)
    at java.lang.Thread.run(Thread.java:536)
    When reading serialized data of classes included in the JRE, i.e. an ArrayList, everything works fine.
    I think it is a kind of CLASSPATH problem, because "Test.class" can not be found although it is in the same directory as the Test.html file. So how can I fix this problem?
    Thanks in advance
    Markus

    Make sure your 'CLASSPATH' contains '.'. However, it
    should because you are compiling your applets fine.
    Also try specifying a path to your class when
    loading. If it's in a package, you have to specify
    that, or include the class directory in your
    CLASSPATH.Hey Devyn, it can't be a classpath problem. It's an applet; if run in the browser, there is no way to specify a classpath short of entering a jar in the archive parameter.
    How are you deserializing the applet?

  • Serializing a JButton

    i tried serializing JButton and restoring it during the next run, but it doesn't seem to work.
    are there issues with deserializing a JBUtton?

    Just a warning: serialization of GUI components like
    JButton is not meant
    for long term storage, nor for exchange between
    systems that may have
    different versions of the JVM (since the
    serialization format is not guaranteed
    to be stable between versions).Use XMLEncoder and XMLDecoder to serialize and store it if you're worried about different formats between JVMs.

  • ReplicatedCache resynchronization possible in multiple threads?

    Hi all,
    Is there a way to spead-up initial resynchronization of ReplicatedCache(s) when new storage node joins the cluster. We have about 80 replicated NamedCache(s) ocupying initialy about 10GB of heap. The resynchronization to a new storage node takes about 8 mins and watching the CPU/network usage it seems that it is running in single thread. We have T2 machines (8 cores, 64 threads) and super fast (InfiniBand) network. Parallelizing initial resynchronization would make a difference.
    We would like to speed up frequent rolling upgrades of the cluster (application) while serialization format of cache entries stays unchanged.
    Regards, Peter
    Edited by: PeLe on Mar 6, 2012 3:39 PM

    I don't think this is correct. In the same thread dump, there are several threads that are 'waiting' to get a lock on the same object. I don't understand why the two threads in my original post are the only ones that show 'locked' on the same object. See the following stack in the same dump as a representative of about 9 other threads that are properly 'waiting'. I would attach the whole thread dump here but don't see how to attach a file to the forum thread.
    This is what I would expect for the DwgCmdExecutionThread:UXPRD150:16975 (from my original post). But what I see is 'locked' instead.
    "DwgCmdExecutionThread:UXPRD150:16829" daemon prio=10 tid=0x08cdd000 nid=0x588b waiting for monitor entry [0x8c932000..0x8c932ea0]
    java.lang.Thread.State: BLOCKED (on object monitor)
         at com.sunopsis.sql.SnpsConnection.getConnectionListElement(Unknown Source)
         - waiting to lock <0x94fb1580> (a java.util.Vector)_
         at com.sunopsis.sql.SnpsConnection.getdBConnection(Unknown Source)
         at com.sunopsis.sql.SnpsConnection.connect(Unknown Source)
         at com.sunopsis.sql.SnpsQuery.updateExecStatement(Unknown Source)
         at com.sunopsis.sql.SnpsQuery.executeUpdate(Unknown Source)
         at com.sunopsis.dwg.dbobj.SnpSessTask.updateTask(Unknown Source)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskPreTrt(Unknown Source)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(Unknown Source)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(Unknown Source)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(Unknown Source)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(Unknown Source)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(Unknown Source)
         at com.sunopsis.dwg.cmd.DwgCmd.k(Unknown Source)
         at com.sunopsis.dwg.cmd.g.z(Unknown Source)
         at com.sunopsis.dwg.cmd.DwgCmd.run(Unknown Source)
         at java.lang.Thread.run(Thread.java:619)

  • Downsides of using Proxy servers as a storage enabled node

    Hello,
    We are doing some investigation on proxy server configuration, I read "Oracle coherence recommends it's better to use proxy server as storage disabled".
    can anyone explain downside of using proxy server as a storage enabled node?
    Thanks
    Prab

    It seems that I was wrong with my original answer. The proxy uses a binary pass through mode so that if the proxy and cache service are using the same serialization format (de)serialization is largely avoided.
    However, there are other overhead associated with managing potentially unpredictable client work loads, so using proxy server as storage enable node is still discouraged.
    Thanks,
    Wei

Maybe you are looking for

  • How do I get DVD for the Windows 7 that should have come with my new laptop please

    This is my first message here so please bear with me. I have just bought a new Lenovo Essentials B570 laptop which has come with Windows 7 pre-installed. Previously I used to buy Dell laptops which used to come with OS Re-installation DVDs which chec

  • After a router reset, can't get second PC to access the internet

    I had my wireless router (WRT54G) and printserver working fine and then a problem with my DSL had me resetting both boxes (yeah I should have know better). I've fought back and have everthing working for one PC that has an internal wireless but my ot

  • Can anyone help with a simple coding problem?

    I'm creating a website using Dreamweaver MX 2004 and making a real mess of it! I have created two pages which are identical in terms of background, but one has a text box in the middle. What I want to happen when you click the link to this page is th

  • FTP Publishing

    I have a site that I built using a PC application that has a ton of reference files in .pdf on it already. I want to use iWeb to maintain the site as part of my divorce from Mr. Gates. I do not want to have to reload the referenced files. If I publis

  • Trouble opening pdf email attachments.

    I recently upgraded to a new iMac and use the mail that comes with the computer. Now, when I receive a pdf document as an email attachment, I cannot just double click on it to open and print. If I do double click, it opens a previous pdf document?