Date serialization

Hi
I have found interesing (??) problem with Date object serialization. Date object serialized on Sun JVM can't be correctly deserialized on IBM JVM !!
Try run this example on Sun JVM.
import java.io.*;
import java.util.Date;
public class Write {
     public static void main(String[] args) throws IOException {
          Date date = new Date(45,8,1);
          System.out.println("JVM Vendor ["+System.getProperty("java.vm.vendor")+"] Version ["+System.getProperty("java.vm.version")+"]");
          System.out.println("Date ["+date+"], Date time ["+date.getTime()+"]");
          System.out.println("Date timeZoneOffset:"+date.getTimezoneOffset());
          //Serialize date
          ByteArrayOutputStream bout = new ByteArrayOutputStream();
          ObjectOutputStream ous = new ObjectOutputStream(bout);
          ous.writeObject(date);
          ous.flush();
          //write to file
          File outFile = new File("date.bin");
          FileOutputStream fous = new FileOutputStream(outFile);
          fous.write(bout.toByteArray());
          fous.close();
}Result should be:
JVM Vendor [Sun Microsystems Inc.] Version [1.4.2_05-b04]
Date [Sat Sep 01 00:00:00 CEST 1945], Date time [-767930400000]
Date timeZoneOffset:-120
And serialized date object should be written into date.bin file.
Next run this code on IBM JVM:
import java.io.*;
import java.util.Date;
public class Read {
     public static void main(String[] args) throws Exception {
          File inFile = new File("date.bin");
          FileInputStream fis = new FileInputStream(inFile);
          ObjectInputStream ois;
          ois = new ObjectInputStream(fis);
          Date date = (Date) ois.readObject();
          ois.close();
          fis.close();
          System.out.println("JVM Vendor ["+System.getProperty("java.vm.vendor")+"] Version ["+System.getProperty("java.vm.version")+"]");
          System.out.println("Date ["+date+"], Date time ["+date.getTime()+"]");
          System.out.println("Date timeZoneOffset:"+date.getTimezoneOffset());
}On my IBM JVM result is:
JVM Vendor [IBM Corporation] Version [1.4.1]
Date [Fri Aug 31 23:00:00 CET 1945], Date time [-767930400000]
Date timeZoneOffset:-60
Why deserialized date object has different value then before serialization??
Magic date is 1945-09-16, if I set date after - everytjing is ok, if I set date before - something is wrong.

Hi
I have found interesing (??) problem with Date object
serialization. Date object serialized on Sun JVM
can't be correctly deserialized on IBM JVM !!So what, serialization was not meant to be portable accross JVM implementations.

Similar Messages

  • [svn] 1543: Bug: BLZ-152-lcds custom Date serialization issue - need to add java.io. Externalizable as the first type tested in AMF writeObject() functions

    Revision: 1543
    Author: [email protected]
    Date: 2008-05-02 15:32:59 -0700 (Fri, 02 May 2008)
    Log Message:
    Bug: BLZ-152-lcds custom Date serialization issue - need to add java.io.Externalizable as the first type tested in AMF writeObject() functions
    QA: Yes - please check that the fix is working with AMF3 and AMFX and you can turn on/off the fix with the config option.
    Doc: No
    Checkintests: Pass
    Details: The problem in this case was that MyDate.as was serialized to MyDate.java on the server but on the way back, MyDate.java was serialized back to Date.as. As the bug suggests, added an Externalizable check in AMF writeObject functions. However, I didn't do this for AMF0Output as AMF0 does not support Externalizable. To be on the safe side, I also added legacy-externalizable option which is false by default but when it's true, it restores the current behavior.
    Ticket Links:
    http://bugs.adobe.com/jira/browse/BLZ-152
    Modified Paths:
    blazeds/branches/3.0.x/modules/core/src/java/flex/messaging/endpoints/AbstractEndpoint.ja va
    blazeds/branches/3.0.x/modules/core/src/java/flex/messaging/io/SerializationContext.java
    blazeds/branches/3.0.x/modules/core/src/java/flex/messaging/io/amf/Amf3Output.java
    blazeds/branches/3.0.x/modules/core/src/java/flex/messaging/io/amfx/AmfxOutput.java
    blazeds/branches/3.0.x/resources/config/services-config.xml

  • HR Master data serialization error

    Hi,
    I activated serialization on sender and receiver system:
    (Transaction SALE -> Modelling and Implementing Business Processes -> Configure Predefined ALE Business Processes -> Human Resource -> Master Data Distribution - > Serialize HR Master Data)
    After processing, some iDocs have status 51 - Serialization error for object &. Expected counter 000001 < 000002 in IDoc.
    I understand that when I get iDoc with too big serialization counter the iDoc should have status 66 (IDoc is waiting for preceding IDoc (Serialization)) and I should be able to reprocess the iDoc when I get iDocs with missing serialization counter. Am I right?
    So why do they have status 51 and not 66?
    Thanks for answer in advance.

    there is some inconsistency in the receiver system.
    for ex:
    sender system Company code is different from the receiver system. etc...
    Edited by: Ashok Kumar Reddy N on Feb 6, 2008 1:46 PM

  • RemoteObject Data Serialization - Boolean

    I have the problem that a PHP boolean retrieved from a mySQL boolean field as "0" or "1", instead of "true" or "false".
    As far as I understand, AS3 only recognizes "true" or "false" as boolean values, anything else will be interpreted as TRUE, except NULL as false.
    How can I get around this problem? I'm using Zend Framework, if someone knows of a what to implement this fix in the PHP value object.
    Thanks!

    This is just a trial error suggestion --
    why not use the 0 & 1 in the UI ? change the Boolean to String in the value object, when the data is passed from the php to the UI, the string would appear as is and you can use that how ever you want

  • Idoc Serialization for Transactional data

    Hi All,
    1. Please let me know if you have done IDOC serialization for Transactional data.
    If so please let me know the steps.
    2. How do we use serialiazation using object types. If you have done this please let me know the steps for this too.
    Thanks for your help.
    Srikanth.

    Hi Srikanth,
    Follow the steps below to set up serialization using object types:
    1.       In the SAP menu choose ® IDoc Interface/ALE ® Development ® BAPI ® Serialization ® Serialization Using Business Objects ® Determine Supported Business Objects (transaction BD105). Enter all the business object types relevant for serialization.
    2.       In the SAP menu choose ® IDoc Interface/ALE ® Development ® BAPI ® Serialization ® Serialization Using Business Objects ® Assign Message Type to a Business Object (transaction BD104). Assign the message types relevant for serialization to each business object type.
    3.       In Customizing (IMG) activate the serialized distribution in both the sending and receiving systems:
    ALE Implementation Guide (transaction SALE)
    Modeling and Implementing Business Processes
    Master Data Distribution
    Serialization for Sending and Receiving Data 
    Serialization Using Business Objects
    Execute activities Activate Outbound Business Objects and Activate Inbound Business Objects. Set the Serialization flag for the required business object types.
    If you want to do serialization by message type then
    1. go to BD44 and create a serialization group and assign messages and the serial number to each.
    2. Run the program RBDSER01.
    Award points if useful,
    Aleem.

  • Java.util.Date badly serialized to java.sql.Timestamp Coherence 3.5.2

    Hi all,
    I'm running into this odd behaviour.
    I serialize java.util.Date objects to cache and when I read them back from cache, they appear to be java.sql.Timestamp types.
    I've isolated a junit test for that.
    Do you know why Coherence changes the type in the middle?
    Regards
    Harry.
    import java.util.Date;
    import org.junit.Assert;
    import org.junit.Test;
    import com.tangosol.io.Serializer;
    import com.tangosol.io.pof.ConfigurablePofContext;
    import com.tangosol.util.ExternalizableHelper;
    public class DatePofSerialTest {
         @Test
         public void testCobdate() throws Exception {
              Date date=new Date();
              Serializer serial = new ConfigurablePofContext();//"coherence-pof-config.xml");
              Date date2=(Date)ExternalizableHelper.fromBinary(ExternalizableHelper.toBinary(date, serial), serial);
              System.out.println(serial +" -- Date to serailize ["+ date.getClass() + "]");
              System.out.println(serial +" -- Date from deserialize ["+ date2.getClass() + "]");
              Assert.assertEquals(date, date2);/* Of course this passes, as both refer to the same time (long)*/
    {code}
    This gives as output
    {code:title=output |borderStyle=solid}
    log4j:WARN No appenders could be found for logger (Coherence).
    log4j:WARN Please initialize the log4j system properly.
    com.tangosol.io.pof.ConfigurablePofContext {location=coherence-pof-config.xml} -- Date to serailize [class java.util.Date]
    com.tangosol.io.pof.ConfigurablePofContext {location=coherence-pof-config.xml} -- Date from deserialize [class java.sql.Timestamp]
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    Hi Harry,
    It looks like the same issue as ...
    PofExtractor with java.util.Date results in ClassCastException
    It was fixed in version 3.5.4.
    Thanks

  • Too Big to Serialize??

    I have run into the same problem someone else here mentioned before, namely that I get a StackOverflowError when I try to deserialize an object I successfully serialized. I wondered if the Java serialization routine (the default, since I didn't override any readObject or writeObject methods) had a bug and had worked itself into an infinite loop, but I think my object may simply be too large.
    The data structure I'm trying to serialize contains hundreds of thousands of objects and has a complex pointer structure (used to sort the elements in more than one direction at once and also as a hash map) plus several thousand collection objects referencing small subsets of the set of objects in the large data structure. This means there are cycles in the object graph--serialization is supposed to be able to handle that, right?
    Trying to serialize my object (using the writeObject method) initially resulted in a StackOverflowError, but I got results when I increased the stack size. Unfortunately, I still can't read them back in, even with a Java stack size of 10G and a native stack size of 8M.
    Has anyone overcome a problem like this by writing your own serialization routines? I don't see how I'd get around having the same problem the default versions seem to be having--I think they're simply going into deep recursion on the pointers before coming back up.
    Here is a snippet of the stack trace:
    java.lang.StackOverflowError
    at java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2163)
    at java.io.ObjectInputStream$BlockDataInputStream.readInt(ObjectInputStream.java:2658)
    at java.io.ObjectInputStream.readHandle(ObjectInputStream.java:1373)
    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1429)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1626)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1274)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1845)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1769)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1646)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1274)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1845)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1769)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1646)
    [The last four lines are repeated over and over.]
    By the way, I had the same problem I have seen reported (but not solved) here earlier with StackOverflowError not printing a stack trace, even when printStackTrace() is explicitly called on it, in Java 1.4.2. To get the stack trace above, I had to use Java 1.4.1_05.

    I've discovered that serializing the object using writeObject to a FileOutputStream caused a StackOverflowError, even though it produced a file, so that could have to do with why it is getting a StackOverflowError trying to read in that file, or at least it is probably the same problem in both cases.
    I get a very analogous error trying to write the object:
    java.lang.StackOverflowError
    at java.io.FileOutputStream.write(FileOutputStream.java:257)
    at java.io.ObjectOutputStream$BlockDataOutputStream.drain(ObjectOutputStream.java:1637)
    at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1601)
    at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1323)
    at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1302)
    at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1245)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1052)
    [The last four lines are repeated over and over.]
    As I mentioned, I didn't override the writeObject method for any class, and I only make one call to writeObject on my main object--that one call results in this problem.
    I wonder if the default serialization mechanism could be going about things in a dumb way that could cause problems for large, multiply-linked data structures like mine.
    For example, suppose the pseudocode for writeObject looked like this:
    writeObject(Object o){
    writeObject(first field);
    writeObject(second field);
    writeObject(last field);
    Then suppose I had a doubly-linked list of 100,000 listnodes like this:
    class ListNode{
    DataClass data;
    ListNode next;
    ListNode previous;
    Then suppose I tried to serialize the first ListNode.
    The serialization would behave like this:
    write data;
    serialize next:
    ---write data;
    ---serialize next:
    ------write data;
    ------serialize next:
    ---------write data;
    ---------serialize next:
    There would be 100,000 function calls on the stack before it got to the end of the list, and it still wouldn't have finished serializing the first ListNode. Then when it got to the end of the list, it would continue serializing that object by proceeding through the previous pointers, so there would be 200,000 function calls on the stack before it finished processing the last ListNode. When you consider that my DataClass objects are themselves complex objects with pointers pointing, among other things, back to the ListNodes (I'm simplifying my structure for the purposes of discussion), it's easy for me to imagine that a serialization routine written in a simple way would simply fill up the stack before ever returning from its first function call. How many function calls would that take??
    The only way I see around that problem would be to rewrite all the serialization routines for my objects in some way so they wouldn't be recursive, however I could do that. Am I missing something here?

  • Storing data in server's database

    hi i am new in Java ...
    i am developing a clint-server program with java
    the clint and the server will run on the same PC (it's a student project)
    i need to know about the available methods to store data in the server's database ..
    i've got an idea about using arrays, but if there is any other better ways for the storage PLEASE HELP!!
    thanks in advance for your help.

    For a simple single-user application you could make the classes you use to hold data Serializable so their content can be written to disk strait away, and later on read and turned back into objects.
    For larger tasks, use a relational database like MySQL, Postgres, Sybase, Oracle etc. and connect to it using JDBC

  • Serialization error in inbound idoc

    Hi Friends
    Am getting inbound idocs with this error : Serialization errror for object 01,S, 34343434,Expected counter 000001<
    and error in details says,
    The expected serialization counter has the value 000001. However, the serialization counter in the IDoc has the value 000002 and is therefore too big. There are therefore older IDocs with this HR object with the serialization counter values in between. These IDocs have either not yet been posted, or have been posted incorrectly.
    Intrested Part is   i put check ( Selected ) on  as per client suggestion in ,
    IMG -> Sap Netweaver -> Application server ->Idoc Interface / Application Link Enabling (ALE ) -> Modelling and imple  Business process -> Configure  Predefine  ALE Business Process -> Human Resource -> Master data Distribution ->Serialize HR Master Data  -> ( Serialize HR Master data in ALE inbound processing ) ...   settings  and i was getting this error but when i take back this check ( unselect ) flag   idoc posted successfully    Now am confuse i shd keep this or not because am getting data from diff countries also.
    what shd i do for this few expert said i need to do some thing for Serialization so automatically system shd take care for this,
    Need your support on this .
    Regards
    Meeta

    I have been working with IDocs in a serilization-active HR-ALE system and have learned a few things along the way ....
    - Serialization must be active in both sending/originating and receiving/destination systems.  (Turn on flag in customizing, as you have already done).
    - Both systems maintain a registry of HR objects that includes in each record the object type, object id, and serial counter.  The serial counter is increased by one in the sending system every time the object is sent via ALE and increased by one when the object is correctly processed in the receiving system.  The serial counter is passed in the E1PLOGI record of each object in the IDoc.
    - When a similar serialization error is received (where the counter received is lower than the expected counter), the error message gives a little more detail:   "Decide what to do based on your own particular situation. Responses could range from ignoring or editing the IDoc, to manually adjusting the serialization counter in the registry of the receiving system to the expected value."
    Options are therefore:
      - Edit IDoc - can use IDoc test tool (WE19) to create and edit a copy for processing.
      - Use transaction RE_RHALE_HRMDRGIN to maintain the serial counters
      - Use transaction RE_RHALE_RGIN2IDOC to adjust the serial counters for the offending IDoc
    I strongly recommend, however, that you take some time to further understand the serialization process before updating the counters manually.  If you've turned on, then off, then on the serialization flag(s) through customizing, you might consider initializing the serial counters in both sending and receiving systems and starting from scratch.

  • Doubt in ALE/IDOC's

    Hi Friends,
        I have 2 inbound idocs in which
         one idoc posts the document in SAP
         seecond oneconfirms that document.
    what my problem is if the second idoc comes first and try to confirm before the documnet posted it gets fail. SO that i want this confirmation to be done only after the document is posted. How can i achieve this.
    Thanks& Regards,
    Naren.

    hi,
    there is an excellent concept in ALE called <b>SERIALIZATION</b>.
    1. goto transaction code SALE.
    2. MODELLING AND IMPLEMENTING BUSINESS PROCESS -> MASTER DATA DISTRIBUTION -> SERIALIZATION FOR SENDING AND RECEIVING DATA -> SERIALIZATION USING MESSAGE TYPES
    3. what u actually do is,
    <b>example :</b>
    let us say IDoc1 is of message type CLFMAS, IDoc2 is of message type MATMAS. now u configure in the path given above that, MATMAS should be received first and CLFMAS should be received next. this is known even to the partner.
    this is how u correctly receive IDoc in the sequence.
    hope it helps.
    reward if useful...

  • Windows Phone 8 Can't Update listbox item in XML file after using hardware back button?

    Hi All i have a list box and i am saving list box selected item in XML file thats working fine but the problem is that when i will close my app and reopen and add more value to list box my previous value is removed form the xml file and listbox also how
    i can save my current added value and previous value in xml file i am using following code :
    and iam using following code
    <Grid x:Name="ContentPanel" Grid.Row="2" Margin="15,10,15,0">
    <ListBox Name="list_location" Tap="list_location_Tap" Foreground="Black">
    <ListBox.ItemTemplate>
    <DataTemplate>
    <TextBlock x:Name="item_name" Text="{Binding description, Mode=OneWay}" Padding="5,15,5,15" TextWrapping="Wrap" FontSize="{StaticResource PhoneFontSizeLarge}"/>
    </DataTemplate>
    </ListBox.ItemTemplate>
    </ListBox>
    <ListBox Name="list_locationAdd" Background="Red" Foreground="Black" Visibility="Collapsed">
    <ListBox.ItemTemplate>
    <DataTemplate>
    <TextBlock x:Name="item_name" Text="{Binding description, Mode=OneWay}" Padding="5,15,5,15" TextWrapping="Wrap" FontSize="{StaticResource PhoneFontSizeLarge}"/>
    </DataTemplate>
    </ListBox.ItemTemplate>
    </ListBox>
    </Grid>
    and my back end code is follow:
    XmlWriterSettings x_W_Settings = new XmlWriterSettings();
    x_W_Settings.Indent = true;
    using (IsolatedStorageFile ISF = IsolatedStorageFile.GetUserStoreForApplication())
    using (IsolatedStorageFileStream stream = ISF.OpenFile(filename, FileMode.Create))
    XmlSerializer serializer = new XmlSerializer(typeof(ObservableCollection<Prediction>));
    using (XmlWriter xmlWriter = XmlWriter.Create(stream, x_W_Settings))
    data.Add(new Prediction() { description = App.professionalId });
    list_locationAdd.ItemsSource = data;
    serializer.Serialize(xmlWriter, data);
    protected override void OnNavigatedTo(NavigationEventArgs e)
    try
    if (list_locationAdd != null)
    using (IsolatedStorageFile ISF = IsolatedStorageFile.GetUserStoreForApplication())
    using (IsolatedStorageFileStream str = ISF.OpenFile(filename, FileMode.Open))
    XmlSerializer serializer = new XmlSerializer(typeof(ObservableCollection<Prediction>));
    ObservableCollection<Prediction> data = (ObservableCollection<Prediction>)serializer.Deserialize(str);
    if (list_locationAdd != null)
    this.list_locationAdd.ItemsSource = data;
    list_locationAdd.Visibility = Visibility.Visible;
    catch (Exception ex)

    Can you provide a working sample?  Upload to Onedrive and share it with us.
    Matt Small - Microsoft Escalation Engineer - Forum Moderator
    If my reply answers your question, please mark this post as answered.
    NOTE: If I ask for code, please provide something that I can drop directly into a project and run (including XAML), or an actual application project. I'm trying to help a lot of people, so I don't have time to figure out weird snippets with undefined
    objects and unknown namespaces.

  • How to set User Name in session?

    Can anyone tell me if there is an user name variable already stored in a session object to which I can assign the user's name? I usually do this by storing a variable in the session to hold that name. When I print the session object (I am using websphere) I get the following...You will notice that there is a user name field that has value anonymous....how can i change that to store the actual users name?
    Thanks in advance,
    jk.
    Session Object Internals:
    id : 1M12TXAPPYUZAJJJ4SS5IVY
    hashCode : 586456410
    create time : Sun Jun 30 15:17:38 MDT 2002
    last access : Sun Jun 30 15:17:40 MDT 2002
    max inactive interval : 1800
    user name : anonymous
    valid session : true
    new session : false
    session active : true
    overflowed : false
    session application parameters : com.ibm.servlet.personalization.sessiontracking.SessionApplicationParameters@385b1d5b
    session tracking pmi app data : com.ibm.servlet.personalization.sessiontracking.SessionTrackingPMIApplicationData@38581d5b
    enable pmi : true
    non-serializable app specific session data : {}
    serializable app specific session data : {}
    session data list : Session Data List -> id : 1M12TXAPPYUZAJJJ4SS5IVY next : LRU prev : MRU

    ok I did some more reading on the websphere literature and came to understand that the User Name indicated there was really set as part of an authenticated request from a secure page. And if the request was in an insecure page websphere automatically assigns "anonymous" to it.
    Security integration rules for HTTP sessions
    Sessions in unsecured pages are treated as accesses by "anonymous" users.
    Sessions created in unsecured pages are created under the identity of that "anonymous" user.
    Sessions in secured pages are treated as accesses by the authenticated user.
    Sessions created in secured pages are created under the identity of the authenticated user. They can only be accessed in other secured pages by the same user. To protect these sessions from use by unauthorized users, they cannot be accessed from an insecure page.

  • Queries with offline mode working

    Hi,
    I am pretty new to MI. I created a generic sync application in which I invoke a BAPI to create sales order.
    I fail to understand -
    If I am offline and I create 2 orders using the client, where are the orders stored in MI client? Are the stored automatically? or I need to store them explicitely through code?
    When I go online and sync will I get 2 separate Inbound containers for each order?
    Your replies will be very helpful
    Thanks in advance.
    Regards,
    Nakul

    hello nakul,
    what the generic sync can offer is the abstraction of
    the synchronization process by providing apis. other than
    that, you have to create it by yourself. for example, your
    data persistence and delta handling should be handled in
    your application.
    for your questions:
    >If I am offline and I create 2 orders using the client,
    >where are the orders stored in MI client? Are the stored
    >automatically? or I need to store them explicitely
    >through code?
    as i had described above, you have to persist your own
    data. the simpliest is to make your data serializable and
    use the object streams for your persistence... the timing
    of creating your outbound container is usually during the
    sync.started event. your application should also have a
    way of managing your data states; i.e. which data have
    already been sent and which data wasn't; which data is the
    same state as in the table. and so on...
    >When I go online and sync will I get 2 separate Inbound
    >containers for each order?
    generally, when you sent 1 outbound container, you will
    get a corresponding inbound container. this inbound container
    will be handled to your inbound processor according to the
    method you registered. inbound containers must be processed
    by you and persist your data locally for your application
    use. the framework temporarily saves the raw container files
    but deletes them after the inbound processors were notified.
    so if you did not process the inbound container during the
    notification, the incoming data in that container will be lost.
    hope this clears your doubt.
    regards
    jo

  • A simple DBMS and SQL engine parser

    hi guys;
    i'm about to start an interesting c++ project described below...what i need is that you give me links ,advice and resources that could help me in my project...thanks for helping ...
    here is the description:
    Assignment:
    In a nutshell, this assignment asks you to create a parse-tree for sample test SQL statements and actually create the underlying data structures to hold the data i.e. you are creating an extremely rudimentary SQL Engine (front-end parser) and a DBMS (back-end data structures). Our SQL Engine can be thought of as a bare-bone compiler. To test the SQL Engine and the DBMS, you will create a sample database. Finally (non-programming part), you will read up on the ODBC driver.
    Note that a real compiler and a DBMS have functionality that is way beyond what I am going to ask you to implement. In fact, your DBMS software creation may disregard issues regarding efficiency ? those that couple the DBMS to the OS i.e. page buffering, blocking, concurrency, etc. Ours will be a data-centric view i.e. we will only focus on how best to save the user data. Treat this as DBMS-101 (It is more like DBMS-1.01).
    Functionality:
    Your SQL-Engine/DBMS software must offer the following capability-
    Data: You must design data structures that will allow me to create tables and delete (drop) tables, insert tuples into tables, delete tuples from the tables and modify tuples values in tables possibly based on some criteria. Querying of course is the most important and frequently performed activity, but do NOT worry about joining tables or complicated queries. Data Serialization is of course implied by retrieval.
    Meta-Data/Data Dictionary: In addition, I should be able to look up names of all tables that exist or the table definitions that exist and so on.
    Future Requirements: I am being vague on this so that I may or may not add some small functionality later.
    Software Components:
    SQL Engine: Create a basic parser. Your parser should be able to parse the limited SQL statements that have been provided in Appendix A.
    ?     Include the test cases you used for the parser. Software Architecture and Implementation documents should also be attached.
    ?     Have the parser create parse trees for the sample SQL statements in accordance with your proposed grammar. The input could come from command-line or a stored file.
    ?     Print the parse trees (In-order and Pre-order) for sample input to present proof of correctness.
    ?     That completes the front-end of our SQL-engine.
    DBMS: Propose a data structure, which will house the actual database. The back-end of the program will ?walk? the parse tree and create/fill in the data-structures in the back-end DBMS accordingly.
    ?     Data Dictionary ? meta-data details. I.e. lists details about tables and fields.
    ?     Data Definition ? creation of tables.
    ?     Data Manipulation ? update/modify, storage and retrieval of records.
    ?     Disregard issues regarding efficiency ? those that couple the DBMS to the OS i.e. page buffer, blocking, etc.
    Database: Create a sample database to test your SQL/DBMS software. You must attach database schemas, tables with inserted values in your final submission.

    I have no comments on the parsing of the SQL language. Without looking at Appendix A, I can't tell what you are being asked to do. However I do have a suggestion for a rather bogus back end storage that seems to fit the spirit of the assignment.
    Create a directory named DB and in it keep files that are named things like
    int_Employee_ID
    string10_Employee_Name
    where you follow the naming convention
    DataType_TableName_FieldName
    What this does for you is this:
    To create a table just create files with the names and the datatypes that youwant.
    To delete a table just delet files that have that table name as the secon parameter
    The data dictionay is essentially a dump of the filenames in the directory.
    Given that the data type in encoded in the file name it is easy to locate the ith element in any file for reading or writing.
    Yes, it is bogus. It requires that you make dozens of disk reads to pick up all the pieces of a single record. Good thing efficiency was not listed as a requirement. You would NOT want this structure if you were looking for speed, or making sure that no tranaction gets only halfway done during a power outage.
    But since the assignment is NOT about building databases but rather about parsing SQL statements, this layout gives you a fairly transparent way to represent your data and allows you to do nice bogus things like to create your initial data all with a simple text editor.
    Since there are no joins to be requested of your system, all your SQL queries will essentially boil down into building filters that will run one or more of these lists and essentially create a list of indicies for records in a single table.

  • Compiling iTunes U Usage Reports?

    Has someone come up with a good way to compile the iTunes U Reports into something useful? I appreciate Apple's "iTunes U: Usage report" for knowing what tracks are getting hits, but putting the data together week by week is a pain. I was wondering if someone had figured out a better way? Thanks for any help or suggestions.

    I used good ol' fashion Perl to create a web app for distributing stats to our various internal players.
    I cron job GetDailyReportLogs once a night to get the stats, then use Perl to parse the stats into meaningful data for year, month, series, episode, etc. Then, my favorite part, I use Perl's Data::Serializer to save all these data to files.
    When a user loads the app, I just pull up the already processed data file and pass it to HTML::Template files. Because the data are already in the correct format for my various line charts, bar graphs and tables, the page loads very quickly and I'm not reprocessing the same data over and over, and I'm using AJAX to pull in the template for each chart individually.
    So, using the iTunes U web services and Perl, I was able to pretty easily create a web app that lets users dig through their own usage stats. They can view top series, total downloads, most viewed episodes for the year, month, series, etc.
    Perl might be old fashion, but I still love it.
    -Chris

Maybe you are looking for