Annotation (metadata) working...

At least the @Overrides is working OK in beta-1. Pretty cool. Probably Mr. J. Bloch would update his book "Effective Java" to advise to add "@Overrides" to all methods that are intended to override a concrete method declaration in a superclass.
import java.util.*;
public class Test {
     @Overrides public boolean equals(Test that) {
          return true;
     public static int myMethod(List<?> value) {
          return value.size();
     public static void main(String[] args) {
          System.out.println (myMethod(new ArrayList<Integer>()));
}The expected compilation error (equals must have a parameter of type Object, not Test)
C>javac -source 1.5 Test.java
Test.java:4: method does not override a method from its superclass
        @Overrides public boolean equals(Test that) {
         ^

It requires using RetentionPolicy.RUNTIME to properties to be read reflectively. The default is "CLASS" (only present in class file), and @Overrides is "SOURCE" (only present in source file - it is a pragma or an instruction to the compiler).
import java.lang.annotation.*;
import java.lang.reflect.*;
import java.util.*;
@Retention(RetentionPolicy.RUNTIME)
@interface Copyright {
     String value();
@Copyright ("�2004 edsonw") public class Test {
     @Copyright ("�2003 edsonw") @Overrides public boolean equals(Object that) {
          return true;
     @Copyright ("�2002 edsonw") public static void main(String[] args) {
          for (Annotation ann : Test.class.getAnnotations()) {
               System.out.println ("Class :" + ann);
          for (Method meth : Test.class.getMethods()) {
               for (Annotation ann : meth.getAnnotations()) {
                    System.out.println ("Method " + meth + ": " + ann);
Class :@Copyright(value=�2004 edsonw)
Method public static void Test.main(java.lang.String[]): @Copyright(value=�2002 edsonw)
Method public boolean Test.equals(java.lang.Object): @Copyright(value=�2003 edsonw)

Similar Messages

  • Photoshop Elements 11 and Metadata Working Group Guidelines

    Could you please let me know whether Adobe Photoshop Elements 11 is fully compliant or not with Metadata Working Group Guidelines?
    Thank you

    PSE11 supports the IPTC data fields but captions (description) and keywords are stored in the catalog unless metadata is written to files. GPS co-ordinates written by a digital camera or device can also be read by PSE11.
    http://www.adobe.com/products/xmp/

  • H.264 + Annotations/Metadata = No go

    I am using QuickTime 7 Pro to convert some DV footage into MPEG-4 H.264 encoded files. I use the Movie Properties menu to add metadata (or "annotations") such as Artist, Author, Comments, etc. The problem I'm having is, once the video is exported out to the MP4 file, all of the metadata save Copyright is gone.
    I refuse to believe that MPEG-4 files cannot handle metadata tags, so what is the problem here? It is a problem with QuickTime 7? Any advice for getting full metadata tags in there?

    That doesn't work. I'm having the same problem. In my case I'm exporting to iPod format (.m4v).
    The original .mov movie has all the annotations added as the last thing before saving it.
    When I export it to an .m4v file for ipod all the annotations save the copyright are lost.
    If I open the .m4v file and add the annotations again, a "Save" or "Save As" tries to save it as a .mov file again. The only way to get a .m4v file is to do the export again, and that doesn't preserve the annotations.
    I'm trying to add a second episode to a video podcast. In the case of the first episode iTunes got the title and other annotations from the RSS feed. I'm guessing if the second episode doesn't have its own annotations it will end up showing the same description on iTunes. Not a good thing.
    Can any Apple engineers confirm if this is a bug or othewise?
    TIA

  • How does metadata work with Bridge and Premiere?

    We are trying to figure out how to start using metadata with bridge. But it doesnt seem to work. I can load an avi file into premiere and change the metadata fields for description or scene or log or whatever, when i load it into premiere i see that data, when i look at it in bridge i don't see any of the metadata.
    I can go into bridge and edit the metadata for a file. and when i look at it in bridge it is always there. but looking at that same file in premiere there is no metadata there.
    we have 8 premiere workstations on a fibrechannel network solution we edit with. we want to have about 5TB of stock video clips available with metadata that every machine can search and use in projects.

    I know nothing about Premiere, but perhaps they use different methods to store data.  With Bridge it puts the metadata on the file.  With some programs like PS Elements it puts the metadata in a table.  With PSE one can select "write metadata to file" and solve problem.  Check with Premiere to see how data is stored.
    If you are entering keywords a good check is to look with Windows Explorer and see it there are any "tags"  listed after entering it in Premiere.  WE and Bridge read the same.

  • Lens Exif metadata worked for KM but no more for Sony cams

    Playing with the metadata panel in Aperture 2.1, I was surprised to see the information for the lens manufacturer/model for some shots I did with my old Konica-Minolta 5D DSLR.
    Sony took over the Konica-Minolta DSLR system about 2 years ago and as many others I have since moved to a Sony A100 body and recently to an A700 body.
    Using the very same lenses, that give me detailed lens Exif metadata with the old Konica-Minolta 5D, the metadata field is empty with Sony A100 and A700 files.
    A bit annoying, that the 4 years old Konica-Minolta 5D is supported in Aperture, while the follow up models from Sony are not.
    Hope this will be fixed in the next update.
    Peter

    Steve,
    you are probably right, that Sony is encoding the lens data in a somewhat non standard way.
    But as far as I know, in the EXIF of the Konica-Minolta files, this was done as well in a non standard way, by storing a binary lens code (at least I don't find any plaintext lens data in the image file, when I peek into it with a low level file-editor like Hexedit). As the lens information is displayed in clear text for the Konica-Minolta files within Aperture, Apple must have already integrated a lens database holding data for Konica-Minolta and Sony lenses. Interestingly even new Sony lenses like the Carl-Zeiss 16-80 lens show up, when used on an old Konica-Minolta DSLR.
    For me it looks like Sony might store the same lens-code now in a different proprietary EXIF field but Aperture is still looking into the old Konica-Minolta EXIF field, which is non existent or empty.
    So yes, Sony probably broke it, but there was enough time for Apple to fix it, especially as they have updated the lens database all the time. It is weired to add all recent Sony lenses to Aperture's lens database, but make the data only available, if you use the lenses on a 5 years old Konica-Minolta body.
    So I repeat my request for Apple to fix this in Aperture soon.
    Peter

  • Metadata work Flow

    Dear all,
    Can I have some words of wisdom please.
    I import into Lightroom, then do my stuff.
    Now I then go to the image folders and using RoboGeo add GPS to selected images.
    I then ask lightroom to read metadata from file to update the GPS in Lightroom.
    Am I doing right ?
    Does Lightroom Over-write data with blank fields in this case or just adds what isn't blank.
    I though this was a nice idea, I am beginning to wonder whether the GPS should be done before bringing into LR.
    Many thanks to all in advance,
    simon
    (LR1.3 on a 1.66 dual core, 2GB memory, XP home 32, AVG, Happy.....)

    Simon
    I'm not quite sure what it is you're asking. Are you concerned that any metadata that you currently have in your files will be lost by adding metadata from RoboGeo? If so, I don't THINK this would be the case but why not try a little test first. Right click and make a copy of an image that already has metadata attached without using RoboGeo and stick it some place handy - the desktop maybe. Now add the geotagging data to this image using RoboGeo followed by importing it into LR. If all your original metadata remains intact and only the geotagging data has been added then I think it's a safe bet that you could then go ahead and geotag the remaining images already in LR.
    I guess the simplest way of doing this would be to leave your images exactly where they are and point RoboGeo in their direction. Do the business with the programme and then open up LR. After a few seconds those images should appear with a badge in the top right hand corner saying that metadata has been changed in another application. Click on the badge and update the metadata. Either that or update the metadata from the metadata menu on the menu bar.
    If you have a look in the right hand panel in the Library for the metadata on an image that HASN'T been geotagged you'll notice that there isn't even a blank place to enter any geotagging data. This extra field is only added in LR when it detects EXTRA metadata - in this case geographical coordinates. So it's a pretty safe guess that RoboGeo is only adding to the metadata (EXIF) rather than taking anything away from it, and that LR is now only reading and adding the extra metadata it has been supplied with.

  • @EJB annotation in JSF managed beans not working

    Hi all,
    I've been trying to get the @EJB annotation to work in a JSF manged bean without success.
    The EJB interface is extremely simple:
    package model;
    import javax.ejb.Local;
    @Local
    public interface myEJBLocal {
    String getHelloWorld();
    void setHelloWorld(String helloWorld);
    and the bean code is simply:
    package model;
    import javax.ejb.Stateless;
    @Stateless
    public class myEJBBean implements myEJBLocal {
    public String helloWorld;
    public myEJBBean() {
    setHelloWorld("Hello World from myEJBBean!");
    public String getHelloWorld() {
    return helloWorld;
    public void setHelloWorld(String helloWorld) {
    this.helloWorld = helloWorld;
    When I try to use the above EJB in a managed bean, I only get a NullPointerException when oc4j tries to instantiate my managed bean. The managed bean looks like:
    package view.backing;
    import javax.ejb.EJB;
    import model.myEJBLocal;
    import model.myEJBBean;
    public class Hello {
    @EJB
    private static myEJBLocal myBean;
    private String helloWorld;
    private String helloWorldFromBean;
    public Hello() {
    helloWorld = "Hello from view.backing.Hello!";
    helloWorldFromBean = myBean.getHelloWorld();
    public String getHelloWorld() {
    return helloWorld;
    public void setHelloWorld(String helloWorld) {
    this.helloWorld = helloWorld;
    public String getHelloWorldFromBean() {
    return helloWorldFromBean;
    Am I missing something fundamentally here? Aren't you supposed to be able to use an EJB from a JSF managed bean?
    Thanks,
    Erik

    Well, the more I research this issue, the more confused I get. There have been a couple of threads discussing this already, and in this one Debu Panda states that:
    "Support of injection in JSF managed bean is part of JSF 1.1 and OC4J 10.1.3.1 does not support JSF 1.1"
    10.1.3.1 Looking up a session EJB with DI from the Web tier
    But if you look in the release notes for Oracle Application Server 10g R3, it is explicitly stated that JSF 1.1. is supported. So I'm not sure what to believe.
    I've also tried changing the version in web.xml as described here:
    http://forums.java.net/jive/thread.jspa?threadID=2117
    but that didn't help either.
    I've filed a SR on Metalink for this, but haven't got any response yet.
    Regards,
    Erik

  • Custom Annotation :: inject object at runtime

    I would like develop a custom annotation to inject object at runtime. The annotation will work as follows
    1. develop ValidatorClass annotation on type.
    2. If annotated on a type object will be created at runtime and injected to the object. e.g. in class A it will inject validatorimpl object to vald object/
    Please let me know how to implement that.
    package com.debopam.validate
    public interface validator{
    public String validate();
    package com.debopam.validate
    public class validatorimpl implements validator{
    public String validate(){
       return "true";
    pckage com.debopam
    public class A{
    @ValidatorClass(name="com.debopam.validate.validatorimpl")
    validator vald;
    }

    yogeshd,
    It might be that the .class file for the annotation that you are compiling against is the one you have given us the source code for above, but when you run the program, the class file for the annotation is not the same, and is one that was compiled before you added the static field.
    This can happen if your classpath is wrong. (and would explain why the problem only occurs sometimes - because it is only failing when the classpath is wrong).
    If you run the following program with the Fully qualified name of your annotation as the argument, and with the same classpath as you run your program, it can tell you where it is finding the annotation class file. Check that this is the same one that you are compiling to.
    class WhereLoaded {
        public static void main(String[] args) {
            Class theClass=null;
            if(args.length > 0) {
                try {
                    theClass=Class.forName(args[0]);
                } catch (Exception e) {
                    e.printStackTrace();
                    theClass=Object.class;
            } else {
                System.out.println(
                    "The first argument should be a fully qualified class name," +
                    " using java.lang.Object instead"
                theClass=Object.class;
            String name=theClass.getName();
            name=name.substring(name.lastIndexOf(".")+1) + ".class";
            System.out.println(name + " loaded from " + theClass.getResource(name));
    }regards
    Bruce

  • How to change the timezone in photo metadata?

    After returning from a recent trip out west, I forgot to change my camera's timezone to the local time.  As a result, hundreds of photos from a long weekend trip were recorded with the wrong timezone.  I imported them to Lightroom 5 and discovered the problem.  Of course, I used the Edit Capture Time feature as described in the online help: https://helpx.adobe.com/lightroom/help/metadata-basics-actions.html#change_the_photo_captu re_time
    But that changes the time of each photo's metadata, rather than the timezone.  There's a subtle difference; the better solution would be to change the timezone so that all future interpretations and exported times would be correct.  For example, here's the original metadata from a RAW image:
    DateTimeOriginal                : 2014:08:23 09:13:45
    Timezone                        : -07:00
    and here's what appears on an exported JPG file, after the capture time was edited:
    DateTimeOriginal                : 2014:08:23 11:13:45
    TimeCreated                     : 11:13:45
    DigitalCreationTime             : 09:13:45
    DateTimeCreated                 : 2014:08:23 11:13:45
    DigitalCreationDateTime         : 2014:08:23 09:13:45
    [These metadata reports were produced by exiftool.]  Notice how the RAW image includes Timezone (though sadly the JPG export does not).  More to the point, look at the variety of timestamps produced for the JPG file, none of which mention the timezone and two of which have the wrong time (9:13am instead of 11:13am).  Without the timezone information the latter is hard to interpret and looks inconsistent.  Is this a bug in LR5?  or is it a missing feature (to not record the timezone on export)?

    There's not a simple explanation of the behavior you're observing.  It's not a bug -- LR is doing a reasonable job of following imperfect industry standards. 
    The photo/camera industry in general does not make it easy to deal with time zones. LR's general approach is to preserve whatever time zones are present in the metadata but otherwise ignore them -- it doesn't change them, and it doesn't display them. As you discovered, the Edit Capture Time command does not let you change time zone, but it does preserve whatever time zones are present in the metadata  (but see below for one exception).   LR's handling of time zones conforms with Metadata Working Group's Guidelines For Handling Image Metadata 2.0, an industry standard supported by Adobe, Apple, Canon, Microsoft, Nokia, and Sony.
    The EXIF metadata inserted by cameras does not provide an official field to record time zone (folklore among metadata geeks is that one of the technical leaders of the spec, when challenged about that, asked why anyone would want time zones).  Some cameras insert the non-industry-standard EXIF field TimeZoneOffset.  I don't know the entire history of this field, but it's not defined in the EXIF 2.2 or 2.3 specs, nor mentioned in the Metadata Working Group's Guidelines For Handling Image Metadata 2.0.  The field is not well-defined -- it only allows offsets of whole hours, whereas over a billion people live in a time zone whose offset is +5:30 (India). 
    LR preserves EXIF:TimeZoneOffset in your original photo, but it otherwise ignores the field, and it doesn't get copied to exported photos.  This is not unreasonable, given that the field is non-standard.
    The XMP metadata fields, typically inserted by software applications, do support time zones, but including a time zone is optional.  LR preserves XMP time zones that are present when the image is imported, even if you change the date/time with Edit Capture Time.  Your original photo did not contain any XMP fields (cameras typically don't insert XMP), so there were no time zones there for LR to preserve.
    The Digital Creation date/time fields you observed are intended to record when an image was digitized.  With digital cameras, that's generally the same as capture time, but with scanned images, the date/time of digitization (scanning) is different from the date/time the image was originally captured.  LR's handling of these two different date/times isn't great, but it's adequate for most users.  LR will insert the digitization date/time when an image is first imported (if it doesn't already exist), but Edit Capture Time changes just the capture time, not the digitization date/time.  (A few years ago Adobe changed Edit Capture Time in a point release to also change digitization date/time, on the theory that LR is designed for images from digital cameras, and the two date/times should always be the same; but a number of users who also used LR to manage scanned images complained loudly that LR was overwriting precious metadata, and Adobe backed out that change in the next release.)
    So Edit Capture Time will leave photos taken by digital cameras with an incorrect digitization date/time.  But very few users care about that -- it's only users with scanned images who care about the difference, and at least LR will avoid overwriting digitization date/time.
    A suggestion for using Exiftool to examine image metadata: Use the "-a -G" options, which causes Exiftool to show which section of the metadata the fields are coming from.   Given the complexity of the legacy metadata industry standards, this will help you better understand what your camera and software tools are doing.  For example:
    [EXIF]          Modify Date                     : 2014:08:31 10:01:45
    [EXIF]          Date/Time Original              : 2014:08:13 11:53:44
    [EXIF]          Create Date                     : 2014:08:13 08:53:44
    [IPTC]          Date Created                    : 2014:08:13
    [IPTC]          Digital Creation Date           : 2014:08:13
    [XMP]           Modify Date                     : 2014:08:31 10:01:45-07:00
    [XMP]           Create Date                     : 2014:08:13 08:53:44
    [XMP]           Metadata Date                   : 2014:08:31 10:01:45-07:00
    [XMP]           Date Created                    : 2014:08:13 11:53:44
    [Composite]     Date/Time Created               : 2014:08:13 11:53:44
    [Composite]     Digital Creation Date/Time      : 2014:08:13 08:53:44

  • Is there a way to tag faces in Lightroom that is exportable as metadata?

    I'm scanning a bunch of old family photos and I want to be able to tag faces of relatives and export those names and the corresponding faces as metadata. Ideally this metadata would be in some universal format that is then readable by everything- Windows, Mac OSX, Photoshop, Powerpoint, Facebook, Instagram, etc.
    Is that possible via XMP?
    Searchable keywords are great, but they don't go far enough. In a photo of a family reunion from 40 years ago, I need to attach each name to a specific face and know that some future family member will be able to decipher those tags 40 more years into the future.
    I think everyone could benefit from this. Architectural photographers could tag each piece of furniture with the name of the designer, sports photogs could tag several athletes within each pic, wedding photographers could tag the wedding party and family members.
    Is there a way to do that now? If not, is this ability coming soon?

    So. I've done a lot of reading since last night, including some papers from the Metadata Working Group.
    Here's what I think I've learned:
    the feature I want, that I've been calling "face tagging", is called "Image Region Metadata" by software companies
    Adobe created support for image region metadata within XMP
    Adobe, Apple, Canon, Microsoft, Nokia and Sony founded something called the Metadata Working Group to standardize the use of increasingly complex metadata
    The Metadata Working Group published a set of guidelines in Nov 2010 that included guidelines for image region metadata
    As of now, only Google Picasa and Windows Photo Gallery actually support any kind of XMP image region metadata (or what I called "face tagging")
    Did I get all that correct?

  • Author metadata separated by semi-colons truncated in file properties and "Get Info"

    I'm using Acrobat Professional 9.0 (CS3) for Mac to edit the metadata for a collection of PDFs to be made available on the web. When I enter the data, I am inputting a list of authors separated by commas, like this: Smith J, Watson C, Brown J. If I click on "Additional metadata", the data I've already entered is transposed into the various XMP fields. And the commas separating the author names are changed to semi-colons. I gather that this happens because XMP wants to separate multiple authors with semi-colons, and Acrobat wants the metadata in XMP fields to match the metadata stored for the file properties. Fine.
    However, if I save such a PDF and then use Get Info on my Mac (OSX 10.4) to look at the file properties, the list of authors is now truncated where the first semi-colon appears. The list is also truncated in Windows XP if I right-click and select properties. The list is also truncated when I look at the file properties in Preview on my Mac, or if I look at file properties using FoxIT, or using Adobe Acrobat Reader 7 or earlier. The only way a site visitor will actually be able to view the full list of authors in a file saved this way is to use Adobe Reader version 8 or later.
    I would like to preserve XMP/Dublin Core/etc metadata in the proper format in the XMP code, but would also like users of standard, popular file viewers to be able to access the full list of authors. Is there a way to do this with Acrobat 9?
    Also, once I've saved a file and the XMP metadata has been altered, Acrobat seems to permanently change the way that the authors are listed in the file properties. I cannot manually change those settings any longer without Acrobat overriding my changes and converting commas to semi-colons, or surrounding the entire list of authors in quotation marks. Is there a way to get around these Acrobat overrides and manually take control of my metadata again?
    Does Windows Vista read the authors list correctly in the file properties if it is separated by semi-colons?
    It seems to me that in an attempt to get XMP metadata working smoothly across the entire CS line, Adobe has jumped the gun somewhat and is now forcing Acrobat users to use "file properties" metadata that is really only fully compatible with Adobe products. Is there a way I can get some backwards compatibility on this?
    Thanks for any suggestions or insight anyone can provide to this vexing issue.
    Phil.

    Bridge has some pretty powerful and helpful features. However, I am unable to figure out how to access the non-XMP "file properties" fields through Bridge, and if I add metadata via Bridge, then I run into the same problem regarding the use of semi-colons to separate authors.
    If I had more time, or a larger set of files I might investigate the use of ExtendScript to import all my metadata from an Excel file (where it already exists) into the PDF file properties and XMP metadata.
    The best solution for my case though appears to be to use Acrobat 9 and to do the double-edit process for each file. I should be able to just cut-and-paste the metadata from the Excel file, and then if I save the Authors list to the end, I can simply paste it once into the XMP field (through the Advanced metadata button) and then return to the regular file properties page and paste it again in there, where Acrobat will add quotes around it.
    Lastly, if anyone else happens to find this post and is looking for similar information, I would recommend searching in the Bridge forum as well as the Acrobat forum.
    Phil.

  • Updating/sharing keywords and metadata in Lightroom 4

    I will be using an intern to help add keywords & metadata to an extremely large group of photos in Lightroom 4.
    The intern will have their own workstation so we'll be passing files back and forth.
    I need to add their work to mine, keeping all of the keyword/metadata work tied to the virtual photos which already have some work done to them (some keywords, metatdata and photo enhancements already in place on some photos).
    step-by-step appreciated
    What is the best way to do this?
    I'm still learning LR myself.
    suggestions welcomed!
    thank you

    Lightroom is not a multi-user application, and so you can, via tedious manipulation of files, achieve the result you want, but I should also note that make one mistake, and you potentially have a real mess on your hands.
    The best solution (in my opinion) is to put the catalog file and all photos on a server that you can both access. The drawback to doing this is that only one person can be working with a specific Lightroom catalog at a time. And given my usual advice to use only one catalog, this may not meet your needs.
    If that's not a good solution, then the intern can create her own temporary catalog of the portion of the photos to be worked on today, add keywords and other metadata, and then pass the catalog file to you to merge with your main catalog using the command File->Import from another catalog.
    As another alternative, if the only task the intern will carry out is adding keywords and other metadata, there are many freeware solutions that your intern can use for this, and then when she is done in a particular folder, you can in Lightroom select all the photos in that folder, and then read the metadata from the photo files. For Windows, two such freeware programs are GeoSetter and ExifToolGUI.

  • What Metadata Standard is used for Face Regions?

    Nice to see that LR6 now has support for Face Recognition and Face Regions. However, nowhere in the online help is it stated explicitly what metadata standards are used for the image regions.
    I would hope that as Adobe is a member of the Metadata Working Group, then LR6 is using the metadata standard specified by the MWG for image regions.

    It's the MWG.
    Within Lightroom itself, the face regions are recorded inside the catalogue and this data is associated with keywords.
    When you export a photo, unchecking the option Remove Person Info, the names are also written to the IPTC Extension Person Shown and the MWG Regions too.
    I suggest you export a photo as above and examine its metadata in the Advanced tab of Bridge's FIle Info dialog box, or in some other metadata tool.

  • Problem about annotation @WebServlet

    Hi everybody.
    I'm working with Tomcat 7 and I'm developing a servlet named Booklet.
    I want to use annotations instead of web.xml.
    So, I annotate my servlet class:
    @WebServlet(urlPatterns="/Answers",displayName="Booklet Servlet")
    The servlet works fine, the URL is just the one I wrote in the annotation, but I can't see the display name in Tomcat Manager.
    If I add a simple web.xml containing <display-name>Booklet Servlet</display-name>, leaving the annotation unchanged, the display name shows up !
    as though the annotation would work for urlPatterns but wouldn't work for displayName.
    Any suggestions ?
    thanks,
    Virgilio

    I tried the name attriobute, but nothing changed, the display name column in Tomcat Manager remained blank.
    I agree it isn't a showstopper bug, but it's a bit annoying not to see the display name, 'cause my customer wants to see it (it was him who decided the string to show in Tomcat Manager), so I must keep web.xml file ONLY for it (I hoped I could get rid of it).
    well, I can go on with web.xml....

  • Annotations missing from toplink.jar

    The download of Toplink 11g includes the toplink.jar along with a few other jar files and a source .zip file. The toplink.jar does not include the oracle.toplink.annotations package which is necessary to use the @Convert and @Converter annotations for working with datatypes Toplink does not already understand. These classes are included in the source distribution. Am I missing something? Do I need to build the jar file from source to get the annotations package or is this available in another jar file?
    Thanks,
    Scott Akins

    The TopLink 11g release has a toplink.jar which delivers native ORM/OXM/EIS functionality focused on backwards compatibility with previous releases (uses the oracle.toplink.* packaging).
    The release also includes the EclipseLink libraries where all of the annotation support is delivered focused on implementing JPA with annotations and extended XML configuration. If you want to use JPA and our advanced annotations you should be using the included EclipseLink library.
    Doug

Maybe you are looking for

  • I have iphone 5 and can not send a text to one person who has iphone 4s and on same family plan

    I have an iphone 5... and also a family plan.  I can not send a text to my daughter on the same plan. She has an iphone 4s.  I have tried mulitiple things to get this to work.... It started about 10 days ago... I have delete text stream for her and a

  • Query regarding Transfer order creation

    Hi Experts, I have query regarding on creation of transferorder in WM. 1) Once the delivery order received from SD ,  How  WM people  will proceed with that delivery order? 2) Is it possible  to create transfer order for each item avaliable in delive

  • Demo Flight Availablity Check - status code 500

    Hi Friends, When I executing Demo Filight Availablity check, I am getting following error. Error Type:  XI system error  Error Details:  HTTP response contains status code 500 with the description Internal Server Error  Please give me the solution. A

  • Loop 2 dimentional array to determine best case

    Can you figure this out??? I've got a 2 dimitional array with trips and units and I need to find the best trip to unit matches that will result in the least amount. The number of elements of each is not constant but for this example I will show 4 eac

  • Break point

    Hi how can I put a break point ofer the pop up screen of Transaction? Regards, D.kiran kumar