Pageless Managed beans and operation bindings, design question

Hi,
I am using ADF Faces and ADF BC and JDev 10.1.3.
I want to declaratively execute an operation binding from a managed bean which does not have a page definition.
How can I create a page definition for a managed bean which will allow me to define and call operation bindings in the same way that you can in a backing bean?
In the past I have used the binding layer to get references to the AM and then execute the client methods directly.
How can I do this declaratively from a managed bean?
regards,
Brenden

Hi Steve,
a good example would be a managed bean which is used by many pages for provide dynamic page forwarding (using global forwards).
The page forwarding is based on what data the user has selected and the entry tasklist will have pages added and removed based on the users entered information.
At the moment I use references in the managed bean which get the binding context by evaluating #{data} to get a reference to the AM and directly call an client method, which calls a PL/SQL procedure and returns a global forward string. The AM is imported into the managed bean and methods called directly.
This is just one example, but probably the best I can come up with for now.
In a backing bean I can use an operation binding to call a method and get a result declaratively like this:
        BindingContainer bindings  = getBindings();
        OperationBinding operation = bindings.getOperationBinding("getUserByEmail");
        operation.getParamsMap().put("email", email);
        String userName = (String)operation.execute();whereas in a managed bean I have to get direct references to the AM to call the method directly.
if there is a way to do this in the same way as backing beans it would seem a more abstracted and cleaner approach?
regards,
Brenden

Similar Messages

  • Managed Beans and Data Access Object

    I have a question / need help understanding how to configure backing bean and model objects so that memory and object creation/deletion is done as efficiently as possible.
    1. I have a .jsf page with a form and a commandbutton that submits the form inputs to a backing bean (enrollispbean is backing bean)
    <h:commandButton value="Enter" action="#{enrollispbean.insert}"/>
    2. The backing bean is used for form handling - the insert() method is used to read the data fields from the form and create a SQL string that will be submitted to a model object, DbInsert, that is used as a generic data access object that connects to the database and insert the SQL string:
    public class EnrollIspBean {
    private String beanvar1="";
    private String beanvar2= "";
    // DbInsert is data access object
    private DbInsert dbinsert = new DbInsert();
    public String insert (){
    String sqlstmt;
    sqlstmt = "INSERT INTO ispmain VALUES(beanvar1, beanvar2,..)"
    dbinsert.insert(sqlstmt);
    return "success"; }
    3. DbInsert is the data access object that contains a method, insert(), that accepts a sql string to insert into the database. This method contains the code to obtain a connection from the database connection pool and then execute the sql statement (note: error checking code not shown):
    public class DbInsert {
    public void insert(String sqlstmt) throws SQLException {
    Connection conn = null;
    GetDBConnection getdbconnection = new GetDBConnection();
    PreparedStatement stmt = null;
    conn = getdbconnection.getdbconn();
    stmt = conn.prepareStatement(sqlstmt);
    stmt.executeUpdate();
    stmt.close();
    conn.close();
    return;
    Where I need help understanding is how to set up the scope for the managed beans and data access object. Currently, I have the backing bean within the session scope (using the facesconfig.xml file). My main question is how to set up the scope for the Data Access Object - currently I do not have it as a managed bean within facesconfig.xml. Instead I am creating a new instance within the backing bean:
    private DbInsert dbinsert = new DbInsert();
    Is this the best way to do this? Will the DBInsert object now be tied to the session scope of the backing bean (i.e., when backing bean is deleted, the DbInsert object will be deleted from session scope as well.)
    Ideally I would like the data access object to be available as a shared object throughout the life of the application. When I was programming using a servlet approach, I would have created a servlet to load on startup. Now that I'm using java server faces, I'm confused about the scope / how to efficiently set up a data access object that I want to be available to all backing beans in the application.
    tnanks for any help understanding this.
    Tom

    I was thinking about setting the data access object as application scope so that it can be used by an backing bean to execute sql statements.
    If I do set it as application scope, however, if I do this, do I still need to declare a new instance of the object from within each bean that uses the object?
    For example do I need to declare a new instance of the data access object from within the bean? or, should I assume that there is always an instance of the bean available in the application scope, and if so, how do I reference it from within the bean?
    Bean Code:
    public class EnrollIspBean {
    // DbInsert is data access object
    private DbInsert dbinsert = new DbInsert();
    Finally, I understand performance may be an issue if I have one instance of the data access object available in the application scope - is there a way to make multiple instances available in the application scope?
    thanks

  • What is difference between Managed Bean and Backing Bean?

    What is difference between Managed Bean and Backing Bean? Please guide me how to create them and when to use them?
    Please post sample for both beans.

    Hi,
    managed beans and backing beans are quite the same in that the Java object is managed by the JavaServer Faces framework. Manage in this respect means instantiation. The difference is that backing beans contain component "binding" references, which managed beans usually don't. Do backing beans are page specific versions of managed beans.
    Managed beans are configured either in the faces-config.xml file, or using ADF Faces and ADFc, in the adfc-config.xml file
    Frank
    Edited by: Frank Nimphius on Jan 31, 2011 8:49 AM

  • Difference between a Managed Bean and Backing Bean

    hi
    i am new to JSF made my 1st appl today ...
    couldnt clearly understand difference between a Managed Bean and Backing Bean , anybody know the difference?
    Regards
    dsdsf

    These are two terms that means the same .... A backing bean is a normal web term, in JSF specifically it is termed as managed bean as this' backing bean' configurations in faces-config.xml are termed within 'managed-bean ' tags.
    Cutting the long story short its a bean class . period :)

  • JSF managed beans and xmlbeans

    Hi,
    I am having a jsf application,where I get data from a back end configuration service,which returns data in the form of xml,for which we already have compiled xmlbeans.At our side we recieve the response and parse it to get back the response.My question is- Is it advisable to use these xmlbeans within the jsf managed beans,or do we need to create pojos from the xmlbeans,before sending them to mbeans.Why I am asking such a question is-xmlbeans have a tight coupling with the backend
    Thanks
    -Bibin

    I would suggest have pojo's. Below are my points
    1) You man not need everything from the XMLBean always. Most of the times, you may need very little from the XMLBean. You can transfer the required information to the POJO. This would also save you from network overhead in case the backend is running on a separate machine.
    2) Any changes to the backend code will not have any impact on your UI code.

  • How to use remote managed bean and JPA in JSF

    Hi All,
    I am familiar with referencing backing-beans and JPA properties where Glassfish and MySQL is running locally. However, is it possible to lookup these same properties using JNDI if they reside on remote servers? If so, what change is needed?
    I would like to distribute the J2EE 5 application load including database by running Glassfish, MySQL on separate servers. This will put on the JSF (presentation-tier) components on it's own server while a secondary system will handle the middle tier processing and leaving the database activities to be carried out on another server. Not sure whether this is the right approach though. These hardware would run on both Solaris and Windows platforms.
    Unfortunately, buying faster hardware is not an option.
    Any assistance would be appreciated,
    Jack

    Hi Faissal,
    Is your suggestion below:
    //Lookup an EJB and use it
       YourRemoteBean bean = (YourRemoteBean ) ServiceLocator.findRemoteObject(jndiName); // ServiceLocator is a class that lookup
                                                                                                                                           //  the remote objectis equivalent to the following lines:
    Properties props = new Properties();
        props.setProperty("java.naming.factory.initial", "com.sun.enterprise.naming.SerialInitContextFactory");
                props.setProperty("java.naming.factory.url.pkgs", "com.sun.enterprise.naming");
                props.setProperty("java.naming.factory.state", "com.sun.corba.ee.impl.presentation.rmi.JNDIStateFactoryImpl");
                // optional.  Defaults to localhost.  Only needed if web server is running
                // on a different host than the appserver   
                // props.setProperty("org.omg.CORBA.ORBInitialHost", "localhost");
                props.setProperty("org.omg.CORBA.ORBInitialHost", "remoteServer");
                // optional.  Defaults to 3700.  Only needed if target orb port is not 3700.
                // props.setProperty("org.omg.CORBA.ORBInitialPort", "3700");
                InitialContext jndiContext = new InitialContext(props);     
                InitialContext jndiContext = new InitialContext();
                YourRemoteBean bean =  (YourRemoteBean) jndiContext.lookup("ejb.YourRemoteBean");Thanks,
    Jack

  • ECC6 and PI integration design question

    Hello experts,
    We have a customer that is going for ECC6 and going to use PI to integrate with external systems. We are designing the way to design the logic for distributing materials so two options are on the desk and I wanted to ask which one you consider the best one and also if you can mention some documentation about the topic. The options are:
    Option one: To distribute master data like Materials, the Materials are classified under a class using the classifcation for each of the external systems and then using these classes on the distribution model. So whenever a material needs to go for a external systems, it is classified on the corresponding class. The main Con we see with this is that we generate an idoc for each of the external systems.
    Option two: All materials are sent to PI using a unique idoc and the we put the logic for distribution on PI, using data like Plant, Sales organization, material type and so on it is possible to derive the system that they need to be distributed to. This way only one idoc per material is sent to PI and the PI will send the messages wherever they are needed.
    I hope I have explained it ok
    What do you guys think about it?
    Best regards,

    hi,
    as we don't know your context and your complete business scenario, it's difficult to say "THE" solution is ...
    Else by what you say, I would prefer the option 1, because:
    - you can managed easily different scheduled job (time of extraction) depending on your different Legacy needs (perhaps your legacies are localized in different countries, so different timezones). You can use the same extraction program with different variants.
    - you limit the risk to only one legacy sending if something is wrong with your idoc. If you have only one big idoc to split, the risk of a pi mapping dump will be for all your legacies. is it in compliance with your business ?
    - If you have only one idoc, and do the split in PI based on plant, sales areas, material type, etc... that means when you will change one of this structural data , you (in fact business or functional) have also to think that you have this data also hard-coded in PI mapping and/or in a receiver dertemination condition, so somewhere in the 'technical' side (e.g middleware). Well could be a real nightmare and 100% of chance that you will have an issue. And see previous point, issue could be global.
    - with only one idoc split in several messages for several legacies, when you will have a new legacy or have to remove one, that means you will have also to change the spliting mapping, and so in theory you should do regression tests for ALL your legacies, and not only the new one.
    there are probably other points of view, but here mine with what you explained us.
    Regards.
    Mickael

  • X201 HD and Operating Sysytem RAM questions

     Hello - I appreciate any insights as I am about to pull the trigger on an X201 purchase, however had a few last-minute configuration questions re:
    Hard drives
    Can I add a solid state drive later (in addition to the current HD drive)?
    Not sure if I should go for a 250GB FDE drive or a 500GB non-FDE drive. I am sure 250B would be enough space, however was not sure about how useful FDE was for me
    Can I add a spinning HD drive later to a system with a solid state drive (or vice-versa)?
    2. Operating system(s)
    Can I install Windows XP and Windows 7 (i.e. a dual boot machine?
    Windows 7 Professional 64 – needs 2GB RAM and 20GB of HD space. I was going to get 4GB of RAM (2 DIMM). Is the RAM sufficient?
     Thanks!

    1. The X201 can only hold one hdd in its bay, so you have to choose from either the SSD or the platter hdd.
    2. Unless you work with sensitive documents or have a high tendency to lose your personal possessions, then i don't think you would need the FDE drives. Just get the regular 500 gig would be fine.
    3. You only can use one drive at a time.
    4. 4 gigs of ram should more than enough. 
    Regards,
    Jin Li
    May this year, be the year of 'DO'!
    I am a volunteer, and not a paid staff of Lenovo or Microsoft

  • Best practices for Manage beans and Navigation flow

    Hi,
    I have the following scenarios along with the possible solutions. I need confirmation from JSF experts, which is recommended way of doing it.
    Scenario
    I need to show a list of item from database/config file to the user. User will select the show details from the menu and the details page will be shown with list of items.
    User - > Menu - >Show Details -> Details Page.
    Components
    DetailsBean.java
    menu.jsp
    Details.jsp
    Solution 1:
    a) When user click show details, call DetailsBean.populate(),
    b) DetailsBean.populate() method populate list from DB, store it as a property in it and returns the String for next flow
    c) User will be navigated to the Details.jsp using the <from-outcome> for menu.jsp.
    d) Details.jsp will read the property from DetailsBean() to show the list of items on page.
    Solution 2
    a) When user click show details, navigate user to the Details.jsp.
    b) Details.jsp is tied with the DetailsBean.populate() method , will get the list and print the page.
    I am little confused which one to be used and when ?. Are there any scenarios where we need to use specifically solution 1 , any sometimes specifically solution 2?
    In Brief
    *" Should we populate bean first and let the jsp page, just draw what ever is there in the bean, without thinking much about how the bean get populated"*
    or
    *"Should we let JSP to call methods on bean, get contents directly and show to user, no need to store in bean."*
    Let me know, my you need more information about my question.
    Expecting an quick reply on this.
    Thanks,
    Sudhir

    In real world, I never use navigation cases. I always declare action methods void (or return null, if you want). I just show results at the same page, if necessary with help of the 'rendered' attribute. And then for plain page-to-page navigation I just use GET all the way using h:outputLink or plain HTML <a> elements, but not with commandLinks. Finally it's all much better for the User Experience.

  • N7k N2K and n5k N2K design questions

    Data Center with Dual N7K , want to add N2K for top of rack server access. I understand the model where I can set my servers up with a vPC Etherchannel to a pair of 2k with one 2k attached to each 7k. This model appears to meet all of the no single attached device rules of vPC. The question I have is how I can attach traditional server connectivity that doesn't support Etherchannel. In this model if I attach to a single 2k or team(without etherchannel) to both 2ks haven't I violated the no single attached device rule of vPC.
    I have a similar issue with the 2232 and N5k model. In order to support FCOE on a 2232, the only supported configuration is to attach the 2232 to a single 5k, in all of the design models I have seen the server is depicted with a vPC etherchannel to the two 2232s that are attached to their respective 5k. In this design if my server doesn't support an Etherchannel, and I am forced to utilize traditional teaming I have broken the no single attached device rule of VPC again.
    This no single attached device rule of vPC makes it really hard to utilize the FEX in either of these scenarios, What is the recommendation for connecting servers that don't support Etherchannel in these two models

    Hi,
    Take a look at the document Nexus 7000 Fex Supported/Not Supported Topologies and you'll see what options you currently have.
    As I mentioned, up to and including NX-OS 6.2, a single FEX can only be connected to a single Nexus 7000. This is what is shown as Figures 8, 9 and 10 in the unsupported topologies section.
    A server can be connected to two different FEX, and those two FEX could be connected to a single Nexus 7000 or two different Nexus 7000. The options are shown in Figures 6 and 7.
    Regards

  • How to execute multiple methods of application module from managed bean using operation binding

         im using jdev 11.1.2.3
    gettiing error..........in my page as below
    java.lang.NullPointerException
    ADF_FACES-60097:For more information, please see the server's error log for an entry beginning with: ADF_FACES-60096:Server Exception during PPR, #1
    and weblogic log as below
    RegistrationConfigurator> <handleError> ADF_FACES-60096:Server Exception during PPR, #1
    javax.servlet.ServletException
        at javax.faces.webapp.FacesServlet.service(FacesServlet.java:521)
        at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
        at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
        at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
        at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:173)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:125)
        at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:468)
        at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
        at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:468)
        at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:293)
        at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:199)
        at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at oracle.adf.library.webapp.LibraryFilter.doFilter(LibraryFilter.java:180)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:119)
        at java.security.AccessController.doPrivileged(Native Method)
        at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:315)
        at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:442)
        at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
        at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
        at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:139)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
        at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
        at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
        at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
        at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
        at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
        at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
        at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
        at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    Caused by: java.lang.NullPointerException
        at view.com.pof.admin.users.POFAdminUser.createPasswordHistory(POFAdminUser.java:64)
        at view.com.pof.admin.users.POFAdminUser.performOperationBinding(POFAdminUser.java:49)
        at view.com.pof.admin.users.POFAdminUser.saveData(POFAdminUser.java:28)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at com.sun.el.parser.AstValue.invoke(Unknown Source)
        at com.sun.el.MethodExpressionImpl.invoke(Unknown Source)
        at org.apache.myfaces.trinidadinternal.taglib.util.MethodExpressionMethodBinding.invoke(MethodExpressionMethodBinding.java:53)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.broadcastToMethodBinding(UIXComponentBase.java:1545)
        at org.apache.myfaces.trinidad.component.UIXCommand.broadcast(UIXCommand.java:183)
        at oracle.adf.view.rich.component.fragment.UIXRegion.broadcast(UIXRegion.java:159)
        at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.broadcastEvents(LifecycleImpl.java:1137)
        at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:361)
        at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:202)
        at javax.faces.webapp.FacesServlet.service(FacesServlet.java:508)
        ... 38 more

    User, I fail to understand what your header (of the question) has to do with the stack trace. You get a npe in your code
    view.com.pof.admin.users.POFAdminUser.createPasswordHistory(...)
    This is the point where I would start my investigation.
    Timo

  • Managing files and other newbie type questions

    I'm trying to decide between FCPX and Premiere Elements. I manage most of my media with Aperture. I've learned a whole lot already in a few solid hours of use of the trial. The integration with Aperture/iPhoto is very nice and keeps me out of Adobe Organizer which I do not care for. I have a signficant time investment in Aperture libraries, so the extra $$ for FCPX might be worth it -- if I can learn it.
    It appears that FCPX currently will not directly import AVCHD (from Sony) with 5.1 -- it is brought in as stereo. Or am I missing something? I can get it in re-wrapping it with ClipWrap. But that adds $50 to an already expensive piece of software (for me.) I have an older mini-DVD camcorder (also Sony). Actually the camcorder is toast, but I have the mini-DVDs. Can I import these and keep the 5.1 or do they need an intermediate step?
    I can't figure out how to go from FCPX to Aperture, if that's possible. If I import with FCPX, do I have to turn around and re-import into Aperture? (We really like having it all in one app for browsing the photos/videos.)
    If I import with FCPX and do not optimize, I understand I can do this later on the fly if I choose to use the clip, right? The unoptimized video quality of the imported clips is not worse than the source, correct? This will save me a ton of disk space if I can use this fast, unoptimized import. I don't edit 80% of the source.
    I've figured out how to do a Picture-in-Picture type deal but I can't find in help or elsewhere how to resize/transition the Picture-in-Picture. Here's what I'm doing: We're filming a high school band. The primary video is the big picture, wide view of the band. Then when a soloist performs, I overlay a separate video of them moving into position in "picture-in-picture" type of window. When they actually perform, I would like to put in a transition where this picture in picture "explodes" or zooms out to fill the screen. I can't figure this out. When the soloist is done, I want to reverse it. This whole process looks like a perfect example of multi-cam edits, but I'm still trying to figure that out.
    Thanks for any pointers.
    -Jack

    Boerne wrote:
    Ok, multi-cam is not the way to go for this. Thanks.
    Keywords... that's what I'm discovering now. Very nice organizational stuff in there I've been missing with iMovie/PrE. I don't have that many clips yet, but eventually this will be very handy.
    I'm going away from the PIP deal. We tested it last night and even on a 55" screen, it's really just kinda gee-whiz. You don't get enough detail to see what he's doing. So I'm just transitioning now from main clip to soloists and back. The wide screen shot was totally useless in a little PIP window anyway. Since the whole show is only 10 minutes, I'll just give them two videos. One edited with soloists/visuals one raw video of the whole band for marchign critique.
    Actually, if you decide that the PIP stuff does not work, this brings multicam back into the process.
    PIP was the only reason why I advised against going with multicam for this.
    It would be nice if the end user could pick the camera angle they wanted while watching.
    Thanks for the inputs. I will read up on keyframes as you suggested.
    -Jack
    You can't offer that kind of interactivity to the user, but with multicam you can offer your best choice of angles - and can even easily produce two or more different versions.

  • About managed beans and

    I'm trying a JSF code similar to:
    <li>Number:<h:inputText value="#{contract.pk.number}" required="true" id="pk_number" /></li>
    <li>Alias:<h:inputText value="#{contract.pk.alias}" required="true" id="pk_alias" /></li>The contract bean is similar to:
    public class Contract implements Serializable {
        @EmbeddedId
        private Contract.PK pk;
        @Embeddable
        public static class PK implements Serializable {
            private String number;
            private String alias;
            public String getNumber() { return this.number; }
            public void setNumber(String sNumber) {...}
            public String getAlias() { return this.alias; }
            public void setAlias(String sAlias) {...}
    }        The java code is generated automatically by Dal� (http://www.eclipse.org/dali/) and I can modify it as needed, while the database squema is out of my control.
    I get stuck with next problem:
    Whatever I input into my form a "conversion error" is returned for the pk members.
    Thanks in advance for any help, hint or link!

    I checked get/setPk were correctly defined. Still the problem persisted.
    After tracing the error I found that getPk returns null, so I just added a dummy initializer for pk after which it all works correctly (myfaces implementation bug probably??? ). Just for anyone interested the working code looks like:
    public class Contract implements Serializable {
        @EmbeddedId
        private Contract.PK pk = new Contract.PK();  // <- new Contract.PK() avoid the conversion error (null pointer exception).
        @Embeddable
        public static class PK implements Serializable {
            private String number;
            private String alias;
            public String getNumber() { return this.number; }
            public void setNumber(String sNumber) {...}
            public String getAlias() { return this.alias; }
            public void setAlias(String sAlias) {...}
    }   

  • Operation not found error while calling AM methods from managed bean

    Hi,
    operation not found error while calling AM methods from managed bean.
    written a method with two parameters in AM.
    exposed the method in AM client interface
    in the page bindings added the method in method action ..left empty in the value fields of the parameters.
    calling the method from managed bean like below
    String userNameVal = (String)userName.getValue();
    String passwordVal = (String)password.getValue();
    OperationBinding operationBinding =
    ADFUtils.findOperation("verifyLogin");
    operationBinding.getParamsMap().put("userName",userNameVal);
    operationBinding.getParamsMap().put("password",passwordVal);
    operationBinding.execute();
    i am getting operation verifyLogin not found error.Please suggest me something to do.
    Thanks
    Satya

    Hi vlsn,
    Can you try with the below code
    // in your backing bean
    OperationBinding operation = bindings.getOperationBinding("verifyLogin");
    //Put your both parameters here
    operation.getParamsMap().put("parameter_name1", parameterValue1);
    operation.getParamsMap().put("parameter_name2", parameterValue2);
    operation.execute();
    if (operation.getResult() != null) {
    Boolean result = (Boolean) operation.getResult();
    and share the result.
    regards,
    Rajan

  • ADF Faces and BC: Scope problem with managed bean

    Hi,
    I am using JDev 10.1.3 and ADF Faces with ADF BC.
    I have created a managed bean which needs to interact with the binding layer and also receive actions from the web pages. I have a managed property on the bean which is defined as follows:
    <managed-bean>
        <managed-bean-name>navigator</managed-bean-name>
        <managed-bean-class>ecu.ethics.view.managed.Navigator</managed-bean-class>
        <managed-bean-scope>session</managed-bean-scope>
        <managed-property>
          <property-name>bindings</property-name>
          <value>#{bindings}</value>
        </managed-property>
      </managed-bean>I need the been to session scope because it needs to keep previous and next pages to navigate the user through their proposal. If i use session scope (as above) i get the following error when i click on a comand link which references a method in the above bean: #{navigator.forwardNext_action} which returns a global forward.
    this is the exception:
    javax.faces.FacesException: #{navigator.forwardNext_action}:
    javax.faces.el.EvaluationException: javax.faces.FacesException:
    javax.faces.FacesException: The scope of the referenced object: '#{bindings}' is shorter than the referring object     at
    com.sun.faces.application.ActionListenerImpl.processAction(ActionListenerImpl.java:78)     at
    oracle.adf.view.faces.component.UIXCommand.broadcast(UIXCommand.java:211) at
    javax.faces.component.UIViewRoot.broadcastEvents(UIViewRoot.java:267)at
    javax.faces.component.UIViewRoot.processApplication(UIViewRoot.java:381)     at
    com.sun.faces.lifecycle.InvokeApplicationPhase.execute(InvokeApplicationPhase.java:75)     at
    com.sun.faces.lifecycle.LifecycleImpl.phase(LifecycleImpl.java:200)     
    at com.sun.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:90)     at
    javax.faces.webapp.FacesServlet.service(FacesServlet.java:197)how can i get around this?
    Brenden

    Hi pp,
    you need to create a managed (not backing) been set to session scope.
    You can call/reference the managed bean from your page.
    the backing bean is designed around a page lifecyle which is request oriented in it's design.
    This is a simple managed bean from faces-config.xml
    <managed-bean>
        <managed-bean-name>UserInfo</managed-bean-name>
        <managed-bean-class>ecu.ethics.admin.view.managed.UserInfo</managed-bean-class>
        <managed-bean-scope>session</managed-bean-scope>
          <managed-property>
          <property-name>bindings</property-name>
          <property-class>oracle.adf.model.BindingContext</property-class>
          <value>#{data}</value>
        </managed-property>
      </managed-bean>and the getters and setters for bindings in your session scope managed bean:
        public void setBindings(BindingContext bindings) {
            this._bindings = bindings;
        public BindingContext getBindings() {
            return _bindings;
        }you can access the model from the managed bean using the the BindingContext if needed.
    also have a look at JSFUtils from the ADF BC SRDemo application, there are methods in this class such as resolveExpression which demonstrate how to get the values of items on your page programatically using expression language. You can use this in your managed bean to get values from your pages.
    regards,
    Brenden

Maybe you are looking for