ADF BC: Active data model question

Hello,
can somebody confirm that row iterator of detail view object obtained from master view using view link is different from row iterator used by active data model (getDetails() in application module) ?
My problem is:
I have a detail view object with multiple updateable entities. When I create a new row like this:
   Row row = masterViewRow.getDetails().createRow();
   masterViewRow.getDetails().insertRow(row);I don't see this row on web page probably because multi upd entities view object doesn't notify other iterators about changes by default.
I have to use
   Row row = appModule.getDetails().createRow();
   appModule.getDetails().createRow(row);to make it work
Can I omit use of application module and use master view object only to work with details in model and have changes visible in layout?

Nobody can answer this.
Where are all the ADF experts.
Please help solve this issue.

Similar Messages

  • Active data model problem

    Hi all!
    I have a problem working with ADF BC, i think it's caused by the Active Data Model.
    I have a jspx with an adf:table based on a a ViewObject ( with an underlaying EntityObject ) that queries for all employees that matches: attributeName = '0' .
    Then I navigate to another jspx where I can create employees. So I create a new employee with a value on that attribute different from '0', for example '1'.
    Finally I navigate to the first page and... THE NEW EMPLOYEE APPEARS IN THE ADF:TABLE!!! Aulthogh his attribute value is not 0!!!
    Any suggestions? Is it a problem related to de active data model?
    Thanks in advance!

    Hi,
    the filter is applied to the query, so you will have to re-execute the query to not see the new enetered value. The query filter is not applied to new records added to the current collection.
    So add the user, commit it and re-query the VO
    Frank

  • How Active Data Model Works?

    Some one can explain to me, how Active Data Model Works?
    All data is managed in memory? When de changes go to the database? benefits of use.
    Thanks in advance.

    See if this overview helps:
    http://docs.oracle.com/cd/E16162_01/web.1112/e16182/bcintro.htm#sm0061

  • Data Modeler: Relational data model questions

    1. Can a different notation be specified for relational data models' constraints? Specifically, I'd like crow's feet. BTW, the docs show crow's feet and parent pointer (with the arrowhead), but there's no such thing in the actual modeler.
    2. Is there any way to manually route FK constraints lines?
    3. When forward engineering from logical, is there any way to indicate the preferred name for keys and indexes (primary, unique, foreign)?
    4. Mandatory/optional indicator on tables: what exactly does 'N' or 'A' stand for? I can understand 'N' meaning "Not optional", but 'A'? Wouldn't it be simpler to use '*' and 'o' like in the logical?
    Man, do I ever miss Designer!
    Thanks,
    Patrick

    Here's one more question:
    I've transformed several super/sub entities to relational, and some of the tables do not allow me to open Properties (on the table). I can use the navigator to open column properties, but cannot open table properties (neither from diagrammer nor from navigator). Some of the tables are two or three subtype levels deep, and I haven't figured out why some open and some don't.

  • SQL Query Data Model Question

    Hi All,
    I'm new to XMLPublisher so i'll try to explain the best i can the problem i'm having creating a Data Model for my report.
    I'm using Database XE and have installed XMLPublisher, as XE is extremely flexible in creating apps i'm having huge trouble finding a reporting solution for my applicaiton. I have created an app to track and control time and expense which i want to generate timesheet, expense sheets, invoice etc from.
    Having done some read i think XML is the best approach for me however i'm having huge troubles with my data model.
    EXAMPLE:
    I have 2 tables to start with, first table holds client information and the second table holds project information. One client can have many projects.
    When i create the Data Model using SQL i get the obvious problem that when a client has more then one project my XML structure is incorrect.
    SELECT gc.name AS CLIENT,
    gp.name AS PROJECT
    FROM gte_client gc,
    gte_project gp
    WHERE gc.client_id = gp.client_id(+)
    The above SQL creates the following XML
    - <ROWSET>
    - <ROW>
    <NAME>Symatrix Ltd</NAME>
    <NAME>Symatrix Pre-Sales</NAME>
    </ROW>
    - <ROW>
    <NAME>Aston Carter</NAME>
    <NAME>MOD</NAME>
    </ROW>
    - <ROW>
    <NAME>Symatrix Ltd</NAME>
    <NAME>Fujitsu</NAME>
    </ROW>
    </ROWSET>
    Obvious problem here is the fact that client Symatrix appears twice as there are 2 projects.
    I have tried to write SQL/XML and i think i'm beginning to understand the structure of the XML functions however XMLPublisher doesn't appear to understand the syntax when creating a SQL Query for the data model.
    I hope all this makes sense.
    Kind Regards
    Glen

    Hi All,
    Ok, so with a little searching through documents and articles on the web i found this information.
    http://blogs.oracle.com/xmlpublisher/2006/05/05#a38
    this details the exact thing i'm trying to achieve so anyone else who searches this forum with a similar problem check this out.
    Regards
    Glen

  • Data Model Question

    I am new with XML Publisher. What I'm still not getting is how to define a data model containing parent-child relationship (e.g. customers - orders - order_lines). Do I have to use some tools to create it?
    Denes Kubicek

    Hi.  In my opinion, option A is the most straight forward solution. 
    One thing to keep in mind is how the % figures will roll up, say over the entity hierarchy.  The % accounts will simply sum as you roll up the entity heirarchy, which is generally incorrect.  So, at a parent level you will need to back calculate the % figure.  I hope this makes sense.  I am pointing this out to you because I am in the process of building a similar solution and clean forgot about this.  I have had to re-engineer the solution slightly to correct this.
    Hope this helps.
    Sean

  • Active Data Guard Question

    Hi All,
    I'm thinking whether the oracle active data guard can enable "multiple" physical standby database to be opened for read-only access while Redo Apply is active?
    As i read a lot of document, seems like only able to enable "one" physical standby database only.
    Let say i have one production database, and i have two or more physical standby DB (can be RAC or stand alone) in different country,
    can this two physical standby be enable active data guard for read-only access?
    Any idea? Thanks in advance.
    Best Regards,
    hau

    Dear klnghau,
    If your project is on hold now then you should have time to read some more!;
    http://www.oracle.com/us/products/database/options/active-data-guard/index.htm
    Please also read the following thread;
    More than One Physical Standby Database
    Regards.
    Ogan

  • Canonical data model  Question?

    How can XI support a canonical data model ?  We don't want point-to-point mappings....We always want to map to and from a central message/data model which will be based on canonical data model.
    regards,
    tony

    you can support that with using a sequence of message mappings within an Interface mapping. with that you could canonicalize your message mappings, where you map from legacy to canonical model and from canonical to other legacy model. your interface mapping however would always be point to point, including a sequence of mappings to and from canonical model.
    best regards
    christine

  • Header/Item extractor data modelling question

    Hi,
    I have installed the Business Content for CRM Activities. In the Business Content they have a separate extractor and cube for header and item level.
    However, for our reporting requirements we would like to use one cube. We don't consider a multicube as a valid solution since the item/header cubes have ofcourse not all fields in common what will lead to the famous not-assigned behaviour in Bex.
    Therefore I'm trying to enrich each item line with all header fields. I don't want to write a user-exit to enhance the item extractor on datasource side since this will be a lot of work and also not sure this would be delta-compatible. I have bad experiences with this on a previous project.
    So I would like to find a solution on BW (data modelling) side but at the moment I don't see any good alternatives. The only thing that I can think of is to build one ODS/master data object (transaction number) that contains all header information and to read this ODS/Master data object while loading the item lines from the ODS to the cube. This way I can copy all header fields to the item line. But this will force me to do FULL loads between the item ODS and the "global" cube and ofcourse I would like to avoid this. (if a header fields changes I want this to be changed in all item lines as well ofcourse)
    Does anyone have better solutions ?
    Thanks

    What I finally did is the following :
    1) Create a new cube with all characteristics and keyfigures from header/item level.
    2) Load Header ODS to new cube
    3) Load Item ODS to new cube but via a startroutine that enriches the item datasources with all header characteristics that are not yet available on item level.
    4) Make a query and put constant selection on a restricted keyfigure that contains the following :
    a) Product (item level) ==> CONSTANT SELECTION enabled
    b) Nr. of activities (header level)
    In Bex everything looks now fine with the nr. of activities header keyfigure, even when I do a filtering on material that still only exists on item level thanks to the constant selection but my result row is acting a little bit dissapointing. I hope that can be solved.
    Let me give an example : (i'm even excluding the header line in this query)
    1) all fine
       Transaction nr ; Nr. Of activities
       6000           ; 1
       ====================================
    Resultrow           1
    2) drilldown to material
       Transaction nr ; Material ; Nr of activities
       6000           ; 12D      ; 1
       6000           ; 13D      ; 1
    =====================================================
    Resultrow                      2
    ==> As you can see the resultrow changes to 2 after doing the drill-down.
    Note that I am using "calculate result as summation". If I use "nothing defined" the result will be fine in this case but then it is not correct in the case where I do filterings on e.g. a material number. It will then show the number of activities for the WHOLE query.
    Is there anything I can do to get the resultrow right in all circumstances using constant selection ?
    Thanks
    Message was edited by: Double U

  • Complex data model question regarding master data

    Hi Experts,
    I have a requirement to add master data retail price and master data cost into BW.
    Both amounts are based on plant and material.  How do I model this?
    Here is how the records are coming in via the datasource:
    Plant   Material      UofM Cost    CostFrom CostTo Retail   RetailFrom RetailTo
    0001   1000321      EA    $1.00   5/1/07   5/9/07     $2.99    6/6/07        6/9/07
    0001   1000321      CS    $8.00   5/5/07   5/7/07     $9.99    8/6/07        9/9/07
    Here's what I am thinking....
    1.  Add UofM, Cost, and Retail prices as attributes of 0MAT_PLANT infoObject
        Also make those attributes time dependent.
    2.  Use ABAP slit up the cost info and the retail info and use the valid dates above for the time dependent from/to dates in 0MAT_PLANT
    Example:
    0MAT_PLANT attributes:
    Plant   Material      UofM Cost    ValidityFrom ValidityTo    Retail
    0001   1000321      EA    $1.00   5/1/07          5/9/07
    0001   1000321      CS    $8.00   5/5/07          5/7/07
    0001   1000321      EA               6/6/07          6/9/07         $2.99
    0001   1000321      CS               8/6/07          9/9/07         $9.99
    After looking at the above, I need to compound UofM with 0MAT_PLANT somehow.  How would I do that?
    Is this the best way to model this?
    Thanks,
    Chris

    Chris,
    I wouldn't include those amounts as attributes for 0MAT_PLANT. In general it doesn't make too much sense, except in some very specific cases, to make a Key Figure an Attribute of a Characteristic.
    I wouldn't either modify the 0MAT_PLANT key to compound it with UOM... Not a good idea, in my opinion.
    In your case, those values can change over time, depending on the validity period. So you could have several records with different validity periods for the same Material and Plant combination.
    I'd rather create an ODS with these values. The key fields would be 0MAT_PLANT and the validity dates and the data fields would be the amounts.
    You could use and include this ODS in any Multicube or get the values based on the validity periods by using ABAP routines if you need to.
    Another thing, as SAP recommends, an InfoObject with more than 500,000 records shouldn't be modeled as Master Data. And 0MAT_PLANT is a perfect candidate for this situation.
    So I'd advise to go with the ODS solution.
    Hope this helps.
    Regards,
    Luis

  • Dimension table design Data modeling question

    Hi Experts,
    Sorry if I am putting my question in a wrong forum and please suggest an appropriate forum.
    need your opinion on the existing design of our 10 years old datawarehouse.
    There is one dimension table with structure like following
    Dimension Table
    Dimension Key Number (THIS IS NOT A PRIMARY KEY)
    Natural key (from source) Number
    source name character
    current record indicator char(1)
    form_date date
    to_date date
    many other columns, which if change a new current record is created and previous is marked as H-historical
    Data is stored in the dimension table like this
    Dimension_key Natural key Source Name current record ind from_date to_date
    1 10001 Source1 H 1-jan-2005 31-may-2005
    1 10001 Source1 H 1-jun-20005 12-dec-2011
    1 10001 Source1 C 13-dec-2011 NULL
    2 20002 Source1 H 1-jun-20001 12-dec-2011
    2 20002 Source1 C 13-dec-2011 NULL
    The problem I see in this design is that there is no surrogate key, if any attribute is changed the new record is inserted by first taking the dimension key based on the (natural_key,source_name,current_record_ind).
    Shouldn't it be stored like following based on the data-warehousing principals.
    Dimension_key Natural key Source Name current record ind from_date to_date
    1 10001 Source1 H 1-jan-2005 31-may-2005
    2 10001 Source1 H 1-jun-20005 12-dec-2011
    3 10001 Source1 C 13-dec-2011 NULL
    4 20002 Source1 H 1-jun-20001 12-dec-2011
    5 20002 Source1 C 13-dec-2011 NULL
    Please let me know the pros and cons of the current design.

    And what if you have both the features something like this :
    Lineno  Dimension_key Natural key Source Name current record ind    from_date         to_date
         1              1       10001     Source1                  H   1-jan-2005     31-may-2005
         2              1       10001     Source1                  H  1-jun-20005     12-dec-2011
         3              1       10001     Source1                  C  13-dec-2011            NULLI mean just add a new column and populate it with required order by clause. Because what i guess, that in the second example you just added a new column which is something like a line number.
    Regards
    Girish Sharma

  • ADF BC and the Active Data Service

    hi
    The OFM Fusion Developer's Guide for Oracle ADF 11g Release 1 (B31974-05) has a section "42 Using the Active Data Service"
    at http://download.oracle.com/docs/cd/E15523_01/web.1111/b31974/adv_ads.htm
    that says "... If you want your components to update based on events passed into ADF Business Components, then you need to use the Active Data Proxy. ..."
    but it does not seem to explain how to use ADF BC and the Active Data Service.
    I have been able to create this example application ...
    http://www.consideringred.com/files/oracle/2010/ActiveDataServiceADFBCApp-v0.01.zip
    ... that does not have a af:poll component (but has moved polling into a managed bean).
      <managed-bean>
        <managed-bean-name>sumSalBean</managed-bean-name>
        <managed-bean-class>activedataserviceadfbcapp.view.SumSalBean</managed-bean-class>
        <managed-bean-scope>session</managed-bean-scope>
        <managed-property>
          <property-name>empSumSalVO</property-name>
          <value>#{data.activedataserviceadfbcapp_view_sumSalPagePageDef.SumSal.viewObject}</value>
        </managed-property>
      </managed-bean>This is some code in the SumSalBean class:
    package activedataserviceadfbcapp.view;
    // also based on code found in "ADF’s Active Data Service and scalar data (like activeOutputText)" by Matthias Wessendorf
    // at http://matthiaswessendorf.wordpress.com/2010/01/07/adf%E2%80%99s-active-data-service-and-scalar-data-like-activeoutputtext/
    public class SumSalBean
         extends BaseActiveDataModel
         protected static final String SUM_SAL_NAME = "sumSal";
         protected final AtomicInteger fCurrentChangeCount = new AtomicInteger(0);
         protected long fActiveDataUpdateEventTime;
         protected EmpSumSalVO fEmpSumSalVO = null;
         @PostConstruct
         public void setupActiveData()
              ActiveModelContext vActiveModelContext =
                   ActiveModelContext.getActiveModelContext();
              Object[] vKeyPath = new String[0];
              vActiveModelContext.addActiveModelInfo(this, vKeyPath, SUM_SAL_NAME);
              ScheduledExecutorService vSEService = Executors.newScheduledThreadPool(1);
              vSEService.scheduleAtFixedRate(new Runnable()
                        public void run()
                             if (hasDataChanged())
                                  triggerActiveDataUpdateEvent();
                   3, // let's wait some seconds
                   2, // period between the updates
                   TimeUnit.SECONDS);
         public void triggerActiveDataUpdateEvent()
              setActiveDataUpdateEventTime(System.currentTimeMillis());
              incrementCurrentChangeCount();
              ActiveDataUpdateEvent vEvent =
                   ActiveDataEventUtil.buildActiveDataUpdateEvent(
                        ActiveDataEntry.ChangeType.UPDATE,
                        getCurrentChangeCount(), new String[0], null,
                        new String[] { SUM_SAL_NAME },
                        new Object[] { getSumSal() });
              fireActiveDataUpdate(vEvent);
         public String getSumSal()
              EmpSumSalVO vEmpSumSalVO = getEmpSumSalVO();
              return "" + vEmpSumSalVO.getFirstSumSal();
         protected void startActiveData(Collection<Object> rowKeys,
              int startChangeCount)
         protected void stopActiveData(Collection<Object> rowKeys)
         public int getCurrentChangeCount()
              return fCurrentChangeCount.get();
         protected boolean hasDataChanged()
              EmpSumSalVO vEmpSumSalVO = getEmpSumSalVO();
              return vEmpSumSalVO.hasDataChanged(getActiveDataUpdateEventTime());
         public void setEmpSumSalVO(EmpSumSalVO pEmpSumSalVO)
              fEmpSumSalVO = pEmpSumSalVO;
    }How all this behaves a runtime can be seen in this screencast
    at http://www.screentoaster.com/watch/stUEpQSkxIR19aSV9YW1NRVF9W/activedataserviceadfbcapp_v0_01_zip_demo
    I would welcome comments on how the example application in ActiveDataServiceADFBCApp-v0.01.zip can be improved, or references to information on how this should be done properly.
    question
    (q1) Where can I find some example code that does use ADF BC and the Active Data Service?
    many thanks
    Jan Vervecken

    Jan,
    ADF BC does not natively support ADF yet. Its planned for a next release. The only Data Control that out of the box support ADS is BAM. To use ADF BC with e.g. databae change notifications you
    - create a shared AM
    - Configure the VO to respond to database changes (check box)
    - Configure the database to broadcast changes
    - Use an af:poll component for the refresh because the update would be on the model layer only
    So what is in the documentation is a doc bug. In the current releae you can use ADS best with a POJO model (that you use directly for dashboard use cases). You can though use a POJO data control, but this at the current stage would just act as a pass through for the data access.
    See example 156 on http://blogs.oracle.com/smuenchadf/examples/ for how to do it with ADF BC
    Frank
    Ps.: Of course, the plan is to make everything working out of the box with no developer action required.
    Edited by: Frank Nimphius on Feb 12, 2010 6:56 AM
    Re-read your post. Maybe I need to revise my comment. Are you accessing AM directly or via the ADF layer. If the latter - I did not yet look at your sample - then this may work if you don't release the AM module you access directly (may not scale well)

  • How to auto scroll a table to last row when using Active Data Service?  11g

    Hi all,
    Has anyone got any experience with the ADF Active Data Service?
    I am using an af:table in combination with the ADF Active Data Service and I want the table to scroll to the latest row when new data arrives.
    It is basically a simple setup:
    The value attribute of my table points to a bean that extends CollectionModel and implements ActiveDataModel. This works perfectly. A soon as I receive new data in my bean, the table is automatically updated (e.g. a new row is inserted). I use JDev version: 11.1.1.3
    The problem is that I can not get the scroll bar to move down to the last row automatically when new data arrives. The auto scroll only works for the client that actually performs a submit. In other words the person that submits the new data, will see his scrollbar move down, this is because I added the following code:
    RowKeySet rowkeysset = chatTableBinding.getSelectedRowKeys();
    rowkeysset.clear();
    rowkeysset.add(getRowKey());
    AdfFacesContext.getCurrentInstance().addPartialTarget(this.chatTableBinding);
    This does not work for the clients that receive the new data via the Active data model. Apparently the active data service partially refreshes the table, but not in such a way that it sets the selected row to the last row?
    My table definition:
    <af:table value="#{pageFlowScope.chatBean.instantMessagingChatwindowBean}" var="row"
    id="t1" columnStretching="last" rows="10"
    styleClass="AFStretchWidth" autoHeightRows="10" contentDelivery="immediate"
    horizontalGridVisible="false"
    verticalGridVisible="false"
    disableColumnReordering="true" displayRow="last"
    rowSelection="single"
    binding="#{pageFlowScope.chatBean.instantMessagingChatwindowBean.chatTableBinding}"
    inlineStyle="height:80px;">
    p.s.
    The table is used to show incoming chat message.
    I am building a chat client (taskflow) in a Webcenter application. The concept is similar of that presented by Lucas Jellemain in his blogpost on building a Google Talk Client http://www.oracle.com/technetwork/articles/jellema-googletalk-094343.html

    Dan,
    This is a thanks for posting your findings. You saved me quite some time.
    <af:selectBooleanCheckbox id="showOnlyActiveSubscriptions" selected="true"
    label="Show only active services:"
    value="#{servicePortfolioBean.showOnlyActiveSubscriptions}"
    autoSubmit="true" />
    <af:table binding="#{servicePortfolioBean.serviceTable}" ...
    partialTriggers="showOnlyActiveSubscriptions">
    Of course, when I add a "ValueChangeListener" to the selectBooleanCheckbox, the PPR stops working. :( I wonder why?

  • Data Modeling with BPC

    Hello , does any body know some paper that explains the datamodeling process in BPC?
    Thanks
    Sergio

    Hi Sergio-
    Key the data modeling question is the recognition that BPC utilizes a true account based data model.  This means only one key figure is available in your modeling design.
    The properties of the individual dimensions are key for much of the BPC delivered functionality. For example, Currently Translation requires several specific properties across multiple applications to work correctly.
    Also, there is functionality to move data from Application to Application (SAP will soon be posting a How To guide for this!), but currently there is no capability to move data (using available delivered BPC tools) from Appset to Appset. However if you consider the creation o custom process chains and the power of NetWeaver  BW tools, you are essentially able to manipulate records from/to anywhere.
    Regards,
    Sheldon

  • SQL Developer Data Modeling - import from Oracle Designer Model

    Hi,
    I do want to try to import model from a Designer repository.
    The first step is to create connection to the repository (9i version).
    I have created the one, however any attempt to test the connection or to go to the next step simply finishes with error "4", without any valueable message.
    Any idea or suggestion ?
    Thanx.

    A clarification of the Data Modeling feedback application and this forum. The developers are part of the SQL Developer development team, but as for all our features, each developer has a focus area, so the Data Modeling developers will tend to answer those questions. There will not be a separate Data Modeling forum, so once we're production all SQL Developer Data Modeling questions will be on this forum. We want to encourage all users to get into the habit of tagging posts. Having the posts tagged means that users can skip them if they're not their focus area and users can search using the tags. The more tags are used in all the forums the better. You may well find find the answers to your questions are outside this forum!
    As SQL Developer Data Modeling in in its first early adopter phase, we have provide a feedback application, where users can provide feedback, log bugs and ask questions. We did the same for SQL Developer before it's initial production release and found this to be very useful for the product.
    I'm happy for "How To" questions to be on this forum, as the answers benefit the broader community.
    Sue Harper

Maybe you are looking for

  • Issue between PO and Sales order

    we are having a serious performance issue in one certain circumstance. we have a sales order with 1800 items on it. of these 1800 items, 21 have generated a third party PO, for the goods to be shipped directly to the customer from the supplier. When

  • Is it possible to get hacked from safari on your Iphone

    When I was browsing a site a white badge icon popped up and asked if I wanted to go to a site. Assuming that this was malicious I went to take safari off multitask. When I did that my screen went black w/the white apple logo and a few seconds later m

  • Zip Exception Problem using jeode to unzip file

    Hi, I am using a JAva application, that has a client running on the IPAQ-Jeode JVM and the server is on JDK 1.3. The application is using RMI to communicate between the Server and the PDA. The application code is running on JDK 1.3. Once the RMI comm

  • Keynote v.5.3 compatible with OS 10.8.5 to find/get/purchase ?

    I need to find this version of Keynote. Does anybody have a copy to sell ? Or let me know where I can purchase one ? I'm not wishing to switch to OS 10.9! thks for any useful suggestion

  • Garageband error messages and it's really slow

    When I try to open Garageband I get the message: "GarageBand has detected a possible conflict between one or more third party MIDI or audio drivers." Then I click OK a couple of times and it opens but it's super slow. This is only been since I upgrad