BPC FACT Tables

Dear all,
Please tell me what is the purpose of following BPC FACT tables in which data is stored. Also, how they are different from each other.
1) dbo.tblfactFinance
2) dbo.tblfac2Finance
3) dbo.tblWBFinance
PS: As an observation, dbo.fac2Finance was not containing any BUDGET category entries.
Regards,
Ankush

As per the tuning doc:
WB u2013 real time data input (ROLAP partition)
This is data that is the most current data sent to the system. Data sent by BPC for Excel data sends and Investigator browser data sends is placed in real-time storage.
FAC2 u2013 short term and Data Manager imports (MOLAP partition)
This is data that is not real-time data, but is also not in long-term storage yet. When you load data via Data Manager (automatic data load from external data sources), it loads the data to short-term storage so that the loaded data does not affect system performance. Only the cube partition associated with this table is processed, so the system is not taken offline.
Fact u2013 long term history (MOLAP partition)
This is the main data storage. All data eventually resides in long-term storage. Data that is not accessed very often remains in long-term storage so that the system maintains performance
This structure allows SAP BPC to maintain the same performance over time even when there is a large increase in data volumes.
Periodically clearing real-time data greatly optimizes the performance of the system and an u201COptimizationu201D process is required (this could be scheduled automatically based on given parameters like a numbers of records threshold).
Lite Optimization:
u2014
Clears Real-time data storage (WRITEBACK) and moves it to short-term data storage (FAC2). This option doesnu2019t take the system offline, and can be scheduled during normal business activity.
Incremental Optimization:
u2014
Clears both real-time and Short-term data storage (WB and FAC2) and moves both to Long-term data storage (FACT).
u2014
This option should be run when the system is offline, but it will not take the system offline so it should be run during off-peak periods of activity.
Full Process Optimization:
u2014
Clears both real-time and short-term data storage and processes the dimensions.
u2014
This option takes the system offline and takes longer to run than the incremental optimization.
u2014
It is best run scheduled at down-time periods u2013 for example after a month-end close.
The Compress Database option is available to rationalize the Fact Tables. u201CCompressu201D sums multiple entries for the same CurrentView into one entry so that data storage space is minimized. Compressed databases also process more quickly.

Similar Messages

  • Add primary key column to fact tables?

    Our datawarehouse folks asked me if they could add a primary key column in the BPC fact tables.  Anyone know if this is possible??
    Edited by: Shawn Freundschuh on Apr 3, 2009 12:27 PM

    Well.. based on my expeirence, it is not possible.
    When you run admin console, sometimes it will return error message.
    I tried it before but you can try it again. Maybe developer changed its behavior.
    Thanks.

  • Duplicate records in Fact Tables

    Hi,
    We are using BPC 7.0 MS SP7. BPC created duplicate records in WB and Fac2 tables. We faced similar issue before and the solution was to reboot the server and cleanup the additional data created. I think it should not be an issue with the script logic files we have. We had the issue across all applications. Data is fine now after the server reboot and running the same logic files.  I want to know if any one faced this issue and if there is any solution other than reboot. I appreciate your help.
    Thanks
    Raj

    Hi Sorin,
    I know this thread is rather old, but i have a problem which is pretty much related to this thread and appreciate if you could assist me. I have client running on 7.0 MS who has been using it for the past 3 years.
    It is a heavily customized system with many batch files running daily to update dimensions, copy data and sort. And Yes we do use custom packages that incorporates stored procedures.
    Recently, with no change in environment, we encountered our factwb ballooning up out of no where. fact table only contains less then 1 gb data but, factwb has 200 gb data and practically paralayzed the system. There is also equilavent 300 gb increase in the log files.
    We are not able to find out what caused this? Or if even the 200gb records in wb are even valid records that are duplicated. Is there a way to troubleshoot this?

  • Import SQL into Fact Table - Conversion Error

    Hi All,
    I am new to BPC MS and did not know much into MS SQL.
    I am facing the Conversion error while converting between unicode and non unicode string data types.
    We are using SAP BPC 10.0 MS SP13, EPM-Addin SP18 on .Net 4
    The below is the error log
    Total Step: 4
    SQLToTxt: Failed in 0 seconds
    Import SQL into Fact Table: Failed in 0 seconds
    [Selection]
    DB = BPC
    TABLE = GlBalanceOpening
    COLUMNS = Account,Auxiliary,Curr,BalanceDateTime,Amount
    TRANSFORMATION = \JMG\LIQ_JMG\DataManager\TransformationFiles\GLOpeningBalances.xls
    CLEARDATA = No
    RUNLOGIC = No
    PROCESSCUBE = Yes
    CHECKLCK = No
    [Message]
    An error occurred while executing a package.
    Package Error Events: 
    ErrorCode = -1071636234
    Source = SQLToTxt
    SubComponent= OLE DB Source [68]
    Description = Column "Account" cannot convert between unicode and non-unicode string data types.
    IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
    Package Error Events: 
    ErrorCode = -1071636234
    Source = SQLToTxt
    SubComponent= OLE DB Source [68]
    Description = Column "AccountName" cannot convert between unicode and non-unicode string data types.
    IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
    Package Error Events: 
    ErrorCode = -1071636234
    Source = SQLToTxt
    SubComponent= OLE DB Source [68]
    Description = Column "Auxiliary" cannot convert between unicode and non-unicode string data types.
    IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
    Package Error Events: 
    ErrorCode = -1071636234
    Source = SQLToTxt
    SubComponent= OLE DB Source [68]
    Description = Column "AuxiliaryName" cannot convert between unicode and non-unicode string data types.
    IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
    Package Error Events: 
    ErrorCode = -1071636234
    Source = SQLToTxt
    SubComponent= OLE DB Source [68]
    Description = Column "Curr" cannot convert between unicode and non-unicode string data types.
    IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
    Package Error Events: 
    ErrorCode = -1071636234
    Source = SQLToTxt
    SubComponent= OLE DB Source [68]
    Description = Column "Dr" cannot convert between unicode and non-unicode string data types.
    IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
    Package Error Events: 
    ErrorCode = -1071636234
    Source = SQLToTxt
    SubComponent= OLE DB Source [68]
    Description = Column "Cr" cannot convert between unicode and non-unicode string data types.
    IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
    Package Error Events: 
    ErrorCode = -1073450901
    Source = SQLToTxt
    SubComponent= SSIS.Pipeline
    Description = "component "OLE DB Source" (68)" failed validation and returned validation status "VS_ISBROKEN".
    IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
    Package Error Events: 
    ErrorCode = -1073450996
    Source = SQLToTxt
    SubComponent= SSIS.Pipeline
    Description = One or more component failed validation.
    IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
    Package Error Events: 
    ErrorCode = -1073594105
    Source = SQLToTxt
    SubComponent=
    Description = There were errors during task validation.
    IDOfInterfaceWithError= {B4E78907-3D9C-4229-9DB9-6A311E45C779}
    Thanks and Regards,
    Raj

    Hi Raj,
    you have to modify the script of your package, see please 1629737 - Error in ImportSQL Data Manager Package
    Regards
         Roberto

  • Update DIM data in Fact Table

    All -
    I have a TSR dimension data (1 of 12 dimensional attributes) in our forecast fact table which gets updated monthly in our source system and we have to reflect this in our BPC system. What is the best method of updating this data in BPC?
    1. Can I update the TSR directly in the forecast fact table and reprocess the application to reflect this in the cube?
    2. Do I have to negate that entire fact table row with the old TSR and insert a new row with the new TSR and process the cube?
    3. Any other method?
    Thanks for your help.

    You need to explain your requirement a bit more. Why you want to add type or rejection reason in fact table.
    In general you should avoid adding textual data in fact.
    Fact tables are only supposed to store keys and measures. It can hold textual data in case of degenerate dimension ( i.e if the number of records are same in dimension as well as fact ).
    Type or Rejection reason are going to have much less records. You can create a new dimension for them having probably 20 or 30 records.
    Disadvantages of adding textual data in fact is, once the fact has millions of records the fact table updates are going to take huge time and also your reports will take long time.
    If you have a rejection reason dimension with some 20 records, you can use rejection reason in prompts.
    But if the same rejection reason is coming from fact table your prompts performance is going to be very slow since it has to fetch 20 distinct records from millions of records.
    Consider these before adding textual data in facts.
    Thanks
    Edited by: Maqsood Hussain on Dec 5, 2012 9:40 PM

  • Why Negative values are entering in to fact table.

    Dear All,
    After Data File uploaded. The Fact table is showing Negative values.
    Can any body give solution.
    And in DataManager->Financial Process>FX Restatament (Is running Succussfully)
    But Currency is not getting translated.
    Please help.
    Thanks,
    Satish.

    The reason the values are negative may be caused by 2 items.  The first is the default method by which BPC store information; BPC is designed to store values based on the natural sign.  So, if a Revenue account is positive in the Data File, and the Property "ACCTTYPE" is INC, the value is stored as a negative number.  When an expense value is loaded to an account with "EXP", the value is stored as a positive.  LEQ are Negative and AST are positive. (CREDITS and DEBITS storage)
    BPC then uses tables and measures to report the account based on type correctly in the Excel interface. This is how  BPC is designed. 
    There is also the CREDITPOSITIVE = YES / NO that is part of the Data Transformation instructions, and if this is setup incorrectly, you may store the reverse signs from the data file that is loaded. So this may need to be reviewed as well.
    As for the FXTranslation, I assume you have rates in the RATE cube to use in the translation process. You will need to verify that the RATETYPE property is filled in for the accounts you wish to translate? DId you process the FX script logic and the RATE script logic prior to running FX.
    Hope this helps.
    Edited by: Petar Daniel on Dec 10, 2008 2:20 PM

  • Experiences of Partitioning FACT tables

    Running BPC 7.0 SP3 for MS
    We have two very large FACT tables (195milliion records and 105million records) and these are currently growing at a rate of 2m/5m records per month - we are running an incremental optimize twice per day
    It has been suggested that we consider partioning the tables to improve performance, but I have not been able to find any users/customers with any experience of doing this
    Specifically
    1. Does it improve performance?
    2. What additional complexity does it add to regular maintenance ?
    3. Have there been any problems encountered implementing Partioned tables?
    4. It would seem that partioning based on time would make sense - historic data in one partition, current data in another HOWEVER many of our reports pull current year and prior year so will this cause a reporting issue? Or degrade report performance?

    I don't know if this is still an issue for you.  You ask about Fact Table partitioning specifically, but you need to be aware that it is possible to partition either the FACT tables or the Fact table partition of the cube, or both. We have used (further) partioning of Fact table partition in the cube with success, and it sounds as if this is what you are really asking about. 
    The impacts are on
    1. processing time, a full optimize without Compress only processes the paritions that have changed, thereby reducing the run time where there is a lot of unchanged data. You mention that you run incremental updates twice daily,  this is currently reprocessing the whole database.  I would have expected the lite optimize to be more effective, supported by an overnight full optimize, if you have an overnight window. You can also run the lite optimize more frequently.
    2. query time. The filters defined in the partitions provide a more efficient path to data in the reporting processes than the defaults, which have the potential to scan large parts of the database.
    Partitioning is not a panacea. You need to be specific about the areas of performance problem that you have and choose the performance improvement strategy to address these.  Looking at the indexing of the database is also an area where you can improve performance significantly.
    If you partition the cube, it is transparent to the usage of the application, from both user and admin perspective. The greatest complexity comes is the definition of the partitions in the first place, but this is a normal DBA function.  The trick is ensure that the filter statements do not overlap, otherwise you might get a value duplicated in 2 partitions, and to define a catchall partition to include anything not included in specific partitions. You should expect to revist the partitioning from time to time.  It is quite straightforward to repartition, you are not doing anything to the underlying data in the FACT tables
    Time is a common dimension to partition and you may partition at different levels of granularity for different periods, e.g. current year by qtr or month, prior and future years by year.  This reflects where the most frequent updates will be.  It is also possible to define partitions based on combinations of dimensions, we use category and time, so that currenct year actuals has the most granular partitions and all historic years budgets go into a single partition.

  • Logical level in Fact tables - best practice

    Hi all,
    I am currently working on a complex OBIEE project/solution where I am going straight to the production tables, so the fact (and dimension) tables are pretty complex since I am using more sources in the logical tables to increase performance. Anyway, what I am many times struggling with is the Logical Levels (in Content tab) where the level of each dimension is to be set. In a star schema (one-to-many) this is pretty straight forward and easy to set up, but when the Business Model (and physical model) gets more complex I sometimes struggle with the aggregates - to get them work/appear with different dimensions. (Using the menu "More" - "Get levels" does not allways give the best solution......far from). I have some combinations of left- and right outer join as well, making it even more complicated for the BI server.
    For instance - I have about 10-12 different dimensions - should all of them allways be connected to each fact table? Either on Detail or Total level. I can see the use of the logical levels when using aggregate fact tables (on quarter, month etc.), but is it better just to skip the logical level setup when no aggregate tables are used? Sometimes it seems like that is the easiest approach...
    Does anyone have a best practice concerning this issue? I have googled for this but I haven't found anything good yet. Any ideas/articles are highly appreciated.

    Hi User,
    For instance - I have about 10-12 different dimensions - should all of them always be connected to each fact table? Either on Detail or Total level.It not necessary to connect to all dimensions completely based on the report that you are creating ,but as a best practice we should maintain all at Detail level only,when you are mentioning any join conditions in physical layer
    for example for the sales table if u want to report at ProductDimension.ProductnameLevel then u should use detail level else total level(at Product,employee level)
    Get Levels. (Available only for fact tables) Changes aggregation content. If joins do not exist between fact table sources and dimension table sources (for example, if the same physical table is in both sources), the aggregation content determined by the administration tool will not include the aggregation content of this dimension.
    Source admin guide(get level definition)
    thanks,
    Saichand.v

  • Parent member values in Fact tables

    Hello,
    I want to understand something, as far as I know, we can only send data to base level members, right ?
    Then how come we find rows of data that have parent member values in the Fact tables ? (assuming we do not play manually with the database of course), I thought that this can be due to an import with the data manager, can this be right ?

    nilanjan chatterjee wrote:
    Hi,
    >
    > The data for the parent members should be available in the SQL tables.
    > For example, 2011.TOTAL is parent member. You should not have any data for this member in your database. If it is there, it might have come somehow (may be an import). But this is not right. You might want to remove these records. But be sure that you dont delete the records for the base level members.
    >
    > Hope this helps.
    I guess you meant should not, right ?

  • Null key in fact tables

    Hi all,
    I have one role play dimension with some null key in fact table, I liked know if it's a good practice?
    thanks

    Depends on what you actually mean..
    When one dimension column contains NULL in your fact table, then it's normally a bad practice. Create a entry in your dimension table to represent this state.
    The problem is that NULL is a state, which means does missing or inapplicable information. This means that the row in the fact table is semantically meaningless. Cause your fact table is no longer additive over this dimension.

  • Help on  Setting logical Levels  in Fact tables and on Dimension tables

    Hi all
    Can any body provide any blogs or any king of material on what exactly is levelling .
    Like after creating the Dimensional hierarchies we need to set the logical levels for the LTS of fact tabels ri8 .So what is the difference between setting logical levels to fact tabels and also Setting levelling on Dimension tables .
    Any kind of help is appreciated
    Thanks
    Xavier.
    Edited by: Xavier on Aug 4, 2011 10:50 AM

    I have read these blogs ,but what my question is
    Setting the logical levels in LTS of Fact tables i understood .
    But we can also set the logical levels for dimensions also ri8 .I didn't understand why do we set the logical levels for dimensions .Is there any reason why we go with the levelling at dimensions
    Thanks
    Xavier
    Edited by: Xavier on Aug 4, 2011 2:03 PM
    Edited by: Xavier on Aug 4, 2011 2:32 PM

  • Logical level for logical fact table sources

    it is clear that for fact aggregates, we should use the Content tab of the Logical Table Source dialog to assign the correct logical level to each dimension.
    question is : is it mandatory to assign even for non-aggregates fact tables the logical level for each dimension (which normally should be set to the most detailed level of each dimension) ? is it any known issue if "logical levels"in content tab are not set ?
    the reason I'm asking this is a strange bug I have (I'm not going to discuss it here) and then only workaround seems to be NOT setting the logical levels (on content tab) for logical fact table sources.
    thank you !

    If levels are not set: By default levels are considered as lowest level
    It should not matter if you set or not
    Generally we set for facts explicitly when we are using Aggregate tables.
    Your current issue might be a case by case; I would suggest to check implicit fact, any table mapped to the source to force a join etc
    Mark if helps
    Let me know how it helps
    Edited by: Srini VEERAVALLI on Feb 5, 2013 8:33 AM
    Any updates on this?+_
    Edited by: Srini VEERAVALLI on Feb 14, 2013 9:09 AM

  • Best way to combine multiple fact tables in single mart

    Hi, quick question that I think I know the answer to, just wanted to bounce it off everyone here to make sure I'm on the right track.
    I have a HR datamart that contains several different fact tables. Some of the facts are additive across time (i.e. compensation - people get paid on different days, when I look at a month I want to see the total of all pay dates within that month). The other type of fact is more "status over a set of time" - i.e. a record saying that I'm employed in job X with a salary of Y from a given start date to a given end date.
    For the "status over time" type facts, if I choose January 2009 (month level) in the time dimension, what I'd really like to see is the fact records that were in place "as of" the last day of the month - i.e. all records where the start date is on or before 1/1/2009, and whose end date is on or after 1/1/2009. Note that my time dimension does go down to the day level (so you could look at a person "as of" the middle of the month, etc. if you're browsing on a day-by-day basis)
    I've set up the join between the time dimension and the fact table as a complex join in the physical layer, with a clause like "DIM_DATE.DATE >= FACT.START_DATE AND DIM_DATE.DATE <= FACT.END_DATE". This seems to work perfectly at the day level - I have no problems at all finding the proper records for a person as of any given day.
    However, I'm not quite sure how to proceed at the month level. My initial thought is:
    a) create a new LTS for the fact table at the month level
    b) in the new LTS, add the join to the time dimension
    c) in the new LTS, add a where clause similar to LAST_DAY_IND = 'Y' (true for the last day of each month).
    Is this the proper way to do this?
    Thanks in advance!
    Scott

    Hi Scott,
    I think you're on the right track but I don't think you need the last part. Let me generalize the situation to the following tables
    DAILY_FACT (
    DAILY_FACT_KEY NUMBER, -- PRIMARY KEY
    START_DATE_KEY NUMBER, -- FOREIGN KEY TO DATE DIMENSION FOR START DATE
    END_DATE_KEY NUMBER, -- FOREIGN KEY TO DATE DIMENSION FOR END DATE
    DAILY_VALUE NUMBER); -- FACT MEASURE
    MONTHLY_FACT(
    MONTHLY_FACT_KEY NUMBER, -- PRIMARY KEY
    MONTH_DATE_KEY NUMBER, -- FOREIGN KEY TO DATE DIMENSION, POPULATED WITH THE KEY TO THE LAST DAY OF THE MONTH
    MONTHLY_VALUE NUMBER); -- FACT MEASURE at MONTH LEVEL. DATE_KEY is at END of MONTH
    DIM_DATE(
    DATE_KEY NUMBER,
    DATE_VALUE DATE,
    DATE_MONTH VARCHAR2(20),
    DATE_YEAR NUMBER(4));
    DIM_DATE_END (ALIAS OF DIM_DATE for END_DATE_KEY join)
    Step 1)
    Make the following three joins in the physical layer:
    a. DAILY_FACT.START_DATE_KEY = DIM_DATE.DATE_KEY
    b. DAILY_FACT.END_DATE_KEY = DIM_DATE_END.DATE_KEY
    C. MONTHLY_FACT.DATE_KEY = DIM_DATE.DATE_KEY
    Note: The MONTHLY_FACT DATE_KEY is joined to the same instance of the date dimension as the START_DATE_KEY of the DAILY_FACT table. This is because these are the dates you want to make sure are in the same month.
    Step 2)
    Create a business model and drag DIM_DATE, DAILY_FACT and DIM_DATE_END into it.
    Step 3)
    Drag the physical table MONTHLY_FACT into the logical table source of the logical table DAILY_FACT.
    Step 4)
    Set DAILY_VALUE and MONTHLY_VALUE to be aggregates with a "SUM" aggregation function
    Step 5)
    Drag all required reporting columns to the Presentation layer.
    Step 6)
    Create your report using the two different measures from the different fact tables.
    Step 7)
    Filter the report by the Month that joined to the Start Date/Monthly Date (not the one that joined to the end date).
    Step 8)
    You're done.
    The act of combining the two facts into one logical table allows you to report on them at the same time. The strategy of joining the START_DATE_KEY and the MONTH_DATE_KEY allows you to make sure that the daily measure start date will be in the same month as the monthly fact table.
    Hope that helps!
    -Joe
    Edited by: Joe Bertram on Jan 5, 2010 6:29 PM

  • Content tab for a fact table

    Hi
    Please , help me in knowing the use of content tab for a fact table in the repository in OBIEE.
    Thanks.

    if you have multiple LTS then you should set the content level approprately otherwise you can get errors during consistency checks.not able to find any link which talks only about content level.see these links and let us know if you have any doubts
    http://kr.forums.oracle.com/forums/thread.jspa?threadID=604637
    Content tab is also handy when you are using aggregate tables.
    Regards,
    Sandeep

  • Content Tab: None of the fact tables are compatible with the query request

    Hi All,
    **One thing I am not clear yet of all my years with OBIEE is working with the content tab in BMM.**
    I have made a rpd the joins in physical layer as shown below:
    https://picasaweb.google.com/114804305606242416264/OBIEEError#5663056545119428530
    And the BMM layer as:
    https://picasaweb.google.com/114804305606242416264/OBIEEError#5663056519553812930
    Error I am getting when i run a request from the 3 columns from the selected 3 tables is:
    Dim - Comment Code Details
    Fact - Complaint
    Dim - Service Details
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 14020] None of the fact tables are compatible with the query request Sr Num:[DAggr(Fact - Complaint.Sr Num by [ Dim - Service Details.Sr Cat Type Cd, Dim - Comment Code Details.Cmtcode name] )]. (HY000).
    I get no error for consistency.. I read everywhere and I know i need to set the appropriate aggregation levels in the various dims and facts LTS properties to help OBIEE understanding our model, but how to do that.. how do i decide... how should I approach, what should be the aggregation level, what details.
    When i click More button i see different options: Copy, Copy From, Get Levels, Check Level, what do these mean.
    Aggregation Content, group by - Logical Level or Column which one should i choose and how should I decide.
    Can anyone explain the Content Tab in details and from scratch with some example and why we get these errors.... I know many people who are well versed with many other things related to RPD but this. A little efforts of explaining from you guys will really be appreciated.
    Thanks in advance,
    Dev

    Hi Deepak,
    Option 1:
    My tables in physical layer are joined as below:
    D1--> F1 <--D2--> F2 <--D3
    Same way i model it in BMM
    D1--> F1 <-- D2--> F2 <--D3
    Here D1 is non Conformed Dimension for F2 and D3 is non Conformed dim for F1. Later create Dimensional hierarchies, I tried setting up the content levels
    I go Sources>content tab of Fact F1 I set
    Dimensions----------- Logical level
    D1---------------------- D1 Detail
    D2---------------------- D2 Detail
    D3---------------------- D3 Total
    then, I go Sources>content tab of Fact F2 I set
    Dimensions----------- Logical level
    D1---------------------- D1 Total
    D2---------------------- D2 Detail
    D3---------------------- D3 Detail
    Then, I also go in all the dimensions and set their content levels to Details, but it still gives me errors not sure where I am going wrong in setting the content levels.
    I need to know whether the way I have modeled it in BMM is right,
    Option 2:
    I can combine the two facts in a single Logical Fact or the above design should also work.
    (F1&F2)<--D1, D2 , D3 joined separately using complex logical joins.
    what will be the content tab details?
    Thanks,
    Dev

Maybe you are looking for

  • Is it possible to add subitems into one entry in buttonchoice?

    Dear All, Is it possible to add entries in button choice as following? That is, add one line "Process Route Template" in the Button Choice "Additional Functions", then add three sub-items under this entry. Can anyone help on this? Thanks. Peter Peng

  • I think on your next update you should put a silent option in the settings menu

    Becasue the silent button is really easy to break i think you should put a 'silent' mode in the settings menu

  • Lose of deleted object in BEX order

    Hello, We used BEX orders with severals package ! but we have problem with these orders ! In fact if one user delete a planning function IP for example, this deletion is affected to bex orders. But if another person  decide to transport others querie

  • Help with Purse example

    Hello. i managed to run the purse example supplied in java kit 2_2_2. However, as i tried to migrate the involved files into another folder and run it, i ran into this problem during runtime. Receiving initial reference... java.rmi.StubNotFoundExcept

  • Description not showing up in info

    We are entering info in the Movie Properties area (Captivate 1), but when the movie is published, everything but the description field is showing up. Any ideas why that field is missing? Thanks!