Calculation Field to be mapped to different data object's column name in Oracle BAM 12c

Hi,
I am having a challenge to enable drill down to 2nd level report by passing calculation field as parameter.
As an alternative, I am thinking to point calculation field to another data object’s column name and generate report so that I would be able to pass that as parameter to drilling report view.
Is there any way to map calculation field to different Data object’s column name? Thanks in advance.
Regards
Amik Basu

1. Yes, you can.
SQL> create table ÜÝÞ( ßàá number(10));
Table created.
SQL> insert into ÜÝÞ values (10);
1 row created.1.1 and 1.2 and 2. You can choose UTF as your default character set. It allows the user of non-English characters in VARCHAR columns in your whole database. It is not per tablespace.
SQL> create table ÜÝÞ( ßàá varchar2(100));
Table created.
SQL> insert into ÜÝÞ values ('âãäçìé');
1 row created.

Similar Messages

  • Reusing Human Tasks across BPM Processes with different Data Objects

    Hi
    JDeveloper 11.1.1.6, WLS 10.3.6, SOA/BPM 11.1.1.6
    I have defined 2 BPM Processes, P1 and P2, which has 2 different Process Data Objects O1 and O2.
    But I am reusing the same Human Task in both the processes.
    For the Human Task to support the activities in both the processes, I have added O1 and O2 Data Objects in the Human Task Definition.
    And hence the ADF Taskflow / page generated out of the Human Task will have both the Data Objects O1 and O2 as payload objects in the Page.
    When an instance of Process P1 is created, the payload O1 will have values, but O2 will be null.
    And when an instance of Process P2 is created, the payload O2 will have values, but O1 will be null.
    It works well like this, but I am a bit concerned about the performance from BPM Process and also ADF page.
    In ADF page, let us say, somehow I can control the rendering of the attributes or creation of the iterator bindings based on identifying which process is being executed.
    (By setting the iterator binding refresh property in page definition)
    In this way attribute bindings for O1 will not be created when P2 is being executed.
    But still when the Process instance is created, and when we see the payload structure of the human task in the EM - Audit Trail,
    I still see both data objects O1 and O2 are created in the Payload, but O1 will have nulls in process P2.
    So my question is, from performance point of view, is it advisable to define different Data Objects in reusable Human tasks ?
    Or should I have to define a separate Human Task definition which contains only the Data Objects related to the process it is being called from ?
    Thanks for any help
    Sameer 

    Martijn,
    You are correct in your assessment that the JAG in the current JHeadstart release cannot cope with multiple bc4j packages. There is no work around for this. Upgrading to 9051 will not help.
    In the JHeadstart-ADF release, this restriction has been lifted. You can place your EO, VO and AM objects in different packages, and you can group them in a separate project (Model project), while generating your JHeadstart application in a ViewController project.
    We have a number of customers that use the latest JHeadstart-ADF builds to build their ADF apps. If you are interested in joining this beta program, please send an e-mail to [email protected]
    Steven Davelaar,
    JHeadstart Team.

  • How to Select data using same column name from 3 remote database

    Hi,
    Can anyone help me on how to get data with same column names from 3 remote database and a single alias.
    Ex.
    SELECT *
    a.name, b.status, SUM(b.qty) qantity, MAX(b.date) date_as_of
    FROM
    *((table1@remotedatabase1, table1@remotedatabase2, table1@remotedatabase3)a,*
    *(table1@remotedatabase1, table1@remotedatabase2, table1@remotedatabase3)b)*
    WHERE b.dept = 'finance'
    AND a.position = 'admin'
    AND a.latest = 'Y' AND (b.status <> 'TRM') AND b.qty > 0;
    GROUP BY a.name, b.status ;
    NOTE: the bold statements is just an example of what I want to do but I always gets an error beacause of ambiguous columns.
    Thanks in advnce. :)
    Edited by: user12994685 on Jan 4, 2011 9:42 PM

    user12994685 wrote:
    Can anyone help me on how to get data with same column names from 3 remote database and a single alias.Invalid. This does not make sense and breaks all scope resolution rules. And whether this is in a single database, or uses tables across databases, is irrelevant.
    Each object must be uniquely identified. So you cannot do this:
    select * from (table1@remotedatabase1, table1@remotedatabase2, table1@remotedatabase3) a3 objects cannot share the same alias. Example:
    SQL> select * from (dual, dual) d;
    select * from (dual, dual) d
    ERROR at line 1:
    ORA-00907: missing right parenthesisYou need to combine the objects - using a join or union or similar. So it will need to be done as follows:
    SQL> select * from (select * from dual d1, dual d2) d;
    select * from (select * from dual d1, dual d2) d
    ERROR at line 1:
    ORA-00918: column ambiguously definedHowever, we need to have unique column names in a SQL projection - so the join of the tables need to project a unique set of columns. Thus:
    SQL> select * from (select d1.dummy as dummy1, d2.dummy as dummy2 from dual d1, dual d2) d;
    DUM DUM
    X   X
    SQL> I suggest that you look closely at what scope is and how it applies in the SQL language - and ignore whether the objects referenced are local or remote as it has no impact to fundamentals of scope resolution.

  • Problem with childs nodes and automatic key mapping in a Data Object

    Hi experts!
    I'm doing the service order tutorial from the mobile help at [this link|http://help.sap.com/saphelp_nwmobile71/helpdata/en/21/9b5b924c3b434fba4767731794b029/frameset.htm] and I have a problem...
    In the topic "Modeling the Equipment Data Object", says you have to mark the "Automatic Key Mapping" checkbox. So when I had to create a third child node ( the location node ) the system raised an exception with the message "Deselect automatic key mapping flag for more than two-level nodes". I'm trying deselecting the flag and creating the location node, but when I want mark again the automatic key mapping flag, this is disabled.
    What can I do to solve this and create the three child nodes with the flag marked? It's a configuration thing?
    Any help it's very welcome. Thanks in advance.
    Best regards,
    Simon.

    The thing is: Its not allowed to use automatic keymapping if you have more than two levels. This is why the message showed up, and this is why its been disabled.
    What automatic keymapping does: Figures out automatically which child node belongs to which parent (by guessing from the field name and type, which fields in the child correspond to which key fields of the parent).
    On three levels, this becomes more complicated => Its disabled.
    How to do keymapping yourself instead of having the DOE do it automatically: Do 'Explicit keymapping' from each child to its parent. Explicit keymapping is done by clicking on the corresponding menu button in the child node. Here you need to associate child node fields (they need not be key fields of the child, but they are allowed to be that as well) to each of its parent nodes key fields (so that each child can be associated to its parent).
    Cheers

  • Mapping errors between data object and automated activity

    Hi,
    in a automated activity of a bpm process I try to call a web service method ("addBook") which exptects an object called u201CbookWrapperu201D. The object "bookWrapper" consist of three variables: u201Cbidu201D (Integer), u201Cauthoru201D (String), u201Ctitleu201D (String).
    When I try to directly map, for example u201Ctitleu201D, out of an existing Data Object (Filled by an human activity: u201CDO_CreateBooku201D) I got the error:
    Expected: xsd:string
    Found list of u2018com.sap.dictionary.stringu2019
    Severity and Description     Path     Resource     Location     Creation Time     Id
    The literal: "Expression for target path 'DO_BookWrapper/title' has errors." do not have type, in expression: "#erroneous=true;sources=TaskOutput/UIResponse/start/title#TaskOutput/UIResponse/start/title".     LocalDevelopmentLocalDevelopmentbook_chapter_process~demo.sap.com/src/bpmn     create book and chapter.bpmn     Expression for target path 'DO_BookWrapper/title' has errors. [Galaxy, Rule, Literal]     1260878200220     4720
    I do not know why this error occurs because both type are strings. Is it due to the fact that I do not have an object instead of variables as input parameters? I used the automatically generated data types as data objects.
    May anybody help me please? Can I make casts?
    Regards

    Hi Martin,
    From the error that you get, it is indicated that you try to map a list to a single variable.
    If you want to do so, please see the documentation http://help.sap.com/saphelp_nwce711/helpdata/en/47/63604e42bf6831e10000000a421937/frameset.htm where it is said:
    If you map a node, whose content is a list or a node, with one or more parents that are lists, to a single-valued node, you get an error. Depending on your needs you could provide custom expression, for example to aggregate the list to a single value, or you could do parent mapping between the multi-valued parent nodes.
    So you need to create an EJB function that does the conversion between a list and single value type. Here is the help for functions: http://help.sap.com/saphelp_nwce711/helpdata/en/47/63604e42bf6831e10000000a421937/frameset.htm and especially for EJB functions http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/609c9982-39af-2b10-768b-e5bd8813c1f3.
    I hope this helps you.
    Best regards,
    Radost

  • Bulk Load data without specifying column names in control file

    I am totally new to the world of Oracle databases and am having some trouble doing some basic data imports from a csv file. I have a large amount of data in an excel spreadsheet (a couple hundred columns by varying numbers of rows from 2- several thousand) which I want to import into my Oracle 11g Database. Currently I am trying to use sql loader and specify the import syntax in the control file. From the examples I've seen so far, it seems you must specify the columns individually for what you are importing. This is simply not reasonable for several hundred columns of data.
    Is there sql loader syntax (or a better method all together) to import a comma delimited csv or excel spreadsheet into an oracle database where it automatically generates a table and columns based on the data in the target import file? Even if I must first create the table in the database, my key issue is being able to generate the column names during import to match the column headers in the import file. I appreciate any help you could provide.
    FYI here is my first meager attempt at importing a comma delimited csv, here is my ctl file:
    load data
    infile 'C:\filepath\mock_data.csv'
    into table allData
    fields terminated by "," optionally enclosed by '"'
    It says Syntax error at line 4. Expecting "(", found end of file.
    I suspect there is more at issue here than a missing "("

    You have to create the control file to map to the table.
    Since you would already have created the tables, you could generate the controlfiles based on the table structures.
    See the "unloader.zip" file under the Files link at http://asktom.oracle.com
    -- it has scripts to create unload a table to a flat file and to create the SQLLoader controlfile to reload that data to another target database. It would be the second set of scripts that you can use.
    Hemant K Chitale

  • Data object is not being populated in BAM

    We are putting the data in JMS Queue. We have created an EMS in BAM. It is picking the data from JMS Queue, but the data object is not being populated. The request we are using here is given below:
    Request:
    <?xml version="1.0" encoding="UTF-8"?>
    <Test xmlns="http://education.com/OrderImport">
    <OrderLineReferenceNumber>abcad</OrderLineReferenceNumber>
    <WebOrderNumber>cccccg</WebOrderNumber>
    <OrderLineCount>gn</OrderLineCount>
    <LineNumber>4</LineNumber>
    <ISBN10>46564574</ISBN10>
    <ISBM13>6767</ISBM13>
    <ID>67</ID>
    <Status>78768</Status>
    </Test>
    Does anybody has any idea on that?
    Regards,
    Chandrachur.

    Hi Arun,
    Thanks for your prompt reply.
    We are not using any trasfomation here. We have created thje Data objects as per the message type in the request.
    Please find the dataobjects details we are using:
    Field name Field ID Field type Max length Scale Nullable Public Lookup Calculated Tip Text
    OrderLineReferenceNumber _OrderLineReferenceNumber string                                    100               -               No             Yes                   -                          -               - 
    WebOrderNumber _WebOrderNumber              string                                    100               -               Yes            Yes                   -                          -               - 
    OrderLineCount _OrderLineCount                 integer                                     -                 -               Yes            Yes                   -                          -               - 
    LineNumber _LineNumber                      integer                                     -                 -               Yes            Yes                   -                          -               - 
    ISBN10 _ISBN10                              string                                    100              -                Yes            Yes                  -                           -               - 
    ISBM13 _ISBM13                             string                                    100              -                Yes            Yes                  -                           -               -
    ID _ID                                    string                                    100              -                Yes            Yes                  -                           -               -  
    Status _Status                              string                                    100              -                Yes            Yes                  -                           -               -  
    Thanks again,
    Chandrachur.

  • Where statement on two different tables with same column name

    Hello,
    I have 2 financial tables:
    tblincome, tblexpenses
    in each table I have column name "monthPayed" (have values of all the months of the year).
    I would like to create a balance view table that will show me the financial status for the Q1 (for example).
    I have a column totalIncome and totalExpended respectively.
    I've create a view table that shows me the financial balance at the moment but I want it to be devide to Quarters of the year...
    Regards

     
    Please post DDL, so that people do not have to guess what the keys, constraints, Declarative Referential Integrity, data types, etc. in your schema are. Learn how to follow ISO-11179 data element naming conventions and formatting rules. Temporal data should
    use ISO-8601 formats. Code should be in Standard SQL as much as possible and not local dialect. 
    This is minimal polite behavior on SQL forums. Where is the code you already tried? Or are you so lazy, so rude or so privileged that you did not do anything for yourself? 
    In fact that silly “tbl-” prefix is so bad that it has a name! It is called Tibbling and Phil Factor even wrote a humor article on bad programmers that use it. 
    Your narrative is vague; it sounds like this is the DDL: 
    CREATE TABLE Incomes
    (payment_month CHAR(10) NOT NULL PRIMARY KEY,
     income_amt DECIMAL(12,2) NOT NULL
      CHECK (income_amt > 0.00),
    CREATE TABLE Expenses
    (payment_month CHAR(10) NOT NULL PRIMARY KEY,
     expense_amt DECIMAL(12,2) NOT NULL
      CHECK (expense_amt > 0.00),
    >> in each table I have column name "monthPayed" (have values of all the months of the year). <<
    I hope not! 2014 has not gotten to September, November or December yet! And the ISO-11179 data element name should be “payment_month”; but it is still wrong! An expense is not a payment! An income is not a payment! 
    >> I would like to create a balance view table that will show me the financial status for the Q1 (for example). <<
    Report Period Table
    Since SQL is a database language, we prefer to do look ups and not calculations. They can be optimized while temporal math messes up optimization. A useful idiom is a report period calendar that everyone uses so there is no way to get disagreements in the DML.
    The report period table gives a name to a range of dates that is common to the entire enterprise. 
    CREATE TABLE Something_Report_Periods
    (something_report_name CHAR(10) NOT NULL PRIMARY KEY
       CHECK (something_report_name LIKE <pattern>),
     something_report_start_date DATE NOT NULL,
     something_report_end_date DATE NOT NULL,
      CONSTRAINT date_ordering
        CHECK (something_report_start_date <= something_report_end_date),
    etc);
    These report periods can overlap or have gaps. I like the MySQL convention of using double zeroes for months and years, That is 'yyyy-mm-00' for a month within a year and 'yyyy-00-00' for the whole year. The advantages are that it will sort with the ISO-8601
    data format required by Standard SQL and it is language independent. The pattern for validation is '[12][0-9][0-9][0-9]-00-00' and '[12][0-9][0-9][0-9]-[01][0-9]-00'
    Now figure it out for yourself, show us what effort you put into this, then we will help you. 
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • 11g bi publisher data model generic columns names for obiee data source?

    I'm just trying to figure out if this is a bug, expected behavior or installation problem. When I select an Oracle BI Analysis to add to the data model diagram the column names are not copied over. Instead they are all renamed as Column0-ColumnNN.
    Thanks
    Edited by: user13087422 on Apr 7, 2011 9:49 AM

    I'm facing this same issue. Can you solve that? What's the problem with it?
    Regards,
    Ariel

  • Using different data source and connection name

    Hi,
    We have a requirement where the data source for our application is already configured on the server by name like MyDBDataSource. But the database connection name in our application is like My_DB. So application module is using data source name like My_DBDS. If i change data source name in application module configuration manually to be MyDBDataSource, then it fails at runtime. This has been identified as bug as mentioned in this post
    http://andrejusb.blogspot.in/2009/01/practical-adf-security-deployment-on.html
    Is this still valid? I am using 11.1.1.6 version. Also want to know if there is any workaround for this ? If i change the name in web.xml at deployment time using script, will it work ?
    Thanks

    Hi Ami,
    Is your issue get solved ?
    I would like to mention few things about the database connection in application & data source in Weblogic.
    1. When you create a database connection in Application (for example say connection name as hr) then in the Application Module Database Configurations in the bc4j.xcfg File Custom JDBCDataSource will be set as <Custom JDBCDataSource="java:comp/env/jdbc/hrDS"/> for both ApplnModule shared & Local. So when you create data source in weblogic it is better to create the JNDI Name like jdbc/hrDS ( ie between the '/' and 'DS' you have to specify the name of database connection you created in the application ).What ever the Data source name is no matter.
    2.If you change the JNDI name of data source later like jdbc/hrconnectionDS then it is better to delete the old hr connection from application and recreate a connection as hrconnection. After that you have to edit bc4j.xcfg file in the Application Module--> Configurations and set Custom JDBCDataSource as below for both local & shared config.
    <Custom JDBCDataSource="java:comp/env/jdbc/hrconnectionDS"/>
    I hope this will help you.
    Gijith.

  • Need to get different data from same column in same query

    need party_name from table hz_parties once to get bank_name & other to get branch name in the same query ,,,, how can i do that?

    thx but it's not fetching what i need as wht i need excatly is as the following
    select distinct party_name -- branch_name or bank_name
    from hz_parties hz
    , hz_code_assignments bankca
    , hz_code_assignments branchca
    where
    -- bankca.class_category = 'BANK_INSTITUTION_TYPE'
    -- AND bankca.class_code IN ('BANK', 'CLEARINGHOUSE')
    branchca.class_category = 'BANK_INSTITUTION_TYPE'
    and branchca.class_code IN ('BANK_BRANCH', 'CLEARINGHOUSE_BRANCH')
    AND branchca.owner_table_id = hz.party_id
    -- AND bankca.owner_table_id = hz.party_id
    if class_code from table x = branch then select party_name from table y & call it branch
    else if class_code from table x =bank then select party_name from table y & call it bank

  • Mapping in interconnect between different Business Objects

    I want to know how to do transformation and mapping between different business objects in interconnects.
    Always,We have a very complex SQL,when We do intergration
    with Oracle interconnect ,We use DB Adapter or Jdbc Adapter,but the complex SQL have to be excuted in the resource DB or the destination DB which may be a big pressure to them ,I think can We use different Business Objects, and do the Mappings in interconnect,so the big pressure will be on the interconnect server just like the ETL tools, But I just find that Interconnect can do tranformation and mapping in one Business Object ,how can I do? Is anyone meet this problem like me ?thanks for discussion.

    For me, Business Objects are logical groupings of business processes. For example, we have a Business Object called "Maintain_Employees". Under this we have 1 Procedure (Create_Employee) and 2 Events (Update_Employee and Delete_Employee).
    We have 1 Oracle system interfacing with 23 other legacy systems. Some of these legacy systems will be using this "Maintain_Employees" Business Object (Common View), and our main transformations will be between the Common View and the legacy Application Views.
    We are using a number of techniques to assist in "validating" data in the InterConnect. The main ones are using 'Cross Reference Tables (XREF)' and 'DatabaseOperation' transformations. By using 'Content Based Routing' we are able to send the right message to the right legacy system, and therefore do the right transformation/validation on the message payload. However, this is only a small part of a complex puzzle.
    I also have the "problem" of having "very complex SQL" on our Oracle system too. This is not unusual when using the InterConnect.
    To my mind, the InterConnect does 2 main operations. Firstly, it performs some message transformation (mapping), and secondly, it acts as a transportation engine (routing) using the adapters.
    The remainder of the effort required to create or consume the message resides with the Applications themselves. Whether it is parsing an XML CLOB payload, inserting data into staging tables, writing to log files, pre-processing data, calling API's or something else, your Application side programming and processing overhead can get large.
    The trade off it to ask the question, do I want to be able to track and manage messages from start to finish in high detail? Or can I trust that all message payload data will be consumed with no additional processing on the Application side?
    My experience has shown that the bottleneck is always at the Application side, and almost never in the InterConnect.
    The short answer to your first question is "You are right. Mappings can take place only between Application Views and Common Views only - not between Business Objects.".
    To answer your second question "Probably everyone reading this forum has this problem. The intelligence that is able to really interpret message data, validate it and process it is only found in the Application, not the InterConnect. You could, however, use the Workflow engine within OAI in order to provide additional pre-validation, human interaction and logic, but this too could be complex."
    At my current client, we are architecting an Application OAI Message handling schema. This will contain staging tables, pre-processing tables, "OAI" wrapper PL/SQL scripts, "APPS" wrapper PL/SQL scripts and Message Logging and Exception tables. Ours will be a complex set of PL/SQL processes too.
    I hope this helps, just in letting you know that you are not alone with this problem.
    I wonder if anyone else would like to share how they have architected their InterConnect and Application side mapping and transformation solutions.

  • Trouble when attempting to Sum Calculated Field.

    I had to create a calculated field called RI_Limit which contains static data.  (Developers/DBA could not enter it into the database at this time, so this was a work around. 
    The calculated field is setup as such:
    =iif(Fields!Location_LOCATION_NAME.Value = "a", 8, iif(Fields!Location_LOCATION_NAME.Value = "b",2, iif(Fields!Location_LOCATION_NAME.Value = "c",0, iif(Fields!Location_LOCATION_NAME.Value = "d",0, iif(Fields!Location_LOCATION_NAME.Value = "e",1, iif(Fields!Location_LOCATION_NAME.Value = "f",1, iif(Fields!Location_LOCATION_NAME.Value = "g ",0, iif(Fields!Location_LOCATION_NAME.Value = "h",0,iif(Fields!Location_LOCATION_NAME.Value = "i",4,iif(Fields!Location_LOCATION_NAME.Value = "j A",0,iif(Fields!Location_LOCATION_NAME.Value = "k",0,iif(Fields!Location_LOCATION_NAME.Value = "l",7,iif(Fields!Location_LOCATION_NAME.Value = "m",0,iif(Fields!Location_LOCATION_NAME.Value = "n", 1, iif(Fields!Location_LOCATION_NAME.Value = "o",0, iif(Fields!Location_LOCATION_NAME.Value = "p",1, iif(Fields!Location_LOCATION_NAME.Value = "q",3, iif(Fields!Location_LOCATION_NAME.Value = "r",1, iif(Fields!Location_LOCATION_NAME.Value = "s",1, iif(Fields!Location_LOCATION_NAME.Value = "t",3, iif(Fields!Location_LOCATION_NAME.Value = "u",5,iif(Fields!Location_LOCATION_NAME.Value = "v",0,iif(Fields!Location_LOCATION_NAME.Value = "w",0,"NA")))))))))))))))))))))))
    and I setup a textbox with that expression and the numbers fill in correctly with its cooresponding site name.
    However, when I go to sub total or total, I get a much larger number than expected. (See screenshot)
    I have searched the internet on a way to calculate both the total by Location Level 2 and LOB total, but nothing that pertains to the specific issue where by calculated field contains static data.  Nothing I try produces an accurate sum.  Any suggestions?

    Hi Katherine,
    Sorry for taking a few days to reply.
    I tried the suggestion, but it it still not summing my calculated fields.  The total row will calculate my other cells okay but not the calculated fields that I had to add to the report.  When I attempted to write an expression to sum the calculated
    fields I get the following error message. 
    The expression used for the calculated field '=sum(iif(Fields!Location_LOCATION_NAME.Value = "test", 8, iif(Fields!Location_LOCATION_NAME.Value = "test1",2, iif(Fields!Location_LOCATION_NAME.Value = "test2",0, iif(Fields!Location_LOCATION_NAME.Value = "test3",0, iif(Fields!Location_LOCATION_NAME.Value = "test4",1, iif(Fields!Location_LOCATION_NAME.Value = "test5",1, iif(Fields!Location_LOCATION_NAME.Value = "test6 ",0, Fields!Location_LOCATION_NAME.Value = "test7",0,Fields!Location_LOCATION_NAME.Value = "test8",4,Fields!Location_LOCATION_NAME.Value = "test9",0,Fields!Location_LOCATION_NAME.Value = "test10",0,Fields!Location_LOCATION_NAME.Value = "test11",7,Fields!Location_LOCATION_NAME.Value = "test12",0,Fields!))' includes an aggregate, RowNumber, RunningValue, Previous or lookup function. Aggregate, RowNumber, RunningValue, Previous and lookup functions cannot be used in calculated field expressions.
    The Espression for the calculated field I am attempting to sum is the following:

  • Calculated Field formula not appearing in Power View Field List panel

    Hi,
    On Power View1 sheet of this
    workbook, the two PowerPivot calculated Field formulas (First Visit and First date of FY) are not appearing in the Power View Fields panel on the right hand side.
    I want to drag client and First Visit (calculated Field) to the Power View.
    Why is this happening?
    Regards, Ashish Mathur Microsoft Excel MVP www.ashishmathur.com

    I found a workaround for it - I was having the same problem as you. I wanted to display the last refresh date of my data, but Power View wouldn't display a date resulting from the [Last Refresh]=MAX(Date[Date]) formula.
    HOWEVER, a Pivot Table can handle that Calculated field just fine. So I renamed my measure "Refreshed Date" and added another tab in the Excel Sheet. I added a Pivot Table from PowerPivot into cell A1 and the only thing I put was that single
    measure (A2). To the right of that, I did =A2 and created an Excel Data Table. The title of that was "Last Refresh". Now, I could add that Data Table to the PowerPivot Model and into Power View. And then I hid that tab.
    The only downside to this method, is that if you're refreshing single tables at a time, you have to remember to refresh the Pivot Table when new data is added AND update the data in the model. I'm only connecting to one database, so I just choose
    "Refresh All" in Excel and it updates everything. It works perfectly! :)
    Well shoot, I even took a screenshot that helped explain everything, but it won't let me post it because my account isn't verified and it won't tell me how. If you have any questions, please let me know and I'll clarify!

  • Calculated Field in BAM

    Hi
    I am facing problem in making caluclated field.
    There are fields in one of the data object named as UpdateTime & UpdateError.
    I have to make this field as caluclated field and apply following formula
    UpdateTime = UpdateTime+timeConsumed
    UpdateError =UpdateError+1
    the field timeConsumed is coming from another data object
    while applying the formula BAM is giving the Error as: *BAM-01283: The calculated column _UpdateError contains a circular reference.*
    *BAM-01283: The calculated column _UpdateTime contains a circular reference.*
    Help me to eradicate this error!!
    Thanks

    Answer to following questions will help me narrow down the problem:
    1. Which version of BAM are you on - 10g or 11g?
    2. What condition are you using in filter i.e. "is equal to", "is less than equal to", "is like"...etc?
    3. In which format is the timestamp data being given in the filter? (by format, I mean mm-dd-yy HH:MM:SS etc)
    4. Can you give the exact datatype of the DB in which this timestamp data is stored?

Maybe you are looking for