Price Band on a pre-calculated average : Don't go down the rabbit hole.

One of the issues with DAX is that you get really ambiguous error messages that essentially give you little clue as to what is the problem.  You can go off in one direction only to realize the problem is in the other direction.
Using DAX Studio http://daxstudio.codeplex.com you an alleviate some of this pain.
I needed to assign a sales band to a fact table but rather than base it on a row value, it needed to be based on a simple
expression i.e. Average Sales :
CALCULATE(average([Sales]),filter(Table1,[Customer]=EARLIER([Customer])))
With an unrelated band table this expression can be dropped right into the basic banding pattern from Marco & Alberto's
DAX Pattern site
=
CALCULATE (
VALUES ( Bands[Label] ),
FILTER (
Bands,
CALCULATE (
AVERAGE ( [Sales] ),
FILTER ( Table1, [Customer] = EARLIER ( [Customer] ) )
>= Bands[Min]
&& CALCULATE (
AVERAGE ( [Sales] ),
FILTER ( Table1, [Customer] = EARLIER ( [Customer] ) )
< Bands[Max]
Lesson : Sometimes problems are easier to solve than what appears !
Lee Hawthorn ACMA, CGMA View my Performance Management blog at leehbi.com

Srini Chavali wrote:
Ed - could it be that these Oracle Homes were cloned (using OUI) from the same source ? That would explain the directory names (which is the structure on the source) and the matching time stamps across multiple homes.
HTH
SriniIt was confirmed that the migration was done some time AFTER those opatch entries - nearly a year later. I'm still trying to get information on how the migration from hpux to linux was accomplished (my predecessor is still employed here, but in another dept and he keeps odd hours and is pretty vague when I question him about anything). The various instance alert logs go back to pre-migration, so it looks like a lot was preserved/copied over.
In any event, I think I can ignore those patches and just focus on what was done post-migration.
Thanks for the clue.

Similar Messages

  • The item Hierarchical Context Menu does not support pre-calculation

    Hello,
    Trying to download pre-calculated web templates via RepAgent, using the Hierarchical Context Menu feature within the template (populated via control query with Hierarchy) the system returned the following message upon file download:
    "The item Hierarchical Context Menu does not support pre-calculation"
    As a result the Hierarchical Context Menu is not displayed on the downloaded html-files. Navigation not possible.
    Could not find any constraints documented in the online help, at this point.
    Question: is there something missing on our setup or is this really out of scope for precalculation with navigation?
    Any other experiences on that or shall we open OSS-message?
    Best regards,
    Markus
    (We are on BW3.1)

    Hi,
    we have the same problem. Does anybody have any hint or workaround? Is there a solution for this in 3.5 or any upcoming version.
    Any input is appreciated.
    Thanks,
    Stefan

  • RE:calculated Average Valuated Stock Value

    Hi ALL,
    how to calculated Average Valuated Stock Value  logic (The sum of the daily stock value for the time frame of the analysis divided by the number of days) please guide me.
    regards,
    ravi

    Please ...........................
    guide me.......................
    how to created formula for this logic
    (The sum of the daily stock value for the time frame of the analysis divided by the number of days)
    Regards,
    ravi

  • Send email with pre-calculated Excel

    Hello everybody.
    I'm trying to send an email with a pre-calculated Excel worksheet, generated by an Agent Reporting job.
    But, I don't know how to attach the last generated XLS file, using ABAP.
    We have BW vers. 3.1 so I can't use Information Broadcasting (it would be the perfect solution).
    I've tried to send the pre-calculated results as HTML (following a very good document found in this forum), but the results didn't satisfied the users. I've already read the document about sending a workbook as attachment, but my problem is I don't know where to get the Excel generated by Agent Reporting.
    Can someone help me?
    Thanks in advance.
    Gabriele
    P.S.: This is my first post. I hope my next posts will be for helping someone, not only asking for help...

    Check this link:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/7f601a65-0a01-0010-16a5-c369f2a9fa87
    Save it as a PDF file before opening it.
    Hope it helps.
    Regards

  • Using pre-calculated web template

    I have a query that takes 20 seconds to load on our web portal.  I've done everything imaginable to optimize performance.  Finally I'm experimenting with pre-calculating the query.  I've created the reporting agent setting, set the calculate parameter to 'HTML for Web Browser', added this to a schedule and successfully ran the schedule.
    Now when I run the corresponding web template on the web it still seems to take the same 20 seconds.  I would think all it has to load now is a static HTML page so it should take 1 or 2 seconds.  How do I know if it's really loading the pre-calculated version or the normal one?  I'm not sure anything is really happening.  Maybe I missed something?  Anybody have experience with this?
    Thanks,
    -Patrick

    I think I figured it out.  I had to add a parameter to my url that was UPDATE_MODE=STORED

  • Validate signature with pre-calculated digest

    Hello to all!
    I'm trying to build a system that signs and validates signatures on large files.
    To get a signature i used a pre-calculated digest and was easy, but for validating the same signature i can't discover how to give the pre-calculated digest.
    I can't use the normal validate because the java api uses the "JavaUtils.getBytesFromStream" method and try to get all the file to memory.
    can someone indicate the solution to this (or explain why don't had a solution, if it is the case)?

    Hello!
    smullan wrote:
    I assume you have tried returning an OctetStreamData wrapped around an InputStream? Yes. I let here the code of the URIDereferencer i'm using.
    public class ConstrulinkURIDereferencer implements URIDereferencer {
         private String filesPath;
         private URIDereferencer defaultDereferencer = null;
         public ConstrulinkURIDereferencer(String filesPath) {
              this.filesPath = filesPath;
         public void setURIDereferencer(URIDereferencer defaultDereferencer) {
              this.defaultDereferencer = defaultDereferencer;
         @Override
         public Data dereference(URIReference uriReference, XMLCryptoContext context)
                   throws URIReferenceException {
              try {
                   String filename = uriReference.getURI();
                   File f = new File(filesPath+filename);
                   if (!f.exists() && defaultDereferencer!=null) {
                        return defaultDereferencer.dereference(uriReference, context);
                   } else {
                        return new OctetStreamData(new FileInputStream(filesPath+filename));
              } catch (FileNotFoundException e) {
                   throw new URIReferenceException(e);
    }I try to get a file, if that fails i send the request for dereference to the deafult URIDereferencer (the one that was there before).
    What i got from seeing the code at Koders the problem was that when we call the method JavaUtils.getBytesFromStream (you can see the stacktrace above) it will read all the bytes of the stream. I didn't see if these bytes are used for something else than the digest.
    >
    It sounds like it could be a bug. I will try coming up with a sample and debug it further. If you are able to, please also post your code here or send it to [email protected].
    I'll try monday to send you a executable code.
    Thanks

  • Broadcasting and pre-calculated Excel workbook

    Hello,
    Is it possible to pre-calculate an Excel workbook using the Broadcast functionality of BW 3.5 and save the calculated workbook on the BW server itself. We don't want to send the workbook via e-mail or publish it on the portal. We just want the users to open the pre-calculated workbook from the BW server itself.
    Many thanks.
    François.

    hi,
    Using the BEx Broadcaster, you can precalculate and distribute Web templates, queries and workbooks. You can distribute these reporting objects either in precalculated form or as an online link. Your distribution options include sending by e-mail or exporting into the Enterprise Portal.
    http://help.sap.com/saphelp_nw04/helpdata/en/bf/220c40ac368f5ce10000000a155106/frameset.htm
    We just want the users to open the pre-calculated workbook from the BW server itself.
    As far as i know,it's not possible(not sure), i haven't done that.. let's see a reply from other BW experts
    if it helps assign points
    thanks,
    senthil kumar

  • Information Broadcaster Pre-Calculation Server

    We just upgraded to version 3.5. I have a need to broadcast workbooks to several users. I setup a test precaluation server on my PC. I am not sure exactly what the function of these server. Are the workbooks actualy stored on this "server" or the BW Server. Does anyone know what functions the pre caluation server does. Also is there any suggestion of where folks locate this in their production landscape. It seem to only run on a windows environment. Any insight would be greatly appreciated.
    Thanks,
    Steve

    Steve,
    To calculate an excel workbook, BEx analyzer is launched and reports are run.  This is done on the pre-calculation server.  The pre-calculation service interacts with the desktop, launches analyser, runs the reports, saves the excel workbook on the server which is then broadcasted.
    For a production environment, recommend running this on either a dedicated windows based PC or a shared server.  It should be up at all times otherwise the boradcast will fail.
    Aneesh

  • Refresh Workbook on Pre-Calculation Server .

    Hi,
    I am using Pre-Calculation server to broadcast the workbook  through Email.
    But the workbook is not getting Automatically refresh on Pre-Calculation server.
    I had also done the setting for 'Refresh Workbook On Open' in workbook setting.
    Please help me to solve this issue.
    Thanks in Advance.
    Vaibhav

    Hi,
    Please check following notes.
    1027807 Workbook not refreshed properly, during precalculation
    1074272 Error in the precalculation server (read long text first)
    Regards,
    Amit

  • [Inf. Broadcasting]: How can i make pre-calculated report.

    Dear experts,
    Could somebody help me how to produce pre-calculated reports using Information Broadcasting since right now our bw version has been upgraded to 7.0 ?
    Pre-calculated report i mean is like this :
    (i'll show it throught example,aim to explain it clearly).
    1. In the night (Or Everynight) BW calculate/run the report (it's scheduled).
    2. In the next morning (Or Everymorning), when user is accessing that report, they don't have to wait the BW to run/calculate the report, since BW has done it in the night.
    O.. ya .. btw, i've not had yet the latest e-portal. So i still can not use inf. Broadcasting to burst the data to e-portal yet.
    Can everybody tell me how to make it .. ?
    Or is there another way to make it using BW 7.0 ?
    Many thanks for all your attentions.
    Best regards,,,
    Niel.
    Message was edited by:
            Niel

    As I understand it, many barcodes don't allow you to script the presence to protected. Changing it in the XML source code is the only way I know of to reliably change it. If you click on the barcode in Design view and then click on your XML Source tab you'll see the first line of code for that object. It will look something like:
    <field name="Code2Of5MatrixBarCode1" y="79.375mm" x="76.2mm" w="29mm" h="10.16mm" access="protected">
    If you don't have the XML Source tab available to you in Designer, you can open it by clicking the View menu and selecting XML Source from the menu. Be aware that barcode behavior is not the same as other fields, even if you change it in the XML Source to protected the user may still be able to select it with a mouse click.

  • Microsoft`s definition of 「All Instance」 for calculating average percentage of CPU usage is a mystery for me !!!.

    Hi, I was looking up in the 「How to Auto Scale」 document
      http://azure.microsoft.com/en-us/documentation/articles/cloud-services-how-to-scale/
    and got stuck in the below paragraph, the part that describes how to calculate the
     average percentage of CPU usage. 
    ==========
    All instances are included when calculating the average percentage of CPU usage and the average is
    based on use over the previous hour. Depending on the number of instances that your application is using, it can take longer than the specified wait time for the scale action to occur if the wait time is set very low. The minimum time between scaling actions
    is five minutes. Scaling actions cannot occur if any of the instances are in a transitioning state.
    ==========
    Does any one know the microsoft`s definition of 「All Instance」 ?
    Whether the virtual machines that are shut down but has been set into auto scaling group count as one of them?
    The question is that the response time, the data shows up in the management portal delays a lot comparing to the
    real time. I was wondering if the definition of the calculation described above is truly what the user`s want or not.
    I`m all ears to any information. Thanks in advance !! 

    Hello Tomo Shimazu,
    Instance is a presence of a service eg :- VM is an instance. Copy of VM is another instance which is 2 instances of VM.
    Although Microsoft’s pricing model is readily available and even offers a simple calculator (http://www.windowsazure.com/en-us/pricing/calculator/advanced/) to assist
    in the process, it may be difficult to know how much of each resource you will need to get an estimate.
    To understand on calculating the compute instances and the VM size, let’s first take a look at the VM role sizes offered by Microsoft.
    The prices per month for each, at the time of writing, are $30, $90, $180, $360, $720 from ExtraSmall to ExtraLarge respectively. So with the exception of the
    transition from ExtraSmall to Small, going to the next size VM is exactly twice the cost, which, is mirrored in the increase of resources you get at each level. With each step, CPU Cores, Memory, Disk Space, and Bandwidth are all doubled from the previous.
    If this is the case, then, is there any advantage to any one of these over the others? The answer is yes. In the majority of cases, the best bet will be to go with the small instance, with the reason being that because all of these resources scale equally
    to cost it is possible to achieve the exact equivalent to a larger VM simply by increasing the number of instances. For example 8 small instances is equivalent to one ExtraLarge instance, with the advantage of the fact that when not needed these small instances
    can be turned off and will cost nothing.
    By hosting the application in this manner, it increases the effectiveness of the number one reason that a business would transition to the cloud anyway – Scalability.
    Obviously 16 smaller instances can be more finely adjusted to the application’s usage than 2 ExtraLarge can. If traffic gets higher or lower than expected, two Small Instances can be added at $180/mo for the time that they are running, versus adding another
    ExtraLarge for $720/mo.
    The only exception to using the smallest instance, is in the case of the ExtraSmall, which offers only 1/20 of the bandwidth of the Small, making it only feasible
    for very lightly accessed applications or, more likely, a QA environment.
    From this point, it becomes much easier to estimate the compute instance requirements of migrating the application to Windows Azure. Take the current server(s)
    total resources (CPU, RAM, etc) and find how many Small instances it would take to recreate it. This gives a good starting point, however, remember that these instances can be turned on and off to meet demand, and beyond the first instance, there is no charge
    when the instances are off. This can lead to significant cost savings, and is the primary incentive to migrating to the cloud.
    For azure pricing information, refer
    pricing details.
    If you still unclear about pricing and billing, you may raise a service request with billing team
    here
    Hope this helps.
    Regards,
    Shirisha Paderu

  • How to delete the pre calculation queues

    Hi All,
    There are so many Pre Calculation queues in RSPRECADMIN ->Display Current queue. But there are no corresponding jobs running in SM37. Please suggest me how to clear these queues all at once.
    Also in the OS of the corresponding Pre Calulation Server, I could see lot of workbook excel running in Task Manager.
    Please also suggest how to end these excels.
    Thanks and Regards,
    Subashree

    On RSPRECADMIN transaction, you will see the button Display Current Queue.
    Then, you will see the following:
    > Queue Overview of Open Precalculations
    > Queue Overview of Current Precalculations
    > Queue Overview of Proccessed Error-Free Precalculations
    To clear all queues, press  F7.
    B.R.
    Edwarde John

  • I have garage band on my I pad and from what I can tell I have done everything right with the latest versions and icloud turned on for the app and phone. I have a cloud in the upper right corner of the song but when I go to my icloud drive on a brows

    I have garage band on my I pad and from what I can tell I have done everything right with the latest updates and icloud enabled on the i pad and app. A cloud is in the upper right corner of the song but when I look on my icloud drive on a internet browser on my PC it is not there. Any reason why?

    Bring your phone into Apple for evaluation

  • How to consider only values 0 when calculating Averages

    Hi,
    Currently, I am using Avg (table.column) function to caluclate averages. However, one of our user wants to only consider records that are >0 when calculating averages. For example if a result set as 10 records, and only 2 of them has values >0 then typically Avg function would caculate based on (sum of 10 records/10). But per our user's requirement, he wants to consider (sum of 2 records/2) and ignore the remaining 8.
    Can you please tell me how to do achieve this?

    I found the solution after much deliberatiion myself :-). I used the following formula instead of the AVG function and it worked.
    Note that "Fact - +fact name+".column is a numeric column and can have null values.
    SUM(IFNULL("Fact - +fact name+".column, 0)) / COUNT("Fact - +fact name+".column)

  • Key figure with pre-calculated aggregation

    Hello,
    I have to model a key figure with a pre-calculated aggregation, that is, soruce flat files contain explicit values for months, quarters, halfyears and years. The key figure is a percentage (service quality) and the corresponding company department provides me with results for all aggregation levels.
    I think that this case can be modeled with one InfoCube for each aggregation level and a Multicube but I have to create a high number of InfoCubes (aprox. 180).
    I can get the formula that department uses, but it's possible on BW to change standard aggregation (SUM, MAX, MIN)?
    Thanks in advance and regards,
    Alberto Garcia.

    Hi Alberto,
    One of the ways of doing it could be -
    add another field (char) to your cube called at "value Type" & assign different values to it depending upon the time value.
    say Monthly  - value type 1
        Querter  - value type 2
    & so on ...
    keep filling the same key figure.
    at the time of reporting restrict your key figure accordingly with the value type. ( say need only montly key figure value restrict value type to "1")
    hope it helps.
    VC

Maybe you are looking for