Aggreation and multiply in Analysis

I have a question for a few days going through my head. I have a very simple database with two tables:
- Products
- Sales
Products in the dimension tabel and Sales the fact table. This creates the following model:
Sales >- Products
Sales in the table, I have two columns:
- Number_of
- Amount
Both columns are aggregated with Sum () in the Business Layer and create a simple analysis, I get the correct results. This results in approximately the following query:
SELECT AL1.SUM (Number_of) AL2.ProdId, AL2.ProdDesc FROM Products AL2, AL1 WHERE AL2.ProdId = AL1.ProdID Sales GROUP BY AL2.ProdId, AL2.ProdDesc
Now I want the column Number_of of Sales multiplied by the column Weight of Products (Number_of * Weight). This action I would rather be found in the analysis and not in the Business Layer, I have the solution to this question :-)
Multiplying by Product is no problem as long as I pick Product column in the analysis. But when I remove this column I get totally incorrect results back. I personally would like to have the following query created (insted of two queries that are joined together in the Business Layer with incorrect results):
SELECT SUM (AL1.Number_of * AL2.Weight) FROM Products AL2, Sales AL1 WHERE AL2.ProdId = AL1.ProdId
Isn't it possible to create this query from an analysis or do I make a mistake in my model?
Thanks for your help in this!
Edited by: ddekock on May 20, 2012 6:07 PM

Is there nobody with a hint or tip about this? (a)

Similar Messages

  • How to create a matrix with constant values and multiply it with the output of adc

    How to create a matrix with constant values and multiply it with the output of adc 

    nitinkajay wrote:
    How to create a matrix with constant values and multiply it with the output of adc 
    Place array constant on diagram, drag a double to it, r-click "add dimension". There, a constant 2D double array, a matrix.
    /Y
    LabVIEW 8.2 - 2014
    "Only dead fish swim downstream" - "My life for Kudos!" - "Dumb people repeat old mistakes - smart ones create new ones."
    G# - Free award winning reference based OOP for LV

  • Batch tracking and material usage analysis

    how do we track the batches and material usage analysis in case of any customer complaint?

    You can use the reports BMBC, MB56, MCRE, MCRX.
    If you want to track by batch wise then batch traceability report MB56.
    Hope this helps you.

  • Critical Action and Role/Profile Analysis

    Hi,
    I want to know the purpose of the Batch Risk Analysis back ground job "Critical Action and Role/Profile Analysis" in RAR 5.3.
    I'm assuming that I need not run this job if I do not want the critical roles/profiles like SAP_ALL to be analysed which were defined to be critical in rule architect.
    Please let me know if there is any other purpose to run the BG job "Critical Action and Role/Profile Analysis".
    Thank you,
    Partha

    Hello Partha,
      You got this right. It will analyze the defined critical actions/roles/profiles.
    Regards, Varun

  • Correlation and Principal Components Analysis software for Mac

    Greetings-
    I am looking for Mac software that will compute correlations between multiple variables and perform principal components analysis on the resulting intercorrelation matrix. These would be small data sets, 10 or 15 variables, 100 subjects. I have Excel which could be used for data entry. I Googled it and became glassy-eyed and may have missed the obvious. This is for personal use so I cannot afford large programs like SPSS which is a great program. If there is nothing runable on a Mac, are there packages on the net that one can use?
    Thank you in advance, I appreciate your help.
    Best,
    Rich
    Huntington Beach, CA

    Hi Limnos-
    Thanks for your response. Your links will be excellent for somebody! The first link concerns Principal Coordinates Analysis for Mac which is related to multidimensional scaling, potentially very valuable for someone looking for such a program. The second and third links provide lists of esoteric software statistical packages some of which for Mac which would be highly valuable for scientific and academic users. Sadly, they didn't provide what I was looking for. Still, all three links were excellent, thanks again.
    Again, I seek a basic correlation package and a factor analysis or principal components package that is free or affordable. If it ran with Excel that would be a plus.
    Thank you in advance,
    Best,
    Rich

  • Customer and supplier aging analysis

    Hi
    How we can do the customer and supplier aging analysis?
    e,g -for more than 3 months and more than 6 months
    what are the various setting we have to do?
    Pls guide me and if there is any config doccument 
    points will be awarded
    regards

    Hi:
    Check this report for vendor aging analysis
    S_ALR_87012085
    For Customer aging analysis
    S_ALR_87012168
    Additionally  go to se38.
    Input pogram RFDOPR00
    If this not suit your requirement, develop a customized report with help of ABAPer.
    Please let me know if you need more information.
    Assign points if useful.
    Regards
    Sridhar M

  • Transfer posting and Certificate of Analysis (BWUL)

    Moderator note:  I broke this off from discussion Transfer posting and Certificate of Analysis (BWUL)
    Please refer to the discussion for any background.
    This has been an interesting thread, and it's circling around the issue I am having, but I can't tell if this thread answers my question or not.  I'll ask here so as not to proliferate duplicate threads:
    I am in an environment where the business produces and tests finished goods in satellite plants and then moves them to a central DC via STO.  When I try to create COAs from the central DC, the COA program cannot find the results or specifications.
    I had the idea to change the profile to look at the production chain (thinking the batch in the production plant was part of the production chain of the identical batch in the DC).  When I did this, the program could not find the correct specifications.  I have been messing with the configuration around the results and specification origins with no success.
    I feel like this should be possible, but I can't figure out what I am missing.  Is SAP able to create a COA from a central DC for a material/batch that was produced in a satellite plant and sent via STO?
    Message was edited by: Craig S

    1) in the BWUL make sure in configuration setup it is set to include stock transports.
    2) You don't indicate where you maintain your specs.  In operations like this you should try to be reporting specs and results from the batch.  But I'm guessing you might be keeping the spec in the inspection plans only.
    3) in the COA profile, you can create your own custom FM for "results origin" and for "results specs".  You may need to create them if you keep your specs in the plans and not in the batch records.
    Craig

  • Critical Action and Role/Profile Analysis job in not running in GRC 5.3

    Hi Team,
    I  am working for a client where GRC 5.3 is installed( support pack 4 and patch 1).
    The installation is complete and also the post processing is done.
    We have scheduled a periodic ( weekly ) incremental background job for Critical Action and Role/Profile.
    Following are the parameter setting used:
    Task: Risk Analysis -Batch
    Batch Mode : Incremental
    First time it run successfully on 28th June'09 and it is completed with spool also. But next time it is supposed to run on 4th of July'09 . But it does not. And since then it is in same state.
    I am not able to find any reason that why it is behaving this way where other incremental jobs are running successfully.
    It will be helpfull if any one can guide me providing the solution.
    Regards,
    Kakali

    Hi Varun,
    I go to the Job History Button. It shows the following data only :
    2009-06-28 00:00:59 Done Job Completed successfully
    2009-06-27 23:45:00 Started RAR_PE1CLNT100_Critical Action and Role/Profile Analysis started :threadid: 0
    Under the Last Run Colomn it shows 28th June ( Status -completed)
    Under Next Run Date it is showing 4th July
    Follwoing are the list of Updates available From SP05
    When executing the critical roles/profile jobs in background, a message
    "error while executing the Job: null" comes up. ---( this one is for which come under Informer Tab)
    Background job spools are not available after upgrade from 5.2 to 5.3.
    Critical action and critical role/profile analysis cannot be run in
    background by system. --- ( But in my case It ran for once )
    Selection parameters (System, User and User Group) have been provided for
    "Critical Action and Role/Profile Analysis" in Configuration->Background
    Job->Schedule Job. --- ( it means it run usually)
    Critical Actions report in detail view shows no results after executing the
    Risk Analysis Job in the background. The same report shows data when
    executed in the foreground. ( this one is for which come under Informer Tab )
    When there is only one periodic job configured in RAR, this job fails to
    start after the first time in the specified time. ( this is not true, becoz there other periodic jobs running successfuly)
    Unable to run Informer - audit reports - critical role and profiles with
    logical systems. ( this is again under Informer Tab )
    I had gone through this  earlier also, but not able to match any update with my problem. If if have any other suggestion you can provide me the same.
    Is there any way to check for job log so that I can check what is the problem. View Log option is also greyed out as we have sap logger set up as a default logger Parameter. I have made it enable just to check but there is nothing.
    Please Guide.
    Regards,
    Kakali

  • How can I use two different PC's with Lr5.5 to manage and process my photos stored on external disk, and exchange the analysis between these two computers?

    I operate on two computers and would like to be able download  Lr jobs done on my laptop,  to my stationary PC, also used to process other images, all stored on the same external disk.
    In other words I want to process may photos on two PCs and to to have access to all analyses from one or even better from both computers independently.
    Samek.

    You need to put your photos AND LR catalog file (.lrcat) on the external drive, and make sure the external drive has the same device name or drive letter on both computers.
    I have been using LR this way for several years and it works fine as long as you MAKE A BACKUP of the external disk somewhere, in case you lose or drop/damage the external drive which is easy to do.  Nightly, or sometimes more than once a day, I copy all changes from my working external disk to another similar-sized external disk.  I have lost a disk, once, and dropped/destroyed a disk, once, but recovery is as easy as using what used to be the backup disk as the primary one, and buying a new disk for the backup device.  It does take some hours to copy the 1TB of photos to the new backup disk, but once that is done, it usually only takes a few 10s of minutes or less to copy what is new or changed each night.

  • Creating OLAP report with Crystal Reports and SQL Server Analysis Services 2005

    Post Author: orahc_mao
    CA Forum: Data Connectivity and SQL
    Hi!
    I am currently using the trial version of the Crystal Reports XI and I wanted to do an OLAP report. The problem is I cannot select a cube using "OLAP Connection Browser" (the popup window). I already selected Microsoft OLE DB Provider and entered the server name but still can't connect.
    I don't think the problem is with SQL Server Analysis Services (2005) since Microsoft Excel - Import Data can detect the server as well as the cube I have created.
    I also tried the "Database Expert" of Crystal Reports, created an OLE DB (ADO) connection with "OLE DB Provider for OLAP Services 8.0" driver, entered the server name, checked the integrated security and then I can readily select the database created at SQL Server Analysis Services. However, I still need the OLAP grid create the report but it goes back to my original problem... selecting a cube.
    I hope somebody would help me with this issue.
    Thanks a lot!
    orahc_mao

    Hello,
    I don't recognize those variables as CR ones so it's likely something the original developer is setting in code.
    You'll have to discuss with that person.
    If your have SDK issues then post your question to one of the .NET or Java SDK forums.
    Thank you
    Don

  • UCCX 8.5 - Historical Reports - Traffic Analysis report and Application Performance Analysis report different calls presented

    Hi,
    Please Advice.
    When I compare Traffic Analysis report and Application report, Calls presented are not same. Please Help !
    Also attached herwith the reports

    Mohammed,
    It is common, in many of the Cisco Express 8.5 environments we have looked at, for the Total Incoming Calls given on a Traffic Analysis report to be a higher number than an Application Report.
    The Traffic Analysis Report counts every unique sessionID (unique call) that is inbound (contact type of 1).  The Application Reports do a similar thing but qualify (filter) only the records that have an application assigned.
    There are simply times where inbound calls have been directed to an "agent" without having an applicaiton assigned.
    The best thing the reporting user can do is to run a query on his or her database such as: 
         select * from ContactCallDetail where contactType=1and startDateTime > '2012-11-16 10:00:00' and startDateTime < '2012-11-16 11:00:00';
    Usually when an application is not assigned to the record in the ContactCallDetail table it is because the destination type is equal to 1, which is an 'Agent' instead of a 'route point'.
    So if you modify your select statement to filter by destinationType, you can quickly find the records that don't have the application assigned. 
    Example:  select * from ContactCallDetail where contactType = 1 and destinationType = 1 and startDateTime > '2012-11-16 10:00:00 and startDateTime < 2012-11-16 11:00:00';
    When you look at these records, you will see the agent that took the call from the destinationID field.  The number in that field should match up with the field called 'resourceID' in a table called 'resouce';
         Example:  select * from resource where resouce=6011; where 6011 was the number you found in the destinationID field.
    If there is still confusion about the source of the call - then talk to that agent and find out what is was.
    Good Luck and let me know if you need further help.
    Ron Reif
    [email protected]

  • Vendor and customer ageing analysis

    Hi Gurus
    i am in a project and my client wants vendors and customer  age vise analysis to be shown in balance sheet that's less than six months and more than six months  kindly help me for doing the same
    Thanks
    venkat

    hi,
    you can use SAP standard report. S_ALR_87012178 - Customer Open Item Analysis by Balance of Overdue Items. This gives an wide option by the customer where in you can choose the aging period by the day i.e. 10, 15, 20, 60, 90 days etc based on your requirement.
    when you perform your data migration make sure that the baseline date and payment terms are correct so the aging will be correct for the line items you load. You dont need to bring over cleared line items, as they wont affect the aging, all they will provide is payment behaviour.
    Some data migration projects assign a due immediately payment term and have the due date in the baseline date.
    Regards,
    Greeshma

  • OBIEE group and filter in analysis

    Hi all,
    As an example, I have an analysis with the following table and values on it:
    Code
    Description 1
    Description 2
    Date
    Description 3
    Description 4
    Description n
    1
    abc1
    a
    01/01/2001
    random value
    random value
    random value
    1
    abc1
    a
    01/01/2003
    random value
    random value
    random value
    1
    abc123
    a
    01/01/2005
    random value
    random value
    random value
    2
    abc2
    b
    01/01/2001
    random value
    random value
    random value
    My goal is to perform an aggregation of this data, meaning, for the same Code with the same Description 1 and Description 2, I would like only to display the row with the Minimum Date. Meaning, the end result would be not showing the row in red!
    Is it possible to do this without changing the RPD and ETL?
    Best regards,
    André

    Hi,
    You can use Conditional format in the selection step.
    According to the description 1 and 2 set the minimum date in the conditional format.
    http://mkashu.blogspot.com
    Regards
    VG

  • Javascript only works for "easy" numbers (and multiplying error)

    HIi Everyone,
    I have a scripting problem.  I used javascript to do some division and subtraction, but it only works some of the time in the second half of my form.  What I mean is, if I'm using a nice even number like 100-the calculations are correct, but if I use 415.75, it calculates, but not correctly.  (I actually also have that problem with some multiplying also???)  It's very weird.  And it only seems to be this particular form.
    I can e-mail the form if that will help.  I'd appreciated any help you can give me.
    Thanks,
    Valerie

    Sure.  The form looks like this: (Field Names are in quotes.)
    1. Total Amount of Third Party Recovery      "total third"
    2. Accrued Compensation Lien                   "accrued"   (I clicked in calculations and just clicked "value is the sum of the following: and then clicked the field
         a. "indemnity benefits"                                             names of a and b.)
         b. "expenses of medical"
    3. Expenses of Recovery                             "recovery expenses"
    4. Balance of Recovery                              "balance"  (Its supposed to be #1minus #2)  I used this script:
                                                                     this.getField("balance").value = thisgetField("totalthird").value - this.getField("accrued").value;
    HiddenLockedField "percent" (The content is set to 100.)
    5. Lien Expenses Reimbursement Rate          "5"   (#2 divided by #1 * 100)   I used this script:
                                                                       event.value = ""
                                                                       var accrued = getField("accrued").value;
                                                                       var totalthird = getField("totalthird").value;
                                                                       var percent = getField("percent").value;
                                                                       if(totalthird!null && total third!=0)
                                                                       {event.value = accrued / totalthird * percent;}
    6. Expenses Attributable to Lien                    "attribute"    (Supposed to be #3 times #5.)  I went to calculations, anc clicked on, "value is the product
                                                                         of the following" and checked "recovery expenses" and "5".
    7. Net Lien                                                   "netlien"   (Supposed to be #2-#6) I used this script:
                                                                       this.getField("netlien").value = this.getField("accrued").value - this.getField("attribute").value;
    8. Reimbursement Rate of Future Lia.             "rate" (Supposed to be #3 divided by #1 * 100.) I used this script:
                                                                         event.value = "";
                                                                         var recoveryexpenses = getField("recoveryexpenses").value;
                                                                         var totalthird = getField("totalthird").value;
                                                                         var percent = getField("percent").value;
                                                                         if (totalthird!=null && totalthird!=0)   
                                                                         {event.value = (recoveryexpenses / totalthird) * perent;}           

  • Pick a subject area at run time and create an analysis on this subject area.

    We have a requirement wherein we need to create analyses at run time. We want to give the flexibility to users to pick up a subject area on which they wish to create an analysis. This will be done through a dashboard prompt. Once the user picks up a subject area from the prompt, we will create the analyses at run time based on the selected subject area.
    The problem that we are facing at this point is how to create a dynamic 'FROM' clause for the analysis. As an example, consider the following scenario:
    1. There is a dashboard prompt called 'Select a DataStore'. The possible values are all the subject areas in the RPD. For simplicity, lets assume there are two values DS1 and DS2.
    2. If the user selects DS1, the analysis sql should be:
    select <colgroup1>, <colgroup2>
    from DS1;
    However, if the user selects DS2, the analysis sql should be:
    select <colgroup1>, <colgroup2>
    from DS2;
    Hence the requirement for a dynamic FROM clause.
    I tried using a presentation variable in the from clause but I get the sql syntax error.
    Is there a way of implementing this requirement?

    To work around the license issue, see the Microsoft Knowledge Base article LICREQST.EXE Requesting a License Key from an Object and download licreqst.exe. Run licreqst.exe and find "CWUIControlsLib.CWSlide.1" under the registered controls section and click on it. The license string at the top can be passed to Licenses.Add and then you should be able to dynamically create the Measurement Studio controls. For example:
    Private Sub Form_Load()
    Licenses.Add "CWUIControlsLib.CWSlide.1", "[Add license string from licreqst here]"
    Dim slide
    Set slide = Controls.Add("CWUIControlsLib.CWSlide.1", "slide")
    With slide
    .Top = 120
    .Width = 1215
    .Visible = True
    End With
    End Sub
    Creating controls like this would have an impact on performance since all calls are late bound. If you are using the Measurement Studio controls in a UserControl, though, you should not need to do this since the license would get compiled into the binary that contains the control. I don't think that you would have any problems creating your control since it would not be licensed.
    - Elton

Maybe you are looking for