Best Way To Handle Non Additive Measures

I am developing a cube looking at total reservations created against a daily allocation by customer. So lets say I am looking at these measures at the lowest level and show location X for Customer Y with an allocation of 100 and total reservations made as 50. Everything here is OK. My issue comes when I roll up to look at just the location. It is taking the allocation and summing this value but I need this value to remain static at all levels. Is there a way to set an accounts measure to never aggregate? I have tried a few different settings such as NEVER SHARE and setting the member aggregation properties to <NONE> in my OLAP model and it continues to aggregate at all levels. I have alos tried adding this as a dimension but because the value is numeric and because I have a few additional allocation measures that can have the same values, I have issues with duplicates. As additional info.....I am building this using EIS. It's entirely possible that I am approaching this the wrong way so any feedback would be appreciated. I can provide more detail if needed.
Thanks,
Bob

Why don't you put the Account that stores the allocated amount in its own special hierarchy? This hierarchy might have Gen2 parent called "Allocate" with a label only tag and below that a series of ~ tagged members underneath. Give it a goofy name so that there can be no question of the Account's purpose, e.g., Reservations - To Be Allocated.
Your post doesn't indicate what tool you're using for input, but have a separate sheet/form/schedule for the input of the amount to be allocated, have the user enter that amount, save it, and have a calc/HBR launch on save that does the allocation.
Then your second view of the data (form, report, etc.) doesn't include that Account and no-one's the wiser. You haven't lost original input data and since the forecaster looks at the "real" Accounts hierarchy except when inputting data to be allocated, he'll see the spread numbers only.
The only thing I might add to this approach is a series of dedicated Location members that receive the allocated number but that's really a design preference more than anything.
Regards,
Cameron Lackpour

Similar Messages

  • Best way to handle multiple currencies

    I have a requirement that users should be able to report against an OLAP cube in a currency of their choice (from a list of about 20) and was wondering what the best way to handle this might be.
    One option would be to have a currency dimension containing the list of valid currencies and then to pre-calculate measures in each of the currencies and store them in the cube. However the downside of this is that the resultant cube would be 20 times larger than a cube in a single currency, take longer to maintain etc. I could of course partition the cube by currency to improve reporting performance since users would only report in one currency at a time.
    Another alternative would be to dynamically calculate the measures based on exchange rates - I guess this could either be done in the cube itself or as part of the reporting code. However since exchange rates are daily, this would obvioulsy prevent me from aggregating data up the time dimension (all measures are at the day level).
    Is there any standard way of doing this and what are the pro's and con's?
    Thanks,
    Chris

    Sorry - messed up - I should have posted this in the OLAP forum.....

  • Best way to handle source files

    Hi there,
    After some pretty general advise please.
    The company I work for looks after a lot of websites, and one
    of the
    headaches we have is the best way to handle source files. By
    source files I'm
    referring to Photoshop files, Flash .fla files and also other
    none Adobe
    files that relate to a site, not the .html, .asp, .aspx,
    .css, .js etc type
    files.
    Now I'm NOT after a version control system, just a simple way
    to store the
    source files in a location that is separate from the website
    but still to be
    able to have a smooth workflow between the Dreamweaver site
    and it's source
    files.
    At the moment, and I know this is unwise, we have a
    subdirectory within the
    site where we store the source files, and use WebDAV to
    transfer both site
    and source files to and from the server. But I really want to
    separate the
    site from source but still maintain a link between site and
    source...... if
    you see what I mean. I think the upshot is I would like to be
    able to open a
    site within Dreamweaver and instantly be able to access that
    sites source
    files if needed. This method needs to be shared across a
    small team spread
    round the UK.
    I looked at the Repository Subversion version control, but
    like I said I'm
    not after a source control system, plus it appeared to
    conflict with WebDAV
    and Contribute, that some of our clients use to maintain
    content on their
    sites. I also looked at Version Cue which looked promising,
    but can't see a
    clear workflow between Dreamweaver and Version Cue which
    separates site from
    source. I might be missing something.... part of my brain
    perhaps. :)
    Would be grateful for any advice please.
    Cheers,
    @ndyB

    Take a deep breath. Relax. All is fine.
    iDVD does not look at the size of your video file, it looks at the length. iDVD can accomodate up to 2 hours of movie
    iDVD gives you different options depending on the length of your movie. Although I won't agree with your friend about reducing the length of your movie to 15 minutes, if you could trim out a few minutes to get it under an hour that setting in iDVD (Best Performance though the new version may have renamed it) gives you the best quality. Still, any iDVD setting will give you good quality even at 64 minutes
    In FCE export as Quicktime Movie NOT any flavour of Quicktime Conversion. Select chapter markers if you have them. If everything is on one system unchecked the Make Movie Self Contained button. Drop the QT file into iDVD

  • Best way to handle inversion of custom fields?

    We've added some additional numeric fields, such as quantities, to some of the LO extractors.
    What is the best way to handle the reversal of these additional fields when reversal records come over on the extractor?  I tried switching on the "inversion" switch in the extractor and this didn't work - found OSS note 382779 which explains that this is expected behavior, that one must put some logic in the user exit to do the inversion.
    Is it just a matter of coding if the ROCANCEL field (mapped to 0STORNO in BW) is 'X' or 'R' to flip the sign of the numeric field we added?  But is it better to put this in the R/3 user exit or in a start routine in the transfer rules do you think?  Does flipping on the "inversion" switch make any difference at all in this case?
    Thanks for your help!
    Chris

    Hey,
       I dont this inversion which matters  as you said you added quantity type of fields. If i were you i will do the user exit which can take care good but tranfer routine also helps in most of the cases.

  • Best way to handle Christmas music - Playlist or separate Library?

    Anyone have a suggestion about the "best" way to handle Christmas music?
    After reading the article below I'm tempted to create a new library for Christmas music... that way I can keep it on an external drive or even DVD.
    http://www.allthingsmarked.com/2006/09/13/howto-manage-multiple-libraries-in-itu nes-7/

    My solution to this is to add the tag "Xmas" to the "Grouping" field of each Christmas song.
    This little tag stays tucked away out of sight until Christmastime. Then, I create a new Smart Playlist where Grouping contains "Xmas." Voila.
    The Grouping field can be useful for non-genre distinctions you want to add to your music, likesay "live" or "remix" or "cover."

  • OIM 9031: Best way to handle application/test account on target system.

    Hi Guys,
    I am wonder what will be the best way to handle application account created in target sytem . i.e. I have target system Active Directory and on non-trusted reconciliation, I also fetch application/test account which not going to match to any existing user ,but should be capture for reporting or any future action .
    Any input or idea is most welcome !!
    Cheers,
    Ankit

    There are basically two approaches to handle service accounts.
    Either you model them as a free standing RO very similar to a normal AD account or you use the built in "service account" and associate the account with an already existing AD RO instance. I haven't used the "service account" approach in any customer project yet so I can't really comment on the details of that approach (hopefully someone else will be able to do that).
    Are you sure that you have service accounts in AD that you can't attribute to a specific users? Most organizations require service accounts to be linked to a user or a group of users so that the need for the account to continue existing can be verified by a human. Having live accounts in your AD that no one can say what they do or why they exist is normally a very scary thought for most organizations.
    Hope this helps
    /M

  • What is the best way to handle very large images in Captivate?

    I am just not sure the best way to handle very large electrical drawings.
    Any suggestions?
    Thanks
    Tricia

    Is converting the colorspace asking for trouble?  Very possibly!  If you were to do that to a PDF that was going to be used in the print industry, they'd shoot you!  On the other hand, if the PDF was going online or on a mobile device – they might not care.   And if the PDF complies with one of the ISO subset standards, such as PDF/X or PDF/A, then you have other rules in play.  In general, such things are a user preference/setting/choice.
    On the larger question – there are MANY MANY ways to approach PDF optimization.  Compression of image data is just one of them.   And then within that single category, as you can see, there are various approaches to the problem.  If you extend your investigation to other tools such as PDF Enhancer, you'd see even other ways to do this as well.
    As with the first comment, there is no "always right" answer.  It's entirely dependent on the user's use case for the PDF, requirements of additional standard, and the user's needs.

  • (workflow question) - What is the best way to handle audio in a large Premiere project?

    Hey all,
    This might probably be suitable for any version of Premiere, but just in case, I use CS4 (Master Collection)
    I am wrestling in my brain about the best way to handle audio in my project to cut down on the time I am working on it.
    This project I just finished was a 10 minute video for a customer shot on miniDV (HVX-200) cut down from 3 hours of tape.
    I edited my whole project down to what looked good, and then I decided I needed to clean up all the Audio using Soundbooth, So I had to go in clip by clip, using the Edit in SoundBooth --> Render and Replace method on every clip. I couldn't find a way to batch edit any audio in Soundbooth.
    For every clip, I performed similar actions---
    1) both tracks of audio were recorded with 2 different microphones (2 mono tracks), so I needed only audio from 1 track - I used SB to cut and paste the good track over the other track.
    2) amplified the audio
    3) cleaned up the background noise with the noise filter
    I am sure there has to be a better workflow option than what I just did (going clip by clip), Can someone give me some advice on how best to handle audio in a situation like this?
    Should I have just rendered out new audio for the whole tape I was using, and then edit from that?
    Should I have rendered out the audio after I edited the clips into one long track and performed the actions I needed on it? or something entirely different? It was a very slow, tedious process.
    Thanks,
    Aza

    Hi, Aza.
    Given that my background is audio and I'm just coming into the brave new world of visual bits and bytes, I would second Hunt's recommendation regarding exporting the entire video's audio as one wav file, working on it, and then reimporting. I do this as one of the last stages, when I know I have the editing done, with an ear towards consistency from beginning to end.
    One of the benefits of this approach is that you can manage all audio in the same context. For example, if you want to normalize, compress or limit your audio, doing it a clip at a time will make it difficult for you to match levels consistently or find a compression setting that works smoothly across the board. It's likely that there will instead be subtle or obvious differences between each clip you worked on.
    When all your audio is in one file you can, for instance, look at the entire wave form, see that limiting to -6 db would trim off most of the unnecessary peaks, triim it down, and then normalize it all. You may still have to do some tweaking here and there, but it gets you much farther down the road, much more easily.Same goes for reverb, EQ or other effects where you want the same feel throughout the entire video.
    Hope this helps,
    Chris

  • Best way to handle text files in OD10g

    We have a requirement to store reports in text format into a database field, to be able to view the reports, and to print them if desired using Forms 10g. What is the best way to handle this?
    - define the field in the database as clob or blob?
    - if CLOB is the choice, what tools to use to upload CLOBs to the database (since webutil transfer is for blob only)?
    - in Forms 10g, can one use the Forms data type LONG for CLOB?
    - can you do Forms search on clob and blob fields?
    - how can reports that are stored in fields be viewed without first downloading to the client workstation?
    - in Forms 10g, what is the best way to view text files residing in local PCs: "host notepad myFile"?
    Thanks much for your reply!
    gk

    Take a deep breath. Relax. All is fine.
    iDVD does not look at the size of your video file, it looks at the length. iDVD can accomodate up to 2 hours of movie
    iDVD gives you different options depending on the length of your movie. Although I won't agree with your friend about reducing the length of your movie to 15 minutes, if you could trim out a few minutes to get it under an hour that setting in iDVD (Best Performance though the new version may have renamed it) gives you the best quality. Still, any iDVD setting will give you good quality even at 64 minutes
    In FCE export as Quicktime Movie NOT any flavour of Quicktime Conversion. Select chapter markers if you have them. If everything is on one system unchecked the Make Movie Self Contained button. Drop the QT file into iDVD

  • Best way to handle all erros and get performance(OCI)?

    Hi there,
    Im using[b] Oracle Call Interface to execute batch file process. But I have got a problem.
    I set ExecuteBatch with the same number of commit time, i.e: 100, 1000 or just 10, for thats ok.
    ((OraclePreparedStatement)globalStmt).setExecuteBatch(commit);
    And I handle executeUpdate to catch all SQL Exceptions. I made some proposital files with invalid erros but when I handle "executeUpdate" it�s doesn�t get the corrent error line, and puts out another line that is corret.
    ((OraclePreparedStatement)globalStmt).executeUpdate();
    Check in the code I concluded that its always get the same sequence of commit number like the error line. For example, I have between line 1 - 50 a line error, this line is 31, but I set the commit time for 50, its show me that line errror is 50 wherever 31. But if I put setExecuteBatch with '1' so it can handle corret lines erros, but the system performance bring down. What is the best way to handle all erros and keep the perfronace?
    Sorry for my english, I am not native. Thanks all.

    So by doing this, everything will transfer and look exactly the way I have it on the old machine?
    That is correct, if your old machine is Intel based after using MA the new machine will look just like the old machine. Here is information from Apple on MA, I'd recommend looking it over.
    My recommendation is to answer NO when setting up the new machine when it asks "Are you moving from another Mac?" The reason being let you new machine get set up and run for a couple of hours to ensure it's fine. Then launch MA and follow the prompts, it's very easy and if you use a fast connection like FW it should go smoothly.
    Regards,
    Roger

  • Best way to handle when Production XI is in outrage/down

    Hi All,
    My production box is running smoothly now.If we want to go for any outrage and still we will be receiving many messages through internet from many partners.
    So we are expecting how to receive the messages when production XI is in outrage. My client is thinking of putting one more instance with the replica of Production box. So that when the main production is in outrage then immediately they can switch back to this replica version and they can process the messages untill the main one get ready.
    Now I am very much concern what are the implication will be if we go like this.
    1. which is the best way to handle XI box to maintain when the system is outrage/down?
    2. Is it a good idea of creating replica of production instance and maintaing this when the main one is in outrage?
    Kindly suggest what you guys are doing for your implementations.
    Thanks
    Seema

    >
    > 1. which is the best way to handle XI box to maintain when the system is outrage/down?(An act of extreme violence or viciousness)
    >
    XI Box - Outrage -Should be a robotic XI Box. Now I know that you are thinking what to do when cops come and handcuff it. -- Just kidding..
    Ok, here's the logic, you want to replace a piece of water pipe that gets water to your faucet. What do you do without the water leaking or disrupting the water supply... you either turn off the valve or add a temp pipe to keep the water flowing..
    So check into your sys landscape for Fail safe or High Availability, may be one of those can save you money and time.
    AB

  • Best way to handle medical collections?

    Hello all, My husband and I are new to the "rebuilding your credit" world.  We're coming back from a foreclosure and actively trying to rebuild.  I have 4 active collections for 3 creditors listed on my EQ/TU report - totally $399.  I can pay them in full right now but I don't want to call the creditors to pay them if its not going to be removed from my reports properly.  Some of these are old - 2011. I also have 3 collections "removed" from my TU/EQ report since March - will paying these affect what is listed on my reports?
    What is the best way to handle these accounts to ensure they are removed from my reports completely?    

    Contact each creditor and politely request a Pay for Deletion - agreement they will delete from your reports in exchange for full payment. Get this in writing. If that doesn't work (try a few times), pay it, then try the goodwill route -- write letters requesting early deletion of the entries from your reports. Even though negative items will fall off your report in 7 years (so the 2011 collection will fall off in 2018), you will still have to report that you have unpaid debt on future mortgage applications, etc. And on a manual review, paid collections look better than unpaid ones. I would not advise not paying them at all. 

  • Best way to handle large files in FCE HD and iDVD.

    Hi everyone,
    I have just finished working on a holiday movie that my octagenarian parents took. They presented me with about 100 minutes of raw footage that I have managed to edit down to 64 minutes. They have viewed the final version that I recorded back to tape for them. They now want to know if I can put it onto a DVD for them as well. Problem is the FCE HD file is 13Gb.
    So here is my question.
    What is the best way to handle this problem?
    I have spoken to a friend of mine who is a professional editor. She said reduce the movie duration down to about 15mins because it's probably too long and boring. (rather hurtful really) Anyway that is out of the question as far as my oldies are concerned.
    I have seen info on Toast 8 that mentions a "Fit to DVD" process that purports to "squash" 9Gb of movie to a 4.7Gb disk. I can't find if it will also put 13Gb onto a dual layer 8.5Gb disk.
    Do I have to split the movie into two parts and make two dual layer DVD's? If so I have to ask - How come "Titanic", 3hrs+ fits on one disk??
    Have I asked too many questions?

    Take a deep breath. Relax. All is fine.
    iDVD does not look at the size of your video file, it looks at the length. iDVD can accomodate up to 2 hours of movie
    iDVD gives you different options depending on the length of your movie. Although I won't agree with your friend about reducing the length of your movie to 15 minutes, if you could trim out a few minutes to get it under an hour that setting in iDVD (Best Performance though the new version may have renamed it) gives you the best quality. Still, any iDVD setting will give you good quality even at 64 minutes
    In FCE export as Quicktime Movie NOT any flavour of Quicktime Conversion. Select chapter markers if you have them. If everything is on one system unchecked the Make Movie Self Contained button. Drop the QT file into iDVD

  • Best way to handle calling the EP logon page

    I would like to be able to handle the following:
    1. Start EP in anonymous/guest mode (basically unauthenticated).
    2. Present an iview that allows a user to fill in some information but when he/she clicks on a button it would first check if the user is logged on and if not -- call up the EP logon page/self registrtion page.  After logging in/self-registrating, the iview would continue processing.
    What is the best way to handle this?
    Regards,
    Mel Calucin
    Bentley Systems, Inc.

    Hi,
    You have to download the com.sap.portal.logon standar "par" and modify the jsp file to simplify the logon page (maybe only input fields for user and password).
    then upload the new .par file to the portal and update the authschems.xml file to redirect the default method to the new ".par".
    You can insert an "enter" button in the home page that launh a pop up to a dummy page with property "default" authentication scheme. Automatically the Portal shows you in the little pop up the simplyfied login page to insert login data. The dummy page need to redirect the parent page to the portallauncher component and close itself.

  • Best way to handle time taking MIS queries!

    Dear all,
    Recently we have developed some MIS reports that execute large queries and process lot of data. These queries almost held the database and user's experience very slow speed.
    What is the best way to handle these heavy duty queries?
    Like having another server for MIS and then what is the best way to synchronize data on daily basis?
    OR
    Creating procedures that execute at night and populate MIS tables and reports using this formulated data?
    Any other better solution please?
    Thanks, Imran

    misterimran wrote:
    Dear all,
    Recently we have developed some MIS reports that execute large queries and process lot of data. These queries almost held the database and user's experience very slow speed.
    What is the best way to handle these heavy duty queries?
    Like having another server for MIS and then what is the best way to synchronize data on daily basis?Based on your requirement, Streams.
    Creating procedures that execute at night and populate MIS tables and reports using this formulated data?I would not recommend this for the maintenance part involved; and also, this is re-inventing the wheel.

Maybe you are looking for