Report on Metadata

I need to make a report on Metadata either in BW 3.5/ BI7.0. The report should contain variables of (user should enter the values) InfoArea, InfoCube, queries in that InfoCube and the InfoObject for that Query. I know 2 InfoObjects:
1> 0TCTQUERY-----Which Gives Querys in BW
2>0TCTAPPL----
Which Gives Application Component
But I cannot get the details of the Query, like Infoobjects used in the query, Restrictions on the query, Variables used on the Query etc.
Please help and earn Points.
Raj

There are 3rd Party tools that can give you this level of detail, or you can roll your own with jQuery or other programming.
Steve Clark, MCTS |
Twin-Soft Corporation
Easy Bins Roll-off Dumpster Rentals in Northern VA
Specializing in:
Driveway-sized, roll-off dumpster rentals in Fairfax VA |
Dumpster Rentals for Junk Hauling in Springfield VA
Roll-off Rental Dumpsters in Annandale, VA |
Dumpster Rentals for Estate Cleanout in Alexandria VA

Similar Messages

  • Bug Report: enum metadata containing 'true' and 'false' string values not displaying unless "goosed"

    I have metadata whose value is 'true' but the corresponding enum title is not being displayed. If I set it explicitly to 'true' it will be displayed even though the value is the same as it was.
    Likewise for 'false'.
    I do not have this problem with enums that have any other string values i.e. 'yes' and 'no' function properly, but 'true' and 'false' do not.
    Rob

    Due to better understanding on my part, and the fixes in Beta 2.2, the
    issues raised in this thread are put to rest.
    Abe White wrote:
    >
    We'll investigate further. Stay tuned...
    "David Ezzio" <[email protected]> wrote in message
    news:[email protected]..
    Abe,
    Actually, it doesn't make sense. The first iteration shows
    jdoPostLoad is called just prior to using the fields of Widget and Box.
    After the commit, the instances are not cleared. So far, so good.
    In the second iteration, we see that jdoPreClear is called, but
    jdoPostLoad is not called. Yet the values are there for use during the
    call to Widget.toString() and Box.toString() within the iteration. How
    did that happen?
    On the third iteration, now the value of name is null, but last we
    looked it was available in the second iteration. Other than that, I
    agree that the third iteration looks ok, since it is being cleared and
    loaded prior to use. But in the jdoPreClear, the values should be there
    since the values were used in the previous iteration and are not cleared
    at transaction commit due to retainValues == true.
    David
    Abe White wrote:
    David --
    I believe the behavior you are seeing to be correct. Section 5.6.1 of
    the
    JDO spec outlines the behavior of persistent-nontransactional instancesused
    with data store transactions. As you can see, persistentnon-transactional
    instances are cleared when they enter a transaction; thus the calls to
    jdoPreClear. When the default fetch group is loaded again (like whenyou
    access the name of the widget) a call to jdoPostLoad is made.
    You are seeing the name of one instance as 'null' in the third iterationof
    your loop because the instance has been cleared in the second iteration,and
    the jdoPreClear method is not modified by the enhancer (see section10.3).
    Make sense?

  • Report SDK roadmap or workarounds?

    Hi, my company ships a metadata-driven 3rd-party application which creates and manages universes and WebI reports.  We're finally starting to get customer interest in upgrading to BOBJ BI 4.0, so our next release will support that.  Like a lot of other people, we were dismayed to learn that SAP has gutted the Java Report SDK, which was the only SDK capable of creating WebI reports.  So right now I'm faced with the prospect of a release which will only create WebI reports for 3.1 customers, and leave 4.0 customers out of luck.
    Here are some of the workarounds I've been considering.  They are all pretty expensive/risky, so I don't want to commit to any of them until I have better insight into where SAP is taking things.  I'd appreciate any feedback on any of these ideas.
    1. Ditch WebI reports and build Crystal Reports instead.
    Pros:
    i. There's still an SDK available for Crystal Reports
    Cons:
    i. We believe our customers still use WebI more than Crystal Reports. 
    ii. We've not yet worked with this SDK so we can't estimate how hard it will be to use or how buggy it might be.
    iii. We don't know if SAP will de-support that SDK in a later release as well. 
    iv. We don't know if SAP is pushing customers toward Crystal Reports over WebI, or plans to continue with both over the long term.
    2. Hope for an Report SDK replacement in a future release.
    Pros:
    i. Less to do now for us.
    ii. The replacement, when it comes, may be more reliable and cheaper to use that the current Java SDK.  For example, it may be supported directly in .NET so we can strip out our Java components, or it may be a true XML SDK so we can simply build reports as XML docs and push them reliably to WebI
    Cons:
    i. Our 4.0 customers won't get any reports.
    ii. The replacement may never come.
    3. Hack it.
    Possibilities include:
    i. Use Fiddler, java reflection, etc, to emulate WebI and bypass the API.
    ii. Look at BOBJ's report archive/migration utilities, or at the WebI repository, to find a way to fake it at a binary level
    Pros:
    i. Seamless to our customers
    Cons:
    i. Probably way too expensive and risky to really take seriously
    Thanks in advance for any feedback or advice.  We are of course pursuing the same questions in parallel with our SAP partner contacts.
    -Eric

    Hi Terry, and thanks for that update.
    I work for Noetix Corporation.  Weu2019ve been a Business Objects / SAP partner for almost 5 years with our Noetix Generator for Business Objects product.
    Our core products (NoetixViews and Noetix Analytics) create database views and data warehouse tables used for real time reporting and analytics of Oracle EBS systems.  We also have a Noetix Generator product family, providing integration with major BI platforms (BOBJ, Cognos, OBI EE, etc.).
    Noetix Generators do two main tasks:
    u2022     Generate the semantic model for the BI Platform (in this case, BOBJ Universes) using metadata describing the views and tables from NoetixViews and Noetix Analytics.
    u2022     Generate sample reports (in this case, WebI reports) using metadata from NoetixViews that provide report definitions in a BI-platform-independent form
    Noetix Generators are admin tools and are not deployed outside of IT.   The generation process is essentially a one-time step (no u201Cre-creating the report each timeu201D, just occasionally during upgrade processes) .  So you are correct to some extent when you suggest that weu2019d want to create a u201C[set of reports] that can be easily deployed programmaticallyu201D.  The catches are that:
    1.     We use the same metadata to create these reports on all BI platforms.  Maintaining separate metadata for each BI platform is not an option.
    2.     The reports we create are sensitive to the customizations in customer EBS configurations.  Itu2019s unlikely (I think) that SAP could create a tool that could export a set of reports from one custom environment to another without allowing for some significant (metadata-driven) tweaks in between. 
    All that said, hereu2019s what I think a report creation API should look like (and what the other major vendors' report creation APIs all actually do look like):
    1.     Define each report as a single XML document.  Make that XML easily viewable, at least to super-users or developers.
    2.     Define an API or create SDK tools for exporting/importing/securing these XML documents.  Good language support (.NET, Java, etc.) is nice here, but really secondary since 95% of the real work is in manipulating the XML.  We can always just shell out to a command line utility for the other stuff if we have to.
    Thatu2019s pretty much it.  In the case of BOBJ, the XML would likely contain numerical ID pointers (rather than string references) to underlying semantic objects like Universe IDs and the columns defined in the Universe.  This might be an inconvenience, but not a significant one from our point of view since weu2019re already used to working with these IDs (from our experience with BOBJ XI).
    Thanks, Eric

  • Activation the Standard reports for SAP CRM IN BW 3.5 Version

    hi guys
    I am working on SAP BW 3.5 version
    I have Activated the Satadard Reports for CRM Campaign, Lead, Opportunity in BW 3.5  Business Content
    But I don't find  the any single Standard Report in BEX.
    I followed these Steps ;
    BEX - Open - Queries - Info Area
    If any One Knows please let me Know, Urgent Requirement
    thanks in Advance
    Suma

    did u try to see that reports in metadata repository
    login to bi system
    enter RSA1
    on the left panel select METADATA REPOSITORY
    select LOCAL OBJECTS
    then hit Query
    TRY TO find ur REPORTS WITH TECHNICAL NAME
    if it exists here than u shuld be able to seee in infocube.
    if that doesnot make sense
    go back to LOCAL OBJECTS SCREEN
    hit INFOCUBE
    locate ur infocube and select it
    now everything with respect to infocube active and available is visible
    all char catalog, keyfig catalog, queries etc....

  • OIM report / add column to working report

    Hi,
    I need some help with OIM reports.
    Cannot add a column to a working report. What I have done is that I have changed the Stored procedure and then changed the Report XML Metadata to include that column.
    The report just turns out to be blank after the changes made.
    A snippet of the change to the Stored Procedure(added line in bold):
    -- contruct the query
    strColumnList := ' tusracc.system as "SYSTEM", ' ||
    ' vifs.ifsemp_employee_id as "EMP_ID", ' ||
    ' vusr.usr_first_name || '' '' || vusr.usr_last_name as "EMP_NAME", ' ||
    *' vusr.usr_status as "USR_STATUS", ' ||*
    ' tusracc.attribute as "ATTRIBUTE", ' ||
    ' tusracc.attrvalue1 as "ATTRVALUE" ';
    strFromClause := ' v_users vusr, v_ifsusers vifs, tmp_user_access tusracc ';
    strWhereClause := ' tusracc.usr_key = vusr.usr_key ';
    strWhereClause := strWhereClause || ' and vusr.usr_key=vifs.usr_key(+)';
    IF strsortcolumn_in IS NULL THEN
    strOrderByClause := ' 2 ' ;
    ELSE
    strOrderByClause := strsortcolumn_in ;
    END IF;
    IF strsortorder_in = 'DESC' THEN
    intSortDirection_in := 0;
    ELSE
    intSortDirection_in := 1;
    END IF;
    -- run the report query
    XL_SPG_GetPagingSql(strColumnList,
    strFromClause,
    strWhereClause,
    strOrderByClause,
    intSortDirection_in,
    intStartRow_in,
    intPageSize_in,
    select_stmt
    OPEN csrresultset_inout FOR select_stmt;
    ELSIF intdocount_in = 2 THEN
    OPEN csrresultset_inout FOR select dummy from dual;
    END IF;
    In the report XML Metadata I have added a line for the USR_STATUS column(change in bold):
    <Report layout="Tabular">
    <StoredProcedure>
    <InputParameters>
    <InputParameter name="v_resource_name" parameterType="varchar2" order="1" fieldType="LookupField" fieldLabel="report.ResourceAccessList.label.resourceName" required="true">
         <ValidValues lookupCode="Lookup.Reports.Objects"/>
    </InputParameter>
    </InputParameters>
    </StoredProcedure>
    <ReturnColumns>
    <ReturnColumn name="SYSTEM" label="customer.report.system" display="true" position="SectionHeader" primarySort="true" filterColumn="false" />
    <ReturnColumn name="EMP_ID" label="customer.report.emp_id" display="true" position="SectionHeader" primarySort="false" filterColumn="false" />
    <ReturnColumn name="EMP_NAME" label="customer.report.emp_name" display="true" position="SectionHeader" primarySort="false" filterColumn="false" />
    *<ReturnColumn name="USR_STATUS" label="global.label.Status" display="true" position="SectionHeader" primarySort="false" filterColumn="false" />*
    <ReturnColumn name="ATTRIBUTE" label="customer.report.attribute" display="true" position="SectionHeader" primarySort="false" filterColumn="false" />
    <ReturnColumn name="ATTRVALUE" label="customer.report.value" display="true" position="SectionHeader" primarySort="false" filterColumn="false" />
    </ReturnColumns>
    </Report>
    Anything that I have missed to do? We have restarted the app server, but it still doesn't work.
    Regards,
    Thomas

    Arghle.....
    I found the reason for the blank result page with this...
    apparently we are not allowed to name a resulting column in a stored procedure to USR_STATUS, silent fail :/
    When I renamed this to EMP_STATUS it all works as intended...
    Duh!
    I wonder in where it says not to do this...

  • Web Service to retrieve report

    I am trying to use the web services to retrieve a report's metadata, in particular the report filters. I can't find in the documentation the right classes/methods to use.
    In Jdeveloper, I have set up web service proxy for the WEBCatalogService, as well as SAWSession...
    Has anyone done this, and if so, could you provide any examples?
    This is on OBIEE 11g.
    thanks in advance!

    Check these links
    www.rittmanmead.com/files/biforum2011/Heljula_SOA.pdf
    http://www.rittmanmead.com/2011/11/web-services-in-bi-publisher-11g/
    http://oraclebizint.wordpress.com/2007/07/31/customizing-obi-ee-soap-api/
    http://www.rittmanmead.com/2011/10/oow2011-obiee-11g-and-adf-integration-using-the-action-framework/
    Pls mark if helps

  • Office 365 Reporting Web Service Message Trace - It says it can get the past 30 days but i can only get the past 7 days

    It says Message Trace data is stored for 90 days here: http://technet.microsoft.com/library/jj723162(v=exchg.150).aspx
    When running the MessageTrace report using the Office 365 Reporting Web Service it only returns the past
    7 days, even though it says up to 30 days here: http://msdn.microsoft.com/en-us/library/office/jj984335(v=office.15).aspx.
    PowerShell’s "Get-MessageTrace" cmdlet only returns the past 7 days as well. PowerShell also has
    an alternative cmdlet called "Start-HistoricalSearch" which returns the MessageTrace results between 7 and 90 days old.
    Is it supposed to return 30 days, or is the documentation wrong, or is their an alternative report (like
    PowerShell) to get MessageTrace results for over 7 days using the Office 365 Reporting Web Service?
    Note: I'm aware that the results are returned in batches of 2000 MessageTrace results.

    That document is pretty old last modified March 2013 so I would say it's out of date. The reporting services are still pretty new in the current release I believe 7 days is limit for the Rest Endpoint and the Get-MessageTrace cmdlet (which is what
    the more recent documentation around message tracking say) there is no REST endpoint for Start-HistoricalSearch you can see all the current endpoints by looking at the Metadata via
    https://reports.office365.com/ecp/reportingwebservice/reporting.svc/$metadata
    These services are evolving so new features and reports are added which is explained in
    http://msdn.microsoft.com/EN-US/library/office/jj984346(v=office.15).aspx . But there is no public information on Timeframes and what reports and features are going to be added in future updates (that I know of). If you need data older then
    7 days then Automating the HistoricalSearch cmdlets would be the way to do it for now.
    Cheers
    Glen
     

  • Office 365 - Reporting web service - CsConference* report TotalConferences value clarification

    Site Map: Office development > Office 365 > Reporting web service > Reference > Lync reports > CsConference*
    report<o:p></o:p>
    In Office 365, Lync Conferences
    Report.
    Field - "TotalConferences" is the number of conferences of all types
    in the reporting period. The sum of other conference type (IM, Audio/video, App
    Sharing, Web, dial-in) count doesn't equal/match the Total Conference count.<o:p></o:p>
    In the attached screen shot for a reporting period.
    IM Conference Count : 197
    Audio/video Conference Count : 67
    Application sharing Conference Count : 68
    Web Conference Count : 10
    Dail-in Conference Count : 5
    The sum of the above conference type is 347. In the report
    "Total Conference" count is displaying it as 268.<o:p></o:p>
    why is the difference?
    <v:shape
    alt="http://social.technet.microsoft.com/Forums/getfile/533351" id="Picture_x0020_1" o:spid="_x0000_i1025" style="width:284.25pt;height:300.75pt;" type="#_x0000_t75"><v:imagedata o:href="cid:[email protected]"
    src="file:///c:\Temp\msohtmlclip1\01\clip_image001.png">
    </v:imagedata></v:shape>

    That document is pretty old last modified March 2013 so I would say it's out of date. The reporting services are still pretty new in the current release I believe 7 days is limit for the Rest Endpoint and the Get-MessageTrace cmdlet (which is what
    the more recent documentation around message tracking say) there is no REST endpoint for Start-HistoricalSearch you can see all the current endpoints by looking at the Metadata via
    https://reports.office365.com/ecp/reportingwebservice/reporting.svc/$metadata
    These services are evolving so new features and reports are added which is explained in
    http://msdn.microsoft.com/EN-US/library/office/jj984346(v=office.15).aspx . But there is no public information on Timeframes and what reports and features are going to be added in future updates (that I know of). If you need data older then
    7 days then Automating the HistoricalSearch cmdlets would be the way to do it for now.
    Cheers
    Glen
     

  • How to make Image/size/resolution agree with metadata resolution?

    After scanning the help file a bit and searching thru this forum a bit with `metadata resolution' as search criteria, I'm still pretty confused about how it is supposed to work.. I may have never actually found the right part of the help file.
    Cutting to the chase: How can I make the resolution reported by the metadata tab for resolution agree with what is reported inside photoshop at Image/size resolution.  I didn't see any way to set how res is reported in bridge.
    I see many pictures reported in metadata as 72 ppi for resolution and  reported in photoshop as 200 pixel/inch
    They do BOTH mean pixels per inch right?  If so, or really even if they don't how can I get the two reports to show the same resolution?
    There are too many processes that depend on knowing res in advance of opening a file for that to be so far divergent.

    Jingshu Li wrote:
    Some JPG or Tiff with Camera raw settings will be opened in ACR firstly when you double click them. In ACR workflow option (the bottom in the ACR dialog) the resolution value isn’t same with that in Bridge (metadata panel). If you go ahead to open the image in PS by clicking ‘Open Image’ button in ACR dialog and then check the image size from Image -> Image Size in PS, the resolution will be same with ACR (actually it is changed by ACR).
    If disable ACR support for JPG and Tiff in Bridge (Open Camera Raw Preferences in Bridge and choose ‘Disable JPEG support’ or ‘Disable TIFF support’ for ‘JPEG and TIFF Handling’), JPG or TIFF files will be opened by PS directly and the resolution matches between Bridge and PS. I believe this is an ACR bug.
    So what file types are you using and if they’re opened by ACR firstly in your workflow?
    The files are *.jpg.   But I don't see where camera raw comes in.  These are common jpg files.  I'm opening them by double click in bridge.
    Any way I did disable jpg and tiff in the Camera Raw preferences as you suggested.  At Jpeg and tiff handling, both disabled. and restared bridge.
    However, I see no improvment.
    For example, a file reporting 72 ppi in bridge when opened in photoshop ... Images/size resolution reports 480 pixels/inch.
    There is obviously some kind of miss handling that has been done to these files for them to have such a high resolution.   I suspect they may have been handled at Walgreens or Walmart... or similar since I had something very similar happen to me once when I had a bunch of pictures developed at Walgreens and they cam back with resolutions like that, when I know the camera would have given them something way less.
    But I don't see how any of that should be effecting Bridges  inability to get the resolution right.
    Here is another example: some.jpg
    It appears that bridge is showing an unusually huge file size in inches, to account for the reported 72 ppi.
    Bridge:
    Size: ...............1.14mb
    Dimensions:.....1496x1064
    Dimensions in
    inches ............20.8x26.7
    Resolution: .....72 ppi
    ===========================
    Photoshop:
    pixels
    Width 1496
    height 2064
    Document size
    inches
    width  4.987
    height 6.88
    Resolution 300
    Do you have further suggestions?

  • Microstrategy metadata w Oracle

    Hi everybody,
    I have a few questions concerning the best practices about storing microstratgy's [bi reporting tool] metadata in Oracle and as our experience w Oracle is limited I kindly ask for you advice / opinion...
    We are planning to configure an metadata database with two schema, one for the production environment and one for development.
    1) Are there any best practices concerning the back-up procedures(in terms of type of backup, frequency..etc) ?
    2) For hardware, any benefit in partitioning the server into drives, if the server will only be used for holding the Oracle metadata? Except for the standard RAM and disk space recommendations  are there any other hardware settings that we should take into consideration?
    3) Any recommendations for the physical separation of the data contained in the two schema?
    Any input is appreciated
    Regards,

    >1) Are there any best practices concerning the back-up procedures(in terms of type of backup, frequency..etc) ?
    application source code (including DDL) should be retained in a source code repository, like SUBVERSION.
    >2) For hardware, any benefit in partitioning the server into drives, if the server will only be used for holding the Oracle metadata? Except for the standard RAM and disk space recommendations  are there any other hardware settings that we should take into consideration?
    none
    >3) Any recommendations for the physical separation of the data contained in the two schema?
    schema data is isolated from each other by default.
    exception are made by issuing GRANT.
    With Oracle everything is forbidden; except that which is explicitly GRANTED.

  • How to check whether standard report or not

    Hi,
        can anyone help me how can u check whether the report is standard or not.if not how to genarate.
    can u give steps............

    Hi,
    U can See Standrd report In Metadata Repositary under Onfocube,
    Query Name start with0 it is contant.
    *******Asgin points if usefull***
    Cheers
    Satya

  • How does metadata get changed by both Lightroom and another application??

    I am frequently getting the message "The metadata for this photo has been changed by both Lightroom and another application. Should Lightroom import settings from disk or overwrite disk settings with those from the catalog?" on images in my catalog. I use only Lightroom to organize, add keywords, do basic adjustments, etc. On the occasions I am taking an image into Photoshop to do further work, it's exported as a TIFF or PSD and the original stays in the LR catalog untouched by anything else I might do to the image.
    I have two choices with this message: Use Lightroom's settings, OR overwrite them from disc. The first few times this scared me, as I had no idea where those changes were coming from and why the disc file had something different than the lightroom catalog. I have all my metadata saved in XMP files. But I took the chance and said overwrite LR settings, and saw nothing change on the sliders or image being displayed. I then tried a few saying 'keep LR settings', and again I saw no visible signs of changes. That leads me to believe that they are NOT changed but LR things they are somehow. But I don't like seeing it, I want it to go away unless there really are changes happening outside of LR (like if I went into Bridge or something...)
    Any ideas why this message shows up? I just discovered about 100 images in a collection of images I sent to stock agency three years ago, and haven't done anything with them outside of LR since.
    Thanks for any help you can give.

    Users have complained for years about spurious metadata-has-changed notifications.  If you're confident that you're not editing the file itself outside of LR, then choosing Overwrite Settings will rewrite the metadata in the file with the catalog metadata and (usually) make the spurious notification go away.
    Unfortunately, LR doesn't tell you which metadata fields have changed, so users are left flying blind.  I believe Rob Cole had a plugin that would tell you metadata differences between the catalog and the disk file, but his Web site has been down for a couple of weeks now.
    ManiacJoe: My guess is that Photoshop is updating some EXIF data fields in the raw file, for example the "edited with software" field. Since Lightroom probably has already set that field, it gives you a warning that something changed.
    Interesting hypothesis.  I just tested that with LR 5.7.1, Photoshop CC 2014, OS X 10.10 and couldn't reproduce the problem.  I exported a TIFF from a cataloged raw file into the same folder as the raw, edited the TIFF and changed some of its metadata in Photoshop, and saved it.  LR didn't report a metadata-has-changed notification.
    In general, I'd be (only moderately) surprised that if you edited x.tif in PS, it would change any metadata in x.cr2 or x.xmp.

  • Issue  in Configuring Oracle WebCenter Content: Records

    HI,
    I have an issue in Configuring Oracle WebCenter Content: Records
    In "Records Management Setup Checklist"
         Checked-in Audit Entries Default Metadata    
         Checked-in Screening Reports Default Metadata    
         Checked-in Reservation Default Metadata
    Am not able to select the category/folder in the default check-in form for the above, and i don't see anything in the drop down also
    When i click on the Browse button of above 3..
    see Retention Schedule & favorite Schedule
    But am not able to select any of those.
    AnyOne faced this issue, pls help..
    Thanks!!!!

    The Oracle webcenter content: Records system will enable the retention schedule but it won't create any default retention category. If you want apply retention to a content then first you have to create the retention category and the disposition rule. Once you create the retention category then it will be listed under Retention Schedule. In your case complete the default configuration without applying retention, once you create retention you can apply that to those contents.
    Browse Content --> Retention Schedules --> Create Retention Category
    HTH..
    Regards,
    Manoj

  • Complex Essbase MDX Issue - Need Guidance

    Hi,
    I have a complex Essbase issue in ASO version 11.1.2.2. Currently I have a MDX formula with a Measure member named '10th Percentile'. It calculates the 10th Percentile perfectly. So new requirement is to create a new Measures member and instead of calculating the '10 Percentile' value, it needs to display the Customer Name of the value that is the 10th Percentile from the Customer dimension. So if I do a retrieval and the '10th Percentile' is 3.23, then it needs to display the Customer Name of the 3.23.
    So I altered the formula to do what I think needs to be done and it verifies. However if I retrieve on that new measure in the Excel Add In, I get and error: An error [1200315] Occured in Spreadsheet Extractor. However if I navigate without data I don't get the error, but I also don't get any data, which I obviously need. So my question is, if MDX support reporting on Metadata not just Data, what/how can one report on it? Ideally I need to have this work in the Excel Add In as the client is using a custom vba modified template for their end users.
    Any ideas and help?

    Here's the formula. I bolded the part that is new.....
    IIF ( [Lbs Per Yard].CurrentMember IS [Lbs Per Yard].[No_Lbs/Yd] ,
    IIF( [Count_Price] = Missing, Missing, IIF( [Count_Price] < 2 , Missing,
    { Order (
    Filter ( CROSSJOIN ( Leaves ( [Service].CurrentMember)
    , Filter ( CROSSJOIN ( Leaves ( [Segment].CurrentMember)
    , Filter ( CROSSJOIN ( Leaves ( [Customer Type].CurrentMember)
    , Filter ( CROSSJOIN ( Leaves ( [Zip Code].CurrentMember)
    , Filter ( CROSSJOIN ( Leaves ( [Quantities].CurrentMember)
    , Filter ( CROSSJOIN ( Leaves ( [Frequencies].CurrentMember)
    , Filter ( CROSSJOIN ( Leaves ( [Yardages].CurrentMember)
    , Filter ( Leaves ( [Contract Year].CurrentMember)
    , [$/Yd] <> Missing ))
    , [$/Yd] <> Missing ))
    , [$/Yd] <> Missing ))
    , [$/Yd] <> Missing ))
    , [$/Yd] <> Missing ))
    , [$/Yd] <> Missing ))
    , [$/Yd] <> Missing ))
    , [$/Yd] <> Missing )
    , [$/Yd] /*this is the measure we're using for sort */
    , BASC /* sort in $/Yd in ascending order */
    ) AS OrderedSetOfItems} /* here we define an alias for the set in order to be able to use it later */
    .Item ( Round ( Count ( OrderedSetOfItems) *
    10 / 100 /*where we specify which percentile is being calculated */
    + 0.5 , 0 ) -1 ) *.Item (3-1).[MEMBER_NAME]*
    /* this takes Nth item from the ordered set (0-based index, hence -1) */
    /* .Name takes its name */
    , Missing )

  • Question related to building cube

    Hi, we are in the stage of developing the cube by understanding the report requirements.
    we have 7 dimensions in our cube in which for some reports the metadata exists from all the dimensions and for some reports(expenses),the source data is through the 5 dimensions only.
    My question is
    1)Is it required to maintain seperste cube for expense cube?
    2)If we maintain in the same cube,after loading data through the 5 dimensions,how the aggregation happend towards other 2 dimensions?

    Hi,
    1. My understanding is : YOu have got a cube with 7 dimensions and their is some sort of reporting which is happening as of now.
    2. You have another requirement, i.e expense type, where your report needs only data from 5 dimensions ,where as your cube has 7 dimensions in total.
    3. If this is the scenario, then you need not create a new cube to cater to the needs of your expense requirement. Cube works in combinations,which are unique.
    Let me explain you with an example
    YOu have got 5 dimension, but your expense has source data only for time, accounts and type dimensions. You can load the data into 7 dimension by adding unknown as the members to other 2 non-required dimensions ,and pull out a report by selecting only 3 dimensions.
    Time
    -jan
    -feb
    -march
    Accounts
    -manpowercost
    Type
    -Expense
    -xyz.
    Region
    -A
    -B
    service offering
    -L
    -M
    How does it consolidate?
    Lets take few combinatinos from above outline
    jan,manpowercost,expense,unknown,unknown.,100
    jan,manpowercost,xyz,A,L,200
    ..etc
    these 2 values will never overlap ,as they are all together different combinatins and the same way aggregations also will not impact and provide dubious data
    Hope it helps, pls revert for further clarity
    Sandeep Reddy Enti
    HCC
    http://hyperionconsutlancy.com/

Maybe you are looking for

  • Crystal Reports Viewer 2008 and parameter fields

    Hello, I saw this product on the main Crystal Reports site this afternoon and decided to download it to try it out.  I have a number of users who would benefit from being able to preview their report results prior to actually running the report.  Thi

  • Computer can no longer sign in to iCloud

    As of today, my computer has suddenly stopped allowing access to my iCloud account.  I can still use iCloud on all my IOS devices.  My first clue was my iCloud email address was offline.  Attempting to go online results in a window asking for my pass

  • ITunes not Syncing

    It used to work! I sync my music to my iPhone from my iMac. I Sync my contacts and calendar from my PC. My music does not update from the iMac... it used to! My contacts and appointments update properly from my PC! When I plug in and Sync with the iM

  • T500 Screens Flickering

    Hello all, I have about 5 T500 laptops, when you change the power state ( from battery to power supply) the screen will flicker. It continues to flicker until you disconnect the power supply. Any one else come across this? These are new out of the bo

  • Question on MacPro Using VGA

    Hi. I have a MacPro1,1 and an older Sony LCD vga display that maxes out at 1280 x 1024. I just switched from a 5' vga cable to a 15' vga cable and have a marked decrease in picture quality. Here's my question: I'm using a pretty cheap 15' vga cable;