Export all dimension using a report scripts

Does anybody has an idea on how to Export the dimensions from Essbase through a report script? Because i will use this to load the dimensions in odi.
any idea is highly appreciated.
Thanks in advance.
Hanson

If you look at my blog, I talk about the <preview function. it brings back them members with zeros as the values. It's much quicker than actually returning the data.
There is a problem in general with your premise of using a report script to do this. For ODI, you will want the dimensions in a parent child format. The report script will not do that for you. I would suggest you look at the outline extractor housed on applied olap's web site. www.appliedolap.com it's free and you can run it in a batch mode.
I think you should also look at John Goodwin's blog http://john-goodwin.blogspot.com/ he has a series of articles on ODI. you whould be able to extract the dimensionality from a cube and load it into another cube directly from ODI

Similar Messages

  • How to delete the members in one dimension use the maxl script

    i have question that i want to delete the members in one dimension useing the maxl script, but i do not know how to do it. can the maxl delete the members in one dimension? if can, please provide an sample script, thank you so mcuh.

    MaxL does not have commands to alter an outline directly, except the reset command which can delete all dimensions but not members selectively. The best you could do would be to run a rules file (import dimensions) using a file that contains the members you want to keepload rule for the dimension. As typical the warning is to test this first before you do it on a production database

  • How to output the outline parent-child relationship using a report script?

    I'd like to extract the outline's dimension members with it's parent-child relationship using a report script. Anybody can provide a sample script? Thanks.
    Example:
    DimensionX
    -----MemberX
    ----------ChildX
    Output:
    Dimension X MemberX
    MemberX ChildX
    Edited by: obelisk on Jan 11, 2010 5:16 PM

    Sorry a report script won't do it. You have two options
    1. Use at Essbase Outline API to walk the outline and get it for you
    2. Use the Outline extractor available from Applied Olap (it is a free download). It can be run interactively of as a bat file.
    Frankly, I would rather use 2 since I don't have to code it myself

  • Export FR without using Financial Reporting Studio

    Good morning.
    Is that possible to get the reports designed at Financial Reporting Studio to import into another machine?
    I'm asking this because I ain't able to access this machine anymore. So, did an other machine, but now, with a new database, with no reports =)
    Thanks.
    Cheers,
    Rafael Marques

    I know that there is a report migration tool as we used it when we went from 7.2.5 to 9.3.1. I'm not 100% if you can make use of it as a simple 'copy' tool (i.e. copying same software version reports from one server to the next or if it only works in a specific upgrade capacity); however, I would start on that route first.
    I think the road block you will hit there; however, is that I think the environment would need to be up and running for this utility to work.
    If that is the case, the manual route is ugly. As I've talked about in a prevoius post, reports (since system 9) are scattered in a couple different pieces. In the Workspace database, you will find information that ties a UUID (Unique Identifier) to a 'Friendly Name'. In the file system of your server running the RM web application, you will find the actual report files. The files / folders for the reports are not human friendly; therefore, you need the DB to fix the names. The other issue you have here is the database connection for the reports. When you create a database connection it also gets a generated Unique ID value which is stored in the reports. If that ID does not exist in the target system, you need to update all of the reports; otherwise, they will not work correctly.
    If I HAD to do this manually (i.e. old workspace isn't operational so I can't export), here's what I would do ..........
    ------- WARNING ------------ I don't think anyone here is going to approve of this method and I really don't either, but ....................
    "Old System" to "New System"
    #1 - New System --> create your database connection(s), preferably with same name(s) as previous environment
    #2 - Old --> Locate naming information in Workspace database tables (If you don't know where this is, find a previous post I put out on reports and the database where I explain this)
    #3 - Old --> Locate the physical files / directories (See previous post on subject)
    #4 - Old --> Find at least ONE report for EACH database connection used in the old system. Open the file(s) in notepad and locate the connection ID. (NOTE : Files are compressed so get something like 7zip to get into them)
    #5 - New --> Open the Workspace database, locate the database connection information. For each connection you have created, update the ID to correspond to the ID found in the reports.
    At this point, we have a couple of different options......
    Option 1 - Create your own export archive
    Now that you know where all of the old reports are and you know the proper naming of the reports, you can make your own export archive which you can then import into the new environment. The structure of the 7zip archive would be something like this
    ReportExtract.7z
    (null) Folder --> This would correspond to your Root folder in Workspace
    My Report Folder 1 --> Your workspace stuff here
    My Report Folder 2 --> Etc.
    The import thing to recognize here is that when you first open the 7z file, you have to have that NULL folder; otherwise, this doesn't work. I'm not sure how to create such a null folder, so what I normally do is create an extract file using the Workspace Export option and then use 7zip to alter the archive to include what I really want ....
    #1 - Old --> Fix names. Using the information from the workspace database write a quick script to traverse the file / folder structure and renaming all folders and files so that they have the friendly name.
    #2 - Old --> For the files that were just renamed, uncompress them. This can be accomplished with a bat file script as well.
    #3 - New --> Go to File Export and choose the highest folder you can and export all items. Save the file somewhere handy as you're going ot need this right away.
    #4 - Old --> Open the archive with 7zip. You will see a folder with no name, double click on it. Now you should see the beginning of your report structure (if you have one on the new environment....)
    #5 - Old --> Using windows explorer, DRAG and DROP the folders/files from steps #1/#2 into the 7zip window. 7zip will then add those folders/files to the report archive we created.
    #6 New --> Login to workspace and select the File, Import option. Proceed to use the archive created previously. If everything is done correctly, the reports will pull in and you will be good to go at this point. (OTHER THAN SECURITY)
    Option 2 - Manually Copy Files/Folders and Database entries
    While in theory this is slightly quicker, I've never attempted so I won't say for fact this will work. I would probably only recommend trying this is the old and new machines are running the SAME version of software AND the new machine has nothing added to it yet.....
    #1 - Locate RM1 service folder and copy all files/folders to new server's RM folder
    #2 - Locate the old database tables related to report files/folders/permissions/data connections and copy data into new database tables.

  • Export All rows using xml publisher.

    I have requirement to show search results using region style table and should be able to export all the rows into excel with additional columns that are not shown in table region. Search results could be million rows.
    I know that max results that are shown in the table region are based on value set in Profile option FND: View Object Max Fetch Size. In my case it is set as 5000. Since I need to show addition columns from different tables in excel that are not in table region, I have decided not to use export button but created a submit button to integrated xml publisher to launch excel. Now the problem is excel shows only 5000 rows. I do not want to increase FND: View Object Max Fetch Size as it may affect performance.
    My question is for my requirement what’s is the best solution performance wise and assuming user could search thousands of rows and exported to excel.
    If integrating xml publisher is the solution..how do I show all the rows?
    Thanks SC

    I tried to suggest in your other thread. Please check.

  • Can't we load all dimensions using HAL

    <p>Hi</p><p>i am using Hyperion Application Link to load members into mydimesions.</p><p>i was successfull in loading accounts.  but when i tried toloading scenerios, i was not able to do that as my planning adapterdon't have that dimension in the drop down. i tried refresh , butnothing happened.</p><p>can't we load all dimensions into planning using HAL?</p><p>thanks in advance.</p><p> </p><p>-Balu</p><p> </p>

    <p>Hello,</p><p> </p><p>Four dimensions which are Entity, Accounts, scenario &Versions are self generated in Planning.</p><p> </p><p>However we cannot use HAL for Scenarios and Versions.</p><p> </p><p>Hyperguy</p>

  • Exporting Essbase Dimensions using ODI (specifically attribute)

    I am working on creating a flat file using ODI connected to an Essbase database. I have set up my topology correctly and have been able to export the dimension into a .txt file succesfully. My issue I am having is when I try to include an attribute dimension associated with the main dimension (in this case, organization). I get the following failure:
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 89, in ?
    com.hyperion.odi.essbase.ODIEssbaseException: The source column [ServiceByAUTypeAttribute] is not valid for dimension [Organization]
    I went to the specific line, but see nothing out of the ordinary. The column is a valid attribute dimension for Organization.

    It means you have not set a staging area for your on columns on the target datastore, as you are using essbase in your interface you will need a staging area, essbase as a technology has no SQL capabilities.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Export to excel using deski report

    Hi,
    We are using BOXIR2 SP4 version. We have a deski report that generates excel output file and it was working fine until yesterday. Today that report successfully exported excel output file, but I was not able to open excel file and it gives me below error:
    Excel found unreadable content in 'abc.xls'. Do you want to recover the contents of this workbook? If you trust the source of this workbook, click Yes.
    Now, if I select Yes, then it opens blank sheet.
    I found that today there was a new record in the report for new customer. When I filtered out that customer record it works fine. So I thought may be something wrong with that customer.
    After that I filtered out 10 existing customers and then also I was able to open excel sheet with new customer record as well. So, I tried few other things to resolve the issue, if I reduce the length of Customer_name field from 35 to 30 by using SubStr(), it works fine. If I add a new field Customer ID to the left of Customer_name field, it works fine.
    This is a strange issue. I have suggested our client the above option that have worked. But if client is not OK with it, I will still have to find the root cause and resolve this issue.
    Please advise.
    Thanks,
    Nikhil

    with your product version there are little options you have.
    1. Upgrade to the latest SP and patch on XIR2 (SP6, fp6.4) and re-test
    2. Upgrade to current product release.
    3. figure out what has changed and how in the report and reverse those changes.

  • Export all stories w/ abbreviated tags?

    I'm attempting to export all stories (using the .js script) in my new copy of CS4 and it is not giving me the option of selecting abbreviated versus verbose tags for the .txt files. I am pretty sure this option used to be included in earlier scripts. How can I get it to export using abbreviated tags? The default seems to be verbose, which is not useful for our purposes.

    Thanks to share this excellent functionality ;-)))
    Just added a single line at the end of your code :
    CLIENT_HOST('CMD /C START "Grid" excel.exe ' || LC$Filename ) ;
    and everything is fine !

  • Export all maps (mappings for all dimensions in EXCEL)

    Hi,
    it shouldn't be an unknown issue - but unfortunately I can't find the way :-(

    Below is updated code with the following changes :
    - Adjusted SaveAs logic to prevent Excel prompts in the event the file already exists, etc. (i.e. DisplayAlerts TRUE / FALSE)
    - Added Range creation logic for each worksheet page. If I really wanted perfect code, could do this better, but it gets the job done.
    Sub ExportAllCurrDimMapsForLocationtoXLS()
    'UpStream WebLink DM Custom Script:
    'Created By:         cbeyer
    'Date Created:       11-23-11
    'Purpose:               Export all dimension maps to an Excel workbook      
    'Declare Constant
    'NOTE : This will control whether the function gets the current map in the system or whether it looks back for a specific Period
    '       FDM stores the Map for each period that was loaded... You may want to export a particular POV Period for audit purposes, etc.
    '       IF you enable this, be sure to set the POV Period before running.....
    Const boolgetPOVPeriodMap = False
    'Declare working variables
    Dim intPartitionKey
    Dim strOutputMessage
    Dim strSQL
    Dim strCategoryFreq
    Dim objPeriodKey
    Dim strOutputFileName
    Dim strOutputFilePath
    'Get the location (PartitionKey
    intPartitionKey = RES.PlngLocKey
    'Create SQL Query to get Current Map Data
    If boolgetPOVPeriodMap = False Then
         strSQL = "SELECT * FROM tDataMap where PartitionKey = " & intPartitionKey & " order by DimName ASC"
    Else
         strCategoryFreq = API.POVMgr.fCategoryFreq(API.POVMgr.PPOVCategory)
         Set objPeriodKey = API.POVMgr.fPeriodKey(API.POVMgr.PPOVPeriod, 0, strCategoryFreq)
         strSQL = "SELECT * from vDataMap where PartitionKey = " & intPartitionKey & " and PeriodKey = '" & objPeriodKey.dteDateKey & " 12:00:00 AM' order by DimName Asc"
    End If
    'Create Recordset for all Exported Entities
    Set rsMap = DW.DataAccess.farsKeySet(strSQL)
    If rsMap.EOF And rsMap.BOF Then
         'No records
         If boolgetPOVPeriodMap = False Then
              strOutputMessage = "No Mapping data was found For " & API.POVMgr.PPOVLocation & ".  If this location Is using Parent Maps, you can only export mapping data at the parent location."     
         Else
              strOutputMessage = "No Mapping data was found For " & API.POVMgr.PPOVLocation & " for period " & API.POVMgr.PPOVPeriod & ".  If this location Is using Parent Maps, you can only export mapping data at the parent location."          
         End If
    Else
         'Records Exist, process
         'Generate file name / path
         If boolgetPOVPeriodMap = False Then
              strOutputFileName = API.POVMgr.PPOVLocation & "_DimensionMaps.xls"
         Else
              strOutputFileName = API.POVMgr.PPOVLocation & "_" & objPeriodKey.strDateKey & "_DimensionMaps.xls"
         End If
         strOutputFilePath = DW.Connection.PstrDirOutbox & "\ExcelFiles\"
         'Create Excel file reference     
         'Declare Excel working variables
         Dim oExcel
         Dim oBook
         Dim oSheet 'No puns here......
         Dim oRange
         Dim intCurrentSheetOrdinal
         Dim intCurrentRowOrdinal
         Dim intCurrentColOrdinal
         'Intialize Excel
         Set oExcel = CreateObject("Excel.Application")
         Set oBook = oExcel.Workbooks.Add
         'Declare working variables
         Dim strCurrDimName
         'Initialize variables
         strCurrDimName = ""
         intCurrentSheetOrdinal = 1
         intCurrentRowOrdinal = 1
         intCurrentColOrdinal = 1
         With rsMap
              Do Until .eof
                   'Check to see if current DimName matches existing DimName.  If not, add headers
                   If rsMap.fields("DimName") <> strCurrDimName Then
                         'If the dimension name has changed to a different dimension name, show total information before starting headers
                         'If the previous dimension was not "", then we are transitioning from one range to the next.  Lets create a named range on the just
                         'finished worksheet because we can or because you may want to use this for re-uploading
                         'NOTE : The range I'm creating is more for reference as to how to implement this and I don't know if I'm making the range in a fashion that
                         'FDM will pickup for importing. 
                         'NOTE : You probably want intCurrentRowOrdinal - 1 since it is 1 row past the last row of data at this point.  If you want to clean it up,
                         'then you need to make sure RowOrdinal is not going to be less than the starting point and I didn't feel like adding the couple rows of
                         'code to do the work properly as FDM will just ignore the blank row in all likelihood.
                                                                      If strCurrDimName <> "" Then
                                Set oRange = oSheet.Range("A6:K" & intCurrentRowOrdinal)
                                oBook.Names.Add "ups"&strCurrDimName, oRange
                         End If
                         'Create worksheet reference
                           Set oSheet = oBook.Worksheets(intCurrentSheetOrdinal)                    
                          'Create default header at top of each new dimension group
                             If boolgetPOVPeriodMap = False Then
                                  oSheet.range("A1") = (API.POVMgr.PPOVLocation & " - Map Conversion")
                             Else
                                  oSheet.range("A1") = (API.POVMgr.PPOVLocation & " - Map Conversion for " & rsMap.fields("PeriodKey"))
                             End If
                             oSheet.range("A3") = "Partition: " & API.POVMgr.PPOVLocation
                             oSheet.range("A4") = "User ID: " & DW.Connection.PstrUserID
                             'NOTE: I could make an array of the field names and do a loop here; however, this is easier to read.....
                             '      probably not how I would do it from an efficiency standpoint, but since it's a limited number of fields
                             '      this will work.....
                                 oSheet.range("A5") = "PartitionKey"
                                 oSheet.range("B5") = "DimName"
                                 oSheet.range("C5") = "Source FM Account"
                                 oSheet.range("D5") = "Description"
                                 oSheet.range("E5") = "Target FM Account"
                                 oSheet.range("F5") = "WhereClauseType"
                                 oSheet.range("G5") = "WhereClauseValue"
                                 oSheet.range("H5") = "-"
                                 oSheet.range("I5") = "Sequence"
                                 oSheet.range("J5") = "DataKey"
                                 oSheet.range("K5") = "VBScript"
                             'Update variables                   
                                strCurrDimName = rsMap.fields("DimName")
                                intCurrentRowOrdinal = 6
                                intCurrentSheetOrdinal = intCurrentSheetOrdinal + 1
                                'Update worksheet name
                                oSheet.name = strCurrDimName
                   End If
                     'Write Details
                            oSheet.range("A" & intCurrentRowOrdinal) = intPartitionKey
                     oSheet.range("B" & intCurrentRowOrdinal) = rsMap.fields("DimName").Value
                     oSheet.range("C" & intCurrentRowOrdinal) = rsMap.fields("SrcKey").Value
                     oSheet.range("D" & intCurrentRowOrdinal) = rsMap.fields("SrcDesc").Value
                     oSheet.range("E" & intCurrentRowOrdinal) = rsMap.fields("TargKey").Value
                     oSheet.range("F" & intCurrentRowOrdinal) = rsMap.fields("WhereClauseType").Value
                     oSheet.range("G" & intCurrentRowOrdinal) = rsMap.fields("WhereClauseValue").Value
                     oSheet.range("H" & intCurrentRowOrdinal) = rsMap.fields("ChangeSign").Value
                     oSheet.range("I" & intCurrentRowOrdinal) = rsMap.fields("Sequence").Value
                     oSheet.range("J" & intCurrentRowOrdinal) = rsMap.fields("DataKey").Value
                     oSheet.range("K" & intCurrentRowOrdinal) = rsMap.fields("VBScript").Value
                   'Increment Counters
                   intCurrentRowOrdinal = intCurrentRowOrdinal + 1
                   'Move to the next record
                   .movenext
              Loop
         End With
         'Final Sheet Named Range addition
         'Since the loop will end and we will not execute the above logic to create the range for the previous sheet
         'the easiest (laziest) solution is to just handle the last sheet after the loop.
         'We're basically doing the same stuff we did above, just down here.
          If strCurrDimName <> "" Then
              Set oRange = oSheet.Range("A6:K" & intCurrentRowOrdinal)
               oBook.Names.Add "ups"&strCurrDimName, oRange
          End If      
         'Close / release file objects
         'Added some logic here to ensure you don't get caught up on the file replace prompt.
         oExcel.Application.DisplayAlerts = False
         oBook.SaveAs strOutputFilePath & strOutputFileName
         oExcel.Application.DisplayAlerts = True
         oExcel.Quit
         'Create output message          
         strOutputMessage = "Mapping data export for " & API.POVMgr.PPOVLocation  & " complete.  Extract file is : " & strOutputFilePath & strOutputFileName
    End If
    'Close / release data objects
    rsMap.close
    'Display output
    If LCase(API.DataWindow.Connection.PstrClientType) = "workbench" Then
              MsgBox strOutputMessage       
    Else
         'Let the user know we are done
         RES.PlngActionType = 2
         RES.PstrActionValue = strOutputMessage
    End If
    End SubEdited by: beyerch2 on Dec 14, 2011 9:43 AM

  • Export for all rows of a Report Region

    Hi,
    I have a html region with a query button to restrict which rows get selected in a reports region below it.
    But I also want to have a link to export all rows in the report to a csv file. Button the report region may already
    have selected data in it that is not necessarily all rows in the report. Thus I guess I am wondering if I should
    create a whole separate hidden report region that will always select all rows in the query to relate the
    export link to? Or if I can someone use the current report region and select all rows for the export but
    not requery or change the currently displayed report regions results?
    Thanks in advance!

    Yes I just created a hidden report i.e. one with all columns set to
    APEX_APPLICATION.G_EXCEL_FORMAT = TRUE
    And then just create a javascript function to call the export routine...

  • Regarding report scripts

    Hi all,
    I am using two report scripts in ASO cube,
    1. to export Budget data
    2. to export actual data
    The amount of data in the cube is same for these two scenarios and all the other dimensions are also fixed at the same state in the two report scripts.
    Exporting of budget data taking less than minute but export of actual data taking nearly 12 minutes.
    Can anyone suggest me with possible issue if they faced such a similar problem.
    Question may not be completely clear but I can provide inputs for individual understanding.
    Thanks in advance.
    Regards
    Rav

    i tried this and it works
    go to database right clk> edit properties>dimension tab
    just clik on members in dimension it will show u highest to lowest
    take that order and keep the same order in ur report script for the one which is taking much time
    hope it will help

  • Report script issue (ASO cube)

    Hi All,
    I have few report scripts in ASO which I am executing them using Maxl.
    When I run the scripts manually and exporting it to console window, I am able to see the data.
    When I use Maxl and exporting it to excel file, the log file does not show any error and data is not exported to excel file.
    Can anyone help me if they faced a similar problem with report scripts.
    Thanks in advance.
    Regards
    Rav

    There are 8+ scripts and here is one of the similar script. Few are Asymmetric scripts with columns, Year and Months.
    I think there is no issue with the script as I have run it many times before but from past two days I am facing the problem as it exports partial data or else no data is exported.
    It is not throwing any errors including cartesian product error.
    //ESS_LOCALE English_UnitedStates.Latin1@Binary
    <SETUP <SYM { TabDelimit } { decimal 13 } { IndentGen -5 } <ACCON <QUOTE <END
    <COLUMN (Months)
    <LINK ((<Descendants(Months) AND <Descendant(Q1)) OR <Descendant(Q2) OR <Descendant(Q3) OR <Descendant(Q4))
    <COLUMN (Year)
    <MATCH (Year,&CurrentYear)
    <MATCH (Year,&NextYear)
    <Match(Year,&YearPlusTwo)
    {ROWREPEAT}
    { SUPMISSINGROWS  }
    <row (Dim1,Dim2,Dim3,Dim4,Dim5,Dim6,Dim7,Dim8,Dim9,Dim10,Account )
    &Dim1
    &Dim2
    Lev0,Dim10
    Lev0,Dim9
    Dim6
    Dim5
    Dim4
    Lev0,Dim7
    Lev0,Dim8
    "No Dim3" //fixed member in the dimension
    <LINK ( (<Descendants("Income Statement") AND <Lev(Account,0) ) AND (<Match(Account,6*) OR <Match(Account,7*) ) )
    Thanks,
    Regards
    Rav

  • Report script optimization

    We have the following report script. Its taking 2 hrs to run .... but if we hard code the 70 accounts we get output in 1 min...we used these 3 functions to extract the level 0 accounts //<DIMBOTTOM "Account" //<LINK(<LEV("Account",0)) //<LINK(<LEV("Account",0) AND <IDESC("Account")).........................for these 3 functions its taking more time...but if we hard code the 70 accounts we get output in 1 min
    // DATA FORMATTING
    <SPARSE
    <SUPSHARE
    <QUOTEMBRNAMES
    <SORTNONE
    <ACCON
    <SYM
    {TABDELIMIT}
    {ROWREPEAT}
    {SUPFEED}
    {SUPHEADING}
    {SUPEMPTYROWS}
    {SUPMISSINGROWS}
    {SUPBRACKETS}
    {SUPCOMMAS}
    {SUPPAGEHEADING}
    {MISSINGTEXT "#MISSING"}
    {ZEROTEXT "0.00"}
    {UNDERSCORECHAR " "}
    {DECIMAL 2}
    {NOINDENTGEN}
    // DATA LAYOUT
    //<PAGE ("Project","ProjectYear")
    <ROW("Scenario","Year","Project","Entity","FundingSource","CostCenter","Campus","ProjectYear","Fund","Account")
    <COLUMN("Period")
    // DATA SELECTION
    "Budget"
    "FY09"
    "Project"
    "Draft"
    "Input"
    "PY09"
    <LINK((<LEV("Entity",0)) AND (<IDESC("Entity")))
    <LINK((<LEV("FundingSource",0)) AND (<IDESC("FundingSource")))
    <LINK((<LEV("CostCenter",0)) AND (<IDESC("CostCenter")))
    <LINK((<LEV("Campus",0)) AND (<IDESC("Campus")))
    <LINK((<LEV("Fund",0)) AND (<IDESC("Fund")))
    //<DIMBOTTOM "Account"
    //<LINK(<LEV("Account",0))
    //<LINK(<LEV("Account",0) AND <IDESC("Account"))
    <LINK((<LEV("Period",0)) AND (<IDESC("YearTotal")))
    //&BudgetYear
    //&ProjBudgetYear
    // This substitution variable needs to be updated every Fiscal Year at the start of
    // the Budget Formulation process using the Essbase Administration Services client
    // EXECUTE THE REPORT
    !

    hi, i just read your post. i'm not sure youre optimizing the order of the dimensions. try doing a level 0 export. then, use that order of dimensions in your report script for the Row command. if you need a different order of the fields use the ORDER command. see my example. i found that my needed order of fields took 6x longer than when i used the export order...also, my link command took longer when i put the dimbottom part last. you really just have to play around alot : \
    {SUPEMPTYROWS}
    {SUPMISSINGROWS}
    {SUPZEROROWS}
    //suppresses the automatic insertion of a page break
    {SUPFEED}
    {SUPCOMMAS}
    {SUPBRACKETS}
    {SUPPAGEHEADING}
    {SUPHEADING}
    {MISSINGTEXT "0.00"}
    { ROWREPEAT }
    //{TABDELIMIT}
    {DECIMAL 2}
    {NOINDENTGEN}
    //suppresses the display of duplicate shared members when you use generation or level names to extract data for your report.
    <SUPSHARE
    //forces a symmetric report, regardless of the data selection. Use SYM to change the symmetry of a report that Hyperion Essbase would create as an asymmetric report.
    //<SYM
    // 0 1 2 3 4 5 6
    //export order
    <ROW ("Business Units", VERSIONS, Products,TIME,ACCOUNTS,Departments)
    <SORTASC
    //col 7
    {CALCULATE COLUMN "DATA 2" = 6 }
    {ORDER 0 1 4 5 2 3  6 7 fixcolumns 8}
    {width 18 6 7}
    //COL0
    //<DIMBOTTOM "BUSINESS UNITS"
    {RENAME "B33000"} "BU 33000"
    {RENAME "B33009"} "BU 33009"
    {RENAME "B33010"} "BU 33010"
    {RENAME "B33030"} "BU 33030"
    {RENAME "B33050"}"BU 33050"
    {width 7 0}
    //COL1
    {RENAME "BUDGET    BUDGET     "} "2012 PLAN"
    {WIDTH 21 1}
    //COL2
    // "600000"
    <LINK (<DIMBOTTOM (Accounts) and not <MATCH (Accounts, 6?????) and not <MATCH (Accounts, 9?????) )
    {width 7 4}
    //COL3
    <DIMBOTTOM DEPARTMENTS
    {width 11 5}
    //COL4
    <DIMBOTTOM PRODUCTS
    {RENAME ""} "BLANK PRODUCT"
    {width 40 2}
    //COL5
    <SORTNONE
    {RENAME "USD     2012 001"}"PER01"
    {RENAME "USD     2012 002"}"PER02"
    {RENAME "USD     2012 003"}"PER03"
    {RENAME "USD     2012 004"}"PER04"
    {RENAME "USD     2012 005"}"PER05"
    {RENAME "USD     2012 006"}"PER06"
    {RENAME "USD     2012 007"}"PER07"
    {RENAME "USD     2012 008"}"PER08"
    {RENAME "USD     2012 009"}"PER09"
    {RENAME "USD     2012 010"}"PER10"
    {RENAME "USD     2012 011"}"PER11"
    {RENAME "USD     2012 012"}"PER12"
    {width 17 3}
    !

  • Report Script Question

    i'm trying to suppress all zero rows when using a report script to export some data. i have the {SUPEMPTYROWS} command in the script (which i thought would suppress all #missing & zero rows) however i'm still getting a large amount of zero rows in the output file. anyone ever encounter this & have solution to eliminate these zero rows? thanks.

    Hi
    please find sample report script which does your job
    //ESS_LOCALE English_UnitedStates.Latin1@Binary
    {SUPZEROROWS}
    {SUPMISSINGROWS}
    {SUPEMPTYROWS}
    {DECIMAL 2
    WIDTH 9
    SUPCOMMAS
    MISSINGTEXT " "
    UNDERSCORECHAR " "
    NOINDENTGEN
    SUPFEED
    TABDELIMIT
    ROWREPEAT }
    <SUPSHARE
    <PAGE(Measures, Period, Version, Year, Scenario,Currency,HSP_Rates)
    Base Jan Final FY08 Actual Local "HSP_InputValue"
    <COLUMN(Account)
    "Pay Rate" "Bonus Target" "Commission Target"
    <ROW(Employee,Entity)
    <SORTASC
    <SORTMBRNAMES
    <DIMBOTTOM "Employee"
    <DIMBOTTOM "Entity"
    <Sparse
    hope this helps.
    Dornakal.
    www.dornakal.blogspot.com

Maybe you are looking for

  • Posting internal profit to separate line items in STO

    Hi I am doing STO from one plant to another Basic Price: Rs.100, Markup: 20 and therefore branch transfer rate is 120 Now i create STO. Basic price i entered is 100 and mark up Rs.20 [against new condition type as ZMUP]. Attributes of ZMUP is: Contro

  • NEED HELP with HP P1006 DUPLEX printing

    This printer has the ability to print duplex pages (manual). This is different from printing the odd pages and then the even pages to achieve two-sided pages. This affects me because I like to print 2-sided AND print 2 pages on each side and the manu

  • When will Oracle 10g and 11g PSU for July (quarter) be available?

    I've searched Oracle site and result shows Critical Patch Updates (CPU) page only that has release dates: Critical Patch Updates and Security Alerts. Are the dates applicable to PSU as well?

  • Need a Logic for balance

    Hi Experts, I have a table SSN          data            balance paid_amount rank 111111111    1st week date   1000     100         1 111111111    2nd week date   1000     100         2 111111111    3rd week date   1000     100         3 111111122   

  • How to change the Group name?

    Hi I made some groups and assigned users to it. Now the need requires me to change the names of the group. Could you please explain me how to do this and also will this chage be also effective in the desktop master rule. I'm using EP6.0, SP11 Thanks