Loading Performance Scorecard measure with DIM

Hi,
I'm having a bit of a problem loading measure into Hyperion Performance Scorecard 9.3.1 with DIM adapter. I have managed to load the measures and set some of the properties, however there seems to be a problem setting the High / Low result (HigherIsBetter port) and Result frequency. Tried passing in T and F for HigherIsBetter and 0 to 8 for ResultFrequency but nothing happened. Has anybody had this problem before? I have applied the 9.3.0.1.1 patch to the DIM 9.3.0.1 adapter.
Regards,
Gerd

After compression, the requested IDs will delete but where the deleted request IDs goes? Is there any chance to get them back?
After the compression request id replace by 0. You can't find anymore.
what about if I want to load delta on already compressed cube? how compression impact on cube after delta?
This is normal behavior. we can load delta data to compressed cube. no issues with it.
Compression won't impact on delta.
compression - just moves data from F fact table E fact table. Your loaded delta records(uncompressed) will be at F fact table.
Compression always do on old request not on the latest requests.
Aggregates on cube will improve query performance, apart from query response time how do you know query performance increased?
first you can check your how much time taking without aggregates note down it.
later create aggregates and check query performance time.
You can check query statistics at RSRT --> use cache or aggregates, execute+debug.
About DSO requirement:
As my guess we never get such requirement to load 10 objects from one source and remaining 4 objects from another source to fill 14 objects of cube.
You can load, but thru one dso you will get 10 objects data and 4 objects data will be blank.
2dso also same it will fill 4 objects data and 10 objects will be blank.
routines - we can't about this. its purely depend on your requirement.
BI Content - if you want one data flow, you need objects, based on your data source or cube you will get/find required info objects by using bi content.
Thanks

Similar Messages

  • Error "CONVT_NO_NUMBER" while performing user measurement with USMM

    Hi,
    While running user measurement with USMM I am getting following run
    time error.
    To be specific - After preselections in USMM->system measurement->while
    checking Green Check mark this error is screaming up.
    Ae per Note - 1230640 :- You call transaction USMM and want to cancel
    the user measurement using the cancel button (red 'X') on the dialog
    box of the user measurement. However, the measurement continues and is
    completed as if you had chosen the ok button (green check mark).
    Above note is applied if the problem comes if we check red button
    rather than green check mark.
    Please suggest us the best way to procees.
    ST22 details:-
    Runtime Errors CONVT_NO_NUMBER
    Exception CX_SY_CONVERSION_NO_NUMBER
    Date and Time 09/16/2008 07:42:32
    Short text
    Unable to interpret " 1,410 " as a number.
    What happened?
    Error in the ABAP Application Program
    The current ABAP program "RSUVM001" had to be terminated because
    it has
    come across a statement that unfortunately cannot be executed.
    What can you do?
    Note down which actions and inputs caused the error.
    To process the problem further, contact you SAP system
    administrator.
    Using Transaction ST22 for ABAP Dump Analysis, you can look
    at and manage termination messages, and you can also
    keep them for a long time.
    Error analysis
    An exception occurred that is explained in detail below.
    The exception, which is assigned to
    class 'CX_SY_CONVERSION_NO_NUMBER', was not
    caught and
    therefore caused a runtime error.
    How to correct the error
    Whole numbers are represented in ABAP as a sequence of numbers,
    possibly
    with an algebraic sign.
    The following are the possibilities for the representation of
    floating
    point numbers:
    [mantissa]E[algebraic sign][exponent]
    [whole number part].[fraction part]
    For example, -12E+34, +12E-34, 12E34, 12.34
    If the error occurred in your own ABAP program or in an SAP
    program you modified, try to remove the error.
    If the error occures in a non-modified SAP program, you may be able
    to
    find an interim solution in an SAP Note.
    If you have access to SAP Notes, carry out a search with the
    following
    keywords:
    "CONVT_NO_NUMBER" "CX_SY_CONVERSION_NO_NUMBER"
    "RSUVM001" or "RSUVM001"
    "START-OF-SELECTION"
    If you cannot solve the problem yourself and want to send an error
    notification to SAP, include the following information:
    1. The description of the current problem (short dump)
    Information on where terminated
    Termination occurred in the ABAP program "RSUVM001" - in "START-OF-
    SELECTION".
    The main program was "RSUVM001 ".
    In the source code you have the termination point in line 96
    of the (Include) program "RSUVM001".
    85 * Analyze the user records.
    86 CALL FUNCTION 'SAPGUI_PROGRESS_INDICATOR'
    87 EXPORTING
    88 text = 'Die Benutzerstammsätze werden analysiert.'(002).
    89 * PERFORM analyze_user_records.
    90
    91 CALL FUNCTION 'SLIM_USER_MEASUREMENET'
    92 CHANGING
    93 SLIM_TUREP = LT_TUREP.
    94 .
    95 LOOP AT LT_TUREP INTO LS_TUREP .
    >>>>> MOVE-CORRESPONDING LS_TUREP TO ITUREP.
    97 APPEND ITUREP.
    98 ENDLOOP.
    99
    100 * Submit Application measurements as batch jobs.
    101 * Save additional data about this analysis in the DB.
    102 * we need meas-no ,v-date at the batch
    any ideas?

    Hi Srikanth,
    The below Notes helps you to solve the issue.
    Please review the below Notes.
    Note 1117010 - Runtime error CONVT_NO_NUMBER in RSUVM001
    Note 1115924 - USMM composite SAP Note: Problems after importimg new SPs
    Thanks & Regards,
    Pradeep Srigiri

  • Error Performance Scorecard 11.1.2.2 Login

    Hello;
    I have an error when entering the Performance Scorecard application with admin user exepcion which has no problems
    In workspace is logged good.
    At the moment of choice in Workspace Browse -> Applications -> Performance Scorecard see this error:
    Performance Scorecard está teniendo problemas para procesar su solicitud. Póngase en contacto con el Administrador de Performance Scorecard.
    Mensaje: User Account is not associated with any security role.
    With the admin user I have all the operating functions. The problem is to enter a different user to admin.
    Provisioning in Shared Services is OK.
    If anyone has any comments or recommendation to solve the problem I will be very grateful.
    Version: Performance Scorecard 11.1.2.2
    BD: Oracle 11G
    The logs generated is as follows:
    (self-tuning)' ERROR com.hyperion.pmd.hps.actions.HPSAction services.configuration - com.hyperion.pmd.hps.session.NoHPSSecurityRoleException: User Account is not associated with any security role
    (self-tuning)' WARN com.hyperion.pmd.hps.session.LoginHelper infrastructure.communication.servlet - There was no UserSessionInfo in the session: weblogic.servlet.internal.session.MemorySessionData@31748dc
    name=jsp_builder_EmployeeSetup.jsp
    Thread-45 WARN com.hyperion.pmd.hps.model.authorization.CASUserAccountSynchronizer services.authorizationService.accessControl - Problem in getting matching roles of the user [rceballos], will not assign any to synchronized user accounts.
    (self-tuning)' ERROR com.hyperion.pmd.hps.actions.HPSAction services.configuration - com.hyperion.pmd.hps.session.NoHPSSecurityRoleException: User Account is not associated with any security role
    (self-tuning)' WARN com.hyperion.pmd.hps.session.LoginHelper infrastructure.communication.servlet - There was no UserSessionInfo in the session: weblogic.servlet.internal.session.MemorySessionData@207d4b5
    (self-tuning)' ERROR com.hyperion.pmd.hps.model.ReverseRelationshipsContainer$InnerReverseRelationshipsContainer infrastructure.database.sql.queries - Unable to fetch the associate end [< com.hyperion.pmd.hps.model.PrimaryKey@fffaff4a: Object id=327861, Version number=<not initialized yet>, B.O. class=com.hyperion.pmd.hps.model.authorization.UserAccount >70z9.-1.56ltuh]
    of relationship [< < com.hyperion.pmd.hps.persistence.castor.jdo.authorization.UserAccountEmployeeRelationshipJDO@169d8ee - 327867: Primary key=< com.hyperion.pmd.hps.model.PrimaryKey@fffaff44: Object id=327867, Version number=<not initialized yet>, B.O. class=com.hyperion.pmd.hps.model.authorization.UserAccountEmployeeRelationship >70zf.-1.-mmqj9d >: Relationship definition=< com.hyperion.pmd.hps.model.RelationshipDefinition@217cefb: Relationship name=User account's employee, Associate class name=com.hyperion.pmd.hps.model.authorization.UserAccount, Associated class name=com.hyperion.pmd.hps.model.Employee, Type=1, Relationship class name=com.hyperion.pmd.hps.model.authorization.UserAccountEmployeeRelationship, Is composite=false >, Associate (from) b.o.=< com.hyperion.pmd.hps.model.PrimaryKey@fffaff4a: Object id=327861, Version number=<not initialized yet>, B.O. class=com.hyperion.pmd.hps.model.authorization.UserAccount >70z9.-1.56ltuh, Not restored associated (to) b.o. primary key=< com.hyperion.pmd.hps.model.PrimaryKey@ffffee1c: Object id=4579, Version number=<not initialized yet>, B.O. class=com.hyperion.pmd.hps.model.Employee >3j7.-1.382rb6 >]
    using b.o. persistence: com.hyperion.pmd.hps.persistence.castor.ReadOnlyBusinessObjectPersistence
    java.lang.Exception
         at com.hyperion.pmd.hps.model.ReverseRelationshipsContainer$InnerReverseRelationshipsContainer.fetchReverseRelationships(ReverseRelationshipsContainer.java:162)
         at com.hyperion.pmd.hps.model.ReverseRelationshipsContainer$InnerReverseRelationshipsContainer.getRelationships(ReverseRelationshipsContainer.java:60)
         at com.hyperion.pmd.hps.model.RelationshipContainerBase.getRelationships(RelationshipContainerBase.java:178)
         at com.hyperion.pmd.hps.model.BusinessObjectImpl.getOneToManyReverseRelated(BusinessObjectImpl.java:1270)
         at com.hyperion.pmd.hps.model.Employee.getAssociatedUserAccounts(Employee.java:689)
         at com.hyperion.pmd.hps.model.authorization.EmployeeUserAccountSynchronizer.synchronize(EmployeeUserAccountSynchronizer.java:89)
         at com.hyperion.pmd.hps.actions.AdminSynchronizeAction.performAction(AdminSynchronizeAction.java:51)
         at com.hyperion.pmd.hps.actions.HPSAction.perform(HPSAction.java:104)
         at com.hyperion.pmd.hps.actions.HPSAction.execute(HPSAction.java:73)
         at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:431)
         at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:236)
         at org.apache.portals.bridges.struts.PortletRequestProcessor.process(PortletRequestProcessor.java:52)
         at com.hyperion.pmd.hps.servlet.HPSRequestProcessor.process(HPSRequestProcessor.java:458)
         at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1196)
         at org.apache.struts.action.ActionServlet.doGet(ActionServlet.java:414)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:301)
         at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at com.hyperion.pmd.hps.servlet.RequestCharacterEncodingFilter.doFilter(RequestCharacterEncodingFilter.java:34)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at com.hyperion.pmd.hps.servlet.PictureCacheFilter.doFilter(PictureCacheFilter.java:40)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at com.hyperion.pmd.hps.servlet.GZIPFilter.doFilter(GZIPFilter.java:56)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:119)
         at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:315)
         at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:442)
         at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
         at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
         at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:139)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3730)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3696)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2273)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2179)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1490)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:221)
    thanks
    Pedro

    >
    (self-tuning)' ERROR com.hyperion.pmd.hps.actions.HPSAction services.configuration - com.hyperion.pmd.hps.session.NoHPSSecurityRoleException: User Account is not associated with any security role1. Go to HSS,
    2. Expand Applications,
    3. Right Click on Performance Scorecard,
    4. Assign Security,
    5. Select the user and give a corresponding role as you have given in HSS. (As per you update, you have provisioned user for HPS, if not please do that otherwise you will not be able to see the user here )
    Cheers...!!!
    Rahul S.

  • What are the better load/performance testing tools available for Flex Application with BlazeDS RO?

    In my application is designed with Flex3, ActionScript3, BlazeDS Remote Objects.
    Just i tried with OPENSTA but i cant do the dynamic parameterization in their generated scripts because the response of the calls is binary values and also we cant get the response using with SCL language.
    While testing with OPENSTA with HttpService, i can do the dynamic parameterization and got the response.
    can give the information about the below questions
    whether we can do dynamic parameterization with OPENSTA for Flex Remote objects?
    and  what are the better load/performance tools available for Flex Remote Objects?

    Your approach is fine, depending on how many and what type of CFCs you are talking about. If they are "singletons" - that is, only one instance of each CFC is needed to be in memory and can be reused/shared from multiple parts of your application - caching them in the application scope is common.  Just make sure they are thread safe ("var" or local.* all your method variables).
    You might consider taking advantage of a dependency injection framework, such as DI/1 (part of the FW/1 MVC framework), ColdSpring, or WireBox (a module of the ColdBox platform that can be used independently).  They have mechanisms for handling and caching singletons.  Then you wouldn't have to go to the application scope to get your CFC instances.
    -Carl V.

  • How to export measures and scores in Hyperion Performance Scorecard

    Hi,
    Please I need the command to export measures and their scores in hyperion performance scorecard.
    Regards,

    I have been able to export measures and scores.
    Is there a way to make the system include the 'Short Name' as a column in the exported csv file?
    Regards

  • SCD 2 load performance with 60 millions records

    Hey guys!
    I'm wondering what would be the load performance for a type 2 SCD mapping based on the framework presented in the transformation guide (page A1-A20). The dimension has the following characteristics:
    60 millions records
    50 columns (including 17 to be tracked for changes)
    Has anyone come across a similar case?
    Mark or Igor- Is there any benchmark available on SCD 2 for large dimensions?
    Any help would be greatly appreciated.
    Thanks,
    Rene

    Rene,
    It's really very difficult to guesstimate the loading time for a similar configuration. Too many parameters are missing, especially hardware. We are in the process of setting up some real benchmarks later this year - maybe you can give us some interesting scenarios.
    On the other side, 50-60 million records is not that many these days... so I personally would consider anything more than several hours (on a half decent hardware) as too long.
    Regards:
    Igor

  • Hyperion Performance Scorecard automatic Synch with Planning data

    Hello all,
    I was wondering if anyone knows how and if it is possible to automate the Data and Metadata Synchronization between Performance Scorecard and a Planning?
    I have successfully set up the Planning database as an External data source but need to automate the synch process, as opposed to having an administrator manually click the buttons at set periods.
    I'm guessing the automation will involve calling the relevant class from command line, but I don't know which one and what parameters to pass...perhaps there is even an easier way (let's hope!).
    Can someone please give me any advise or information on how this can be achieved?
    thanks in advance
    Assia

    Have a read: http://docs.oracle.com/cd/E12825_01/epm.111/readme/dimapsc_11111_readme.html

  • Query : Hyperion Performance Scorecard Rollup Functionality

    Hi All,
    Hyperion Performance Scorecard Rollup Functionality in measure and scorecard template.
    Query: We are implementing Hyperion performance scorecard for a client using measure template and scorecard templates after build client came up with following new requirements-
    Strategy hierarchy as follows-
    Level 1- SBU (Parent) Level 2- Groups (Child), 3- Teams( Chilld), 4- Desk ( Child), 5- RM (Child)
    1 Along with Existing scorecards functionality they need to roll up 2 KPIs (Credit Quality1,Credit Quality2) explicitly only at RM level ( at lowest level).
    2- Need to follow this hierarchy for these scorecards roll up (RMs to Desks to Teams to Groups to SBUs)
    3- No of KPIs to be used/modified (credit quality2 means bad assets-non performing loan & Credit quality 1 means good assets-performing loan)
    4- Both these KPIs should roll up to all RMs only but KPI bad asset (Credit Quality2) should not roll up to Desk, to Teams, to Groups, To SBUs but KPI1 Good Assets (performing asset) should roll up to all levels.
    4.1-Example let’s assume there are 2 RMs (RM1 & RM2) RM1 is having bad asset & RM2 is not having bad asset then only for RM1 both KPI should roll up but at above levels 1st KPI (non-performing assets ) should not roll up but second KPI ( performing assets) should roll up.
    5- No of scorecards required means RMs * approx 3000, Desk*Total numbers ,Teams*Total numbers ,Groups*total numbers, finally SBUs*total numbers.
    As per my understanding in the existing HPS design we are using Measure Templates & Scorecard Templates in which all KPIs are automatically rolling up to RMs then from RMs to all above hierarchies, now for this new requirement explicitly we need to create another scorecard (secondary scorecard) with 2 KPIs both should roll up to RMs (3000 approx) but should not roll up to above hierarchy (to Desks to Teams to Groups to SBUs) only we need to roll up 1 KPI to all above hierarchies not the other KPI or this new scorecard.
    As per my understanding in the current design all KPIs are rolling up to all levels.
    Question : In current scenario is there a possibility or solution to restrict a particular KPI to rollup to above levels?*
    Any help would be highly appreciated.

    Thanks for your response!
    I'm not sure that the Power Pivot plugin for Excel will be available here. We use Excel 2010. We have restrictions in our environment. Additionally, we have 32bit Office installed and my understanding is that you need 64 bit on both client and
    server when working with Power Pivot. So exporting Excel Power Pivots from SharePoint 2010 to a 32 bit client wouldn't work (or vice-vera) wouldn't work.
    By robust I mean the ability to utilize different types of data sources (CSV, EXCEL, SQL Server, ORACLE) as well as Data Models, (transactional, analytical, tabular). The ability to handle large datasets, pivot table funcitonality, drill down and drill
    through, rich data visualizations.
    Currently this data is in Oracle. I will need to export it to SQL and I would like to leave it as OLTP since OLAP technologies are not currently in our skill set. Hopefully these can be added in the next year or so.  If Power View loses the bulk
    of it's reporting power when moving to OLTP, I will just build these in SSRS UNLESS, I can use Group By, Rollup and CUBE in my queries to give Power View a data set where  it's capabiliities can be utilized.
    Love them all...regardless. - Buddha

  • Population of Hyperion Performance Scorecard

    Hi all,
    after setting up a performance scorecard system (by defining framework, creating employee profiles, defining units/targets, building performance indicator ranges, strategy trees, accountability maps and strategy maps, defining measures and measure template and building scorecards,
    I WANT TO POPULATE THE SCORECARD WITH EITHER MANUAL OR AUTOMATED INPUT OF MEASUREMENT DATA FOR THE RESPECTIVE SCORECARDS.
    however i havent found the information on how to do it.
    Does anyone have an idea how this can be done?
    Many thanks for your help, regards,
    Dominic

    Hi Dominic,
    by viewin your message in this forum, i came to know that you have some knowledge on Hyperion Performance ScoreCard. I was new to this technology, i have good knowledge of Hyperion essbase and planning. We are Planning to implement score card for the first time in our company. Plz let me know if you know about this tool future/current market, implementations. can we expect that much market as Essbase and planning. if i work on this tool will it boost my career.Plz share with me what ever you know about this tool also you can mail me to [email protected]
    Thanks&Regards
    Nath

  • Load/Performance Testing using ECATT

    Please provide the process to perform Load/Performance Testing using ECATT ASAP. 
    What are the T-Codes are required to fulfill Load/Performance Testing using ECATT.
    Thanks in ADVANCE.

    Hello Colleague,
    Here are the steps that you need to do, for performance testing using ST30.
    Use transaction ST30 to invoke Global Performance Analysis ( Widely used for performance tests of certain transactions ).
    On the eCATT test tab, Key in the folloing data
    Log ID ( needs to be created ONLY for the first run ),
    Performance test ( logically the entries for Perrformance test field are of the format:
    Logid_name/PERF_transaction_name/systemname/client/date ),
    Name of the testconfiguration ( You need to create a test configuration for the eCATT to be used in ST30, use the same name for the created Test
    configuration as that of the test script ),
    No of times the test configuration needs to be run as the preprocessor to create the required backend data, No of times the test configutation needs to
    run as processor ( both these fields are filled with 5 and 5 or 5 and 10 respectively for performance measurements, but in your case you can give a 1 and 1
    or 0 and 1 in these fields for your requirements )
    With all the check boxes in the Programming guidelines and Distributed Statistics Data unchecked ( unless req ). In the data comparison ( use No - option
    for With values ).
    Click on the eCATT test only button to start the performance run using ST30.
    Now the procedure stated above makes the eCATT test configuration execute as many times as the sum of pre and pro given by the user AT ONE STRETCH ONLY. But if there is a requirement of having the eCATT execute after an interval, we follow a different approach.
    We have a VB script that will create a ECA session, call se37, select the required test package and then execute all the required test cases ( eCATTs ) in the
    test package and also ensure the KILL ECA session at the end of the execution.
    We then create a batch file to execute the VB script and call the batch file for our executions
    In you case, please schedule the execution of the batch file for every 30 mins ( or any such time duration ) using the simple scheduler functionality provided by
    Windows.
    The only problem with this is that whenever we have some system messages / Software updates / any new screens the scheduling is bound to fail as the called VB script does not handle the new situation. Please also ensure that the user whose Password has been given in the scheduler has to be the user who has logged into the system during the execution period.
    So, to summarize : ST30 will only allow you to run the eCATT as many times as required, but only at ONE STRETCH, you need to use the second mechanism to make the eCATT run effectively after a predetermined time without any user interaction.
    FYI : A new feature to handle the scheduling of executions is being developed, will post the details when it is available and the usage steps soon. We also have a new command called PERF ENDPERF in eCATT also ( a new development ), kindly go through the documentations for the new developments in eCATTs for the same
    Thanks and best regards,
    Sachin

  • Improve data load performance using ABAP code

    Hi all,
             I want to improve my load performance using ABAP code, how to do this?. If i writing ABAP code in SE38 how i can call
    in BW side? if give sample code to improve load performance it will be usefull. please guide me.

    There are several points that can improve performance of your ABAP code:
    1. Avoid using SELECT...ENDSELECT... construct and use SELECT ... INTO TABLE.
    2. Use WHERE clause in your SELECT statement to restrict the volume of data retrieved.
    3. Use FOR ALL ENTRIES in your SELECT statement to retrieve the matching records at one shot.
    4.Avoid using nested SELECT and SELECT statements within LOOPs.
    5. Avoid using INTO CORRESPONDING FIELDS OF. Instead use INTO TABLE.
    6. Avoid using SELECT * and select only the required fields from the table.
    7. Avoid Executing a SELECT multiple times in the program.
    8. Avoid nested loops when working with large internal tables.
    9.Whenever using READ TABLE use BINARY SEARCH addition to speed up the search.
    10. Use FIELD-SYMBOLS instead of a work area when there are more than 200 entries in an internal table where some fields are being manipulated.
    11. Use MOVE with individual variable/field moves instead of MOVE-CORRESPONDING.
    12. Use CASE instead of IF/ENDIF whenever possible.
    13. Runtime transaction code se30 can be used to measure the application performance.
    14. Transaction code st05 can be used to analyse the SQL trace and measure the performance of the select statements of the program.
    15. Start routines can be used when transformation is needed in the data package level. Field/individual routines can be used for a simple formula or calculation. End routines are used when you wish to populate data not present in the source but present in the target.
    16. Always use a WHERE clause for DELETE statement. To delete records for multiple values, use SELECT-OPTIONS.
    17. Always use 'IS INITIAL' instead of equal to '' because null for a character is '' but '0' for an integer.
    Hope it helps.

  • Loading performance issues

    HI gurus
    please can u help in loading issues.i am extracting  data from standard extractor  in purchasing  for 3 lakhs record it is taking  18 hrs..can u please  suuguest  me loading performance issues .
    -KP

    Hi,
    Loading Performance:
    a) Always load and activate the master data before you load the transaction data so that the SIDs don't have to be created at the loading of the transaction data.
    b) Have the optimum packet size. If you have too small a packet size, the system writes messages to the monitor and those entries keep increasing over time and cause slow down. Try different packet sizes to arrive at the optimum number.
    c) Fine tune your data model. Make use of the line item dimension where possible.
    d) Make use of the load parallelization as much as you can.
    e) Check your CMOD code. If you have direct reads, change them to read all the data into internal table first and then do a binary search.
    f) Check code in your start routine, transfer rules and update rules. If you have BW Statistics cubes turned on, you can find out where most of the CPU time is spent and concentrate on that area first.
    g) Work with the basis folks and make sure the database parameters are optimized. If you search on OSS based on the database you are using, it provides recommendations.
    h) Set up your loads processes appropriately. Don't load all the data all the time unless you have to. (e.g) If your functionals say the historical fiscal years are not changed, then load only current FY and onwards.
    i) Set up your jobs to run when there is not much activity. If the system resources are already strained, your processes will have to wait for resources.
    j) For the initial loads only, always buffer the number ranges for SIDs and DIM ids
    Hareesh

  • Loading performance help me

    Hi all,
    I had a query regarding the loading performance. v can improve the performance.
    1. first load md before td
    2. packet sizing
    3. multiple initialization.
    4. deleting indices
    can any body help in knowing the rest of the procedures to improve the loading performance.
    thanks & regards
    KK

    Hi,
    when loading transactional data the first time you can activate the number range buffering for your dimensions as well:
    goto se37 and execute function module RSD_CUBE_GET in order to get the number ranges for dimension:
    I_INFOCUBE: <yourcube techname>
    OBJVERS: A  
    I_BYPASS_BUFFER: X
    I_WITH_ATR_NAV    
    Goto to the very right of return table E_T_DIME and get NOBJECT (BIDx) for the number range of your DIMs
    Goto Tx SNRO; enter your BIDx number range in change mode / menu edit / setup buffering / main memory; enter something around 50'000 into no. of numbers to buffer.
    Doing that for dimensions loaded with high number of record will definitively boost your performance.
    Do NOT buffer the DataPackage dimension in any case!!
    You can do the same for master data... (BIMx)
    hope this helps...
    Olivier.

  • QUERY PERFORMANCE AND DATA LOADING PERFORMANCE ISSUES

    WHAT ARE  QUERY PERFORMANCE ISSUES WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES...PLZ URGENT
    WHAT ARE DATALOADING PERFORMANCE ISSUES  WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES PLZ URGENT
    WILL REWARD FULL POINT S
    REGARDS
    GURU

    BW Back end
    Some Tips -
    1)Identify long-running extraction processes on the source system. Extraction processes are performed by several extraction jobs running on the source system. The run-time of these jobs affects the performance. Use transaction code SM37 — Background Processing Job Management — to analyze the run-times of these jobs. If the run-time of data collection jobs lasts for several hours, schedule these jobs to run more frequently. This way, less data is written into update tables for each run and extraction performance increases.
    2)Identify high run-times for ABAP code, especially for user exits. The quality of any custom ABAP programs used in data extraction affects the extraction performance. Use transaction code SE30 — ABAP/4 Run-time Analysis — and then run the analysis for the transaction code RSA3 — Extractor Checker. The system then records the activities of the extraction program so you can review them to identify time-consuming activities. Eliminate those long-running activities or substitute them with alternative program logic.
    3)Identify expensive SQL statements. If database run-time is high for extraction jobs, use transaction code ST05 — Performance Trace. On this screen, select ALEREMOTE user and then select SQL trace to record the SQL statements. Identify the time-consuming sections from the results. If the data-selection times are high on a particular SQL statement, index the DataSource tables to increase the performance of selection (see no. 6 below). While using ST05, make sure that no other extraction job is running with ALEREMOTE user.
    4)Balance loads by distributing processes onto different servers if possible. If your site uses more than one BW application server, distribute the extraction processes to different servers using transaction code SM59 — Maintain RFC Destination. Load balancing is possible only if the extraction program allows the option
    5)Set optimum parameters for data-packet size. Packet size affects the number of data requests to the database. Set the data-packet size to optimum values for an efficient data-extraction mechanism. To find the optimum value, start with a packet size in the range of 50,000 to 100,000 and gradually increase it. At some point, you will reach the threshold at which increasing packet size further does not provide any performance increase. To set the packet size, use transaction code SBIW — BW IMG Menu — on the source system. To set the data load parameters for flat-file uploads, use transaction code RSCUSTV6 in BW.
    6)Build indexes on DataSource tables based on selection criteria. Indexing DataSource tables improves the extraction performance, because it reduces the read times of those tables.
    7)Execute collection jobs in parallel. Like the Business Content extractors, generic extractors have a number of collection jobs to retrieve relevant data from DataSource tables. Scheduling these collection jobs to run in parallel reduces the total extraction time, and they can be scheduled via transaction code SM37 in the source system.
    8). Break up your data selections for InfoPackages and schedule the portions to run in parallel. This parallel upload mechanism sends different portions of the data to BW at the same time, and as a result the total upload time is reduced. You can schedule InfoPackages in the Administrator Workbench.
    You can upload data from a data target (InfoCube and ODS) to another data target within the BW system. While uploading, you can schedule more than one InfoPackage with different selection options in each one. For example, fiscal year or fiscal year period can be used as selection options. Avoid using parallel uploads for high volumes of data if hardware resources are constrained. Each InfoPacket uses one background process (if scheduled to run in the background) or dialog process (if scheduled to run online) of the application server, and too many processes could overwhelm a slow server.
    9). Building secondary indexes on the tables for the selection fields optimizes these tables for reading, reducing extraction time. If your selection fields are not key fields on the table, primary indexes are not much of a help when accessing data. In this case it is better to create secondary indexes with selection fields on the associated table using ABAP Dictionary to improve better selection performance.
    10)Analyze upload times to the PSA and identify long-running uploads. When you extract the data using PSA method, data is written into PSA tables in the BW system. If your data is on the order of tens of millions, consider partitioning these PSA tables for better performance, but pay attention to the partition sizes. Partitioning PSA tables improves data-load performance because it's faster to insert data into smaller database tables. Partitioning also provides increased performance for maintenance of PSA tables — for example, you can delete a portion of data faster. You can set the size of each partition in the PSA parameters screen, in transaction code SPRO or RSCUSTV6, so that BW creates a new partition automatically when a threshold value is reached.
    11)Debug any routines in the transfer and update rules and eliminate single selects from the routines. Using single selects in custom ABAP routines for selecting data from database tables reduces performance considerably. It is better to use buffers and array operations. When you use buffers or array operations, the system reads data from the database tables and stores it in the memory for manipulation, improving performance. If you do not use buffers or array operations, the whole reading process is performed on the database with many table accesses, and performance deteriorates. Also, extensive use of library transformations in the ABAP code reduces performance; since these transformations are not compiled in advance, they are carried out during run-time.
    12)Before uploading a high volume of transaction data into InfoCubes, activate the number-range buffer for dimension IDs. The number-range buffer is a parameter that identifies the number of sequential dimension IDs stored in the memory. If you increase the number range before high-volume data upload, you reduce the number of reads from the dimension tables and hence increase the upload performance. Do not forget to set the number-range values back to their original values after the upload. Use transaction code SNRO to maintain the number range buffer values for InfoCubes.
    13)Drop the indexes before uploading high-volume data into InfoCubes. Regenerate them after the upload. Indexes on InfoCubes are optimized for reading data from the InfoCubes. If the indexes exist during the upload, BW reads the indexes and tries to insert the records according to the indexes, resulting in poor upload performance. You can automate the dropping and regeneration of the indexes through InfoPackage scheduling. You can drop indexes in the Manage InfoCube screen in the Administrator Workbench.
    14)IDoc (intermediate document) archiving improves the extraction and loading performance and can be applied on both BW and R/3 systems. In addition to IDoc archiving, data archiving is available for InfoCubes and ODS objects.
    Hope it Helps
    Chetan
    @CP..

  • How to export/import data of hyperion performance scorecard?

    I have a Hyperion Performance Scorecard Environment which save data in Oracle database. Now I am trying to setup another scorecard environment which has exact the same set of data and use MS SQL Server as database. What is the best way to dump data from Oracle one to SQL Server one? I tried to use HPS Import/Export Utility to export data from Oracle scorecard environment to .CSV file and then import .CSV file into SQL Server Scorecard environment. The exporting process went well despite of some warning message. However, I keep getting trouble with importing. There are a lot of issues such as inconsistency date format, data dependencies, missing comma in .csv file, etc. Could anyone suggest me some easier way to do this data dumping thing? All what I am trying to achieve is a clone of my HPS environment. Thanks a lot.

    Hi Yang,
    i had your Query regarding Hyperion Performance ScoreCard. I feel that you may have some idea regarding this tool, so i was asking you. I need Some information Regarding that tool, implementations, Current& Future market, and any prerequisites to work on that tool. Also if you have any idea on Hyperion Essbase & planning plz let me know the implementations of this HPScore card when compared to them.I would be thankfull to you if you let me know what ever you know about that.
    Regards

Maybe you are looking for

  • Stagewebview doesn't show html5 video for Android

    I just made app for Android to be able to play HTML5 videos. But the video is showing blank and only voice playing. If I make the video full screen it will show. The youtube video embed in the page has same problem. I also have the app in IOS version

  • Matching Color

    Hi, Is it possible to match the color of a sepia clip in fcpx if i don't know where the sepia filter for that clip came from as in AE, Motion, etc.? that would be on a 422 mov clip. I ask because i tried using the Match Color option and although it c

  • Validate a number using regex

    I would like to validate numbers using regex, and thought that a search both here on the net would produce what I was looking for, but... The closest I can get to a number validation sequence (thankyou www.interviewjava.com) was the following: "^[-+]

  • Error con archivos multimedia de keynote y imovie

    hola. sucede que cuando habro keynote o imovie, las fotos que importe a iphoto no me aparecen en archivos multimedia para poder desde el mismo keynote o imovie hacer uso de ella, quiero aclarar que si podía y era mas fácil, y ahora no puedo.

  • I cannot get the officejet pro 8500a to print with airprint

    i cannot get the officejet pro 8500a to print with airprint