For performance point, in data loading what we need to do?

Hi,
1.for performance point, in data loading what we need to do?
2.How to increase Cube performance & query Performance?
Thanks,
Basava Raju

Hi,
In addition to the above steps, especially for data loading, make sure that if you have custom code, its enhanced for performance both on the R3 side and the BI side as well.
For generic data sources, try to make them generic deltas or try pseudo deltas for them.
From a query pespective, in filters, try putting include conditions wheree ver possible instead of exclusions.
Cheers,
Kedar

Similar Messages

  • IMPORT_COUNTREQUEST API for Physical Count form data load?

    Dear All,
    Please clarify whether I can use the IMPORT_COUNTREQUEST API for physical count form data load...?
    or which other API does the job other than dataload through dataloader..
    Please update...
    many thanks in advance....

    Hi,
    We believe instead of using IMPORT_COUNTREQUEST API you can use a Dataload script to key in data in Physical Inventory form.
    Basically you need to populate 4 fields in this form, i.e Physical Inventory, Date, Description & Snapshot Complete (checkbox).
    Go ahead and write a Dataload script for the same.
    Regards,
    S.P DASH

  • ASA 5505 WebVPN - It has taken a while for SSL VPN Relay to load. You need to verify Java is enabled in your browser

    ASA 5505
    ASA Version 9.0.(2)
    Suddently on the webvpn Interface when i click on my web bookmarks (and java launches in browser) i get this fail in Chrome and FF 'It has take a while for SSL VPN Relay til load. You need to verify Java is enabled in your browser' and nothing happens...
    Java IS enabled and running. Tried this in both 7.45 and 7.51
    No problem in IE 11 and java 7.45 and 7.51
    I've googled alot but have not been able to find any suggetions
    Hope you have a solution
    Best Regards.

    Any resolution on this?  Firefox/Chrome my cifs work but smart tunnel RDP doesn't, and in IE my shares don't work but RDP smart tunnel does....
    Cisco, if you're not going to do something good, just don't do it.  The SSL VPN is a hack job.

  • "A problem caused the program stop working," after I accepted the TERMS and Conditions for Adobe Reader 11 installstion. What I need to do next? Thanks.

    "A problem caused the program stop working," after I accepted the TERMS and Conditions for Adobe Reader 11 installstion. What I need to do next? Thanks.

    Thanks for your reply. I did get upgraded to 10. something. Not on that computer now. 11 is not supported on Vista
    Pat Willener <[email protected]> wrote:
    Pat Willener  created the discussion
    ""A problem caused the program stop working," after I accepted the TERMS and Conditions for Adobe Reader 11 installstion. What I need to do next? Thanks."
    To view the discussion, visit: https://forums.adobe.com/message/6442262#6442262

  • Performance Point Services data source pointing towards wrong datasource after production instance copied in development

    Hi
    We have copied our Project Server 2010 production databases to development server. We have few reports created by using Performance Point Services, but here when we try to open the reports the data source is  pointing towards production data source
    instead of dev.
    We are getting some issue while executing the reports
    Request you to let me know what exactly the cause and steps to rectify the issue.
    Thanks 
    Geeth If you feel that the answer which i gave you is Helpful please select it as Answer/helpful.

    As you have copied data from Prod to Dev reports are pointing to production data sources and reports are unable to display data. Whenever we copy the data from one environment to other then this kind if issue occurs.
    We need to modify the data source of all the reports manually so that report point to dev data source.
    You have to open data source for your reports and change it from prod to dev then reports will display correct data.
    http://blogs.msdn.com/b/performancepoint/archive/2011/09/19/performancepoint-data-connection-libraries-and-content-lists-in-sharepoint.aspx
    http://www.networkworld.com/community/node/57687
    kirtesh

  • Performance Point Services Data Source Sql Server Table.

    Hi,
    I have a requirement where I need to create a matrix report which shows resource capacity and deomand fromProject Server 2010 databases which involves joining of multiple tables and views from the database.
    But in Performance Point Service 2010 I am not able to write query to join multiple tables it'sjust showing one option in Data Connection to create connection to a database and specific one table not like SSRS where we can connect to database and can query
    multiple tables and views to render the report as required.
    So can any body suggest in case I am missing something in PPS 2010 or we cant joing or write sql query in PPS 2010
    Thanks

    I would join the tables in a view as Regis suggested or simply pull the information into a PowerPivot model and use that in SharePoint to connect to in PPS.  If you go that route then you will be able to use a multidimensional source and can create
    Analytical Reports.
    http://denglishbi.wordpress.com/2011/01/03/using-powerpivot-with-performancepoint-services-pps-2010/
    http://denglishbi.wordpress.com/2012/02/09/using-performancepoint-services-pps-with-powerpivot-sql-server-2012-rc0/

  • "master data deletion for requisition" before master data loading

    Hello Gurus,
             in our bw syetem , for   process chains for loading  master infoobjects, all include "u201C master data deletion for requisition" ABAP
    process  except for one process chain. my question is:
           why that process chain for master data loading is different from others as for lacking "master data deletion for requisition" in it?
    so it does not matter if you include " master data deletion for requisition" ABAP  process in process chain for master data loading ?
    Many thank.

    Hi,
    ABAP Process means some ABAP program is being executed in this particular step.
    It's possible that for all of your process chains except for that one requirement was to do some ABAP program processing.
    You can check which program is executed by following below process:
    Open your process chain in planning view -> Double click on that particular ABAP process -> Here you can see program name as well as program variant.
    Hope this helps!
    Regards,
    Nilima

  • Data Load scenario- help needed

    Hi gurus,
    Please help me with the below scenarios for the LO mechanism:
    Rec 1 entered in R3 between T1 and T2 (during R3 setup table population). When does Rec1 get into BW?
    Rec 2 entered in R3 between T2 and T3 (after setup table population but before initialization loads). When does Rec 2 get into BW?
    Rec 3 entered in R3 between T3 and T4 (during initial loads in BW). When does Rec 3 get into BW?
    Rec 4 entered in R3 after initialization is completed after T4. When does Rec 4 get into BW?
    Time
    T1                                          T2    T3                                                                               T4                             
    |--|---||--
           Rec 1                                Rec2                                   Rec 3                                                   Rec 4
    T1 – T2 – Setup Tables populated
    T3 – T4 – Initial Loads performed
    T4 onwards – Delta Loads performed

    Hi ,
    senario1;Rec 1 entered in R3 between T1 and T2 (during R3 setup table population). When does Rec1 get into BW?
    during the setup table filling ,v3 jobs will be descheduled
    you need to lock all the user's (no user can post documents)
    Rec 2 entered in R3 between T2 and T3 (after setup table population but before initialization loads). When does Rec 2 get into BW?
    here just setup table is filled,however init is not performed
    so still  v3 jobs will be in descheduled mode ,the posting which were/are happened  sits in application tables
    Rec 3 entered in R3 between T3 and T4 (during initial loads in BW). When does Rec 3 get into BW
    here init is running,it will be down time for R/3 due to heavy load processing to BW
    so still  v3 jobs will be in descheduled mode ,the posting which were/are happened still sits in application tables
    Rec 4 entered in R3 after initialization is completed after T4. When does Rec 4 get into BW?
    after init,now we need to schedule v3 jobs, so all the postings(which were posted earlier/which are posted for the moment) now will transfer from application tables to delta que depending on the update mode(direct/queued/un serialized)
    Hope this helps you!!!!!!!
    cheers,
    Swapna.G
    Message was edited by:
            swapna gollakota
    Message was edited by:
            swapna gollakota

  • Deploymentshare : local or DFS? What is best for performance and failures? And what about multicast in relation to DFS?

    Hi,
    we use MDT 2013 on MS Server 2012 Standard. We deploy LTI's. We have only one (virtual) MDT-server and that has only one deploymentshare. I'm having now a discussion with my colleague about where to put the deploymentshare. He says that we should put it
    on our fileservers (which use DFS) because it's more safe and more dynamic (no idea what he means with that).  But to me it appears quite strange to separate an application and its main folder structure. I have this idea that those two need to be as close
    as possible, like to avoid network problems for instance. It's also a fact that our MDT-server is backed up every night, so if something bad happens we just ask our colleagues to restore the latest snapshot. Can someone explain which option is the better and
    why : local or DFS?
    Second thing which is related to my first question: I've read on this forum about multicast not being possible when the deploymentshare is not local on the server? Am I correct? Anymore info about this issue would be nice too. Awaiting your answers I'll
    keep looking for more clues everywhere.
    thanks,
    Paul

    IN addition to what is mentioned here.
    MDT works great with DFS. However note that MDT is *NOT* supported with Domain DFS, only stand-alone DFS, because Domain DFS implies that the client is joined to a domain when you make the call, and MDT needs to run within WinPE, where there is no domain
    infrastructure.
    Multicast should be on the same machine as your DeploymentShare, wherever that may be. If you have DFS, your deployment share would be the root.
    Another idea is to use DFS to replicate out to your DFS leaf nodes, and each Leaf node would be it's own WDS Server with Multicasting being done from there.
    Keith Garner - Principal Consultant [owner] -
    http://DeploymentLive.com

  • Efficiency - How to load what you need.

    Hello,
    I am pondering the scenario of using object oriented programming and filling a report efficiently. I am hoping someone could provide insight or direct me to existing patterns.
    Let's say I want to create a report of all employees. Great, my first thought is have a Employee class and Employee Container. Load all the employee objects and all the n attributes. Loop through to display.
    What about if I only wanted a select few of the employees in the report. I could loop through all as above looking for the correct criteria, however, I would much rather let the database do that. So what is a good way to specify criteria to load the employees.
    Another is what if I only want 2 of the n attributes. How can I get around loading all the attributes of the Employee? Keeping in mind that I don't want to know anything about the database at the business object level.
    This is really only part of the problem. I'm sure everyone goes through it so there must be best practices and/or patterns to help.
    Thanks,
    Tom

    Note, I didn't actually run this in a database to verify that it worked. Only verification I did was to print out on the console what the SQL statement was.
    bash-2.03$ java Example
    SELECT * person WHERE name = bob AND age >= 21
    (You may want to modify it to put a ';' at the end of the query if necessary.)
    This is the interface that defines the key to the Map
    package db;
    public interface Attribute {
            String getField();
            String getOperator();
            String construct(String value);
            int hashCode();
            boolean equals(Object o);
    }Very flexible implementation of an attribute--should be able to apply to whatever you want.
    package db;
    public class GenericAttribute implements Attribute {
            private String field;
            private String operator;
            public GenericAttribute(String field) {
                    this(field, "=");
            public GenericAttribute(String field, String operator) {
                    this.field = field;
                    this.operator = operator;
            public String getField() {
                    return this.field;
            public String getOperator() {
                    return this.operator;
            public String construct(String value) {
                    return this.field + " " + this.operator + " " + value;
            public int hashCode() {
                    return field.hashCode() + operator.hashCode();
            public boolean equals(Object o) {
                    if(o instanceof GenericAttribute) {
                            GenericAttribute attrib = (GenericAttribute) o;
                            return (this.field.equals(attrib.getField()) && this.operator.equals(attrib.getOperator()));
                    return false;
    }Pieces all of the parts together.
    package db;
    import java.util.Collection;
    import java.util.Iterator;
    import java.util.Map;
    import java.sql.ResultSet;
    import java.sql.SQLException;
    public abstract class SQLComposer {
            protected abstract String getTable();
            protected String getFields() {
                    return "*";
            protected abstract Collection getResults(ResultSet rs) throws SQLException;
            public String compose(Map attributes) {
                    StringBuffer sb = new StringBuffer("SELECT ");
                    sb.append(getFields() + " " + getTable());
                    Iterator iter = attributes.keySet().iterator();
                    boolean first = true;
                    while(iter.hasNext()) {
                            Attribute attrib = (Attribute) iter.next();
                            if(first) {
                                    sb.append(" WHERE ");
                                    first = false;
                            } else {
                                    sb.append(" AND ");
                            sb.append(attrib.construct((String) attributes.get(attrib)));
                    return sb.toString();
    }Example code using the composer and the attribute
    import db.Attribute;
    import db.GenericAttribute;
    import db.SQLComposer;
    import java.util.ArrayList;
    import java.util.Collection;
    import java.util.HashMap;
    import java.util.Map;
    import java.sql.Connection;
    import java.sql.Statement;
    import java.sql.SQLException;
    import java.sql.ResultSet;
    public class Example extends SQLComposer {
            public static final Attribute NAME = new GenericAttribute("name", "=");
            public static final Attribute AGE = new GenericAttribute("age" , ">=");
            public static final Attribute TITLE = new GenericAttribute("title", "LIKE");
            private String table = null;
            public Example(String table) {
                    this.table = table;
            protected String getTable() {
                    return this.table;
            public Connection getConnection() {
                    // do JDBC direct connection or DataSource lookup
                    return null;
            public Collection process(Map attributes) throws SQLException {
                    Connection conn = getConnection();
                    Statement s = conn.createStatement();
                    return getResults(s.executeQuery(compose(attributes)));
            protected Collection getResults(ResultSet rs) throws SQLException {
                    Collection results = new ArrayList();
                    while(rs.next()) {
                            // parse out info and append to collection
                    return results;
            public static void main(String[] args) {
                    Example e = new Example("person");
                    Map map = new HashMap();
                    map.put(Example.NAME, "bob");
                    map.put(Example.AGE, "21");
                    try {
                            e.process(map);
                    } catch(SQLException sqle) {
                            sqle.printStackTrace();
                            System.exit(1);

  • I just got a used ipad version 6.1.3.  I have an ipod classic version 1.1.2PC and I have a laptop with windows 7 and use internet explorer.   somehow i have icloud on my ipad but no app for it. I cannot explain what I need --not enuf text--2 b continued

    I have never used this--on my ipad it says I have an account with icloud and that I can syn all my devices but there is no app for icloud on my ipad.
    I have never used my ipod myself--that is to put material on it.  A friend did it and I registered with Apple and I did have itunes and a list of my stuff on my old pc but it is gone.  I tried to get itunes on the pc I have--no sucess--5 + attempts yesterday.  Today I tried to link my ipod to icloud.  In the "settings" heading I do not have a "software" choice.  I do not know what to do.  Can someone help me.  I am not very pc literate--so you might get frustrated.  Thank You

    Hi, thanks for the suggestion. I have tried as you suggested, and when opening the "purchased" apps some have the icloud logo next to them, but I only have "OPEN" against "Find My iPhone". When opening it up, it goes through the same routine; needs to be updated before proceeding, and wouldn't update because I don't have IOS8.
    Anything else I could try, or am I doomed!
    All of your help is much appreciated, thanks

  • An update for an app is not loading, but I need to use the app even without the update. How do I cancel the update? Right now is on hold.

    I use an app, today there is an update, so I push it to update. Is giving some error, not possible to upload,.. May the server is too busy?,.. So the app is on hold,.. Waiting...
    In any case, I need to to use the app... But I can not get it to open ....how do I go about canceling the update?

    If there's no important data, delete and re-install.

  • Last member based on date in Performance point Scorecard

    Hello,
    I have a scorecard in performance point with data parameter Month-Year. We have data for few months only like March, Aug and Dec for 2014. On my date drop down parameters, we have last 12 months Month year list. Like May 2014-Apr 2015.  Now when user
    select June 2014, I need to display data for March 2014 as we dont have data for June 2014. If user selects march 2015, then we need to display Dec 2014 data as we have Dec 2014 data. How can I do that in Performance point? Do I have to write logic in KPI
    custom MDX or I  have to write logic in connection formula in performance point dashboard? what logic I need to write?
    Thank you!

    Hi Satish,
    Have you tried below:
    Create Month1 object as =Month([Month]) and [total]as =Sum([Measure])
    Then drag both Month1 and total objects in report block.
    Regards,
    Yuvraj

  • Performance while data load

    Hello Friends,
    I an having a query regarding performance in BW 3.5.
    while data load from R3 to BW we have 4 options.
    Only PSA
    Only Data Target
    PSA and later data target
    PSA and Data target in parallel.
    In system performance point of view, which is the best option with less system resources and How.
    Your help aprreciated.
    Thanks
    Tony

    Hi ,
    for performance point of view ..
    PSA and later data target will be better option ..
    for more info check this link ..
    http://help.sap.com/saphelp_nw04/Helpdata/EN/80/1a6567e07211d2acb80000e829fbfe/frameset.htm
    Regards,
    shikha

  • Data Loader On Demand Proxy Usage for Resume operation

    Hi,
    My project required me to use the proxy feature available in Data Loader R19 release.
    I could use the proxy at command line for insert /update operations.
    However, the same doesnt work for RESUME operation in Data loader.
    Tried using proxy settings from command line as well as property file but to no use.
    Any suggestions...
    Regards,
    Sumeet

    Its a Java application so it may run on your Linux/Unix system, you would have to test to see if works. Last time I checked Oracle only supports the application running on windows.

Maybe you are looking for

  • Disable automatic Bluetooth reconnect?

    Hello all, After upgrading to Mavericks, I've noticed a slightly annoying behavior that was not present in Mountain Lion: my computers automatically connect to nearby paired Bluetooth devices, namely my Logitech Wireless Speaker Adapter. Because I us

  • Getting File Name (Sender File Adapter )

    Hi EveryBody, I need to capture only the the FileName in to the outbound xml that is present in the filepath of XI server. I checked: Adapter Specific Message Attributes: Checked box on --> 1)Set Adapter Specific Message Attributes                   

  • Change the File Name

    Environment - EBiz XMLP 5.6.2 How do I set/change the output file name? I am trying to use eText and the banks require a specific file name. Thanks

  • Solaris 10 - dates getting set to 1-1-1970

    I dont have a lot of detail about this but lately I have seen a very strange thing occur on Oracle 10g in the Solaris 10 environment. Sometimes I have date fields that get populated with 1970 dates or pre 1970 dates. I cant really say how the dates a

  • Search help enhancement in KS03

    Dear All, I have a requirement of adding Bus.Area field in the cost center fof transaction code KS03. Is there any suggested method like enhancement or modification to achieve this.. If there is any search help exits , kindly let me know the steps in