COPA Query

Hello,
I have one query regarding COPA. My one of the client want to implement COPA for 2 company code out of 9 company code of the company and all this 9 company codes come under single controlling area. And now client want to implement the COPA for 2 company codes.
So my query is
     1 Whether it is possible to implement COPA for 2 company codes only?
     2 If it can be possible then which are the probable precautions to be taken. And also which are the challenges can be faced in this?
Thanks in advance for the help.

Hi Hari Gore,
On the face of it, it is not possible to go for COPA for only two company codes in your case.
Operating Concern is next level of Controlling Area. We can assign CO Area to Op Concern but not Company Codes directly. (Similar example you can take for Plant and Controlling Area. You assign Plants to Company Code and Company Codes to Controlling Area. We don't assign Plants directly to Controlling Area)
But pls explain me...when you said, all the 9 company codes are under same controlling area, then what is the problem of going for COPA for all 9 company codes? I would like to know why you client is looking for only 2 company codes. Except the technical reasons like no. of line items posted will be more, I don't see any reason or difference between going for 2 company codes and 9 company codes...
Pls revert back for further discussion...
Srikanth Munnaluri
Edited by: Srikanth Munnaluri on Mar 4, 2009 11:30 PM

Similar Messages

  • COPA Query - Inventory Adj & Derivation of field for Variance

    Hello,
    I have some of COPA relared queries, could you please share your experience regarding same how it is treated.
    1  Since all COPA related based on Cost of Sales but Variance and invettory is at Cost of Production, so how this is adjusted in COPA reporting.
    2  In case of Variance settlment how the sales order, customer is derived, as this was not avaible at the time of settlment of production order.
    Thanks,
    Harry

    Hi Harry
    Cant u/s Q1
    Q2 - Sales order is updated in COPA doc for variances only if it is a MTO i.e. Make To Order production.... for MTS, ie. Make to Stock, no customer no would be there in COPA doc and thats standard!!
    br, Ajay M

  • Currency Conversion terminates the query execution

    Hi friends,
    When I execute a query it terminates due to the following error in currency conversion. "Abort System error in program CL_RSR_RRK0_CURR and form FILL_CUDIM_02-04 (see".
    conversion works fine and gives correct values in target currency when the source currency is say thailand bhat, malaysian ringett, philippino peso etc.., but when i use the source currencies of zimbabwe or kenya i get this error and next the query terminates.
    Can any one of you suggest me what could be the problem. How do i systemtically approach to solve this problem.
    Thanks
    Raj

    Hi Serjio
    All the currencies mentioned currencies are available in the TCURR table including the exchange rates from those currencies to USD.
    Hi Sven,
    I am sorry, I really didnot understand when you say "set the flag for protocol to X". You may throw more light on this. I tried debugging the program RSW_CUR_TRANSLATION_RATE_GET in the past step-by-step. This leads to lots of other programs and i was really lost in them. Pls guide me to debug it effectively.
    Hi Serjio/Sven,
    Further, i missed out to mention the following details in my post. We have two version of a this COPA query. One is older and the other is newer. Both were created before we upgraded from 3.1C to NW2004s.
    The older version works fine. The conversion from source to target currencies is good for the african currencies also besides the asia-pacific (AP) currencies. However, the newer version throws out the error message i have mentioned in the post for african currencies but works fine for asia-pacific currencies. Even the newer version was working fine before we upgraded from 3.1C to NW2004s. Its only after the upgrade that we are encountering this problem.
    In the query we are also converting the data to SGD (singapore dollar), besides USD. As i donot require this data i have hided the values in SGD. As a reason we have never maintained conversion rates/factors for converion from any of the AP or African currencies to SGD. Due to these missing rates I get an Information message to maintain the conversion rates/conversion factors for AP/African curr to SGD. This message appears along with the error message(mentioned in my post) on the same screen. This information message also appears for the older version of my query, which is currently giving correct $values. But I think we can ignore this message. Else do you think, in some way these missing rates are responsible for the trouble??.
    But more surprisingly why is that the older version is not impacted by the upgrade while the newer is affected?
    Pls share your views
    Thanks
    Raj

  • AD get records of security log..

    Hello everyone:)
    i must get some records (by event id, time, etc..) of security log.
    adler_steven answer me for previous my post(http://forum.java.sun.com/thread.jspa?threadID=5292943&messageID=10238354#10238354)
    he sad look at http://forum.java.sun.com/thread.jspa?threadID=5116320&tstart=15
    I must use WMI HTTP Mapper and some WBEM library...
    Ok, i install and configure WMI HTTP Mapper and use next source for try get :) security log, by this dont work...
    connect success, but retrieving information failed
    adler_steven :) help me :)
    *EXHIBIT A - Sun Industry Standards Source License
    *"The contents of this file are subject to the Sun Industry
    *Standards Source License Version 1.2 (the "License");
    *You may not use this file except in compliance with the
    *License. You may obtain a copy of the
    *License at http://wbemservices.sourceforge.net/license.html
    *Software distributed under the License is distributed on
    *an "AS IS" basis, WITHOUT WARRANTY OF ANY KIND, either
    *express or implied. See the License for the specific
    *language governing rights and limitations under the License.
    *The Original Code is WBEM Services.
    *The Initial Developer of the Original Code is:
    *Sun Microsystems, Inc.
    *Portions created by: Sun Microsystems, Inc.
    *are Copyright � 2001 Sun Microsystems, Inc.
    *All Rights Reserved.
    *Contributor(s): _______________________________________
    import java.io.IOException;
    import java.util.Enumeration;
    import javax.wbem.cim.*;
    import javax.wbem.client.*;
    import javax.wbem.client.UserPrincipal;
    import javax.wbem.client.PasswordCredential;
    * This class will perform an CIMClient.execQuery using a WQL query string that
    * is passed on the command line.  If a query isn't passed on the command line,
    * the user will be prompted for the query
    public class TestQuery {
        public TestQuery(String args[]) {
            String serverName = "win2003";
            String user = "administrator";
            String pass = "welcome1";
            CIMClient cimClient = null;
            CIMObjectPath cimPath = null;
            String wbemNameSpace = "root/CIMV2";
            int wbemPortNumber = 5988;
            String wbemProtocol = CIMClient.CIM_XML;
            try {
                    System.out.println("connecting..\n");
                    String hostURL = "http://" + serverName + ":" + wbemPortNumber;
                    CIMNameSpace cimNameSpace = new CIMNameSpace(hostURL,wbemNameSpace);
                    UserPrincipal userName = new UserPrincipal(user);
                    PasswordCredential userPass = new PasswordCredential(pass);
                    cimClient = new CIMClient(cimNameSpace,userName,userPass,wbemProtocol);
            catch (CIMException e) {
                    System.err.println("Failed to access CIMOM: " + e);
            try {
                    System.out.println("get win32_ntlogevent");
                    cimPath = new CIMObjectPath("Win32_NTLogEvent");
                    System.out.println("cimPath");
                    Enumeration e = cimClient.enumerateInstances(cimPath); // this line hang
                    System.out.println("Enumeration");
                    if (e.hasMoreElements()) {
                            CIMInstance ci = (CIMInstance)e.nextElement();
                            // i think, there must be properties of Win32_NTLogEvent Class, such as message, eventid, eventcode...
                            CIMProperty cp = ci.getProperty("Message");
                            System.out.println("   Message: " + cp.getValue());
                    System.out.println("stop get win32..");
                    cimClient.close();
            catch (NullPointerException e) {
                    System.err.println("Null Pointer Exception: " + e);
            catch (CIMException e) {
                    System.err.println("Failed to enumerate WBEM Info: " + e);
        public static void main(String args[]) {
         new TestQuery(args);
    {code}
    Edited by: Jeqpbl4 on Jun 9, 2008 4:24 AM                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

    I figure I've abused enough people today on the forum, that it's time to redeem myself.
    Firstly, as I've always admitted, I'm not a Java developer, so there may be better ways of doing this. Secondly, I'm not an expert on WBEM/WMI, so I'm not sure of the different classes, methods or properties that WBEM exposes. I think I recommended some references in the links mentioned in this post, so if you want to dig deeper, go read those.
    This is just a quick and dirty WBEM query that retrieves the security events. One thing I discovered is that if you have lots of events, you'll get a heap overflow exception. I guess there may be a way to retrieve pages of results, otherwise use a more refined query to return a smaller number of records.
    * WBEMQueryLog, retrieve the entries from the security log from a server
    * demonstrating the use of a WBEM Query
    import java.io.*;
    import java.util.*;
    import javax.wbem.cim.*;
    import javax.wbem.client.CIMClient;
    import javax.wbem.client.UserPrincipal;
    import javax.wbem.client.PasswordCredential;
    public class wbemquerylog {
        public static void main(String args[]) throws CIMException {
         CIMClient cc = null;
         CIMObjectPath cop = null;
         CIMInstance ci = null;
         String hostname = "myServer";
         String nameSpace = "root/CIMV2";
         int portNumber = 5988;
         String hostURL = "http://" + hostname + ":" + portNumber;
         String protocol = CIMClient.CIM_XML;
         try {
             CIMNameSpace cns = new CIMNameSpace(hostURL,nameSpace);
             UserPrincipal username = new UserPrincipal("myServer/Administrator");
             PasswordCredential password = new PasswordCredential("XXXXXX");
             cc = new CIMClient(cns,username,password,protocol);
         catch (CIMException e) {
              System.err.println("Failed to access CIMOM: " + e);
              System.exit(1);
         cop = new CIMObjectPath();
         //lets try to get the Security Log entries, using a query
         try {
              cop = new CIMObjectPath();//"Win32_NTLogEvent");
              String query = "Select * from Win32_NTLogEvent where Logfile='Security'";
              Enumeration e = cc.execQuery(cop,query,CIMClient.WQL);
              for (int i = 1;e.hasMoreElements();i++) {
                   System.out.println("Event: " + i);
                   System.out.println(e.nextElement());
         catch (CIMException e) {
              System.err.println("Failed to query security log: " + e);
              System.exit(1);
         System.exit(0);
    }If you want to retrieve specific Security Log Events, you could construct a more complex query, such as below, which will find Account Logon Failures
    String query = "Select * from Win32_NTLogEvent where Logfile='Security' And EventCode = '681'";You could also use an enumeration as you have done, the only thing I haven't bothered to work out is how to enumerate the Security log itself, rather than every event in all the logs. I guess it's just a matter of working out what the CIM Path is, if it as at all possible.
    * WBEMEnumLog, enumerate the NTEventLogs from a server
    * Should find out the full CIM Path for the security logs
    import java.io.*;
    import java.util.*;
    import javax.wbem.cim.*;
    import javax.wbem.client.CIMClient;
    import javax.wbem.client.UserPrincipal;
    import javax.wbem.client.PasswordCredential;
    public class wbemenumlog {
        public static void main(String args[]) throws CIMException {
         CIMClient cc = null;
         CIMObjectPath cop = null;
         CIMInstance ci = null;
         String hostname = "myServer";
         String nameSpace = "root/CIMV2";
         int portNumber = 5988;
         String hostURL = "http://" + hostname + ":" + portNumber;
         String protocol = CIMClient.CIM_XML;
         try {
             CIMNameSpace cns = new CIMNameSpace(hostURL,nameSpace);
             UserPrincipal username = new UserPrincipal("myServer/Administrator");
             PasswordCredential password = new PasswordCredential("XXXXXX");
             cc = new CIMClient(cns,username,password,protocol);
         catch (CIMException e) {
              System.err.println("Failed to access CIMOM: " + e);
              System.exit(1);
         cop = new CIMObjectPath();
         try {
              Enumeration en = cc.enumNameSpace(cop,true);
              if (en != null) {
                   while (en.hasMoreElements()) {
                        CIMObjectPath obj = (CIMObjectPath)(en.nextElement());
                        System.out.println("Namespace: " + obj.toString());
         catch (CIMException e) {
              System.err.println("Failed to enumerate namespace: " + e);
              System.exit(1);
         //lets try to get the event logs
         try {
              cop = new CIMObjectPath("Win32_NTLogEvent");
              System.out.println("Host: " + cop.getHost());
              System.out.println("Object Name: " + cop.getObjectName());
              String attrs[] = {"Logfile","Sourcename","EventIdentifier","EventType","TimeGenerated","Type","CategoryString","User"};
              Enumeration e = cc.enumerateInstances(cop,false,false,false,false,attrs);
              for (int i = 1;e.hasMoreElements();i++) {
                   System.out.println("Disk: " + i);
                   System.out.println(e.nextElement());
         catch (CIMException e) {
              System.err.println("Failed to enumerate Event Log: " + e);
              System.exit(1);
         System.exit(0);
    }Good luck....

  • Bex query and tranport of copies

    Hi All,
    I would like to transport some Bex queries from the <b>production</b> system to the <b>developpement</b> system. In other words, i wd like to copy the prod queries on the developpement system.
    It seems that is possible with a <b>transport of copie</b> process.
    Can you give me please the procedure or explain me the way to do.
    Thank you in advance
    Message was edited by:
            Pontoise Pontoise

    Hi,
          you can transport the queries in transport connection.
    there u ll find options "Bex requests and development classes" of transporting of querieequests.

  • COPA extraction: RD155 No source could be found to answer the search query

    Hi Everyone,
    I'm trying to schedule an init. load for the new fiscal year from an Account-based 1_CO_PA** extractor. I've entered restrictions for Controlling Area (single entry) and Fiscal Period (PERIO) 001.2011 - 001.2025.
    When I try to schedule the init. load or even test the extraction in RSA3, I get the error message 'No source could be found to answer the search query'. I've checked my summarization levels and and they have the value '*' for all characteristics.
    If someone can provide an explanation for the error message, it would be most helpful.
    Thanks,
    Ram

    Hi Ram,
    This error is almost always caused by incomplete or incorrect customizing:
    In addition, the summarization level must have status 'ACTIVE' (this applies also to the search function in the maintenance transaction for CO-PA data sources,KEB0).                                                                               
    Please also review the SAP note 392635 which is release independent and has further information on the customizing and CO-PA extraction:                                                                               
    392635 - Information: Sources with BW extraction for account-based    
               CO-PA                                                        
    So please check your customizing and the note 392635 carefully in particular please pay attention to the following:                                                                               
    There has to be one (single)  summarization level that contains all characteristics (except for  the controlling area) with '*' in the   
    characteristics view in  transaction KEDV and has status 'active'.   
    After checking the above you need to replicate the datasource in the connected BW system and reactivate the transfer rules.               
    Best Regards,
    Des

  • Complex query cannot cope with shared memory

    Hi All,
    SELECT s.sessionid,
      s.requestid,
      s.locumid,
      s.sessiondate,
      s.sessionstart,
      s.sessionend,
      s.status,
      decode(type,   '1',   'Surgeries',   '2',   'Surgeries and Visits',   '3',   'Surgeries and On Call',   '4',   'Surgeries, On call and Visits',   '5',   'On Call',   '6',   'Visits',   '0',   'Type not listed',   type)
    FROM sessions s,
      locumdetails l,
      locumrequest lr
    WHERE l.locumid = s.locumid
    AND lr.locumrequestid = s.requestid;
    see error below:
    I am presently on 9i. Do i need to increase the share memory size. and where if do or what is the problem. help
    ERROR at line 1:
    ORA-04031: unable to allocate 4096 bytes of shared memory ("large pool","unknown object","hash-join .
    What do I do in such situation please.
    cube60Message was edited by:
    cube60

    Thanks Ranga
    Re: strange startup problem
    The database in question was set as a shared and not a dedicated server. So, in trying to increase the shared pool size in OEM, it would not shut down because it is running on a shared memory service. So I had to change or add to the TNSnames.ora and .bak under that TNSname changing it to a dedicated server and it took in the new memory increase quite alright.
    addedd (server = Dedicated) in code below:
    EDIS =
      (DESCRIPTION =
        (ADDRESS_LIST =
          (ADDRESS = (PROTOCOL = TCP)(HOST = fulham)(PORT = 1521))
        (CONNECT_DATA =
        (SERVER = dedicated)
          (SERVICE_NAME = edis)
      )and it was fine
    Thanks for the reference.
    cube60
    Message was edited by:
    cube60

  • Query Help Please

    Hi... having problems with a query.  Any assistance would be
    much appreciated.
    Two queries with identical columns: Villages_Query_1 and Villages_Query_2.
    Both have these columns: Village_ID, Village_Name, Player_ID.
    I need to find all records in Villages_Query_2 where the Village_ID's match but the Player_ID's have changed.
    Example Village_Query_1
    Village_ID
    Village_Name
    Player_ID
    1
    Houston
    1
    2
    Dallas
    2
    3
    Chicago
    3
    Example Village_Query_2
    Village_ID
    Village_Name
    Player_ID
    1
    Houston
    1
    2
    Phoenix
    4
    3
    Chicago
    3
    4
    New York
    5
    In this case, Village_ID = 2, has changed names (Dallas to Phoenix) and the Player_ID has changed (2 to 4).  In addition, a new record was added.
    The eventual output I need is to be able to report the following:
    Player 2 village "Dallas" was taken by Player 4 and renamed "Phoenix".
    New York is a new village owned by Player 5.
    How the heck do I do this??  I have been trying query after query... reading about query of queries and JOINS and and and... I am now completely confused.
    Help appreciated.
    Mark

    Well... firstly... you do not use MS Access for that volume of data.  Plain and simple.  MS Access is for DBs like "My CD collection".  It's a desktop application, and is not intended to be used other than as a desktop application.
    Part of the reason for it not being appropriate for the job is that it can't do things like bulk loading data, which is kinda what you're wanting to do here.  That aside, it's a single-user file-based DB which is simply not designed to work as the back-end for a web application (or any sort of serious application).
    Anyway, I would approach this by putting all the data from the CSV files into the DB as is.  Then on the DB run a query which gets all your changes.  You're really going to struggle with the suggestions here to use valueList() to generate a list that is then used for a NOT IN(#list here#), because you're likely to have a mighty long list there.  Even proper DBs like Oracle only allow 2000 entries in a list like that (SQL Server is about the same, from memory), so I doubt QoQ will allow even that.  The reason the DBs put limits on these things is that doing a WHERE IN (#list#) is a really poorly-performing process.
    If you've got all your data in the DB, then your query becomes pretty easy, and I'm sure even Access could cope with it.  it'd be something like this:
    SELECT VB.village_id, VB.village_name AS village_old_name, VB.player_id AS player_old_id,
    VU.village_id AS village_new_id, VU.village_name AS village_new_name, VU.player_id as player_new_id
    FROM villages_base VB
    RIGHT OUTER JOIN villages_updates VU
    ON VB.village_id = VU.village_id
    WHERE VB.village_name != VU.village_name
    (that's untested and I only gave it about 1min thought before typing it in, so don't quote me on that!)
    Where VILLAGE_BASE is your original data, and VILLAGE_UPDATES is the data that indicates the changes.  I'm kinda guessing that this is the sort of thing you want.  Note: the "new" villages will be the ones which have NULLs for the village_id, village_old_name and player_old_id.
    Getting all the data into the DB is going to be a matter of looping over the CSV file and doing an INSERT for each row.  And that will take as long as it takes, so you might need to get some control over your request timeouts.  However doing these inserts will take less time than all the QoQ logic suggested before, so you might be OK.  And the query should be quick.
    What happens to the data once the report is written?  Does the "updated" data become the "live" data?  If so, after you run your report you're gonna want to do something like a TRUNCATE on villages_base, and INSERT all the records from villages_update into it (then TRUNCATE villages_update, ready for the next time you need to run this process).  Although don't take my word for it here, as I'm guessing your requirement here ;-)
    Adam

  • How to hide a single value node hierarchy in the query designer

    Hi,
    I would like to perform a calculation  in the infocube. It concern  a count "Nb of different customers" with the following hierarchy : Country / region / City.
    This KF would be read at the same time by city, region and country, and  certain customers exist in 2 different region, and I have to count them only once by country.
    To do this, I added a new record (at the country level when loading infocube) to deduct a number of customers in doubloons.
    This line added contain only a country code and the number of cutomers in doubloons (- 23 when I have 23 cutomers in doublons) with revrese sign.
    Now, I would like to use this value (-23) to have a right Nb of different customers by country, without showing the not assigned line in the report hierarchy.
    How can I hide, in the query designer or the WAD, a single value node from hierarchy (NOT assigned) but take in account the value (-23) at the country level ?
    Is there another solution (more simple) to perform this in the infocube ?
    For information,
    1) The infocube is loaded once per month, and the KF (Nb of different customers" is also performed by month.
    2) I can't perform this in the report (performance problem with calculated KF).
    thx.

    Hi,
    the short answer is "you can't", the longer is: The ADF Faces tree component is stamped for better performance. This stamping means that nodes are not rendered as JSF objects but copies of an existing component (just with a changed value). For this reason the nodes don't issue their own query for data and thus cannot be refreshed individually
    Frank

  • Same query, same dataset, same ddl setup, but wildly different explain plan

    Hello o fountains of oracle knowledge!
    We have a problem that caused us a full stop when rolling out a new version of our system to a customer and a whole Sunday to boot.
    The scenario is as follows:
    1. An previous version database schema
    2. The current version database schema
    3. A migration script to migrate the old schema to the new
    So we perform the following migration:
    1. Export the previous version database schema
    2. Import into a new schema called schema_old
    3. Create a new schema called schema_new
    4. Run migration script which creates objects, copies data, creates indexes etc etc in schema_new
    The migration runs fine in all environments (development, test and production)
    In our development and test environments performance is stellar, on the customer production server the performance is terrible.
    This using the exact same export file (from the production environment) and performing the exact same steps with the exact same migration script.
    Database version is 10.2.0.1.0 EE on all databases. OS is Microsoft Windows Server 2003 EE SP2 on all servers.
    The system is not in any sense under a heavy load (we have tested with no other load than ourselves).
    Looking at the explain plan for a query that is run frequently and does not use bind variables we see wildly different explain plans.
    The explain plan cost on our development and test servers is estimated to *7* for this query and there are no full table scans.
    On the production server the cost is *8433* and there are two full table scans of which one is on the largest table.
    We have tried to run analyse on all objects with very little effect. The plan changed very slightly, but still includes the two full table scans on the problem server and the cost is still the same.
    All tables and indexes are identical (including storage options), created from the same migration script.
    I am currently at loss for where to look? What can be causing this? I assume this could be caused by some parameter that is set on the server, but I don't know what to look for.
    I would be very grateful for any pointers.
    Thanks,
    Håkon

    Thank you for your answer.
    We collected statistics only after we determined that the production server where not behaving according to expectations.
    In this case we used TOAD and the tool within to collect statistics for all objects. We used 'Analyze' and 'Compute Statistics' options.
    I am not an expert, so sorry if this is too naive an approach.
    Here is the query:SELECT count(0)  
    FROM score_result sr, web_scorecard sc, product p
    WHERE sr.score_final_decision like 'VENT%'  
    AND sc.CREDIT_APPLICATION_ID = sr.CREDIT_APPLICATION_ID  
    AND sc.application_complete='Y'   
    AND p.product = sc.web_product   
    AND p.inactive_product = '2' ;I use this as an example, but the problem exists for virtually all queries.
    The output from the 'good' server:
    | Id  | Operation                      | Name                  | Rows  | Bytes | Cost (%CPU)|
    |   0 | SELECT STATEMENT               |                       |     1 |    39 |     7   (0)|
    |   1 |  SORT AGGREGATE                |                       |     1 |    39 |            |
    |   2 |   NESTED LOOPS                 |                       |     1 |    39 |     7   (0)|
    |   3 |    NESTED LOOPS                |                       |     1 |    30 |     6   (0)|
    |   4 |     TABLE ACCESS BY INDEX ROWID| SCORE_RESULT          |     1 |    17 |     4   (0)|
    |   5 |      INDEX RANGE SCAN          | SR_FINAL_DECISION_IDX |     1 |       |     3   (0)|
    |   6 |     TABLE ACCESS BY INDEX ROWID| WEB_SCORECARD         |     1 |    13 |     2   (0)|
    |   7 |      INDEX UNIQUE SCAN         | WEB_SCORECARD_PK      |     1 |       |     1   (0)|
    |   8 |    TABLE ACCESS BY INDEX ROWID | PRODUCT               |     1 |     9 |     1   (0)|
    |   9 |     INDEX UNIQUE SCAN          | PK_PRODUCT            |     1 |       |     0   (0)|
    ---------------------------------------------------------------------------------------------The output from the 'bad' server:
    | Id  | Operation                 | Name                  | Rows  | Bytes | Cost (%CPU)|
    |   0 | SELECT STATEMENT          |                       |     1 |    32 |  8344   (3)|
    |   1 |  SORT AGGREGATE           |                       |     1 |    32 |            |
    |   2 |   HASH JOIN               |                       | 10887 |   340K|  8344   (3)|
    |   3 |    TABLE ACCESS FULL      | PRODUCT               |     6 |    42 |     3   (0)|
    |   4 |    HASH JOIN              |                       | 34381 |   839K|  8340   (3)|
    |   5 |     VIEW                  | index$_join$_001      | 34381 |   503K|  2193   (3)|
    |   6 |      HASH JOIN            |                       |       |       |            |
    |   7 |       INDEX RANGE SCAN    | SR_FINAL_DECISION_IDX | 34381 |   503K|   280   (3)|
    |   8 |       INDEX FAST FULL SCAN| SCORE_RESULT_PK       | 34381 |   503K|  1371   (2)|
    |   9 |     TABLE ACCESS FULL     | WEB_SCORECARD         |   489K|  4782K|  6137   (4)|
    ----------------------------------------------------------------------------------------I hope the formatting makes this readable.
    Stats (from SQL Developer), good table:NUM_ROWS     489716
    BLOCKS     27198
    AVG_ROW_LEN     312
    SAMPLE_SIZE     489716
    LAST_ANALYZED     15.12.2009
    LAST_ANALYZED_SINCE     15.12.2009Stats (from SQL Developer), bad table:
    NUM_ROWS     489716
    BLOCKS     27199
    AVG_ROW_LEN     395
    SAMPLE_SIZE     489716
    LAST_ANALYZED     17.12.2009
    LAST_ANALYZED_SINCE     17.12.2009I'm unsure what would cause the difference in average row length.
    I could obviously try to tune our sql-statements to work on the server not behaving better, but I would rather understand why they are different and make sure that we can expect similar behaviour between environments.
    Thank you again for trying to help me.
    Håkon
    Edited by: ergates on 17.des.2009 05:57
    Edited by: ergates on 17.des.2009 06:02

  • Error in select query

    hi experts,
    for my smartform this is the driver program i have written.
    when  exucuted i  am getting a error message in teh third select quirey i have written which is in  bold here.
    the actuall error is coming here <b>WHERE objek = gt_resb-charg</b>
    the error is like 'both should be same lenghth and type'.
    could u plz suggest me how to correct this.
                     Tables
    TABLES:vekp,      "Handling Unit - Header Table
           vevw,      "Where-Used List for Handling Units
           afvc,      "Operation within an order
           resb,      "Reservation/dependent requirements
           makt,      "Material Descriptions
           ausp.      "Characteristic Values
    TYPES:BEGIN OF ty_vekp,
          exidv    TYPE vekp-exidv,
          handle   TYPE vekp-handle,
          vpobjkey TYPE vekp-vpobjkey,
          END OF ty_vekp.
    TYPES:BEGIN OF ty_vevw,
          objkey   TYPE vevw-objkey,
          handle   TYPE vevw-handle,
          END OF ty_vevw.
    TYPES:BEGIN OF ty_afvc,
          objnr    TYPE afvc-objnr,
          ltxa1    TYPE afvc-ltxa1,
          aufpl    TYPE afvc-aufpl,
          aplzl    TYPE afvc-aplzl,
          END OF ty_afvc.
                     Internal tables Declaration
    DATA: gt_resb TYPE STANDARD TABLE OF resb ,
           gt_makt TYPE STANDARD TABLE OF makt ,
           gt_ausp TYPE STANDARD TABLE OF ausp .
                     Work area Declaration
    DATA:wa_vekp      TYPE ty_vekp,
         wa_vevw      TYPE ty_vevw,
         wa_afvc      TYPE ty_afvc.
                            Variables                                   *
    DATA : lf_fmname          TYPE rs38l_fnam.
    CONSTANTS: c_fmname(25) TYPE c VALUE  'ZWM_HANDLING_UNIT_LABEL'.
                        S e l e c t – O p t i o n s                     *
                            P a r a m e t e r s                         *
    SELECTION-SCREEN BEGIN OF BLOCK block1 WITH FRAME TITLE TEXT-001.
    PARAMETERS: p_exidv  LIKE vekp-exidv,         "Handling Unit Number
                p_objkey LIKE vevw-objkey,        "Process Order Number
                p_ltxa1  LIKE afvc-ltxa1,         "Phase Description
                p_copies(3) TYPE N.               "NO of Copies to be print
    SELECTION-SCREEN END OF BLOCK block1.
                     AT SELECTION-SCREEN
    AT SELECTION-SCREEN.
    IF p_exidv IS NOT INITIAL.
    SELECT single exidv
                  handle
                  vpobjkey
                  FROM vekp
                  INTO wa_vekp
                  WHERE exidv = p_exidv.
    ENDIF.
    IF p_objkey IS NOT INITIAL.
    SELECT single objkey
                  FROM vevw
                  INTO wa_vevw
                  WHERE handle = wa_vekp-handle
                  AND objkey   = wa_vekp-vpobjkey
                  AND objkey   = p_objkey.
    ENDIF.
    IF p_ltxa1   IS NOT INITIAL.
    SELECT single objnr
                  ltxa1
                  aufpl
                  aplzl
                  FROM afvc
                  INTO wa_afvc
                  WHERE objnr = wa_vevw-objkey
                  AND ltxa1   = p_ltxa1  .
    ENDIF.
                     START-OF-SELECTION
    START-OF-SELECTION.
    SELECT * FROM resb
             INTO TABLE gt_resb
             WHERE objnr = wa_afvc-objnr
             AND   aufpl = wa_afvc-aufpl
             AND   aplzl = wa_afvc-aplzl.
    IF NOT gt_resb[] IS INITIAL.
    SELECT * FROM makt
                  INTO TABLE gt_makt
                  FOR ALL ENTRIES IN gt_resb
                  WHERE matnr = gt_resb-matnr.
    ENDIF.
    <b>IF NOT gt_resb[] IS INITIAL.
    SELECT * FROM ausp
             INTO TABLE gt_ausp
             FOR ALL ENTRIES IN gt_resb
             WHERE objek = gt_resb-charg
             AND atinn IN ('LOBM_VFDAT','LOBM_QNDAT').
          </b> ENDIF.
      CALL FUNCTION 'SSF_FUNCTION_MODULE_NAME'
        EXPORTING
          FORMNAME                = c_fmname
        IMPORTING
         FM_NAME                  = lf_fmname
       EXCEPTIONS
         NO_FORM                  = 1
         NO_FUNCTION_MODULE       = 2
         OTHERS                   = 3.
    regards,
    siri.

    IF NOT gt_resb[] IS INITIAL.
    SELECT * FROM ausp
    INTO TABLE gt_ausp
    FOR ALL ENTRIES IN gt_resb
    WHERE <b>objek = gt_resb-charg</b>
    AND atinn IN ('LOBM_VFDAT','LOBM_QNDAT').
    OBJEK length is 50 and charg length is 10. So thats why your query is giving error.
    declare the gt_resb table by using.
    data: begin of gt_resb occurs 0,
    <b> charg like ausp-OBJEK,</b>
    all other fields  
         end of gt_resb.
    Regards,

  • RC 8 error while transporting the query from Dev to Qa - Element missing

    Hi All,
    I have made a multiprovider by copying one existing one.
    I have copies one of the Bex queries using RSZC from the source to Target MP.
    While transporting the query from Dev to QA i have got the following error. (I have already moved the Multi Prov in a previous TR) 
    Has anyone seen similar types before. Any inputs will be highly appreciated.
    Start of the after-import method RS_ELEM_AFTER_IMPORT for object type(s) ELEM (Activation Mode)     
    Error when activating element 4LUSZK561QN74UDBJ7BPU9GFM                                             
    Element 4LUSZJXHIS1HM7TVDD9DK7HPU is missing in version M                                           
    End of after import methode RS_ELEM_AFTER_IMPORT (Activation Mode) - runtime: 00:00:06                                                                               
    Start of the after-import method RS_ELEM_AFTER_IMPORT for object type(s) ELEM (Delete Mode)         
    End of after import methode RS_ELEM_AFTER_IMPORT (Delete Mode) - runtime: 00:00:14                  
    Errors occurred during post-handling RS_AFTER_IMPORT for ELEM L                                     
    RS_AFTER_IMPORT belongs to package RS                                                               
    The errors affect the following components:                                                         
       BW-WHM (Warehouse Management)                                                                    
    Post-import method RS_AFTER_IMPORT completed for ELEM L, date and time: 20110517070045              
    Post-import methods of change/transport request BDAK959603 completed                                
         Start of subsequent processing ... 20110517070025                                              
         End of subsequent processing... 20110517070045                                            
    Kind regards,

    Hi Jith,
    First of all try to consolidate all your objects in a single TR and then move to Q from D.
    In this case check for the list of objects that you have in your TRs. If TR 3 have all the Objects that were in TR1, TR2, then transporting TR3 alone will work.
    Also, you can find the information related to the elements that were missed in your TR1,TR2 by following process.
    1. Go to your transport Logs and then to the entries marked as Error.
    2. There you will find the Query Element Ids, copy them
    3. Now, go to Table(Se16) RSZELTDIR, there enter those query elements.
    Now, you can able to find the elements that you have missed in your TRs. Hope this helps you.

  • Posting Period coming into the rows at the query runtime & couldn't drop it

    Hi All,
    We have a query on an InfoCube that we have copied to the Multiprovider containing two cubes, exact copies of the original cubes.
    Two cubes are one for the previous years and one for the current year.
    Now, the query is copied from original cube to the MultiCube.
    The query looks exactly the same in the Query Designer. But when run in Analyzer or on Web, Posting Period is coming into the rows automatically and in Analyzer, when I double click on the Posting Period to take it away from the report, it doesn't go away. Similarly in web, I do not have a option 'Remove Drilldown from rows', against posting period. I have identified posting period in both the cubes in the MultiCube configuration.
    Thanks in advance for any suggestions that you have.
    Best Regards,
    - Shashi

    hmm, we have now something interesting....
    but this means that you have two structures defined in your query right?
    Is your query cell based? It looks to me that this is the case; you have elements using created using the posting period thus it can be moved to the free chars and must be included in the drilldown.
    Remove the posting period again and perform a check before saving the query; this may give the element defined with your posting period.
    hope this helps...
    Olivier.

  • Derivation Of partner profit center in COPA (Direct Posting from FI)

    When posting from SD partner profit center is derived and transfered to COPA, so eliminations can take place.
    When ever we made posting from SD, Partner profit center is coming from derivation rule which is exclusively for sales and maintained in transaction code 8KER.But whenever we made direct posting in FI, Data is transferred to COPA before profit center is derived as system does not find any data from COPA or in other words there is no any Derivation rule or table lookup or user exists in COPA from where we can derive the Partner profit center.
    For testing purpose I made a derivation rule in COPA in KEDR with the same logic used in 8KER, Partner profit center is coming in COPA from this derivation rule. Derivation logic used in 8KER  is very simple : source filed is KUNNR from table KNA1 and target Field is Partner Profit Center : PPRCTR from table CEPC.
    I mean that i derive the partner profit center from customer number in 8KER transaction which is only for SALES and in EC-PCA only.
    Now whenever I do a direct posting from FI, suppose using transaction code F-02
    but partner profit center is not coming. I made a another derivation rule in KEDR in COPA and maintained the rule values there and I got the success for deriving the partner profit center.
    But my query is that If I am maintaing rule values in 8KER then why I maintain the same duplicate data in KEDR.Can we use the same rule values from 8KER for direct posting of FI.
    I have tried with Table lookup, derivation rule in KEDR but not getting any success.
    Please tell me that without maintaing rule values in KEDR (as these are already maintain in 8KER)how I can derive the partner profit center from FI posting.Or in other words i can say that how we can get both PCA as well as COPA document with partner profit center in case of FI direct posting.
    Thanks in advance.
    With best Regards
    Sandeep Seth
    [email protected]
    Thanks in advance

    Dear Paolo
    we have live system, we also have cost center accounting active.
    do u know how to active automatic posting in Profit segment through cost center.
    For Example, i post document in FB60, i assign cost center against GL expense account, if i assign profit segment the amount reflects in COPA, but i want when i assign required cost center it automatically active the profit segment ?
    hope you under stand my query!!!
    Regards
    MuR!!!!

  • Changing value fields in COPA

    Hello,
    I want to change :
    - the text description for 2 value fields in COPA.
    - These value fields are assigned to the value field group Z002 (in the T code KEVFG), and I want to change this assignment
    by linking them to the value field group Z001.
    These 2 value fields are not any more used for 2 years. Can this change have a lot of impact in the production system?. Shall I need to generate again the operating concern after these changes?
    Thanks in advance
    Barondra BERGET

    Hi,
    If you want to change the description of a characteristic
    or value field you must change the descriptions of the related
    data element.
    Therefore you need to perform the following steps:
    1. In KEA5 / KEA6, type in the name of the characteristic/
       value field and click on the create / change button
    2. On the next screen, double click on the data element of your
       characteristic/value field
    3. On the 'Display Data Element' screen, click on the tab strip
       "Field label".  You can see the four field lables short, medium, long
       and heading.  You should change all four field labels and then
       regenerate the data element.  After having done this you may have to
       to regenerate the characteristic/value field and the operating
       concern.
       I hope this answers your query. If so please close this message by
    pressing the confirm button. Else do not hesitate to get back for more
    clarifications.
    Kind Regards,
    Abhisek Patnaik,

Maybe you are looking for

  • No microphone while mix

    I can't seem to get volume through front or rear mic jacks when I'm mixing li've on the net. People saying? they hear the music fine but my voice is very low. I have mic volume all the way up but no change in volume. I tried recording voice in what y

  • I would like to do a reformat of my Macbook Pro

    I purchased a secound macbookk pro form a friend I would like to do a reformat or clean install (back to Factory settings( is this a big job or what do I do ?

  • Tracks syncing with different names onto iphone

    Just transferred some new CDs onto my itunes (10.2.1), which is running on my imac with mac os x 10.6.7, but when i sync my iphone and ipad some of the songs from the new CDs play a different song from the same album, not the one that is listed. Its

  • How to reset password in osx lion

    i have forgot my password ... :-(  can i reset it with out reinstalling? im using 10.7.1 please help...!!!!   i dont have 10.6 dvd...

  • Having trouble opening programs

    Recently, different programs are crashing or will not open at all. I ran Disk Utility and everything seems fine, yet the prolems persists. If I try to re-install a program or utility I get the followng error: "/var/folders/_x/26bcszdj38q8v_901tmgf_yh