FI Quereis

HI FI Gurus,
Can anyone reply for the below queries.We have ECC 6.0 .
Separate Balance Sheets - Can we get separate balance sheets for four divisions without quadrupling our vendor and customer databases?  In other words, can a single vendor (customer) be associated with more than one AP (AR) control account based on some other criteria such as warehouse location?
Inter-company Cash - Assuming that divisional balance sheets are possible (i.e above question is possibel), how are inter-company cash transactions handled?  Example:  Division one cuts a single check for an expense that affects multiple divisions.
GL Account Definitions - Currently we have the option to manage by warehouse or by product group.  Can we manage by both?  The goal is to drive the general ledger both by division and by product group.  (COGS, Revenues, Inventory)
Regards,
L Srikanthan.

Q1/ Yes is the answer. However you can only do this if you can define this by Profit Centers and you chose to use the new GL.
Q2/ This is standard cash comes in for multi Profit Centers and then the allocation will break out the PC allocation for the single payment.
Q3/ I presume it is possible, it all depends on how you define your GL accounts, PC's and so on. Also look into CO-PA.
Please award points if this is useful.

Similar Messages

  • Transport of changes to quereis and CKFs & RKFs

    Hi Gurus,
    I have a Multicube and a  number of queries that has been transported to HIQ already.  I have made a significant changes to these queries in HID, i.e. chges to CKFs, RKFs, including description changes as well as deleting some KFs, and additional KFs.
    I have tried to transport to HIQ by collecting these specific queries and all below and putting the multicube in a separate transport with only necessary objects.
    I transported the Cube first and then all 25 queries in one transport.
    My problem is that in HIQ I have RKFs & CKFs and Queries that I have deleted or renamed in HID.  How do I get both enviroments back in sync ?
    Thanks for your help.
    Laura

    Hi,
    1) Do I need a separate transport for each query ?
    No you can capture multiple queries in one transport
    2) Do I include the Multiprovider in one of these transports ? and/or the Cubes in the MultiCube ?
    No, there is no need to include multiprovider or cube.
    -Mayuri

  • Observing poor performance on the execution of the quereis

    I am executing a relatively simple query which is rougly taking about 48-50 seconds to execute. Can someone suggest an alternate way to query the semantic model where we can achieve response time of a second or under. Here is the query
    PREFIX bp:<http://www.biopax.org/release/biopax-level3.owl#>
    PREFIX rdf:<http://www.w3.org/1999/02/22-rdf-syntax-ns#> PREFIX rdfs:<http://www.w3.org/2000/01/rdf-schema#>
    PREFIX ORACLE_SEM_FS_NS:<http://oracle.com/semtech#dop=24,RESULT_CACHE,leading(t0,t1,t2)>
    SELECT distinct ?entityId ?predicate ?object
    WHERE
    ?entityId rdf:type bp:Gene .
    ?entityId bp:name ?x .
    ?entityId bp:displayName ?y .
    ?entityId ?predicate ?object .
    FILTER(regex(?x, "GB035698", "i")||regex(?y, "GB035698", "i"))
    Same query executed from sqldeveloper takes about as long as well
    SELECT distinct /*+ parallel(24) */subject,p,o
    FROM TABLE
    (sem_match ( '{?subject rdf:type bp:Gene .
    ?subject bp:name ?x .
    ?subject bp:displayName ?y .
    ?subject ?p ?o
    filter (regex(?x, "GB035698", "i")||regex(?y, "GB035698", "i") )
    sem_models ('biopek'),
    null,
    sem_aliases
    ( sem_alias
    ('bp',
    'http://www.biopax.org/release/biopax-level3.owl#'
    NULL,
    null,null ))
    Is there anything I am missing, can we do anything to optimize our data retrieval?
    Best Regards,
    Ami

    For better performance when using FILTER involving regular expression, you may want to create a full-text index on MDSYS.RDF_VALUE$ table as described in:
    http://download.oracle.com/docs/cd/E11882_01/appdev.112/e11828/sdo_rdf_concepts.htm#CIHJCHBJ
    I am assuming that you are checking for case-insensitive occurrence of the string GB035698 in ?x or ?y. (On the other hand if you are checking if ?x or ?y is equal to a case-insensitive form of the string GB035698, then the filter could be written in an expanded form involving just value-equality checks and would not need a full-text index for performance.)
    Thanks.

  • Some quereis in GL

    Hi,
    Can any one send me the examples for the below said transactions, what type of accounts we can use for this
    F-04 Post with clearing
    F-06 Incoming Payment
    F-07 Outgoing Payment
    F-05 Valuate Foreign currency
    What is the use of the below ones, Pls explain
    Manual Accruals
    Planning
    Rgds
    sunfico

    Incoming payments are used to receive payments from Customers
    and outgoing payments used to make all payments to vendors , etc. Basically they clear the open items.
    if we want to clear the open items other than the above way, post with clearing will be used. Example, Credit notes , debit notes etc
    Valuate foreign currency- Some times the GL accounts will be managed in Foreign currency E.G loans from Foreign banks will be managed in USD only. So the liability will be in USD only. But for Reporting purposes, we need to convert the same to reporting currency at prevailing exchange rate. This is facilitated by this transaction
    Manual accruals- Period end adjustments such as the interst income to be received but not yet received, salaries to be paid but not paid etc will be done throug T-Code FBS1.
    Thanks
    Siva
    Siva

  • Help required Basic question !! pls ans

    want to know
    1)How much Statement I can use in my jdbc for queriny in numbers of tables
    by one jsp
    2)do i require for all queries new statement and result set
    3)Is their a performance issue if i create more than one statement in
    one jsp file
    thanks

    Only thing you want to make sure when doing multiple queries, is closing. You want to make sure you call the close() method on all of your ResultSet and Statement/PreparedStatement objects when finished. Since you want to close these objects if there is a db error or not, a finally clause is the best place for it.
    java.sql.Connection dbc = null ;
    java.sql.Statement cursor = null ;
    java.sql.ResultSet data = null ;
    try {
    .....create connection, do query etc.
    }catch(SQLException e)
    ....Error handling
    finally
       try {
          if( data != null )
              data.close() ;
       }catch(Exception e){}
       ..do the same with each statement and connection as well.
    }Make sure when doing quereis in a loop that you close the previous statement object before creating a new one with dbc.createStatement() ;

  • BPM Scenario Design Doubts

    Hi Expert,
    My source system (3rd party) calls a RFC service, based on some credentials the RFC program triggers a text doucment in the zip file format on a specified location. 
    I need to send the response and as well as the file (zip) and the size of the file will be approx 50MB. How do I design my scenario? Is BPM advisbale?
    Anticipating your valuable inputs..
    Warm Regards

    Hi Asha,
      Could you explain me your scenario with clearly....As i understood that third party system is hitting xi and xi will communicat to the receiver system. In the receiver system we have RFC. The RFC will give you only the messae as in the form of response....if you wan to send the zip file I guess we should send first to the one receiver system (file system) fromt there we can send to the third party system.Of course we have to use BPM to achieve this.
      Hope I am clear.Please let me know if you have any quereis.
    Thanks and Regards,
    Chandu
    Message was edited by:
            Chandu

  • What intenstion Sony developed Smart Headset and Live hi fi Headset

    Hi All, I tested the sound quality in  Smart Headset with xperia S and Live hi fi Headset with Xperia Ray and i noticed that there is no clear sound in both tests and i didn't know why Sony developed these headset that too be  very expencive, as a Sony user should get much quality(Bass with Treble) compare original headset. Am feeling good quality in orginal headset which is coming along with phones.
    Developers of these headset to be answer my quereis, Guys really are feeling good quality in these headsets please tell me how you are getting good quality.I will modify the tag line.

    There are many things that plays a role here. Phone model, bit rate, sample freq, music player used.
    But one very important thing when it comes to Smart Headset is the earplugs. I suggest that you try all sizes to find the one that suits you the best and gives you the deepest bass. Because in-ear headphones is all about enclosure.
     - Community Manager Sony Xperia Support Forum
    If you're new to our forums make sure that you have read our Discussion guidelines.
    If you want to get in touch with the local support team for your country please visit our contact page.

  • Steps to be done by developer in server movement physically

    Hi All,
    We are moving BW server from one place to another place physically. The basis team is taking care of all movements. As BW developer what are the steps I needs to be done?
    Like should we delete all delta queue & data mart status? I need some complete guide lines on this. This is high critical project to our company.
    So send me any link or document if you did already.
    Thanks in advance.
    Arun Thangaraj

    As Ajay said, all you should have to do is make sure you're not running batch jobs or loads when the server is brought down.  I think it would be prudent to clear the delta queues and have your load cycle finished, but it shouldn't matter, but you never know what might go wrong.
    Obviously you need to think about user impacts, letting people know what is going on and when the system won't be available.
    If you have enabled some quereis as web services, you may want to make sure you know what might be impacted.  It's possible others getdata from your BW from a web service and don't realize it is even coming form BW.
    If you send / receive data from other systems, you need to coordinate to make sure none are adversely effected.
    If you receive deltas from non R/3 systems, you might want to be sure that they will not have a problem if your BW system is down, e.g. if the BW would be down for more than a day, will the other system accumlate the deltas properly.

  • Is It  POSSIBLE to change the Technical Name of Z-version Query

    Hi  Friends ,
    I was changed all quereis and cube from standard version to Z version.
    For queries, by using rszc , i was changed to Zversion,Now again is it
    possible to change the Technical Name of Zversion Query ?
    If possible , can you please tell me the process ?

    Hi Hameed,
    This may be the reason you are getting an error.
    The target InfoCube, the InfoCube for the query copies, must contain all the InfoObjects of the source InfoCube (InfoCube of the original queries).
    The another reasons may be the copying of queries within the same cube shouldn't be done by RSZC.
    It's better to approach the Bex Query designer.
    Hope you understood..
    Check the link below for more information:
    Re: How to copy query elements without Bex
    http://www.bi-expertonline.com/article.cfm?session=&id=2055
    Re: copy queries + variants + workbooks -- RSZC ?
    Regards,
    Ravi Kanth
    Edited by: Ravi kanth on Jun 10, 2009 10:03 AM

  • "Check Statistics" in the Performance tab. How to see SELECT statement?

    Hi,
    In a previous mail on SDN, it was explained (see below) that the "Check Statistics" in the Performance tab, under Manage in the context of a cube, executes the SELECT stament below.
    Would you happen to know how to see the SELECT statements that the "Check Statistics" command executes as mentioned in the posting below?
    Thanks
    ====================================
    When you hit the Check Statistics tab, it isn't just the fact tables that are checked, but also all master data tables for all the InfoObjects (characteristics) that are in the cubes dimensions.
    Checking nbr of rows inserted, last analyzed dates, etc.
    SELECT
    T.TABLE_NAME, M.PARTITION_NAME, TO_CHAR (T.LAST_ANALYZED, 'YYYYMMDDHH24MISS'), T.NUM_ROWS,
    M.INSERTS, M.UPDATES, M.DELETES, M.TRUNCATED
    FROM
    USER_TABLES T LEFT OUTER JOIN USER_TAB_MODIFICATIONS M ON T.TABLE_NAME = M.TABLE_NAME
    WHERE
    T.TABLE_NAME = '/BI0/PWBS_ELEMT' AND M.PARTITION_NAME IS NULL
    When you Refresh the stats, all the tables that need stats refreshed, are refreshed again. SInce InfoCube queries access the various master data tables in quereis, it makes sense that SAP would check their status.
    In looking at some of the results in 7.0, I'm not sure that the 30 day check is being doen as it was in 3.5. This is one area SAP retooled quite a bit.
    Yellow only indicates that there could be a problem. You could have stale DB stats on a table, but if they don't cause the DB optimizer to choose a poor execution plan, then it has no impact.
    Good DB stats are vital to query performance and old stats could be responsible for poor performance. I'm just syaing that the Statistics check yellow light status is not a definitive indicator.
    If your DBA has BRCONNECT running daily, you really should not have to worry about stats collection on the BW side except in cases immediately after large loads /deletes, and the nightly BRCONNECT hasn't run.
    BRCONNECT should produce a lof every time it runs showing you all the tables that it determeined should have stats refreshed. That might be worth a review. It should be running daily. If it is not being run, then you need to look at running stats collection from the BW side, either in Process Chains or via InfoCube automatisms.
    Best bet is to use ST04 to get Explain Plans of a poor running InfoCube query and then it can be reviewed to see where the time is being spent and whether stats ate a culprit.

    Hi,
    Thanks, this is what I came up with:
    st05,
    check SQL Trace, Activate Trace
    Now, in Rsa1
    on Cube, Cube1,
    Manage, Performance tab, Check Statistics
    Again, back to st05
    Deactivate Trace
    then click on Displace Trace
    Now, in the trace display, after scanning through  the output,
    “ … how do I see the SELECT statements that the "Check Statistics" command executes …”
    I will appreciate your help.

  • Sample select query needed

    hi
    sample table can be considered as scott.emp table
      CREATE TABLE "EMP"
       (     "EMPNO" NUMBER(4,0),
         "ENAME" VARCHAR2(10 BYTE),
         "JOB" VARCHAR2(9 BYTE),
         "MGR" NUMBER(4,0),
         "HIREDATE" DATE,
         "SAL" NUMBER(7,2),
         "DEPTNO" NUMBER(2,0)
    -- INSERTING into EMP
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7369,'SMITH','CLERK',7902,to_date('17-DEC-80','DD-MON-RR'),800,20);
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7499,'ALLEN','SALESMAN',7698,to_date('20-FEB-81','DD-MON-RR'),1600,30);
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7521,'WARD','SALESMAN',7698,to_date('22-FEB-81','DD-MON-RR'),1250,30);
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7566,'JONES','MANAGER',7839,to_date('02-APR-81','DD-MON-RR'),2975,20);
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7654,'MARTIN','SALESMAN',7698,to_date('28-SEP-81','DD-MON-RR'),1250,30);
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7698,'BLAKE','MANAGER',7839,to_date('01-MAY-81','DD-MON-RR'),2850,30);
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7782,'CLARK','MANAGER',7839,to_date('09-JUN-81','DD-MON-RR'),2450,10);
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7788,'SCOTT','ANALYST',7566,to_date('19-APR-87','DD-MON-RR'),3000,20);
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7839,'KING','PRESIDENT',null,to_date('17-NOV-81','DD-MON-RR'),5000,10);
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7844,'TURNER','SALESMAN',7698,to_date('08-SEP-81','DD-MON-RR'),1500,30);
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7876,'ADAMS','CLERK',7788,to_date('23-MAY-87','DD-MON-RR'),1100,20);
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7900,'JAMES','CLERK',7698,to_date('03-DEC-81','DD-MON-RR'),950,30);
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7902,'FORD','ANALYST',7566,to_date('03-DEC-81','DD-MON-RR'),3000,20);
    Insert into EMP (EMPNO,ENAME,JOB,MGR,HIREDATE,SAL,DEPTNO) values (7934,'MILLER','CLERK',7782,to_date('23-JAN-82','DD-MON-RR'),1300,10); In this we can see we do not have any data for the hiredate between '01-feb-81' and '20-feb-81' . But the select statement must retrun
    values in between those days too with ename,job AS NULL and empno, mgr,sal as 0,depno can be 0 or any number.
    i have tried something and i got the count for few columns . THe query goes something like this.
    this works on a given date range . Both queries given below
    select rt.business_date,
    ( select count(ename) from emp  where trunc(hiredate) = trunc(rt.business_date) )  ename ,
    ( select count(empno) from emp  where trunc(hiredate) = trunc(rt.business_date) )  empno ,
    ( select count(job) from emp  where trunc(hiredate) = trunc(rt.business_date) )  job,
    ( select count(mgr) from emp  where trunc(hiredate) = trunc(rt.business_date) )  mgr ,
    ( select count(deptno) from emp  where trunc(hiredate) = trunc(rt.business_date) )  deptno
    FROM (select ((TO_DATE(p_startdate,'mm/dd/yyyy')-1)+rnm) as business_date from (select rownum rnm from user_objects)) rt
                 WHERE TRUNC(rt.business_date) BETWEEN TO_DATE(p_startdate,'mm/dd/yyyy') AND  TO_DATE(p_enddate,'mm/dd/yyyy')
            SAMPLE OUTPUT FROM select statement has to be something like this :
    empno     ename      job     mgr     HIREDATE     SAL     DEPTNO     
    7369     SMITH     CLERK     7902     17-Dec-80     800     20     
    0     NULL     NULL     0     14-Feb-81     0     0     
    0     NULL     NULL     0     15-Feb-81     0     0     
    0     NULL     NULL     0     16-Feb-81     0     0     
    0     NULL     NULL     0     17-Feb-81     0     0     
    0     NULL     NULL     0     18-Feb-81     0     0     
    0     NULL     NULL     0     19-Feb-81     0     0     
    7499     ALLEN     SALESMAN     7698     20-Feb-81     1600     30     
    7521     WARD     SALESMAN     7698     22-Feb-81     1250     30     
    7566     JONES     MANAGER     7839     2-Apr-81     2975     20     
    7698     BLAKE     MANAGER     7839     1-May-81     2850     30     
    7782     CLARK     MANAGER     7839     9-Jun-81     2450     10     
    7844     TURNER     SALESMAN     7698     8-Sep-81     1500     30     
    7654     MARTIN     SALESMAN     7698     28-Sep-81     1250     30     
    7839     KING     PRESIDENT     (null)     17-Nov-81     5000     10     
    7900     JAMES     CLERK     7698     3-Dec-81     950     30     
    7902     FORD     ANALYST     7566     3-Dec-81     3000     20     
    7934     MILLER     CLERK     7782     23-Jan-82     1300     10     
    7788     SCOTT     ANALYST     7566     19-Apr-87     3000     20     
    7876     ADAMS     CLERK     7788     23-May-87     1100     20     Edited by: sri on Oct 19, 2011 8:36 AM
    Edited by: sri on Oct 19, 2011 8:56 AM

    Hi,
    You changed your first message.
    sri wrote:
    hi,
    i want the table data to be displayed as shown . in the quereis we are getting count of those records for that date. But i want something different . if we can see some dates there are no records but in the select statement we need to display the records in the given range of dates.
    the out put can be cosnidered something like this:
    THe select statment need to display hiredates data for 14 - 20 feb as shown below and this we do not have in our base table emp.
    empno     ename      job     mgr     HIREDATE     SAL     DEPTNO     
    7369     SMITH     CLERK     7902     17-Dec-80     800     20     
    0     NULL     NULL     0     14-Feb-81     0     0     
    0     NULL     NULL     0     15-Feb-81     0     0     
    0     NULL     NULL     0     16-Feb-81     0     0     
    0     NULL     NULL     0     17-Feb-81     0     0     
    0     NULL     NULL     0     18-Feb-81     0     0     
    0     NULL     NULL     0     19-Feb-81     0     0     
    7499     ALLEN     SALESMAN     7698     20-Feb-81     1600     30     
    7521     WARD     SALESMAN     7698     22-Feb-81     1250     30     
    7566     JONES     MANAGER     7839     2-Apr-81     2975     20     
    7698     BLAKE     MANAGER     7839     1-May-81     2850     30     
    7782     CLARK     MANAGER     7839     9-Jun-81     2450     10     
    7844     TURNER     SALESMAN     7698     8-Sep-81     1500     30     
    7654     MARTIN     SALESMAN     7698     28-Sep-81     1250     30     
    7839     KING     PRESIDENT     (null)     17-Nov-81     5000     10     
    7900     JAMES     CLERK     7698     3-Dec-81     950     30     
    7902     FORD     ANALYST     7566     3-Dec-81     3000     20     
    7934     MILLER     CLERK     7782     23-Jan-82     1300     10     
    7788     SCOTT     ANALYST     7566     19-Apr-87     3000     20     
    7876     ADAMS     CLERK     7788     23-May-87     1100     20
    Use a FULL OUTER JOIN instead of LEFT OUTER JOIN, like this:
    WITH     all_dates  AS
         SELECT     startdate + LEVEL - 1     AS dt
         FROM     (
                   SELECT     TO_DATE ('02/14/1981', 'MM/DD/YYYY')     AS startdate
                   ,     TO_DATE ('02/20/1981', 'MM/DD/YYYY')     AS enddate
                   FROM     dual
         CONNECT BY     LEVEL <= 1+ enddate - startdate
    SELECT       NVL (e.empno, 0)          AS empno
    ,       e.ename
    ,       NVL (a.dt, e.hiredate)     AS hiredate
    --  add other columns
    FROM                 all_dates     a
    FULL OUTER JOIN      emp          e  ON  a.dt = e.hiredate
    ORDER BY  NVL (a.dt, e.hiredate)
    ;

  • Pr po analysis

    Dear All,
    Please find my requirement of a report for pr - po analysis.
    The report should be as below.
    out put columns
    PR , PR ITEM  NO, PR CREATED ON, PR CREATED BY , PR FINAL RELEASED DATE , PR QUANTITY , DELIVERY DATE , PO,PO CREATED ON, PO CREATED BY , PO FINAL RELEASED DATE, PO QTY, PO DELIVERY DATE,BALANCE TO BE ORDERED , GAP(NO OF DAYS BETWEEN PR FINAL RELEASE DATE & PO FINAL RELEASE DATE),GR QTY, GR DATE , BALANCE GR QTY.
    Input fields
    PR , PR ITEM ,PR CREATED BY,PR FINAL RELEASE DATE, PO, PO LINE ITEM , PO CREATED BY, PO FINAL RELEASE DATE.
    Please provide your vauable inputs as FS should be (ie input fields, out put fields , logic).
    Regards,

    Hi,
    As per your requirement for FS for the PR-PO Analysis you need to give the table name and ref field name . You should specify whether it should b ALV report or Module pool . Ask your ABAPER.
    example : For Input fields:
    Table Name    Field Name  No. of Chars.  Data Type
    EBAN              EBELN          15                  EBELN.
    Similary for Output format  you should specify for each value of output which table to refer and which field to refer.
    EBAN, EKKO, EKPO, EKBE,EKAN... etc are all table names.
    FS should always be cryctal clear for the Develpoers. Take help of your devlporers.
    Hope this helps...
    Any quereis let me know.
    Regards,
    Smitha

  • Settings for publish the reports in EP

    Hi Guys
    Does any one have any documents that shows clear steps on how to publish the Quereies in Portal....Iam new to this and need to know the settings to publish the reports in portal ...We are now using BEx analyser...I will assign points
    Regards

    Hi
        we can publish ur reports directly from the BEx analyser by clicking the globe button.
    You use the Broadcasting layout profile to display precalculated objects or online links that you have published in a KM folder using the BEx Broadcaster.
    This layout profile is tailored to the needs of users who work with business intelligence content in the portal. By default, it is preset in the KM navigation iViews BEx Portfolio and My Portfolio in the Business Explorer portal role.
    You can apply the layout profile to the KM folder that you want to use for your business intelligence content.
    The technical name of the layout profile is Broadcasting: ConsumerBroadcasting.
    Refer to this link
    http://help.sap.com/saphelp_nw04s/helpdata/en/bf/220c40ac368f5ce10000000a155106/frameset.htm

  • Merged dimension returns summarized values from 2nd query

    I have used merged dimension to take fields from 2 quereis.... Both the queries return 30 days of data means 30 records each record corresponds to 1 day.
    So far its good..
    now when I run the queries individually I get 30 days of data as 30 rows in the webi document but when I merge the queries and take one column from 1 query and 2nd column from 2nd query then the problem starts.
    The field from query 1 I get 30 records with each record showing each day data.. column from second query shows 30 records but with each record showing the summarized values for 30 days and all 30 records shows same data.
    For example I am taking 2 days of data:
         column1     column2
             10              20
             10              20
    But I need to get as below
         column1     column2
             10              10
             10              10
    Total   20              20
    Can anyone explain me how can I overcome this and get correct data when I take values from both the queries.
    Thanks for your time.
    Regards
    Siva

    check this:
    Dave&amp;#8217;s Adventures in Business Intelligence &amp;raquo; Using ForceMerge() To Fix Unbalanced Data Providers

  • How to ensure for material POs, TDS does not get deducted.

    Dear Seniors,
    Single vendor is supplying material and doing some service also.  How do we ensure that for material POs, TDS does not get deducted.
    Regards
    KVKR

    Hi kkvr,
    As i see your requirement is to execute MRRL alongwith the performing a proper TDS deduction and also using single vendor.
    There is no Standard SAP solution for this, as a workaround you will have two options for doing the same
    Option 1 : Create a different vendor code then you can run mrrl for the vendor code one for service and another for material
    The demerits of this solution is duplication of vendor code, which results in not having proper information for the SCM personnel for the purpose of vendor evaluation and vendor reconcilation
    Option2:
    you need to enahce the Purchase order at line item level with the a custom fields for the WHT tax code to be selected at the time of PO creation. For this you need to do a screen enahcement in PO and also at the time of providing the popup for selection it should only dipslay the wht codes available in the vendor master so that proper control on the tax code selection will happen.
    The practical issues here is the po creation person should have the knowledge of the tds code which you need to see whether it is possible or not or you can also suggest that finance controller should be part of po release startegy to check these po's to ensure the correct tds code populated.
    Further to this you need to also perform additional enahcement changes for the selected tds codes in the po it should be replaced at the time of mrrl transaction making other tds tax codes base as 0 or removal of tds codes not applicable in the exit.
    This exit with the help of the abaper you will be able to find hte standard enhacements available for mrrl and code the same over there so that at the time of posting this will happen.
    Further, to this while writing the logic you can also further stream line it as for material po's the tds not to be deducted at all so that user input for the material po's can be avoided and it is required only for service po's this disticntion you can do with the item category.
    With this you can perform the enahncement with a minimal changement to achieve your purpose.
    Do let me know if you have any quereies,
    Regards,
    Bharathi

Maybe you are looking for

  • Import and Export Function

    Hi all, My application does not work properly after I do the export/import application function. When I import the application, two things fail - the page authorization scheme and the breadcrumbs. What can i do to avoid this problem ? jeff.

  • Muliple Bank Accounts In Vendor Master

    Hi, I have multiple bank accounts maintained in the vendor master data. However in the F110 automatic payment run the system always takes the first bank account. How can i specify in the F110 to select the second bank account maintained in the vendor

  • How  do I add a FIM user to BPC?

    Hi guys. I've completed my FIM installation and connected it to the 2 external databases from which I need to pull data in to BPC. However, how do I now add a FIM user to BPC? I'm guessing that this is what needs to be done, as when I enter the BPC U

  • Email and Verizon Portal login credentials..

    Hi there! I've got a bit of an odd problem. I switched from DSL to FiOS For Business in 2007, and my email accounts transfered over with it, just like clockwork. Earlier this year I switched the plan back to residential as I didn't need a static IP a

  • Protecting cracked ipad screen from further damage with protector

    Howdy.... I have an ipad 2. The screen recently developed a spider web-type crack. I honestly have no idea how it happened. I did not drop it. I've been on a number of plane trips recently and noticed it after checking into a hotel. It was in my cabi