Query on FIAP:transaction data

Hi bw friends
We created a customized cube FIAP:Line item (ZFIAP_C03) it is used to analyse  the information about the payables(How much the company has to pay its vendors). The best use of the cube can be done for ageing analysis of payables. The characteristics used can be vendor wise, material wise, company code wise etc.
there is one more customized  cube FIAP:Transaction data(ZFIAP_C02).I need to know that what can we analyse from this cube and what is the use of this cube.
The characteristics of this cube are
vendor
company code
currency type
keyfigures
cumulative balance
total credit postings
total debit postings
sales for the period.
Thank You

Thank you Simon for the information it is very useful.I wish to know tat what kind of information does the cube help to view .
I would like to know that the cube FIAP :Transaction data (ZFIAP_C02) enables us to view on information about what?
The characteristics in the cube FIAP :Transaction data
Vendor
Company Code
Currency Type
Key figures are:
Sales for the Period         0SALES
Total Debit Postings        0DEBIT
Cumulative Balance         0BALANCE
Total credit postings        0CREDIT
For example 0SD_C03 enables us to view information about sales and billing documents from R/3
such as:
dealer wise/customer wise monthly sales and sales return
sales representative wise monthly sales and sales return
division wise sales and sales return
I would like to know what does the FIAP :Transaction data enables us to view?

Similar Messages

  • How to get master data records that do not have transaction data in a query

    Hi,
    How to get master data records that do not have transaction data in a query output. Can we create a query or any other way to get the master data records that do not have transaction data?

    Hi,
    Create a multiprovider which includes transactional data target and master data info object. Make sure that identification for this master data info object is ticked on both the provider.
    Create report on this multiprovider , keep the master data info object in rows , and now you should able to see all the values which are there in master data info object irrespective of transaction happened or not .
    Next you may create condition showing only zero keyfigure values , ie. master data without any transaction.
    Hope that helps.
    Regards
    Mr Kapadia

  • Not able to Retrieve Transaction Data based on the property of master data

    Hi,
    I am trying to retrieve transaction data based on property of Master Data for ACCOUNT (property  ACCTYPE = ‘EXP’)
    in BPC 10 version for netweaver.
    Transaction data is present at backend, But I am not getting data in Internal table after running RSDRI Query.
    I am using this code.
    DATA: lt_sel TYPE uj0_t_sel,
    ls_sel TYPE uj0_s_sel.
    ls_sel-dimension = 'ACCOUNT'.
    ls_sel-attribute = 'ACCTYPE'.
    ls_sel-sign = 'I'.
    ls_sel-option = 'EQ'.
    ls_sel-low = 'EXP'.
    APPEND ls_sel TO lt_sel.
    lo_query = cl_ujo_query_factory=>get_query_adapter(
    i_appset_id = lv_environment_id
    i_appl_id = lv_application_id ).
    lo_query->run_rsdri_query(
    EXPORTING
    it_dim_name = lt_dim_list " BPC: Dimension List
    it_range = lt_sel" BPC: Selection condition
    if_check_security = ABAP_FALSE " BPC: Generic indicator
        IMPORTING
    et_data = <lt_query_result>
        et_message = lt_message
    Data is coming if i use ID of ACCOUNT directly, for e.g.
    ls_sel-dimension = 'ACCOUNT'.
    ls_sel-attribute = 'ID'.
    ls_sel-sign = 'I'.
    ls_sel-option = 'EQ'.
    ls_sel-low = 'PL110.
    APPEND ls_sel TO lt_sel.
    so in this case data is coming , but it is not coming for property.
    So Please can you help me on this.
    Thanks,
    Rishi

    Hi Rishi,
    There are 2 steps you need to do,.
    1. read all the master data with the property you required into a internal table.  in your case use ACCTYPE' = EXP
    2. read transaction data with the masterdata you just selected.
    Then you will get all your results.
    Andy

  • How to get XLR to show BPs with no transaction data for a given date range

    Hi -
    I am building an XLR report that does a comparison of net sales data across two periods for a given sales employee's BPs.
    The report has the row expansion:
    FACT BPA(*) SLP(SlpName = "ASalesPersonNameHere") ARDT(Code = "ARCreditMemo", "Invoice") Group by BPA.CardName
    and column expansions:
    FIG(SO_TaxDate = @StartDate:@EndDate)
    and
    FIG(SO_TaxDate = @StartDate2:@EndDate2)
    where @StartDate, @EndDate, @StartDate2, @EndDate2 are parameters that define the two ranges of dates.
    The column formulas are, from left to right:
    =ixDimGet("BPA", "CardName")
    =ixGet("SO_DocTotal")      <-- filtered by column expansion for first date range
    =ixGet("SO_DocTotal")      <-- filtered by column expansion for second date range
    The report works fine except for one problem, I would like it to include BPs for which no transaction occurred in either date range as well.
    Any help is greatly appreciated!
    Thanks,
    Lang Riley

    Really appreciate your feedback!  Those are good suggestions. I should have mentioned that I had already tried both those suggestions.
    Removing FACT on BPA in this case ends up returning all the BPs and not respecting the SLP(SlpName = "aName") part of the query. 
    Using **, i.e., * or #NULL, makes no change in the resulting data in this case.  I had thought that ** would be the solution, but it didn't change the outcome.  I still have BPs for which when their sales employee is used as the filter and they have no transactions for either date range, and yet they still do not appear. 
    I should further mention that the IXL query, as it now stands, does return BPs for which one of the periods has no data, just not both, and I have verified that applicable BPs with no transaction data for both periods do exist in my data set.  It seems that perhaps the IXL query needs to be restructured?  Please keep the suggestions coming including how this query might be restructured if necessary.

  • Problem in Flat file transaction data loading to BI7

    Hi Experts,
    I am new to BI7, got problem in activating data source for transaction data and create transfer routine to calculate sales revenue which is not there in flat file but flat file has got price per unit and quantity sold.
    Flat file fields are:
    Cust id   SrepID  Mat ID  Price per unit  Unit measure  Quantity   TR. Date
    cu100    dr01      mat01         50                 CS              5     19991001       
    created info objects are
    IO_CUID
    IO_SRID
    IO_MATID
    IO_PRC
    IO_QTY
    IO_REV
    0CALDAY
    When i created IO_QTY, unit/curency was given as 0UNIT. In creation of DS, under fields tab, extra field CS was shown I did confuse to which object should i map it. and at the same time when i enter IO_QTY its prompting one dailogbox with 0UNIT, so i said ok. Now i can't map CS to unit as system created automatic ally 0UNIT when i enter IO_QTY. Could you please give me solution for this and procedure to enter formulae for calculating Sales revenue at this level instead of query level. Your solutions will be more appreciated with points.
    Await for ur solutions
    Thanks
    Shrinu

    Hi Sunil,
    Thanks for your quick response. I tried to assign the infoobjects under the fields tab in creation of DS. I have not reached to data target yet.
    Could you please answer to my total question.
    Will you be able to send some screen shots to my email id [email protected]
    Thanks
    Shri

  • To get the transaction data..please help

    Hi all,
    i need to retrive the transaction datas..like deleted ,n updated datas..
    I tried this query
    SELECT * FROM FLASHBACK_TRANSACTION_QUERY WHERE TABLE_OWNER='CFMSDEV'
    AND TABLE_NAME='M_REGIONS'
    but this is givng the feilds of details..not showing a fied what data is translated..
    please help me...
    thnks&regards

    Hi,
    To my knowledge this can be achieved only by Change Data Capture in 10g.
    Let us wait for others opinion.
    Regards
    K.Rajkumar

  • Query on a transaction table breaks after running for sometime - Oracle 11g

    Hi,
    I'm running the below query to pull the sum of amount on trans table (which has 12 years of data partitioned yearly) whose last transaction date is 19-MAY-2013. The calendardatekey column is the one which is used in place of transaction date (each day has a unique calendardatekey). The query runs for couple of hours and breaks. When I contacted DBA they told that its trying to do a full table scan and pulling huge amount of data and doing a kind of sort and then breaks. Could someone please help if there is a better approach? If it works out for one day I will have to perform this for few more days.
    Oracle version - 11g
    trans table :
    columns customerkey, programid and amount - are indexed.
    card table :
    column customerkey, status are indexed.
    select t.customerkey, t.programid, sum(t.amount), max(calendardatekey) from trans t
    where
    exists (select null from card c where t.customerkey = c.customerkey and c.status = 'A')  
    --and t.programid = 10
    group by t.customerkey, t.instprogkey
    having sum(t.amount) < -100
    and max(calendardatekey) =4888
    Thanks for your help!

    doubtsinora wrote:
    Hi,
    I'm running the below query to pull the sum of amount on trans table (which has 12 years of data partitioned yearly) whose last transaction date is 19-MAY-2013. The calendardatekey column is the one which is used in place of transaction date (each day has a unique calendardatekey). The query runs for couple of hours and breaks. When I contacted DBA they told that its trying to do a full table scan and pulling huge amount of data and doing a kind of sort and then breaks. Could someone please help if there is a better approach? If it works out for one day I will have to perform this for few more days.
    Oracle version - 11g
    trans table :
    columns customerkey, programid and amount - are indexed.
    card table :
    column customerkey, status are indexed.
    select t.customerkey, t.programid, sum(t.amount), max(calendardatekey) from trans t
    where
    exists (select null from card c where t.customerkey = c.customerkey and c.status = 'A') 
    --and t.programid = 10
    group by t.customerkey, t.instprogkey
    having sum(t.amount) < -100
    and max(calendardatekey) =4888
    Thanks for your help!
    "and breaks" is totally devoid of actionable information.  Surely you got an actual error message (ora-nnnnn) that you could share with us?
    Sounds like your DBA is trying to avoid providing any assistance .....

  • Bill Presentment Architecture, how to Overide Default Rule and ensure the AR Invoice/Transaction Chooses "Customer Transaction Data Source"

    Hi Intelligentsia,
      we are on 12.2.4 on linux, i have setup external template with supplementary Data source as "Customer Transaction Data Source", when i test it fails, i am not able to debug it , however when i run the BPA Transaction Print Program (Multiple Languages) it always gives me the default layout.
    Query is
    How do i ensure the Default rule Does not apply to my Invoice and i am able to override it
    is there a method to explicitly Ensure the Supplementary Data Source as "Customer Transaction Data Source", when i am creating the AR Invoice/ Transaction?  Am i missing some setup in AR Invoice Transaction Flexfield where i need to setup this "Customer Transaction Data Source" as the DFF Context ?
    please let me know if you need any more information.
    Abdulrahman

    Hello,
    Thanks for the answer. When you say rule data is that Rule creation date or the "Bill Creation From Date" that we setup while creating the rule? I have created a new invoice after the rule created, but it did not pick the new custom template.
    I have another issue. It would be greate if you could help. I have split my logo area into 2 vertically to display logo in one and legal entity and addres on the other one. In the Online Preview I can see the logo and Legal address. But in the print preview , i am not able to see them. It just shows a blank space. Any Idea?
    Thanks in advance

  • JUMP (RSBBS) from a query on a transaction(local)

    Hi,
    Can You help me ?
    I want to do a jump from a query on a transaction RSDMD (on local system).But particullary, I want to put in parameter the data  (doc number) with wich I jumped.
    So How I have to define the assignement detail in RSBBS for the transaction RSDMD?
    Because, for the moment I have a popup where I must fill the name of my charactéristic and after the number of my doc number (for example), and it's with the doc number that I do my jump.
    But, I don't want to fill characteristic and doc number, I want that it's automaticaly...
    Thank you for your help...
    Carine.

    Calling the RRI with a transaction or an ABAP/4 Report as the receiver is done with the RRI
    from the SAP NetWeaver Application Server. This is possible in an ERP system, a CRM
    system or within the BI system. The selections are prepared by the BI system that does not
    recognize the transaction or the report. The assignment is transferred from the RRI of the
    SAP NetWeaver Application Server using inverse transformation rules. There must also be a
    complete chain from the DataSource of the source system to the InfoSource, through
    transformations up to the InfoProvider. This does not mean that data absolutely has to be
    loaded using this chain. If this chain does not exist, the RRI cannot transfer the selections to
    the source system. Calling the RRI only works for fields with dictionary reference. For ABAP reports, this means that the parameter has to be PARAMETERS param LIKE <table_field>
    For transactions, this means that the Dynpro has to have a dictionary reference. Not every
    transaction can be called with the RRI of the SAP Application Server.
    These are the steps to do this
    If you want to jump from a Web application to a transaction or ABAP/4 report using the
    RRI, an ITS for the target system has to be assigned beforehand.
    ● The value of the input field to be supplied must be known at the time of the jump (for
    example by entering a single value on the selection screen of the sender or by the
    cursor position at the time of the jump).
    ● Sender and receiver fields that correspond to one another generally must link to the
    same data element or at least to the same domain, otherwise the values cannot be
    assigned to one another.
    ● The assignment of sender and receiver fields must always be a 1:1 assignment. For
    example, the transactions called from the start screen cannot have two input fields of
    the same data type. Then it is not clear which of the fields is to be supplied, which
    means neither of them is supplied.
    ● There has to be a complete chain from the DataSource of the source system to the
    InfoSource, through update rules up to the target.
    Proceed as follows after you have created the sender-receiver assignment as described
    above.
    1. As the type, choose Table Field. The columns Field Name, Data Element, Domain and
    Set-/Get Parameter become input ready.
    2.  Specify the field name, data element, domain and parameter ID for the receiver
    transaction. You need to know this information because input help is not available. You
    can usually find the parameter ID in the ABAP dictionary entry for the data element.
    If this does not always make jumping to the transaction possible, it may be necessary to
    program a short ABAP start program. For more information on how to do this, see SAP Note
    number 383077

  • Transactional data modification

    Hi all,
    I have one query. In our R/3 system wrong data was entered. That data is uploaded in BW by transactional datasource.
    How can i make modification in the transactional data to correct it? In R/3 also, it is not getting modified.
    Regards,
    Pravin.

    Hi,
    You can have the functional team correct the data for you on the R3 side and then repull the data to BW. Thats the recommended practice.
    Of course you can correct the data on the BW side, but then your source and BW data will be out of sync which is not a good idea.
    Cheers,
    Kedar

  • Transactional Data Copy from In-Active version to Active version(APO)

    Hi
    I want to copy the Transactional Planning data from Inactive version to Active version, so that from Active version I can transfer that data to ECC.
    Kindly give the valuable inputs for my Query.
    Regards
    Guru
    Edited by: gurvarender singh on Jan 31, 2009 11:18 AM

    try the transaction /SAPAPO/VERCOP - to Copy Planning Version
    select if u want to copy the entire version (master as well as transaction data)
    u can also trigger parallel processing and also restrict 
    Restriction via
    Location Product                  
    Location                 
    Locatn Type              
    Production Plannr        
    SNP Plannr               
    Demnd Plannr             
    Transptn Plannr          
    Purch. Group

  • SAP Best Practice: Problems with Loading of Transaction Data

    Hi!
    I am about to implement SAP Best Practices scenario "B34: Accounts Receivable Analysis".
    Therefore I load the data from SAP ERP IDES system into SAP NetWeaver 2004s system.
    My problems are:
    when I try to load the Transaction data for Infosources 0FI_AR_4 and 0FI_AR_6 I get the following errors/warnings:
    when I start the "schedule process" the status getting still "yellow 14:27: 31(194 from 0 records)"
    On the right side I see some actions that are also "yellow", e.g. "DataStore Activation (Change Lo ): not yet activated".
    As a result I cannot see any data in tcode "RSRT" executing the queries "0FIAR_C03/...".
    The problems there
    1) Input help of the web template of query don't contain any values
    2) no data will be shown
    Can some one help me to solve this problem?
    Thank you very much!
    Jürgen

    Be in the monitor window where u got the below issue
    when I start the "schedule process" the status getting still "yellow 14:27: 31(194 from 0 records)"
    and go to environment in the menu options TransactRFC--->in the sourcesystem...
    give the logon details and enter and from there give the correct target destination as ur BI server and execute...
    if u find some idoc's pending there push that manually using F6..
    and come back to ur load and refresh....
    if still it doen't turn green u can manully change status to red in STATUS tab and come to processing tab and expand ur processing details and right click on ur data packet which was not yet updated and select manual update...
    it shows busy status and when it comes out of that once again refresh...
    rgds,

  • Hierarchies - How can transaction data be spillited as per different levels

    Hi,
    Every One.
    I have issue splliting data in diffrent leves of hierarchy,how it can be resolved!
    In detail,if i say
    We are getting all transaction and master data in flat file.
    Master Data is in the format- Global Parent ID,Global ID,Local ID,Company ID
    Transaction Data in the format as givn below-
    Local ID,Company Id,Total Spend.
    To get the drill down facility,we have added hierarchy as- Global Parent ID->Global ID->Local ID->Company ID
    It is not spilliting in different levels,say total spend is fixedin  Local ID,Global ID and Parent ID.
    if we have total spend 500 in some company(say ABCL),its every where same.So the purpose of drill down is not resolved.
    Say spend 500 shd be splitted in different Local ID,then In different Global ID and in every level,finalll giving overall total.
    Any suggesstions!
    Its high priority issue.

    If I understand the description and the query correctly, you are joining the budget with monthly values with the view that has daily values. This means that you get one row in the budget for each row in the daily data. So when you sum, you count the same
    month many times...
    Thus, you need to aggregate before the join, as in this example:
    WITH oms_values AS (
       SELECT Marknad, Distrikt,
              SUM(CASE WHEN convert(char(6), [Posting Date], 112) <=
                            convert(char(6), Getdate(), 112)
                  THEN [CY LC]*-1
                  ELSE 0
               END) AS "CYLC",
              SUM(CASE WHEN convert(char(6), [Posting Date], 112) <=
                            convert(char(6), Getdate(), 112)
                  THEN [CY-1 LC]*-1
                  ELSE 0
               END) AS "CY1LC"
        FROM  Vy_omsattning_per_marknad_och_distrikt
        WHERE [PostingDate] > convert(char(4), getdate(), 112) + '0101'
        GROUP BY Marknad, Distrikt
    ), budget_values AS (
       SELECT Marknad, Distrikt,
              SUM(CASE WHEN Period <= convert(char(6), Getdate(), 112)
                  THEN [Budget_LC]
                  ELSE 0
               END) AS "BUDGET"
       FROM   ManadsBudget
       WHERE  Period >= convert(char(4), getdate(), 112) + '01'
       GROUP BY Marknad, Distrikt
    SELECT oms.Marknad, oms.Distrikt, cms.CYLC, oms.CY1LC, mb.BUDGET
    FROM   oms_values as oms
    JOIN   ManadsBudget as mb ON mb.Marknad=oms.Marknad
                             AND mb.Distrikt=oms.Distrikt
    I had to guess what's in the Period column in the Budget table. I'm assuming that it is on the form YYYYMM.
    Note that I have changed the way you filter on the date columns. Converting to format 112 is often very powerful and helps to make the code shorter. And for the WHERE clause, this means better odds that any index will be used.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Parallel loading of Transaction Data

    Hi,
    We are trying to load transaction data into Inventory Valuation cube (0COPC_C04).
    This gets data from multiple datasources. My question is, Can the different infopackages (different infosources) be scheduled in parallel into the same cube .
    This is transaction data. We are loading manually in production based on some selection criteria.
    Thanks

    Hi,
    For the first part, yes your understanding is correct. Delete all cube indexes before you start any loads and then recreate them after all loading has been completed. This is to increase load performance. Make sure that you recreate the indexes otherwise you'll have problems in querying.
    For the second part, yes, as long as the selections are different and the data loads are from different source systems, you can start another data load in parallel to the one already in progress. This again will help you load faster and will reduce load times..
    Cheers,
    Kedar

  • Transactional data synchronization

    Hi,
    Looking for approach on transactional data synchronization, and guidance as to where i can find further info to know pros & cons for such approaches.
    For instance, when a banking customer makes an account transfer on a self-service Web site, the site must immediately reflect changes to his account. Account information stored in the bank's back-office systems must be synchronized through a transaction with the database backing the Web site.
    Thanks.

    Maybe you can try something like this:
    MERGE INTO emp e
       USING (SELECT   cep1.operation$, cep1.empno, cep1.NAME,
                       TO_NUMBER (NULL) chg_empno, TO_CHAR (NULL) chg_name
                  FROM cdc_v_emp cep1
                 WHERE cep1.operation$ = 'I'
              UNION ALL
              SELECT   cep2.operation$, cep1.empno, cep1.NAME, cep2.empno,
                       cep2.NAME
                  FROM cdc_v_emp cep1, cdc_v_emp cep2
                 WHERE cep1.cscn$ = cep2.cscn$
                   AND cep1.rsid$ = cep2.rsid$
                   AND cep1.operation$ = 'UU'
                   AND cep2.operation$ = 'UN'
              ORDER BY cep1.cscn$, cep1.rsid$) c
       ON (e.empno = c.empno)
       WHEN MATCHED THEN
          UPDATE
             SET e.empno = c.chg_empno, e.NAME = c.chg_name
       WHEN NOT MATCHED THEN
          INSERT (e.empno, e.NAME)
          VALUES (c.empno, c.NAME);:)
    PS: Note that I changed the order (and name) of some columns in the "USING" query.
    .

Maybe you are looking for

  • Report for Actual/plan Comaprision

    Hi We have defined power consumption as one of the activity types in KP06 and given a rate for the same. Is there a standard report which would give a difference between the plan and actual data in terms of Aty (KWH Consumed) and Value (In INR) for t

  • [AwesomeWM] Applications locked to one screen in a dual monitor setup

    Perhaps the title should be "Multiple Screens with multiple video cards." I imagine  my confusion here is the reason I am unable to move clients from one screen to the other in awesome. Though the cursor moves fluidly between Screen1 and 2,  MOD4+dra

  • Outgoing email is in "raw text" instead of HTML. Very annoying to have tiny print and 50 lines of "commands" not part of the email.

    I received an email and attempted to reply. This is what shows up when I hit the "reply" icon. <br><br><font size=2 face='Tahoma, Arial, Sans-Serif'><hr size=2 width="100%" align=center>Return-Path: <[email protected]><br>Received: from sandiego.tran

  • Add rubber stamp on same document

    Hello, In the previous versions of Adobe Acrobat I was able to use the rubber stamp on the same document by just choosing the stamp once and clicking on the different parts of the document repeatedly. With Acrobat XI I have to keep going back up to t

  • Apex 2.2: Recently used pages

    Just saw that the View dropdown list on the Page Definition page now includes the 3 most recently edited pages as in http://img389.imageshack.us/img389/3492/screenshot001cm9.jpg But I am not entirely clear on what actions cause a page to be added to