Profitcenter wise data mismatch In T-code FAGLL03 & S_PL0_86000030

Data Mismatch In two Report
for e.g.
IN GL 610120 FOR the Period of 01.04.2009 to 30.06.2009
T-CODE T-CODE
Profit center          *FAGLL03*                *S_PL0_86000030* -( General -> G/L Account Balances -> G/L Account Balances (New))
1                     338,521.50 ;              338,521.50
2                     115,104.00 ;              453,625.50
3                     406,985.25 ;              860,610.75
4                     345,215.50 ;            1,205,826.25
Total               1,205,826.25 ;            1,205,826.25
The Problem is In one HO profit center both T-code Balance Is Tally Rest of three are not matching and in four profit center is showing total of all
we want data profit center wise data for quarterly balances sheet which is showing In GL Account
what is the reason data showing mismatch ?
How can we get the proper data which will be same in both T-code?
any report setting is required?
please guide.

Hi,
there is no mismatch between the values. When you have a closer look you will see that FAGLL03 shows the period values whereas report S_PL0_86000030 shows cumulated values:
Profit center          FAGLL03                S_PL0_86000030 -( General -> G/L Account Balances -> G/L Account Balances (New))
1                          338,521.50 ;              338,521.50
2                          115,104.00 ;              453,625.50
total in period 2    453,625.50
3                          406,985.25 ;              860,610.75
total in period 3    860,610.75
4                          345,215.50 ;            1,205,826.25
total in period 4 1,205,826.25 ;            1,205,826.25
Cheers,
Daniela

Similar Messages

  • GL balance  profitcenter wise

    I want  GL balance  profitcenter wise for balance sheet purpose  is there any report get proper data or T-code.

    Data Mismatch In two Report
    for e.g.
    IN GL 610120 FOR the Period of 01.04.2009 to 30.06.2009
    T-CODE T-CODE
    Profit center          *FAGLL03*                *S_PL0_86000030* -( General -> G/L Account Balances -> G/L Account Balances (New))
    1                     338,521.50 ;              338,521.50
    2                     115,104.00 ;              453,625.50
    3                     406,985.25 ;              860,610.75
    4                     345,215.50 ;            1,205,826.25
    Total               1,205,826.25 ;            1,205,826.25
    The Problem is In one HO profit center both T-code Balance Is Tally Rest of three are not matching and in four profit center is showing total of all
    we want data profit center wise data for quarterly balances sheet which is showing In GL Account
    what is the reason data showing mismatch ?
    How can we get the proper data which will be same in both T-code?
    any report setting is required?
    please guide.

  • Data Mismatching in DSO

    Hi Folks,
    We  have data mismatch issue for one info object ( BASE UOM ) when the data goes from staging DSO to conformance DSO.I will explain the issue here.
    For one invoice no the Base UOM value dispayed as M2 in the staging DSO.When the data goes from staging DSO to conformance DSO,in the conformance DSO the Base UOM dispayed as Blank value.There is no code written for calculation  of Base UOM in transformation level.This is direct a mapping to field Base UOM in the conformance DSO from staging DSO.For all other invoices the Base UOM value dispalyed correctly in conformance DSO when compare to staging DSO. For one specific invoice the base UOM value showing as blank in the conformance DSO.
    Could you please give me your suugestions what the reasons for showing  base UOM value showing as a blank value in Conformance DSO.
    Thaks for your advance.
    Regards,
    Nag

    Hi,
    You will have to check following things,
    1) Check if other records with base unit value as M2 are updated properly in conformance DSO. This will make sure that there is no issue with this particular base unit.
    2) As you have mentioned other records are successfully getting updated, you might have to debug the data load for this particular record. Do the selective data loading and check, where the unit is getting cleared.
    You can also check if there is any conversion routine at the infoobject level.
    Regards,
    Durgesh.

  • Vendor number in T-code: FAGLL03

    Hi,
    Is it possible to view vendor in no. in the GL line item display T-code: FAGLL03 .
    Requirement is view the Cost (respective cost element) cost details - period wise and vendor wise.
    Please let me know how it will be done.
    Thanks
    Partha

    Hi,
    I added BSEG     LIFNR     Vendor,  in repective path.
    I run the program: %F buffer of EUINFO/LTEX has been reset -- this is the message i got.
    FAGLL03  I am not getting the Vendor number in vendor coloum.
    Please let me what is the problem.
    Thanks
    Partha

  • Data mismatch with ECC FBL5N report and debtor ageing report(BI side)

    Hi,
    I am facing a data mismatch problem with FBl5N(t-code for customer line item report) report in ECC side and Debtor ageing report in BI side.
    The problem is
    1) there are mismatch of data with some customer amounts in both ECC and Bi side.
    2) and also there are customer Nos with amounts in BI Debtor ageing report which are not there in the ECC FBL5N(t-code for customer line item report)
    for the second problem, I checked in the Tables BSID,BSAD in ECC side ,there also these customer Nos are not available.
    One more strange thing is that with the same selectionin report on both ECC and BI reports the data mismatch and The extra customers in Bi reports are changing everyday, i.e. we are getting new set of data mismatch and extra customers Nos in BI Side.
    If anyone  have worked on this type of issue.....kindly help...
    Thanks in advance

    Hi,
    on the one hand it may be delta mechanism of FI_*_4 extractors with the timestamps issue, that your comparision between BI and ECC is at no time up to date.
    FI Extraction
    on the other hand, it may be the delta problem between data targets in your BI-System, in case you load the FI-Data from a DSO to a cube and make a report on the cube. I have this problem at the moment and will watch this thread for more suggestions.

  • Inserting to MS-Access -Data mismatch

    I am trying to insert records in to a MS-Access table. However i got a data mismatch exception and on figuring it out i realized there was a date field and i was populating it with string. hence i converted it in to a date and run the query again, however now, i am neither getting an exception nor is the table getting updated.
    The following is the code snippet where i get the data
    List <org.cuashi.wof.ws.nwis.ValueSingleVariable > ValueList = result.getTimeSeries().getValues().getValue();
    for (org.cuashi.wof.ws.nwis.ValueSingleVariable value : ValueList)
    System.out.format("%20s%10.4f",value.getDateTime().toString(),value.getValue());
    System.out.println();
    System.out.println("obtaining time series data");
    String dateTime = value.getDateTime().toString().replace('T',' ');
    //to convert string in to date
    SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
    java.util.Date dt = sdf.parse(dateTime);
    sdf.applyPattern("yyyy-MM-dd HH:mm:ss");
    java.util.Date dt2 = sdf.parse(dateTime);
    updateTable(siteCode,variableCode,dt2,timvalue,varCode,qualifierCode,qualityControlLevel,conn,rs);
    }catch(Exception e)
    public void updateTable(String siteCode,String variableCode,java.util.Date dt2,double tmvalue,String varCode,String qualifierCode,int qualityControlLevel,Connection con,ResultSet rs)
    try{
    System.out.println("inside update");
    // con.setAutoCommit(false);
    PreparedStatement pstmt = con.prepareStatement("INSERT INTO DataValues(ValueID,DataValue,ValueAccuracy,LocalDateTime,UTCOffset,DateTimeUTC,SiteID,VariableID,OffsetValue,OffsetTypeID,CensorCode,QualifierID,MethodID,SourceID,SampleID,DerivedFromID,QualityControlLevelID) VALUES(?,?,?,?,?,?.?,?,?,?,?,?,?,?,?,?,?)");
    pstmt.setString(1,"1");
    pstmt.setDouble(2,tmvalue);
    pstmt.setInt(3,0);
    pstmt.setDate(4,(java.sql.Date) dt2);
    pstmt.setInt(5,0);
    pstmt.setString(6,"0");
    pstmt.setString(7,siteCode);
    pstmt.setString(8,varCode);
    pstmt.setInt(9,0);
    pstmt.setInt(10,0);
    pstmt.setInt(11,0);
    pstmt.setString(12,qualifierCode);
    pstmt.setInt(13,0);
    pstmt.setInt(14,1);
    pstmt.setInt(15,0);
    pstmt.setInt(16,0);
    pstmt.setInt(17,qualityControlLevel);
    System.out.println("Statement prepared");
    pstmt.execute();
    //commit the transaction
    con.commit();
    pstmt.close();
    }catch(SQLException e)
    System.out.println("The Exception is " +e);
    I found out that after field 4 the control does not go to the remaining fields at all.
    Please let me know what i am missing.

    System.out.format("%20s%10.4f",value.getDateTime().toString(),value.getValue());
    System.out.println();
    System.out.println("obtaining time series data");
    String dateTime = value.getDateTime().toString().replace('T',' ');
    //to convert string in to date
    SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");I'd recommend that you setLenient(false) on your sdf.
    This pattern is what you've got? Sure?
    java.util.Date dt = sdf.parse(dateTime);
    sdf.applyPattern("yyyy-MM-dd HH:mm:ss");
    java.util.Date dt2 = sdf.parse(dateTime);
    updateTable(siteCode,variableCode,dt2,timvalue,varCode,qualifierCode,qualityControlLevel,conn,rs);
    }catch(Exception e)
    {Empty catch block? Not a smart idea. Print the stack trace.
    public void updateTable(String siteCode,String variableCode,java.util.Date dt2,double tmvalue,String varCode,String qualifierCode,int qualityControlLevel,Connection con,ResultSet rs)
    try{
    System.out.println("inside update");
    // con.setAutoCommit(false);
    PreparedStatement pstmt = con.prepareStatement("INSERT INTO DataValues(ValueID,DataValue,ValueAccuracy,LocalDateTime,UTCOffset,DateTimeUTC,SiteID,VariableID,OffsetValue,OffsetTypeID,CensorCode,QualifierID,MethodID,SourceID,SampleID,DerivedFromID,QualityControlLevelID) VALUES(?,?,?,?,?,?.?,?,?,?,?,?,?,?,?,?,?)");
    pstmt.setString(1,"1");
    pstmt.setDouble(2,tmvalue);
    pstmt.setInt(3,0);
    pstmt.setDate(4,(java.sql.Date) dt2);I'd recommend this:
        pstmt.setDate(4,new java.sql.Date(dt2.getTime()));>
    pstmt.setInt(5,0);
    pstmt.setString(6,"0");
    pstmt.setString(7,siteCode);
    pstmt.setString(8,varCode);
    pstmt.setInt(9,0);
    pstmt.setInt(10,0);
    pstmt.setInt(11,0);
    pstmt.setString(12,qualifierCode);
    pstmt.setInt(13,0);
    pstmt.setInt(14,1);
    pstmt.setInt(15,0);
    pstmt.setInt(16,0);
    pstmt.setInt(17,qualityControlLevel);
    System.out.println("Statement prepared");
    pstmt.execute();
    //commit the transaction
    con.commit();
    pstmt.close();You should be closing your statement and connection in a finally block, in indivdual try/catch blocks.
    Set autoCommit back to true.
    >
    }catch(SQLException e)
    {So you don't roll back if there's an exception? Bad idea.
    System.out.println("The Exception is " +e);
    I found out that after field 4 the control does not go to the remaining fields at all.
    Please let me know what i am missing.Lots of stuff. See above.
    %

  • IDOC failed due to date mismatch

    Hi Experts,
    I am facing a problem in IDOC.
    I IDOC is sent through a one SAP to another system.
    In target system ,It was failed ,the reason being the date mismatch.
    When I check through WE02 ,it was failed becoz a TODATE and FROMDATE got wrong ,i.e it got swapped..
    So is there any way to process this error out IDOC?
    Is it possible to change the dates and then process it manually?
    Pls let me know as its on high priorty.
    Note : BOTH SAP system are Production .
    thanks in advance

    Hi Hemant,
    Please find the following steps to edit IDOC segment after you find the error using WE02.
    The example codes can be found in website
    http://www.sapgenie.com/sapedi/idoc_abap.htm
    STEP 1 - Open document to edit
    CALL FUNCTION 'EDI_DOCUMENT_OPEN_FOR_EDIT'
    EXPORTING
    document_number = t_docnum
    IMPORTING
    idoc_control = itab_edidc
    TABLES
    idoc_data = itab_edidd
    EXCEPTIONS
    document_foreign_lock = 1
    document_not_exist = 2
    document_not_open = 3
    status_is_unable_for_changing = 4
    OTHERS = 5.
    STEP 2 - Loop at itab_edidd and change data
    LOOP AT itab_edidd WHERE segnam = 'E1EDKA1'.
    e1edka1 = itab_edidd-sdata.
    IF e1edka1-parvw = 'LF'.
    e1edka1-partn = t_eikto.
    itab_edidd-sdata = e1edka1.
    MODIFY itab_edidd.
    EXIT.
    ENDIF.
    ENDLOOP.
    STEP 3 - Change data segments
    CALL FUNCTION 'EDI_CHANGE_DATA_SEGMENTS'
    TABLES
    idoc_changed_data_range = itab_edidd
    EXCEPTIONS
    idoc_not_open = 1
    data_record_not_exist = 2
    OTHERS = 3.
    STEP 3a - Change control record
    CALL FUNCTION 'EDI_CHANGE_CONTROL_RECORD'
    EXPORTING
    idoc_changed_control = itab_edidc
    EXCEPTIONS
    idoc_not_open = 1
    direction_change_not_allowed = 2
    OTHERS = 3.
    STEP 4 - Close Idoc
    Update IDoc status
    CLEAR t_itab_edids40.
    t_itab_edids40-docnum = t_docnum.
    t_itab_edids40-status = '51'.
    t_itab_edids40-repid = sy-repid.
    t_itab_edids40-tabnam = 'EDI_DS'.
    t_itab_edids40-mandt = sy-mandt.
    t_itab_edids40-stamqu = 'SAP'.
    t_itab_edids40-stamid = 'B1'.
    t_itab_edids40-stamno = '999'.
    t_itab_edids40-stapa1 = 'Sold to changed to '.
    t_itab_edids40-stapa2 = t_new_kunnr.
    t_itab_edids40-logdat = sy-datum.
    t_itab_edids40-logtim = sy-uzeit.
    APPEND t_itab_edids40.
    CALL FUNCTION 'EDI_DOCUMENT_CLOSE_EDIT'
    EXPORTING
    document_number = t_docnum
    do_commit = 'X'
    do_update = 'X'
    write_all_status = 'X'
    TABLES
    status_records = t_itab_edids40
    EXCEPTIONS
    idoc_not_open = 1
    db_error = 2
    OTHERS = 3.
    Other alternative is to use WE19 with reference to the error IDoc number.
    After you change the incorrect segment(s), you can post to the application. 
    As of result, new IDoc will be created.
    You can use FM EDI_DOCUMENT_DELETE to delete the incorrect/errror IDoc number.
    Also you can use WE02 to change the incorrect segment.
    Double click on the incorrect segment to get the detail info. Then go to Data Record (menu) -> Change -> make neccessary changes -> Save.
    Then you can use this program RBDINPUT to reprocess the error IDoc.
    Hope this will help.
    Regards,
    Ferry Lianto

  • Data mismatch error

    hi team
    i have a report, where in my report, Data being mismatch with r/3 data my question is
    1.how we will do reconcellation with r/3 data
    2.what is agregated level
    3.how we will run the statistical setup in lo
    can u explain with step by step how we solve the data mismatch erors eg:company code missing
    thanks in adv
    v.muralikrishna

    Hi
    1.how we will do reconcellation with r/3 data
    Yes we can do for MM we use MB5B, MMBE,Tcodes and also some Tables this is on eexample only and also Related Functional People will need to work to reconcile the data, BW consultanta alone can't do the reconcilization
    2.what is agregated level
    Eg:
    06.2010 -- 10
    06.2010 -- 20
    Aggr Level is
    06.2010 -- 30
    3.how we will run the statistical setup in LO
    LO you can see the below steps
    Follow the steps, these steps are for SD module, but for your datasource, change the Tcode to fill setup tables and replace the SD DataSource with your datasource in the following steps.
    1. First Install the DataSOurce in RSA5 and see it in RSA6 and activate in LBWE.
    Before doing the steps from 2 to 6 lock the ECC System, i.e. no transaction will happen
    2. Then delete the Queues in LBWQ like below
         MCEX11  --> For 2LIS_11_* 
         MCEX12  --> For 2LIS_12_* 
         MCEX13  --> For 2LIS_13_* 
      Be carefull while doing all these deleations in Production Servers
    3. Then delete if any entry is there in RSA7
         Eg:
         2LIS_11_*
         2LIS_12_*
         2LIS_13_*
    At the time of setup table filling no entry will exists in LBWQ in ECC for the following Queues.
         MCEX11  --> For 2LIS_11_* 
         MCEX12  --> For 2LIS_12_* 
         MCEX13  --> For 2LIS_13_* 
    6. Check the job status in SM37 and once finish it then goto RSA3 and then execute it and check.
    7. Then Replicate the DataSource in BW.
    4. Then delete setp tables using LBWG Tocde and select Application Number. i.e. 11, 12 and 13.
    5. Then Load Setup Tables using OLI7BW, OLI8BW and OLI9BW.
       Give Name of run = XYZ, Termination Date = tomorrows date and execute it in background.
       i.e. Program-->Execute in Background.
       2LIS_11_*  Use OLI7BW Tcode to fill the Setup tables
       2LIS_12_*  Use OLI8BW Tcode to fill the Setup tables
       2LIS_13_*  Use OLI9BW Tcode to fill the Setup tables
    8. Install InfoCube/DSO from Business Content or cretae InfoCube/DSO and then Map the ECC DataSOurce Fields and BW          InfoObejcts in Transfer Rules (in BW 3.5) or in Transfermations (in BI 7.0).
    9. Map the InfoObejcts in InfoSource and InfoObejects in InfoCube/DSO in in Update Rules (in BW 3.5) or in Transfermations (in BI 7.0).
    10.Create InfoPackage and Load Init OR Full.
    Thanks
    Reddy

  • Submit syntax for dynamic selections for T-code FAGLL03

    Hi Experts,
    My z report contains following fields in selction screen.
    1 . G/ L account
    2. Comapny code
    3. posting date
    4. Profit Center
    5. layout
    In my z report i used following syntax for passing selection screen values to standard program and getting data.
    This is for T-code FAGLL03
    SUBMIT FAGL_ACCOUNT_ITEMS_GL
    WITH SD_SAKNR IN S_SAKNR
    WITH SD_BUKRS IN S_BUKRS
    WITH %%DYN001-LOW (Profit Center)
    EXPORTING LIST TO MEMORY
    AND RETURN.
    The above syntax is not working for dynamic selection field ( Profit Center), entire Profit center data is fetching from standard program. I want to fetch profit center data based on my z report selection values .
    Expect for profit center field , submit syntax is working.
    kindly provide submit syntax for my above requirement .
    Any suggestions from experts....
    Edited by: PLP REDDY on Nov 25, 2009 8:59 AM

    Indeed it won't work. Instead of passing parameters one by one use
    [SUIBMIT... WITH SELECTION-TABLE rspar|http://help.sap.com/abapdocu_70/en/ABAPSUBMIT_SELSCREEN_PARAMETERS.htm#!ABAP_ADDITION_3@3@]
    Here you add one line per parameter (field KIND = 'P' ) and multiple lines per select options (field KIND = 'S' ). For the latter you need also provide SIGN and OPTIONS, i.e SIGN = 'I' , OPTIONS = 'EQ'
    The lin I gave you will explain it more.
    Regards
    Marcin

  • Difference in amount in T code Fagll03 and Table BSIS, FAGLFLEXA

    Dear Experts
    There are three GL accounts which are maintained on Open item basis.
    When I check the balance in T Code FAGLL03 I get some balance which matches with the trail balance. But when I check the balance at table level it does not match.
    The table that I am checking is BSIS, FAGLFLEXA. I am checking these table as this table contains the data of GL open item. At the moment the table BSIS, FAGLFLEXA and T code FAGLL03 all given different figures.
    Is there any other was that I should check the balance.
    Your suggestion is much appreciated.
    Regards
    Paul

    Hi
    I was checking the document by clicking the general ledger view in FAGLL03.
    I have chosen entry view and it is working.
    Thanks

  • Migrating 3 company codes data to 1 company code.

    Hi all,
    We are one legal entity, imparting educational training and operating its activities at three locations namely Gurgaon, Dehradun and
    Rajamundary. In addition it has two administrative offices at Delhi and Mumbai besides a few other rented premises.
    The implementation team has created one company as XYZ and created three separate company codes for each of the operational unit at Gurgaon, Dehradun and Rajamundary. All the existing company codes have been assigned to one Chart of Account. The operating units are neither independent legal entities nor are required to generate location-wise Final Accounts.
    However the existing system is not meeting our requirement and we wish to operate thru one company code only.
    Accordingly we plan to generate one more company code and migrate all the data from existing three company codes to this new one on a pre fix
    cut off date. After that we will record all the transactions in this new
    company code only and the existing codes would not be used further.
    In this regard please advise us the steps required to be followed to migrate the  combined data of three company codes to one company code.
    Kind Regards,
    Sudhanshu
    Edited by: Sudhanshu Sharma on Feb 11, 2008 11:38 AM

    please find some info on SAP SLO
    Sell-off of parts of the company
    A company group sells the portfolio of one of its divisions to another company. As a result, the data that relates to the sold company division must be removed from the group’s SAP system, and transferred to the new owner.
    With the "client separation” service, SLO offers two possible approaches to the solution. Either the data that the sold part of the company needs is migrated to a new client using one of the data transfer methods (Legacy System Migration Workbench or Migration Workbench), or a client copy is created for the sold part of the company, and the data that the parent company no longer requires is removed from the client copy using the SLO service "delete company code.”
    Company merger
    A company buys or takes over another company. The acquired company uses a different material numbering system than the buying company, for example. However, from the point of acquisition on, the same material numbers must be used, and it is therefore necessary to change the material numbers of the bought company.
    With the SLO conversion service "material number conversion,” the material numbers of the bought company are changed so that they correspond to the naming conventions used in the parent company. This is done using the Conversion Workbench.
    Reorganization/restructuring in the company
    A company was previously organized according to regions, and based its SAP system on this structure. However, an analysis of business processes reveals that a central, function-based organization would considerably increase efficiency in controlling. As a result, the controlling areas used to map the structure of the company in the SAP system must be adjusted accordingly.
    With the help of the SLO service "controlling area merge” and the Conversion Workbench, the old controlling areas, which were set up on a regional basis, can be merged into central units.
    Optimization of business processes
    In a government department, different standard charts of accounts are used for external and internal financial reporting. The duplicate work that this entails is becoming increasingly time-consuming and error-prone. The managers responsible decide that the external chart of accounts could also be used internally. As a result, the chart of account in the government department’s SAP system must be adjusted to reflect this change.
    The SLO conversion service "chart of account conversion” makes it possible to rename or merge accounts and cost elements in the chart of account so that they meet the new requirements. The assignment table in which the changes are defined is retained in the SAP system after conversion for documentation purposes – for an external audit, for example.
    http://www.sap.info/public/INT/int/index/Category-12613c61affe7a5bc-int/-1/articleContainer-70353ed226f24a312

  • Data Mismatch between ODS and Infocube

    Hi,
    I have got data mismatch between from ODS to Infocube. In ODS for one Sender Cost Centre, there is only one Company Code. In Infocube it is showing many.For ODS datasource is 0HR_PT_1. In Updaterules of ODS, Company Code was assigned to Company Code. But in Infocube should  I assign CompCode to Comp Code or any other Characteristic as Master Data Attr of. Plz suggest us.
    It is very urgent.

    post ur  BW related querures in BW forum

  • MDX to DAX Conversion - Data Mismatch

    Hi,
    I’m pulling data from a Tabular cube, for the given time period as filter. When I run an mdx query, it pulls around 9 million records in more than 6 hours. Since I need to run it frequently and quickly so I’ve converted the MDX to DAX (below). DAX
    is executing much faster (20 minutes) than the MDX but results between two approaches are not the same and I’m struggling hard to find why!
    All the dimensions used inside SUMMARIZE clause are having a relationship with Primary Dimension/TableName. High level observations:
    Number of Records are different between the result set of DAX & MDX.
    Sum of each measures(DAX) differ from the sum of their corresponding measures(MDX)
    DAX results have no rows where all the measures have zero values, whereas the MDX results has.
    DAX omits many of the rows where all the measures have zero values except one. It appears that due to the omission of these kind of records, we have the data mismatch but reason for
    omission is unknown.
    Any help will be appreciated!!
    MDX
    SELECT NON EMPTY {
    [Measures].[Calculated Measure 1],
    [Measures].[Calculated Measure 2],
    [Measures].[Calculated Measure 8]
                    } ON COLUMNS,
                    NON EMPTY
      [Dimension1].[DimAttribute1].children
    , [Dimension2].[DimAttribute1].children
    , [Dimension2].[DimAttribute2].children
    , [Dimension10].[DimAttribute1].children
                    ) } ON ROWS
                    FROM (SELECT ([Time].[Fiscal Month].&[May, 2014) ON COLUMNS
                    FROM [Cube Name]
                    WHERE (Filter Condition)
    Converted DAX
    EVALUATE(CALCULATETABLE(ADDCOLUMNS(SUMMARIZE(
    PrimaryDimension/PrimaryTableName
    ,Dimension1[Attribute1]
    ,Dimension2[Attribute1]
    ,Dimension2[Attribute2]
    ,Dimension10[Attribute1]                                           
    MeasureGroup1.[Calculated Measure 1],
    MeasureGroup2.[Calculated Measure 2],                                            
    MeasureGroup4.[Calculated Measure 8]
                                    ,Time[Fiscal Month] = "May, 2014"
                                    ,Filter Condition               
    Thanks,
    Amit

    Amit, the two queries are not semantically equivalent.
    SUMMARIZE returns rows that have at least one row in the table you pass as a first argument for the given combination of columns you pass in the following arguments (think to a SELECT DISTINCT for all the columns you include, with an INNER JOIN between all
    the tables included by the colulmns you specified in the following arguments). You can read more about this here:
    http://www.sqlbi.com/articles/from-sql-to-dax-projection/
    The MDX query returns any existing combination of the cartesian product of the columns you included, and depending on the measures you include, you might see data also for combinations of column values that don't exist in the original source.
    The reason why you have a slow MDX is probably because of the cost of non empty evaluation for certain measures. The solution is probably to optimize the DAX code you are using in Tabular.
    The equivalent DAX statement should be something like
    EVALUATE
    ADDCOLUMNS (
        FILTER (
            CROSSJOIN (
                VALUES ( table1[columnA] ),
                VALUES ( table2[columnB] )
            [Measure] <> 0
        "Measure", [Measure]
    But don't expect better performance comparing this to MDX. Only when you can make assumptions that allow you using SUMMARIZE you might see performance benefits, but if these assumptions are not correct, you can get different results (as you experienced).
    Marco Russo (Blog,
    Twitter,
    LinkedIn) - sqlbi.com:
    Articles, Videos,
    Tools, Consultancy,
    Training
    Format with DAX Formatter and design with
    DAX Patterns. Learn
    Power Pivot and SSAS Tabular.

  • Delta load from ODS to cube failed - Data mismatch

    Hi all
    We have a scenario where the data flow is like
    R/3 table - >dataSrc -- > pSA - >InfoSrc -> ODS ->Cube.
    The cube has an additional field called "monthly version"
    and since it is a history cube , it is supposed to hold data
    snapshots of the all data in current cube for each month .
    We are facing the problem that the Data for the current month
    is there in the history ODS but not in the cube . In the ODS ->Manage->requests
    tab i can see only 1 red request that too with 0 recs.
    However ,In the cube -> manage-> reconstruction tab , i can see 2 Red request
    with the current month date. Could these red requests be the reason for
    data mismatch in ODS and cube .
    Please guide as to how can i solve this problem .
    thanks all
    annie

    Hi
    Thanks for the reply.
    Load to Cube is Delta and goes daily .
    Load to ODS is a full on daily basis .
    Can you help me how to sort this issue . I have to work directly in production env . so has to be safe and full proof .
    Thanks
    annie

  • How to get the values from struct data type using java code..?

    Hi ,
    I am newer to java.
    we are using oracle database.
    How to get the data from struct data type using java code.
    Thanks in Advance.
    Regards,
    kumar

    Hi Rajeev,
    To retrieve a FilterContainer you will need to traverse the report structure:
    ReportStructure boReportStructure = boDocumentInstance.getStructure();
    ReportContainer boReportContainer = (ReportContainer) boReportStructure.getReportElement(0);
    FilterContainer boFilterContainer = null;
    if (boReportContainer.hasFilter()) {
         boFilterContainer = boReportContainer.getFilter();
    } else {
         boFilterContainer = boReportContainer.createFilter(LogicalOperator.AND);
    Calling boDocumentInstance.getStructure() will retrieve the entire structure for the document.
    Calling boReportStructure.getReportElement(0) will retrieve the structure for the first report of the document.
    Hope this helps.
    Regards,
    Dan

Maybe you are looking for

  • The headphone jack on my Mac Mini is putting our distorted audio

    Currently, I'm running the audio from my Mac Mini to our brand new digital sound board but we are getting a lot of distorted signal. It's not a constant distortion and it seems to develop after a few minutes of use. We run the computer volume no high

  • Create a entity without using the DBObjectName/Datasource tag in Entity.xml

    Hi, I am using Jdev11.1.13 version In our application we are trying to implement the custom framework classes for tuxedo calls. Before we are implemented(Tuxedo calls) using the Datasource entry in Entity. but right now we have to implement tuxedo ca

  • CFSEARCH, Verity and suggestedQuery

    I have an issue with the "status" structure returned in cfsearch. In the status structure, the suggestedQuery field is no longer being populated. When I use CFDUMP to view the content of the structure, the only values being returned is FOUND, SEARCHE

  • Adding devices behind firewall

    i have just installed an AirPort Extreme and want to add my thermostat so i can access them remotely.  Do i need to add the MAC address and or IP Address of the thermostats?  How do i do this and where?

  • Problem deleting Facebook account in settings

    I have this curious problem that existed in my iPad 2 which migrated to my new iPad for. For some reason Facebook things that I have an identity established in settings that does not exist where that was old and no longer exists so the password and a