Cube Routine issue

Hi Guys
Here is the problem, I have an update rule its has Charecteristic "Customer Number". The file coming in has customer numbers obviously. A few customer number coming in are correct which should go through and few are not correct. Now i should run a validation to accept the correct number and if there are any wrong customer number i shuld update it with the new number given.
I need to write a routine which does this validations. Could some one help me with the code ASAP.
Regards
Chris

Hi chris,
if you have stored Correct (A) and Not Correct (B) Customer Number in a Master Data (C) to make the check you can use Start Routine of the Update Rules. Here a sample code.
data: begin of t_C occurs 0,
          B like C-B,
          A like C-A,
        end of t_C.
select * into corresponding fields of table t_C
from C where not B is null and OBJVERS = 'A'.
LOOP AT DATA_PACKAGE.
IF DATA_PACKAGE-B IS NOT INITIAL.   
read table t_C with key B = DATA_PACKAGE-B.
IF sy-subrc = 0.
DATA_PACKAGE-B = t_C-A.
ENDIF.
MODIFY DATA_PACKAGE.
ENDIF.
ENDLOOP.
Ciao.
Riccardo

Similar Messages

  • Inventory 0IC_C03 Cube Transformations issue BI 7

    Hi Experts,
    we are Installing Inventory Cube 0IC_C03, with In dataflow options in BI 7, 2LIS_03_BF transformtions is in Active State, while 2LIS_03_BX transformtion and 2LIS_03_UM transformtions are in Active.
    and for most of the Keyfields in Cube are not mapped with infosource fields.
    can any body implemented Inventory in BI 7 system, any body faced this issue. can you please share how you resolved this issue.
    we are on BI Content 703 11 version.
    Regards,
    Raj

    Raj,
    No need to check start routine code for this.
    Goto Transformation --> Change mode --> Choose Rule Group (on top of window, middle of the screen) --> Choose 05 --> Dispay mapping --> check routines available or not.
    Confirm back are you able to see filed routine or not...!!
    Check doc:  [Rule Groups in Transformation|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/90754b76-bcf1-2a10-3ba7-b299b2be09f2]

  • End Routine issue

    I am adding new records to the result_package in the end routine.
    How will I populate the SID and datapakid values to my new records?
    I am not able to code these fields in the end routine as they are not available and they get filled dynamically.
    Are there any system fields that hold SID value and data package value?
    Please help.
    Thanks.

    I have added move-corresponding code from result_package and it has solved my issue.
    Thanks.

  • End Routine Issue - It does not move data from E_T_RESULT to RESULT_PACKAGE

    Hi,
    I am facing an issue with end routine. I have gone through previous posts on, how to write end routine and all.I wrote the end routine accordingly.
    Here is my scenario,
    I have 0CUST_SALES master data , which has all the Sales Org, Distribution Channel and Division, Sold to Party, Sales Grp and Sales Dist.
    I am getting , Sold to party and Distribution channel at the field routine.
    I am using, Sold to Party and Dist Channel and Division = '01'- whatever i populated using a field routine  and trying to get the Sales Org, Sales Grp and Sales Dist at the end routine.
    It looks like, all the code that i wrote seems correct but it does not populate any values into RESULT_PACKAGE.
    Here is the code I wote at the end routine. I am not sure, whats wrong in it. I used, this link to write this routine :
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/203eb778-461d-2c10-60b3-8a94ee91cbfc&overridelayout=true
    Global Declaration----
      DATA : BEGIN OF IT_CUST_SALES,
        DIV TYPE /bi0/pcust_sales-DIVISION,
        DIST_CH TYPE /bi0/pcust_sales-DISTR_CHAN,
        SALES_ORG TYPE /bi0/pcust_sales-SALESORG,
        CUST_SAL TYPE /bi0/pcust_sales-CUST_SALES,
        SALESDIST TYPE /bi0/pcust_sales-SALES_DIST,
        SALESGRP TYPE /bi0/pcust_sales-SALES_GRP,
        END OF IT_CUST_SALES.
    DATA: T_CUST_SALES LIKE TABLE OF IT_CUST_SALES.
    Start of End Routine
       SELECT DIVISION DISTR_CHAN SALESORG CUST_SALES SALES_DIST SALES_GRP
        from
        /bi0/pcust_sales INTO TABLE T_CUST_SALES for all entries in
        RESULT_PACKAGE
        where CUST_SALES = RESULT_PACKAGE-SOLD_TO
         AND DISTR_CHAN = RESULT_PACKAGE-DISTR_CHAN
         AND DIVISION = '01'.
        LOOP AT RESULT_PACKAGE INTO e_s_result.
          READ TABLE T_CUST_SALES INTO IT_CUST_SALES
              WITH KEY CUST_SAL = e_s_result-SOLD_TO
                 DIST_CH = e_s_result-DISTR_CHAN
                 DIV = '01'.
          IF SY-SUBRC EQ 0 .
            MOVE IT_CUST_SALES-SALES_ORG TO E_S_RESULT-SALESORG.
            MOVE IT_CUST_SALES-SALESDIST TO E_S_RESULT-SALES_DIST.
            MOVE IT_CUST_SALES-SALESGRP TO E_S_RESULT-SALES_GRP.
            APPEND E_S_RESULT  TO  E_T_RESULT .
          ENDIF.
        ENDLOOP.
        REFRESH RESULT_PACKAGE.
        MOVE E_T_RESULT[] TO RESULT_PACKAGE[] .
    End End Routine
    Data comes into E_T_RESULT but it does not move to RESULT_PACKAGE. Any inputs will be helpful.
    Regards,
    Kumar

    Hi Hegde,
    Declaration is same , its like this.
       datA: e_s_result type tys_TG_1.
        data: e_t_result type tyt_TG_1.
    I don't know, when i inserted this code in this post, initially it was OK but once i post i also saw , its not that read friendly.
    FYI, i am trying to put the code again, lets see if it works.
      SELECT DIVISION DISTR_CHAN SALESORG CUST_SALES SALES_DIST SALES_GRP
        from    /bi0/pcust_sales INTO TABLE T_CUST_SALES for all entries in
        RESULT_PACKAGE   where CUST_SALES = RESULT_PACKAGE-SOLD_TO
         AND DISTR_CHAN = RESULT_PACKAGE-DISTR_CHAN
         AND DIVISION = '01'.
        LOOP AT RESULT_PACKAGE INTO e_s_result.
          READ TABLE T_CUST_SALES INTO IT_CUST_SALES
              WITH KEY CUST_SAL = e_s_result-SOLD_TO
                 DIST_CH = e_s_result-DISTR_CHAN
                 DIV = '01'.
          IF SY-SUBRC EQ 0 .
            MOVE IT_CUST_SALES-SALES_ORG TO E_S_RESULT-SALESORG.
            MOVE IT_CUST_SALES-SALESDIST TO E_S_RESULT-SALES_DIST.
            MOVE IT_CUST_SALES-SALESGRP TO E_S_RESULT-SALES_GRP.
            APPEND E_S_RESULT  TO  E_T_RESULT .
          ENDIF.
        ENDLOOP.
        REFRESH RESULT_PACKAGE.
        MOVE E_T_RESULT[] TO RESULT_PACKAGE[] .
    Regards,
    Kumar

  • Simultaneous read and write to a cube - critical issue

    Hello xperts,
    I am facing a critical issue.
    I have Cube A which has loads of data in it. I need to take the data of CubeA to a new CubeB as a backup purpose.This activity will take a lot of time.
    Concern1 - In the night we have the master data job which runs including the attribute change run job. Will this have any effect on the load happening from Cube A to Cube B.
    Concern 2 - After master data job finishes in the system , we then run the transaction data jobs which would update Cube A with the delta data. Now the issue is can this delta load happen  to Cube A while there is data load going on from Cube A to Cube B??
    Please help me out with this ASAP.
    Thanks & Regards
    Rohit

    Rohit,
    Attribute change runs will get affected only when you drop or rebuild the indices - this activity locks the cube - any activity that locks the cube will affect attribute change run - reads on a cube will not lock the cube.
    However when you are loading data into your new cube - you cannot load data into the cube - for the very fact that when you load data into the cube - you will drop indices - this will affect the data load into the new cube.
    Arun

  • Multiple columns (named the same originally) and mapped to the same lookup table are causing a Cube Build issue

    Hey folks, looking for some insight here.
    I've an implementation that contains some custom Enterprise columns mapped to lookup tables.  In the instance I'm working with now, it looks like there was/is an issue with one of those columns.  In this scenario, I have a column named
    ProjectType, created initially with that name, mapped to a lookup table.  This field's name was then changed to
    Project Type.  After that, it looks like another column was created, also called
    ProjectType.  So now, we have what I would have originally thought was two distinct columns, even though the names used are the same.
    Below is the error we're currently getting during the Cube Build Process...
    PWA:http://ps2010/PWA, ServiceApp:Project Web App, User:DOMAIN\user, PSI: SqlException occurred in DAL:  <Error><Class>1</Class><LineNumber>1</LineNumber><Number>4506</Number><Procedure>MSP_EpmProject_OlapView_B8546719-4D4C-473A-84B1-89DEDA2307E0</Procedure> 
    <Message>  System.Data.SqlClient.SqlError: Column names in each view or function must be unique. Column name 'ProjectType' in view or function 'MSP_EpmProject_OlapView_B8546719-4D4C-473A-84B1-89DEDA2307E0' is specified more than once.  </Message> 
    <CallStack>   
     at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)   
     at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)   
     at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)   
     at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString)   
     at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async)   
     at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result)   
     at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(DbAsyncResult result, String methodName, Boolean sendToPipe)   
     at System.Data.SqlClient.SqlCommand.ExecuteNonQuery()   
     at Microsoft.Office.Project.Server.DataAccessLayer.DAL.SubDal.ExecuteStoredProcedureNoResult(String storedProcedureName, SqlParameter[] parameters)  </CallStack>  </Error>
    I've tried deleting the one column, but the build still gives the above error.
    Any thoughts as to how the above could be resolved?
    Thanks! - M
    Michael Mukalian | Jan 2010 - Dec 2010 MVP SharePoint Services | MCTS: MOSS 2007 Configuration | http://www.mukalian.com/blog

    We tried taking it out of the cubes, and it builds fine.  The challenge we're having is in building the cubes with that custom field "ProjectType".  It's as if the cubes still hold some reference to it even when it's deleted.
    Since the OLAP View ('MSP_EpmProject_OlapView_{guid}') is recreated, would it be as simple as deleting that View, and trying to recreate?
    Thanks - M
    Michael Mukalian | Jan 2010 - Dec 2010 MVP SharePoint Services | MCTS: MOSS 2007 Configuration | http://www.mukalian.com/blog

  • Routine Issue

    Hi Experts,
    I have a requirement using routine i need to pull the data into daily sales cube.
    For daily sales cube data will get updated form billing cube, global history cube, sales order cube and sales agreement cube.  In these 4 cubes i have 0material infoobject. 
    In daily sales cube we dont have 0material infoobject.  Now i have to pull the data using 0material.
    For this i have routine but when i check the routine it is showing error message 0material table was not maintained in abap dictionary.  please find the below code.
    DATA: output_package  TYPE STANDARD TABLE OF /bic/cs8sd_c02
          WITH HEADER LINE WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    DATA:  BEGIN OF g_t_material OCCURS 0,
            material type.....
    busgrp   type.......
           END OF g_t_material.
    clear g_t_material.
    Read from 0material
      SELECT material busgrp
             FROM 0material
               INTO TABLE g_t_material
               WHERE objvers = 'A'.
    sort g_t_material.
    LOOP AT DATA_PACKAGE .
        MOVE DATA_PACKAGE TO output_package .
    Elec Family
        IF DATA_PACKAGE-material  <> 0  OR
       read table g_t_material with key material = data_package-material
            if sy-subrc eq 0.
             output_package-busgrp = g_t_material-busgrp.
          APPEND output_package.
        ENDIF.
    endloop.
    Regards
    Prasad

    Hi,
    For Daily Sales Cube the data will get updated from billing cube, global history cube, sales order cube and schedule agreement cube.  In these cubes we have 0Material infoobject.  But in Daily sales cube we dont have 0material infoobject so i have assigned business group infoobject into the cube assigned it to dimension and activated the daily sales cube. 
    My requirement is to get the Business Group values based on 0Material.  This Business Group is a Navigational Attribute in 0Material .
    Elec Family infoobject is a navigational attribute in 0material.
    Please find the below revised code
    DATA: output_package TYPE STANDARD TABLE OF /bic/cs8sd_c02 WITH HEADER
    LINE WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    DATA:  BEGIN OF g_t_material OCCURS 0,
           material type /BI0/OIMATERIAL,
           busgrp type /BIC/OICBUSGRP,
           END OF g_t_material.
    clear g_t_material.
    Read from 0material
      SELECT material /BIC/CBUSGRP
             FROM /BI0/PMATERIAL
               INTO TABLE g_t_material
               WHERE objvers = 'A'.
    sort g_t_material.
    LOOP AT DATA_PACKAGE .
        MOVE DATA_PACKAGE TO output_package .
    Elec Family
        IF DATA_PACKAGE-material  <> 0.
          read table g_t_material with key material = data_package-material.
            if sy-subrc eq 0.
             output_package-busgrp = g_t_material-busgrp.
             APPEND output_package.
           endif.
        ENDIF.
    endloop.
    This routine is fine but when i check the routine output_package-busgrp =g_t_material-busgrp  here it is throwing exception The data object "OUTPUT_PACKAGE" does not have a component called busgrp.
    As business group is included as a stand alone infoobject in cube level.
    do i need to write the routine at the object level ?
    Regards
    Prasad
    Edited by: Vara Prasad on Oct 13, 2008 2:06 PM

  • Inventory Cube performance Issue

    Hi All,
    This is not something new, but an old issue traditionally with this cube. I have customized 0IC_C03 for my requirement and having serious performance issues. It has 0CALDAY, 0MATERIAL, 0PLANT as non-cumulative value parameters. i hav eadded movement types (temproarily for validation purpose). But my query always timed out, unless I specify the material. There are close to 40K materials are being maintained. The values are all fine between ECC and BI afterdata loads. So we are thinking may be snap shot approach would hlp us resolve the performanc eissues.
    Anybody has implementeted snap-shot approach for inventory? I know it is a loading issue, but we think we could deal with that rather than performanc eissue when the users execute the query.
    if anybody has done it, could ou provide the steps?
    Thanks,
    Alex.

    Hi Jameson - Thanks for your response.
    We thought that would be the case. Have raised a SR with oracle and they are investigating on it. We have also sent an EIFF file to Oracle for investigation.
    Both the DBs are in the same environment (AIX 6.1) and DBAs have confirmed both the DBs have the same system parameters.
    Even if we keep aside comparing to 11.2.0.1, for some reason 11.2.0.3 seems to be very slow. Even a simple cube (2 Dim and 2 Measures) with 9K records takes around 15 min to get refreshed and it takes ages to view the data.
    Havent generated the AWR report, will see if we can do the same.
    rgds,
    Prakash S

  • Oracle 10g olap cube connectivity issue using Crystal Reports 2008

    Hi
    Kindly help me out with a solution. I have created a cube in oracle 10gR2 database using Oracle Analytic Workspace Manager. I am trying to connect to this cube using Crystal report Olap datasorces. But I am only getting 4 options to connect to Hyperion Essbase, HOLOS Cube, Microsoft OLEDB, Microsoft OLAP Analysis Services 8.0 and 9.0.  I am not getting any option for connection to Oracle OLAP. How do I do that or if it does not support kindly let me know.
    Thanks in advance
    Troyee

    Hi Troyee,
    I have discussed the issue with OLAP engineer and he told me that connectivity to OLAP cube is not supported.
    and please next time address OLAP questions to the following category:
    Expert Forums » Business Objects » Other Business Objects Products
    (see the relevant category description)
    Vitaly Izmaylov
    Crystal Reports Design forum moderator

  • Project Server 2013 - OLAP Cube Build Issues

    Hi,
    I have a Project Server 2013 environment with SQL Server 2012 SP1. My cube built successfully and Report Authors group (including me) were accessing the templates and Data Connections from BI centre successfully.
    I re-built the cube and suddenly we all cannot access templates or data connection as it says 'Access denied'!
    Went to server hosting Analysis Services and checked the role ProjectServerViewOlapDataRole and didn't see any of the check box ticked for database permissions like it was when I initially built the cube and added Report Authors group and account
    that is used by Secure Store app. Membership has both account and group.
    Has anyone come across this issue?
    Thank you,
    SJ

    SJ,
    The access of the BI Center, templates and data connections are handled by SharePoint permissions and are usually inherited from PWA.
    The ProjectServerViewOlapDataRole simply provides the ability to view the data from the OLAP database. It sounds like the issue isn't with the OLAP but rather with PWA itself. An OLAP error would have only precluded you from retrieving data.
    Did someone modify the BI Center site permissions recently such as breaking inheritance?
    Treb Gatte, Project MVP |
    @tgatte | http://AboutMSProject.com

  • ASO Cube Performance Issue

    We have been working on 2 ASO cubes, performance was great. No modification hasn't been done ever since, but we have been experiencing performance issue now. What could be the possible cause or how I could resolve this. Thank you.

    'Performance issue' isn't very descriptive - performance of what? Query? Data load? Aggregation? Restructure?
    As a start, has the volume of data (input data cells) been increasing significantly?

  • Cube compression issue

    Hello Gurus,
    we have some strange behaviours with cube compression.
    All requests are compressed, but in F table we still have some records.
    The same records are stored in E table too, but with BEx query execution we can see correct result.
    If we execute query in debug on RSRT, with SQL code display, the query reads only from F table or aggregates.
    How it is possible?
    We just provide to insert the COMPNOMERGE object in RSADMIN table, but only after the first compression. Do you think thath with a initialization of cube and a new compression with COMPNOMERGE object could solve our problem?
    Could you help us?
    Thanks in advance.
    Regards.

    Vito Savalli wrote:>
    > Hi Lars, thanks for your support.
    > We don't have an open support message for this issue, but if it will be necessary, we will open it.
    >
    > I - The same records are stored in E table too, but with BEx query execution we can see correct result.
    > You - The first part of this sentence is technically impossible. At least the request ID must be different in F- and E-fact table.
    >
    > Ok for the request ID, I know it. But, if we don't consider request ID (of course isn't equal) and we check the characteristics values by SID analysis, we find the same complete key both in F and in E table.
    >
    Well, but that's the whole point - the request ID!
    That's why we do compression for at all - to merge together the data for the same key figures if they exist in both tables.
    It's completely normal to have this situation.
    > I - If we execute query in debug on RSRT, with SQL code display, the query reads only from F table or aggregates. How it is possible?
    > You - Easy - you're statement about all requests being compressed is not true and/or it reads the necessary data from the aggregates.
    >
    > I executed with RSRT one of record which is in both tables.
    Well, obviously there was some other implicit restriction that lead to the selections made by OLAP.
    Maybe the request read from the F-Facttable was neither rolled up nor compressed.
    > Very helpful, thanks.
    > Any others suggestions?
    I'd check exactly the status of the requests and where they can be read from.
    You may also try out to disable the aggregate usage in RSRT to see whether or not the data is also read from the E-facttable and check the result of the query.
    regards,
    Lars

  • InvMgt Cube-load issues - stock movmts has not happened - how to delete

    Hi,
    For few Materials in R/3, current stock is Zero.
    But for the sam Material ,in BW, the Stock is available. loads has happened from 2008. Then the Materials stock is initialised.
    After that movements were loaded but, it never zeroed the TotalStock for that Material till now.
    And in R/3 the Stock is zero for that material currently.
    ( As u will be knowing the cube has non-cumulative KF, which takes inflow:Reciot Qty & outFlow:Issue Qty(based on Stock Movement, the non-cumlative KF holds the last value)
    There are no R/3 records for the material in Tcode MB51( no stock movement for the year 2008 is available).
    How to resolve this issue, would anyone guide me please , as this isssue is in Production now.
    there are many Materials like the same way. How to get the latest Stock ie.0 to all these records.
    please guide me.
    Thanks in Advance!.

    Hi,
    Use 2LIS_03_BX, 2LIS_03_BF, 2LIS_03_UM to 0IC_C03 Cube and design the report.
    Use :See the steps how to load the data to 0IC_C03.
    Treatment of historical full loads with Inventory cube
    Setting up material movement/inventory with limit locking time
    If it is BI 7 then for BX in in DTP in Extraction Tab you need to select Extacrion mode = NON-Cumulative option.
    0VALSTCKVAL      " For Value
    0VALSTCKQTY      " For Qty
    0CALMONTH        " For Month
    Use the above combinations in New Selections in columns and go it.
    For Qty Opening:
    New Selection, drag and drop following InfoObjects
    0VALSTCKQTY     " For Qty
    0CALMONTH       " For Month and restrict with less then or equalto option variable (single value, user input)  and set the offeset
                    " value = -1 bcoz if user will give 12.2009 , so it will display 11.2009 closing stock, this is opening for 12.2009.
    For Qty Closing:
    New Selection, drag and drop following InfoObjects
    0VALSTCKQTY    " For Qty
    0CALMONTH      " For Month and restrict with less then or equalto option variable (single value, user input) .
    In the same way build for Value and other Keyfigures on 0IC_C03.
    And
    Drag & drop
    0MATERAIL
    0PLANT  " Give some Input Variable.
    Thanks
    Reddy

  • SSAS Cube Processing Issue,Need Urgent Help...

    Hi Friends,
    Good Afternoon.
    I am processing the SSAs cube as it is failing with below error message.
    I have tried processing the cube by using XMLA script,Direct processing.
    <return xmlns="urn:schemas-microsoft-com:xml-analysis">
      <results xmlns="http://schemas.microsoft.com/analysisservices/2003/xmla-multipleresults">
        <root xmlns="urn:schemas-microsoft-com:xml-analysis:empty">
          <Exception xmlns="urn:schemas-microsoft-com:xml-analysis:exception" />
          <Messages xmlns="urn:schemas-microsoft-com:xml-analysis:exception">
            <Error ErrorCode="3238395904" Description="OLE DB error: OLE DB or ODBC error: Cannot execute the query &quot;SELECT     &#xA;
    CASE &#xA; WHEN charindex('.', [Name]) &gt; 0 THEN upper(substring([Name], 0, charindex('.', [Name]))) &#xA;
    ELSE Name &#xA; END COLLATE DATABASE_DEFAULT AS  PackageDownloadSourceName&#xA;
    , 'Notification Server' COLLATE DATABASE_DEFAULT AS PackageDownloadSourceType&#xA;
    FROM &#xA; vNotificationServerSource as vNotificationServerSource WITH (NOLOCK)&#xA;&#xA;
    UNION&#xA;&#xA; SELECT &#xA;
    DISTINCT &#xA; vc.Name COLLATE DATABASE_DEFAULT AS  PackageDownloadSourceName&#xA;
    , 'Package Server' COLLATE DATABASE_DEFAULT AS  PackageDownloadSourceType&#xA;
    FROM&#xA; vComputer AS vc WITH (NOLOCK)&#xA;
    INNER JOIN SWDPackageServer WITH (NOLOCK) ON vc.Guid = SWDPackageServer.PkgSvrId&#xA;&#xA;
    UNION&#xA;&#xA; SELECT     &#xA;
    PackageDownloadSourceName COLLATE DATABASE_DEFAULT AS PackageDownloadSourceName&#xA;
    , 'Non-Altiris Server' COLLATE DATABASE_DEFAULT AS PackageDownloadSourceType&#xA;
    FROM         &#xA;
    (SELECT &#xA; DISTINCT &#xA;
    CASE &#xA;
    WHEN charindex('//', [URL]) &gt; 0 THEN upper(substring(substring([URL], charindex('//', [URL]) + 2, len([URL]) - charindex('//', [URL]) - 1), 0, charindex('/', replace(substring([URL], charindex('//', [URL]) + 2, len([URL]) - charindex('//',
    [URL]) - 1), '.', '/')))) &#xA; WHEN charindex('\\', [URL]) &gt; 0 THEN upper(substring(substring([URL], charindex('\\', [URL]) + 2, len([URL]) - charindex('\\', [URL]) - 1), 0, charindex('\', replace(substring([URL],
    charindex('\\', [URL]) + 2, len([URL]) - charindex('\\', [URL]) - 1), '.', '\')))) &#xA;
    WHEN charindex('Multicast download complete. Master: ', [URL]) &gt; 0 THEN upper(substring([URL], charindex('Multicast download complete. Master: ', [URL]) + 37, len([URL]) - charindex('Multicast download complete. Master: ', [URL]) - 36)) &#xA;
    ELSE NULL &#xA; END COLLATE DATABASE_DEFAULT AS PackageDownloadSourceName&#xA;
    , 'Non-Altiris Server' COLLATE DATABASE_DEFAULT AS PackageDownloadSource...; 42000; The OLE DB provider &quot;SQLNCLI11&quot; for linked server &quot;ITANALYTICS_CMDB_SYMANTEC_CMDB_725_CZCHOWV319\SQL02_SYMANTEC_CMDB_3741&quot; reported
    an error. Execution terminated by the provider because a resource limit was reached.; 42000; OLE DB provider &quot;SQLNCLI11&quot; for linked server &quot;ITANALYTICS_CMDB_SYMANTEC_CMDB_725_CZCHOWV319\SQL02_SYMANTEC_CMDB_3741&quot; returned
    message &quot;Query timeout expired&quot;.; 01000." Source="Microsoft SQL Server 2012 Analysis Services" HelpFile="" />
            <Error ErrorCode="3240034316" Description="Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Package Download Source', Name of 'Package Download Source' was being processed."
    Source="Microsoft SQL Server 2012 Analysis Services" HelpFile="" />
            <Error ErrorCode="3240034317" Description="Errors in the OLAP storage engine: An error occurred while the 'Package Download Source Type' attribute of the 'Package Download Source' dimension from the 'IT Analytics'
    database was being processed." Source="Microsoft SQL Server 2012 Analysis Services" HelpFile="" />
            <Error ErrorCode="3238002695" Description="Internal error: The operation terminated unsuccessfully." Source="Microsoft SQL Server 2012 Analysis Services" HelpFile="" />
            <Error ErrorCode="3239837702" Description="Server: The current operation was cancelled because another operation in the transaction failed." Source="Microsoft SQL Server 2012 Analysis Services" HelpFile=""
    />
          </Messages>
        </root>
      </results>
    </return>
    Thank you very much for your Help.
    Regards,
    Reddeppa G

    Hi ReddeppaG2580,
    According to your description, you get the above error when processing a cube. Right?
    Based on the error message, the issue occurs on the dimension 'Package Download Source'. So check the involved tables and the query for this dimension. Check the attribute 'Package Download Source' in the dimension. If you still can't find some issue, try
    to recreate that dimension. Please refer to link below:
    Create a Dimension by Using an Existing Table
    If you have any question, please feel free to ask.
    Simon Hou
    TechNet Community Support

  • Update Routine Issue

    Hello!
    This is with regards to my previous post.
    Help with Update Routines
    I'm facing an issue & need some help.
    I have a NUMC field (in calendar month/year). I'm trying to map 0FISCPER to 0CALMONTH, but unable to do that. The system is providing me just one option "0AEDAT" as Source Char. The Fiscal year variant is V6. When I assign to 0AEDAT & try to activate it, error msg is "IC=ZPM_C45 IS=8ZPM_O50 error when checking the update rules"
    I'm not sure how to proceed?
    Could someone help on this?
    Thanks!

    Hi dear,
    if you have to fill your fiscal year/period (0FISCPER) starting from a calendar year/month you have to write a routine for it !!!
    Insert this code:
    (where you have to replace calmonth in comm_structure-calmonth with the name of your calmonth field, and if your calmonth has a format with seven digit, like MMMYYYY)
    CALL FUNCTION 'UMC_CALMONTH_TO_FISCPER'
    EXPORTING
    I_PERIV = comm_structure-fiscvarnt
    I_CALYEAR = comm_structure-calmonth+3(4)
    I_CALMONTH = comm_structure-calmonth+1(2)
    IMPORTING
    E_FISCPER = result .
    Hope now is clear !
    Bye,
    Roberto

Maybe you are looking for

  • ITunes not syncing to iPhone - Backed up & then synced the backup but....

    I have the iPhone 3gs with the latest software on and also iTunes 10. My phone has not been syncing with iTunes for some time now, I always had to sync it more than once for the new music to be on my phone. However it always showed under the iTunes t

  • Photoshop not opening double clicked files anymore?

    Hi all... All of a sudden my copy of Photoshop CS1 has stopped opening files when they are double clicked in the finder or passed any other way. The only way I can get PS to open a file is by opening them from it's File menu! Draging a file to it's d

  • How to download the Report .rdl from Sharepoint

    Hi, I want to get the server version of the report as I have made so many changes to the local copy of the report. Please suggest me in detail so that I can get the copy from the sharepoint. Thanks, Subrat Kumar

  • Using Font Folio in InDesign to export in PDF to app store

    Our customer is creating applications using In Design.    These applications are exported into PDF format for viewing on mobile devices.    Is there any additional licensing needed from Adobe in this scenario? Also, in the future (not now) if the cus

  • Hot Backup for oracle database?

    Dear all, I want to change Cold Backup to Hot Backup. Does anyone how to do Hot Backup and has some simple document I can follow? If the database is running in ARCHIVELOG mode, is the size grow very fast or other effect will overcome? Please advice,