Cube Transport issue

Hi All,
We have an existing cube which is having lots of data.
We have added a new data field in that cube and activated the same. Now when we are transporting this in Quality system, Is is not getting transported.
We checked the Transport logs also, but its giving an error as : Program terminated (job: RDDEXECL, no.: 01302300
With error code 12.
Any inputs on the same.
Regards,
Mayank

>
Arun Varadarajan wrote:
> Ravi,
> Addition of a characteristic too should be fine - For a cube - if you add characteristics - your dimension tables change - this would mean that you will have to regenerate all the SIDs in the dimension Tables. - this would mean that the DIM table has to be regerared...
>
> but the the DIm ID is a key field in the Fact table - this means that you will have to drop and reload data for adding a characteristic. Even if you are adding a characteristic - you would have to add another key into the fact table...which necessitates a data drop.
Hi Arun,
In InfoCube Star schema, the key the fact table stores is not individual characteristic SID, but a Generated Dimension Key from each dimension. This is also the reason why an InfoCube cannot have more than 13 customer dimensions as the Fact table is limited by the number of Keys it can have (16 Dimension Keys, 3 standard and 13 customer).
Hence, Adding a Characteristic is always fine to a Cube. Check t-code listschema for any InfoCube.

Similar Messages

  • Transportation Issue

    Hi Experts,
    I have a transportation issue which have been explained below:
    While am transporting thr entire dataflow fromDEV to Quality then my transport is getting failed and its saying that return code error 12 (Program terminated (job: RDDEXECL, no.: 15512700)
    ) which is critical and when am checking in the Quality its has a Dump message which is given in detail below:
    Name of Runtime error: TYPELOAD_LOST....
    Can anyone please let me know the cause of this error and also please let me know how to overcome the same....
    Its urgent....
    Regards,
    Gattu

    Ur msg desnt have any error logs - so it is difficult to nail the issue. But I believe you are trying to export all the objects in a one huge transport request, where some of dependencies can be missing. So spilt up the objects into different tranport and try export them in a sequence.
    Just repeating the transport again and again doesnt take away the error. try to fix it.
    follow sequence
    DS, IO, infoarea
    cubes and DSo
    transformations
    DTP and infopacks

  • SLD Transport issue

    Hi All,
                  I have  transport issue with SLD objects. I have different SLD's for my Development and Quality.
    1) Is CMS a option to transport the SLD objects like Business systems and Technical systems.
    2) I have technical System XYZ on Development system with 2 SWCV's installed for it but in the Quality i have for the same system XYZ i have three SWCV's installed, now if i transport the technical system XYZ from Development to Quality i want the additional SWCV installed for XYZ to be removed, but this is not happening, i even tried to transport the concerned SWCV and Product from Developmen to quality still the installed systems doesnt change as Desired.
    Please tell me if its possible and how can be achieved.
    Thanks and Regards,
    Raj.

    Hi,
    Two types of transport mechanism are used in PI.
    • File transport method.
    • CMS transport method.
    One upcoming is CTS+.
    Perform the following to import/export ID objects.
    1. To export or import directory objects, call the configuration maintenance screen of the Integration Builder.
    2. Call the context menu for an object in a collaboration profile in the Integration Builder navigation tree and choose Export….or choose Tools ® Export Configuration Objects…
    3. Select the Transport Using File System mode and follow the wizard’s instructions. When selecting individual objects, you can use drag and drop to drag the objects from the navigation tree and drop them in the object selection field.
    The Integration Builder saves a binary export file with the suffix tpz in the export directory of the directory server
    Do not change the file name of the export file. If you do, the Integration Builder will not accept it as the appropriate file when you import.
    4. To import the export file(s) to another Integration Directory, first copy or move it to the import directory of the target directory.
    5. Call the configuration maintenance screen of the Integration Builder for the target directory. Choose Tools ® Import Configuration Objects...
    6. Select the export file saved in the import directory by using the dialog box that appears.
    If the import is successful, the export file is moved to the subdirectory  Configuring Groups and Transport Targets
    Configure the CMS:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/43f5d790-0201-0010-2984-ff72d822b109
    http://help.sap.com/saphelp_nw04/helpdata/en/de/a4214174abef23e10000000a155106/content.htm
    /people/daniel.wirbser/blog/2005/10/27/tcsfilecreateexception--error-while-assembly-of-software-components-in-nwdi
    http://help.sap.com/saphelp_nw04/helpdata/en/f6/719a2172f74b67b150612a7cd3b7df/content.htm
    http://www.sap-hefte.de/download/dateien/964/074_lesesprobe.pdf
    Overview of Transition from Dev to QA in XI
    /people/sravya.talanki2/blog/2005/11/02/overview-of-transition-from-dev-to-qa-in-xi
    /people/sap.india5/blog/2005/11/09/xi-software-logistics-ii-overview
    regards
    Aashish Sinha
    PS : reward points if helpful

  • When transporting multiproviders and cubes transport failed.

    HI Experts,
    When transporting cubes and multiproviders in a request the transport failed with return code 8.
    The described below as:
    Start of the after-import method RS_CUBE_AFTER_IMPORT for obje
    Error/warning in dict. activator, detailed log    > Detail   
    /BIC/L101078 (specify a primary key)                         
    Table /BIC/L101078 could not be activated                    
    Return code..............: 8                                 
    DDIC Object TABL /BIC/L101078 has not been activated         
    Error when activating InfoCube 101078                        
    Error/warning in dict. activator, detailed log    > Detail   
    /BIC/L101077 (specify a primary key)                         
    Table /BIC/L101077 could not be activated                    
    Return code..............: 8                                 
    DDIC Object TABL /BIC/L101077 has not been activated         
    Error when activating InfoCube 101077                        
    Error/warning in dict. activator, detailed log    > Detail   
    /BIC/L101078 (specify a primary key)                         
    Table /BIC/L101078 could not be activated                    
    Return code..............: 8                                 
    DDIC Object TABL /BIC/L101078 has not been activated         
    Error when activating InfoCube 101078                        
    Error/warning in dict. activator, detailed log    > Detail   
    /BIC/L101077 (specify a primary key)                         
    Table /BIC/L101077 could not be activated                    
    Return code..............: 8                                 
    DDIC Object TABL /BIC/L101077 has not been activated         
    Error when activating InfoCube 101077                        
    Error/warning in dict. activator, detailed log    > Detail   
    /BIC/L101078 (specify a primary key)                         
    Table /BIC/L101078 could not be activated                    
    Please let us know if anyone has come across similar kind of issue.
    Regards,
    Monalisa Mohanty

    Issue seems to be with the aggregates maintained in the cube.
    Is your aggregates were properly maintained in your Dev system when you have collected it in your transport?
    Check the same and also use overwrite option when you are trying to import the transport in the target systems.
    If the aggregates are properly maintained then check the import once again and see if that works fine with the overwrite mode. If not then you need to collect your objects again in a new transport.
    Hope this helps.
    Murali

  • SAP BW 7.31 - Transport Issue with Generated Objects

    Team,
    We have two landscapes, Business As Usual (BAU on SAP BW 7.01SP8) and Project Landscape (on SAP BW 7.31 SP10). The Project landscape Development system is a copy of BAU landscape development System and got upgraded to 7.31. Same case with QA System also.
    Issue: We are trying to collect InfoProviders (which are created in 7.01 SP8) in to a transport request in 7.31 System, it’s not allowing to collect in a transport request. Getting the below message.
    Object CUBE 0FIGL_R10 is generated and cannot be transported
    Message no. RSO887
    Diagnosis
    The object CUBE 0FIGL_R10 is generated (for example, for a semantically partitioned object) and cannot be transported.
    System Response
    The object is not written in a transport request.
    Procedure
    You do not need to do anything. The main object (the semantically partitioned object) simply needs to be transported. Here the object CUBE 0FIGL_R10is automatically created in the target system.
    Could you please let me know if you come across the above situation and solution.
    Warm Regards,
    Surya

    Hi,
    you are not yet collected the cube CUBE 0FIGL_R10.
    go to RSA1 -> select the transport connection -> select the object types -> select the info cube -> expand the node -> double click on the select object -> give the cube name  CUBE 0FIGL_R10 -> select the transfer selection.
    collect the dependent objects like info package, ds, dso, cube, dtp, transformations.
    now click on the truck symbol form standard tool bar.
    now it will collect.
    then do transport to target system.
    Thanks,
    Phani.

  • Planning area transport issue - Urgent

    Hello Everyone,
    We are in SCM 5.1. In my planning area, there is already data and CVCs. When I make changes to the keyfigure disaggregation calculation type in the planning area and transport the changes from Dev to QA, all the CVCs and data in the QA system gets deleted. One thing to note here is that there is a message in the transport log - "No valid storage bucket profile exists". But that storage bucket profile exists in QA and I made sure to run the time series consistency check in Dev and QA before moving the transport.
    We opened an OSS message with SAP and they released 2 OSS notes and still after applying these 2 notes, the issue is not fixed. Supposedly in SCM 5.1, planning area changes could be transported to the target system without a data loss. Wanted to find out if anyone has come across this issue or if you had transported planning area changes after Go-Live without a data loss in SCM 5 or 5.1. I appreciate your quick reply to this message. Thanks.
    Regards,
    Mohammed.

    Hi,
    I need to run the report - RSDG_IOBJ_ACTIVATE for all the characteristics and keyfigures or just the keyfigures which I have changed in the planning area? Please note that I have changed only the disaggregation type in the planning area and not anything related to that keyfigure in RSA1.
    Do I need to run the report - RSDG_CUBE_ACTIVATE for the internal cube of the planning object structure?
    Please elaborate. Thanks.
    Regards,
    Mohammed.

  • Simultaneous read and write to a cube - critical issue

    Hello xperts,
    I am facing a critical issue.
    I have Cube A which has loads of data in it. I need to take the data of CubeA to a new CubeB as a backup purpose.This activity will take a lot of time.
    Concern1 - In the night we have the master data job which runs including the attribute change run job. Will this have any effect on the load happening from Cube A to Cube B.
    Concern 2 - After master data job finishes in the system , we then run the transaction data jobs which would update Cube A with the delta data. Now the issue is can this delta load happen  to Cube A while there is data load going on from Cube A to Cube B??
    Please help me out with this ASAP.
    Thanks & Regards
    Rohit

    Rohit,
    Attribute change runs will get affected only when you drop or rebuild the indices - this activity locks the cube - any activity that locks the cube will affect attribute change run - reads on a cube will not lock the cube.
    However when you are loading data into your new cube - you cannot load data into the cube - for the very fact that when you load data into the cube - you will drop indices - this will affect the data load into the new cube.
    Arun

  • Multiple columns (named the same originally) and mapped to the same lookup table are causing a Cube Build issue

    Hey folks, looking for some insight here.
    I've an implementation that contains some custom Enterprise columns mapped to lookup tables.  In the instance I'm working with now, it looks like there was/is an issue with one of those columns.  In this scenario, I have a column named
    ProjectType, created initially with that name, mapped to a lookup table.  This field's name was then changed to
    Project Type.  After that, it looks like another column was created, also called
    ProjectType.  So now, we have what I would have originally thought was two distinct columns, even though the names used are the same.
    Below is the error we're currently getting during the Cube Build Process...
    PWA:http://ps2010/PWA, ServiceApp:Project Web App, User:DOMAIN\user, PSI: SqlException occurred in DAL:  <Error><Class>1</Class><LineNumber>1</LineNumber><Number>4506</Number><Procedure>MSP_EpmProject_OlapView_B8546719-4D4C-473A-84B1-89DEDA2307E0</Procedure> 
    <Message>  System.Data.SqlClient.SqlError: Column names in each view or function must be unique. Column name 'ProjectType' in view or function 'MSP_EpmProject_OlapView_B8546719-4D4C-473A-84B1-89DEDA2307E0' is specified more than once.  </Message> 
    <CallStack>   
     at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)   
     at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)   
     at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)   
     at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString)   
     at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async)   
     at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result)   
     at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(DbAsyncResult result, String methodName, Boolean sendToPipe)   
     at System.Data.SqlClient.SqlCommand.ExecuteNonQuery()   
     at Microsoft.Office.Project.Server.DataAccessLayer.DAL.SubDal.ExecuteStoredProcedureNoResult(String storedProcedureName, SqlParameter[] parameters)  </CallStack>  </Error>
    I've tried deleting the one column, but the build still gives the above error.
    Any thoughts as to how the above could be resolved?
    Thanks! - M
    Michael Mukalian | Jan 2010 - Dec 2010 MVP SharePoint Services | MCTS: MOSS 2007 Configuration | http://www.mukalian.com/blog

    We tried taking it out of the cubes, and it builds fine.  The challenge we're having is in building the cubes with that custom field "ProjectType".  It's as if the cubes still hold some reference to it even when it's deleted.
    Since the OLAP View ('MSP_EpmProject_OlapView_{guid}') is recreated, would it be as simple as deleting that View, and trying to recreate?
    Thanks - M
    Michael Mukalian | Jan 2010 - Dec 2010 MVP SharePoint Services | MCTS: MOSS 2007 Configuration | http://www.mukalian.com/blog

  • Transport issue when copying a web dynpro component

    Hello All.
    I have a web dynpro component lets say ZTest and now I want to copy this, I gave the new one a different name i.e. ZFINAL, now when I copied, it asked me for the trasnport and when I asked to create a new transport I got this message "Class ZIWCI_ZFINAL does not exist" and then I clicked the green arrow again and it created a transport for me. After this process I activated my web dynpro component successfully, my web dynpro component has 3 views and 3 windows. When I looked at my new transport (for ZFinal),  I only saw three tasks attached to teh transport i.e. 1. Interface 2. Info Object from the MIME Repository and 3. Web Dynpro Component but I didn't see Controller (Web Dynpro), Definition (Web Dynpro) and View (Web Dynpro) attached to my transport. Now I wonder when I release my transport to Q I may have issues.
    Note: My old trasnport (which has Ztest)has all the missing objects: (Web Dynpro), Definition (Web Dynpro) and View (Web Dynpro)

    I resolved it myself.. I had to write the transport entry manually for the views and windows and even the component controller definition.

  • Transportation issue: Dimension table

    Hi
    While transporting an infocube i am getting following error( RC#8).
    Key field /BIC/EABC-KEY_ABC05 missing. Specify maintenance status 'read only'.
    View /BIC/VABCF could not be activated
    This infocube was added with a new character and assigned to a newly created dimension.
    where ABC is my cube name and ABC05 is the dimension.
    Any help on this?
    Rgds,
    Ilyas

    Hi,
    Is your Cube in Active version, when you Transporting?
    When you adding new Fields, you have to activate update rules and also map the fields.If not you have to write routine for that particular Field.
    Reg
    Pra

  • Inventory 0IC_C03 Cube Transformations issue BI 7

    Hi Experts,
    we are Installing Inventory Cube 0IC_C03, with In dataflow options in BI 7, 2LIS_03_BF transformtions is in Active State, while 2LIS_03_BX transformtion and 2LIS_03_UM transformtions are in Active.
    and for most of the Keyfields in Cube are not mapped with infosource fields.
    can any body implemented Inventory in BI 7 system, any body faced this issue. can you please share how you resolved this issue.
    we are on BI Content 703 11 version.
    Regards,
    Raj

    Raj,
    No need to check start routine code for this.
    Goto Transformation --> Change mode --> Choose Rule Group (on top of window, middle of the screen) --> Choose 05 --> Dispay mapping --> check routines available or not.
    Confirm back are you able to see filed routine or not...!!
    Check doc:  [Rule Groups in Transformation|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/90754b76-bcf1-2a10-3ba7-b299b2be09f2]

  • SAP BI 7.0 Transport issue with HR Structural Authorization DSO

    Hi,
    I am trying to transport HR Structural Authorization DSO Objects in  BI 7.0  from Dev to QA system. The Data sources are 0PA_DS02 and 0PA_DS03. ( I am sure that there are lots of changes in Authrorization concept in BI 7.0),.
    1. Please suggest me if I need to make any changes and tests before moving these authorization objects to QA system.
    2. Also, do I need to take any pre-cautions while activating business content objects 0TCTAUTH  and 0TCTAUTH_T (Datasources look like are from 3.x) as I am getting issue with the activation of the transfer structure for these objects?
    Thanks a lot for your valuable inputs.
    Regards
    Paramesh
    Edited by: paramesh kumar on May 5, 2009 12:45 AM

    Hi Paramesh.
    You can use the DSOs 0PA_DS02 and 0PA_DS03 in BI7.0 as well. You just need to use the new generation of analysis authorizations in transaction RSECADMIN.
    You can use 0TCTAUTH and 0TCTAUTH_T in BI7.0, however we have experienced som problems with the 0TCTAUTH_T extractor, which dumped because of a poorly designed SELECT statement that was unable to cope with 10000 records. We have replaced it with a generic data source that uses table RSECTEXT directly.
    Regards,
    Lars

  • Transport issue- Process chain

    Hi Folks,
    I am facing a issue  when transporting process chain,
    Am transporting a set of process chain using transport connection  a) when I choose all the objects and create a request and save it,  I get a message saying ' specify request', however the request is generated.
    b) When I transport the request I get an error as " Source system   does not exist".
    I checked conversion of logical systems  and RSLOGSYSMAP system, I dont find any issue.
    Would appreciate quick response.
    Thanks,
    Ravi

    Thanks Sheetal,
    My settings are fine,
    I tried removing the info package and do, no much of a change, however afte specify request is continously checked I get the errors as " Errors occurred when saving dependent object R3TR/PROG/ZMAJ_SUIVI of RSPV ABAP      MAJ_SU",
    I guess its related to process variant, but as per my understanding the process chains can be transported with Variants,
    Any help ??
    Regards,
    Ravi

  • Transport issue in NWDI

    Hi ,
    I am facing issue while transporting BPM project.
    My scenario is:
    I am using single SLD for Dev and Test environments.
    We have BPM already present on Dev and test environments.
    Received a new change for the existing BPM. we have undeployed the existing BPM on Dev environment and created a new BPM on dev environment by adding the changes.
    The changed BPM is deployed on dev successfully and tested also.
    After we transport the new changed BPM to test environmet, it is giving error while importing the TR:
    Error is:
    -> Cannot deploy two applications with identical application aliases (context roots). Your application [xxx.com/logistics~xxx_bp
      m] defines an application alias (context root) [{urn_xxx_xxx_PurchaseOrderNotification}OutboundDeliveryConfirmation
      _In] that is already in use by application [xxx/logistics~xxx. There are two possibilities in order to proceed with
      deployment: 1) Define different application alias (context root) for your application [xxx/logistics~xxx_bpm] (this is the
      preferred option). 2) Undeploy the application [xxx/logistics~delivery~xxx'. "
    But when I try to login to NWDI Track of Test system  to un-deploy, it is not showing any details.
    Any ideas on my issue.
    Regards,
    SP

    Hi SP,
    You can undeploy your application from Test environment by using the undeploy view of your NWDS.
    Please also go through the below thread which will be helpful to you:-
    Undeploy/Delete the BPM process dc in CE7.2 from server
    To Undeploy the applications, using NWDS, please follow the below blog which gives the step by step procedure:
    NWDS step by step (In the loving memory of SDM)
    BR,
    Anurag

  • Inventory Cube performance Issue

    Hi All,
    This is not something new, but an old issue traditionally with this cube. I have customized 0IC_C03 for my requirement and having serious performance issues. It has 0CALDAY, 0MATERIAL, 0PLANT as non-cumulative value parameters. i hav eadded movement types (temproarily for validation purpose). But my query always timed out, unless I specify the material. There are close to 40K materials are being maintained. The values are all fine between ECC and BI afterdata loads. So we are thinking may be snap shot approach would hlp us resolve the performanc eissues.
    Anybody has implementeted snap-shot approach for inventory? I know it is a loading issue, but we think we could deal with that rather than performanc eissue when the users execute the query.
    if anybody has done it, could ou provide the steps?
    Thanks,
    Alex.

    Hi Jameson - Thanks for your response.
    We thought that would be the case. Have raised a SR with oracle and they are investigating on it. We have also sent an EIFF file to Oracle for investigation.
    Both the DBs are in the same environment (AIX 6.1) and DBAs have confirmed both the DBs have the same system parameters.
    Even if we keep aside comparing to 11.2.0.1, for some reason 11.2.0.3 seems to be very slow. Even a simple cube (2 Dim and 2 Measures) with 9K records takes around 15 min to get refreshed and it takes ages to view the data.
    Havent generated the AWR report, will see if we can do the same.
    rgds,
    Prakash S

Maybe you are looking for

  • Can no longer get into ITUNES

    Getting message "The Itunes folder cannot be found or created and is required. The default location is inside My Music folder." I have unistalled and reinstalled Itunes and get the same message. Any suggestions?

  • Selection-screen output , initialaization confusion

    we have selectionscreen output, initialization events first initializatuion, than at selection-screen output execute it, than we pressed f3 came bact  to  selection-screen than which event  ll trigger thank you kals.

  • Advice on Where to Keep My LR Photos

    Earlier today I posted a question about moving my photos that I use with LR from an external drive to my computer's internal hard drive. Thinking was that this would then allow me to use the external drive for backup (which today I'm doing with DVDs)

  • Always show the Adjustments "Straighten".

    Hi, How do I set Aperture to defaultly show the "Straighten" (or "Noise Reduction" etc.) section on the Adjustment panel, so everytime I select a new image the "Straighten" is right there with all the basic adjustments (Exposure, White Balance etc.)?

  • Playback of Multiple Videos Simultaneously -- onCuePoint

    Hello AS3 Gurus! I have multiple instances of a video playback object (not an FLVPlayback component) on my stage (named "videoScreen1", "videoScreen2", etc...). I have separate NetConnections, NetStreams, and NetStatusEvent handlers set up for each o