Error_message while archiving the cube

Hi,
I am archiving a cube. the job get started .
Step 001 started (program GP4IY4269LUILPMJOZFNME9F9SA, variant ZARV_ XXXXuser ID EXXXXX)
Fill in all required entry fields
Job cancelled after system exception ERROR_MESSAGE
and end with the following error message. But i have entered all the required fields.. have any one
come accross this type of error . pls advice what to be done....
regards !
karthik

Hello Karthik,
Please check if there are any records in the cube that are not yet compressed.
Before you can perform time slice archiving, the data that you want to archive
has to be completely compressed.                                                                               
If this does not help, then please provide more details:
What SP are you on for BW?
Are you receiving any other error messages in the job log? any dumps?
Best regards
Barry

Similar Messages

  • Issue while archiving the processed file in sender communication channel using SFTP adapter

    Hi All,
    In one of my scenario (File to IDOC), we are using SFTP sender communicationchannel.
    we are facing an issue while archiving the processed file. Some times PI processed the file successfully but unable to archive it and in the next poll PI process & archives the same file successfully which will creates duplicate orders in ECC.
    Please let us know how to resolve this issue.

    Hi Anil,
    Refer Archiving concepts in below links.
    http://help.sap.com/saphelp_nw73/helpdata/en/44/682bcd7f2a6d12e10000000a1553f6/content.htm?frameset=/en/44/6830e67f2a6d12e10000000a1553f6/frameset.htm
    http://scn.sap.com/docs/DOC-35572
    Warm Regards,
    DNK Siddhardha.

  • Error while archiving the repository

    Hi all
    We are facing error while archiving the repository. It is giving following error while archiving:
    Error reading blob from A2i_CM_XMLschema table
    Operation ended in ERROR : 84020008H : Database binary object error.
    Can anybody help me on this.
    I am not getting error while working with repository. Its working fine but only while archiving its giving this error.
    Thanks in advance.

    I tried doing that . Now after compacting i am not getting warning while verifying but while archiving i am getting following error :
    *Error reading blob from A2i_CM_XMLSchema table*
    *$$$ Operation ended in ERROR : 84020008H : Database binary object error.*
    Also I tried unarchiving from earlier archived file but for unarchiving also its giving me following Error:
    MDM Repository data is out of date or locked by another MDM Server. Refresh the data and try the operation again.
    I checked. There is no another MDM server accessing the repository. What does it mean by Refresh Data.
    Do above two problems inter-related.
    Can anybody help me on this.
    Thanks in advance

  • While browsing the cube data Excel the circle pointer starts to spin and the excel go into a not-responding state,any recommendations to improve performance in excel?

    hi,
    while browsing the cube data Excel the circle pointer starts to spin and the excel go into a not-responding state,any recommendations to improve performance in excel? 
    I have 20 measures and 8 dimensions.
    while filtering data by using dimensions in excel it is taking so much time.
    Ex:
    I browsed 15 measures in excel and filtered data based on time(quarter  wise) and other dimesion product. It is taking long time to get  data.
    Can you please help on this issue.
    Regards,
    Samba

    Hi Samba,
    What're the versions of your Office Excel and SQL Server Analysis Services? It will be helpful if you can share the detail computer resource information to us while encountered this issue.
    In addition, we don't know your cube structure and the underlying relationships. But you can take a look at the following articles to troubleshoot the performance issue:
    Improving Excel's Cube Performance:
    http://richardlees.blogspot.com/2010/04/improving-excels-cube-performance.html
    Excel Against a SSAS Cube Slow: Performance Improvement Tips:
    http://www.msbicentral.com/Blogs/tabid/131/articleType/ArticleView/articleId/136/Excel-Against-a-SSAS-Cube-Slow-Performance-Improvement-Tips.aspx
    Regards, 
    Elvis Long
    TechNet Community Support

  • Error while transporting the cube

    Hi all,
    While I was transporting my cube from Development box to Acceptance box, following error occured.
       Execution of programs after import (XPRA)
       Transport request   : GCDK905220
       System              : NCA
       tp path             : /usr/sap/NCA/DVEBMGS01/exe/tp
       Version and release: 372.04.10 700
       Post-import methods for change/transport request: GCDK905220
          on the application server: bdhp4684
       Post-import method RS_AFTER_IMPORT started for CUBE L, date and time: 20080802045222
       Program terminated (job: RDDEXECL, no.: 04522200)
          See job log
       Execution of programs after import (XPRA)
       End date and time : 20080802045237
       Ended with return code:  ===> 12 <===
    Actually transport failed but the cube was present in the Acceptance box in inactive state
    Please help me for the same.
    With thanks and regards
    Mukul Singhal

    Hi Mukul,
    This is an RC=12 issue.Contact the basis team or raise ticket basis team stating that your specified transport request is got terminated.Basis team will resolve the issue and then you can Import the transport into your acceptance system.
    I hope it will help you.
    Regards
    Suresh B.G.

  • 'dimension attribute was not found' error while refresing the cube in Excel

    Dear All,
    Thanks for all your support and help.
    I need an urgent support from you as I am stuck up here from nearly past 2-3 days and not getting a clue on it.
    I have re-named a dimension attribute 'XX' to 'xx' (Caps to small)in my cube and cleared all dependencies (In Attribute relationship tab as well).
    But while refreshing the data in excel client (of course after processing the full cube) I get an error
    'Query (1, 1911) the [Dimension Name].[Hierarchy Name].[Level Name].[XX] dimension attribute was not found' error.
    Here I am trying to re-fresh an existing report without any changes with the new data.
    Does not it re-fresh automatically if we clear the dependencies or there something that I am missing here (Like order of the dependencies).
    Cube processed completely after modifications and able to create new reports without any issues for same data. What else could be the reason?
    Can some one help me here?
    Thanks in Advance and Regards,

    Thnaks alot Vikram,
    In se11  ,when i was trying to activate the  /BIC/FZBKUPC , it is showing the warnings  like
    Key field KEY_ZBKUPCP has num. type INT4: Buffering not possible, Transport restricted.What it means.
    In PErformance Window it is showing like:
    A numeric type was used for a key field. A table of this sort cannot be buffered.When buffering, the table records for numeric values cannot be stored in plain text.
    Procedure
    You can enter "no buffering" as an attribute of the technical settings and then repeat the activation process. The table is not buffered.
    If you want to buffer the table, you must replace the type in the key field by a character type, i.e. that you must modify the table.
    Please adivice with your valueable suggestyions.
    Thanks Vikram.

  • Run Time error while activating the cube 0pp_c03

    Hi All,
    I encountered this error while activating the info cube 0pp_c03
      An exception occurred which is explained in detail below.
      The exception, which is assigned to class 'CX_RSR_X_MESSAGE', was not caught
      and
      therefore caused a runtime error.
      The reason for the exception is:
      No text available for this exception
    Regards
    Rohit

    Hi,
    Please check in ST22 for any relevant short dump. the error message is not of much help in anlyzing the problem
    or
    Perform the consistency test with 'RSRV' -->All Elementary Tests --> Master Data --> 'Compare Number Range and Maximum SID for the characteristic 0REQUID'. and correct the error
    Regards,
    Marasa.

  • Getting error while Archiving the IDOCS

    Hello,
    i am using the archiving object IDOC.
    I am getting an error "Infostructure SAP_IDOC_001 contains error. When running the production Mode.
    I tried to run the same in Test mode there was no error.
    The error is occurring in the delete program.
    I tried to debug it and figured out that while selecting the Archive file to delete it is giving the error.
    The IDOC has been written successfully to the archive file path.
    Not able to figure out as to why it is giving error when running in Production Mode
    Thanks
    Nishant

    if SAP gives an error "Infostructure SAP_IDOC_001 contains error", then I would focus on this.
    Of course this error can only happen in production mode, as SAP will only then update the infostructure, it does not update infostructures in test mode.
    use SARI, click customizing, enter Infostructure SAP_IDOC_001 and check if you can activate the info structure without errors.
    check if OSS note 1358690 is of some help. Otherwise open a message at SAP directly.

  • Error for the fact table while processing the cube - attribute key cannot be found when processing

    Please help as I am new to SSAS and this is urgent requirement. This is a MOLAP cube and below is the error that I am receiving when processing the cube. The cube is set to Prrocess Full. Several similar errors are popped up for various dimensions.
    "Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'Fact_Table', Column: 'ID', Value: '1'. The attribute is 'Id'. Errors in the OLAP storage engine: The attribute key was converted to an unknown member because
    the attribute key was not found. Attribute Id of Dimension: 17 - Ves - PoC Cont from Database: DB, Cube: IPNCube, Measure Group: iSrvy, Partition: Partition1, Record: 1."
    Thanks in advance.

    Thanks for the recommendations David.
    It will be really great if you can clear some of my doubts:
    To my information, all the dimensions need to be processed first and then the fact table will be processed.
    So if the ID's are not present in the dimension tables, then it should not be present in the Fact table either.
    Here we found null values in the dimension table and the ID's were present in the Fact table. What might be the reasons causing such situation?
    Also how frequently the cube needs to be processed? Currently the ETL which processes the cube, is scheduled in a SQL Job Agent on hourly basis everyday. 
    Is there any possibilty that the cube might be under processing state and the SQL job for the next run getting executed trying to access and process the cube while it was still processing?

  • "Transaction Errors : Aborting transaction on session " while processing the cube

    HI Team,
    Currently i have developed a cube and successfully deployed it in to the SSAS server.
    But when i process the cube the measures in the cube got successfully processed. After that the process is still running and showing the status as "Transaction errors : aborting transaction on session XYZAB".
    can you please guide me in solving this issue. The cube takes more than 6 hrs to process.
    thanks in advance
    baskar k

    Hi,
    I have Similar issue with 2005 and in 2005 I can't execute select * from $system.discover_sessions.
    Do we have any other way to resolve it.
    If I restart SSAS Server, It starts working fine and I cant restart at day time.
    http://blogs.msdn.com/b/sql_pfe_blog/archive/2009/08/27/deadlock-troubleshooting-in-sql-server-analysis-services-ssas.aspx
    Thanks Shiven:) If Answer is Helpful, Please Vote

  • What are the precautions to be taken while archiving the POs' and Material

    Hi All,
    I want to archive the Purchase orders and Material documents. Is there any standard rule to follow this procedure. What are the further consequences for the consecutive documents. And how do you check the open POs' of all.
                                 I have archived PRs' and Invoices, but these are not interlinked.
    Could you please help me in this, so that I will be very much thankful to you.
    Regards,
    Ram

    Dear Ramesh,
    Go to Tcode ARCHGUIDE --> Application Specific analyses and setting --> MM -->MM_EKKO and MM_MABTEL
    You can tick on the documentation and link to SAP help for all the guide and need for this archiving.
    Regards,
    w1n

  • Fact table with datetime measure showing #value error while browsing the cube

    Hi All,
    I have a cube with a fact table having datetime measure.
    when I browse the cube, I am able to see the data for all measures except  for the measure with the datetime as datatype.
    Thanks in advance.

    Hi jarugulalaks,
    Actually this forum is to discuss:
    Visual Studio WPF/SL Designer, Visual Studio Guidance Automation Toolkit, Developer Documentation and Help System, and Visual Studio Editor.
    To make this issue clearly, would you mind letting us know more information about this issue? Whether it is the VS IDE issue? Which language are you using? Which kind of app are you developing? Maybe you could share us a screen shot about it.
    But like this case posted by you here:
    https://social.msdn.microsoft.com/Forums/vstudio/en-US/bc2d30b8-a60d-4f0f-a273-b7cf0f5aaed5/value-error-for-datetime-measure-in-ssas?forum=visualstudiogeneral#bc2d30b8-a60d-4f0f-a273-b7cf0f5aaed5
    If it is the SSAS issue, please post this issue to the SSAS forum for dedicated support.
    Best Regards,
    Jack
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Java.sql.SqlRecoverableException error while querying the cube

    Hi
    I get the following error when I try to query the cube from a Java program
    java.sql.SqlRecoverableException: no more data to read from the socket.
    This error occurs sometimes and sometimes it does not occur.
    We observed that if many people simultaneously try to query the cube the error crops up.
    Is this a bug or is there a solution or a method to deal with this sort of situation

    That sounds like the connection to the database is not there, or was dropped. Check that you are not maxed out on the number of processes.

  • Oracle Express Error while building the cube

    When we try to build the Express cube using Relation Access manager's Hybrid Maintenance, We encountered the following error raised by the exception manager.
    "SYSTEM ERROR RSALOC02"
    We are using ODBC drivers for connecting and fetching data from oracle RDBMS. When we checked the Log file of the cube, junk characters were printed at the end of the Log.
    signal errorname errortext
    2 FCH.D3 FCH.D4 FCH.D5 FCH.D6>-
    8 THEN-
    ' ' _sqlerrm)
    p
    2   hkT2 ! _W2p )  1 @ H H>8 @U-0 r

    It could be the version of ODBC you are using, or some invalid characters in the data you are trying to fetch. Since this issue is very specific you would be better to open a TAR on metalink. Thanks.

  • Error message while Archiving  the PO on 4.6C version

    Hello Team ,
    We are archiving the PO in SAP 4.6C version.
    Some of the PO failed  with the below error message "4203018801 Item 00002 still entered in info record as last document item".
    i have kept the deletion flag for the info record and tried to archive the same. But it got failed.
    Can any one help me  how to overcome this  error.

    Hi,
       Ensure that the deletion indicator has been set to the correct info record. Please check in table EINA, whether the deletion indicator (EINA - LOEKZ) is flagged for the material and vendor combination.
       If everything is flagged correctly and still system is throwing the same error message, please check the note: 335800 - SARA: Deletion indicator not set, but why?
    Regards,
    AKPT

Maybe you are looking for