HFM Data Load Issue

Hello All,
We had an EPMA Type HFM application whose all dimensions were local, The application validated and deployed successfully.
We tried loading data into the HFM application and the data load was successful.
Then we decided to convert all of the local dimension of the above mentioned HFM application as shared dimensions. After converting all the dimensions as shared dimension successfully we are getting error while loading data into the same HFM application (The app does validate and can be deployed after changes)
The Error log is below:
Load data started: 11/29/2014 10:53:15.
Line: 216, Error: Invalid cell for Period Oct.
ACTUAL;2014; Oct; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;11979
>>>>>>
Line: 217, Error: Invalid cell for Period Nov.
ACTUAL;2014; Nov; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;23544
>>>>>>
Line: 218, Error: Invalid cell for Period Dec.
ACTUAL;2014; Dec; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;58709
>>>>>>
Line: 219, Error: Invalid cell for Period Oct.
ACTUAL;2014; Oct; YTD; E_2100;<Entity Currency>;28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11979
>>>>>>
Line: 220, Error: Invalid cell for Period Nov.
ACTUAL;2014; Nov; YTD; E_2100;<Entity Currency>;28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11565
>>>>>>
Wanted to know whether there is something I might have missed while converting local dimension into shared (If there is any sequence to do so,or any constraint that I may not be aware of, though the conversion looks good as application is validated and deployed after changes)
What can be the reason for the failed data load, can anyone help?
Thanks
Arpan

Hi,
I would look at the account properties for that account (89920000) and see the TopCustom1...4Member. You will find the reason behind the invalid cells.
When you convert the local dimensions to shared, have you checked the 'Dimension Association' for Accounts and Entitities?
It does seem to lose the dimension association if a proper sequence is not followed.
Regards,
S

Similar Messages

  • Log Issue in HFM data load

    Hi,
    I'm new to Oracle data Integrator.
    I have an issue in log file name. I'm loading data into Hyperion Financial Management through ODI. In the Interface, when we select the IKM SQL to HFM data, we have an option of log file enabled. I made it true and gave the log file name as 'HFM_dataload.log'. After executing the interface when I navigate in to that log folder and view the log file, that file is blank. Also a new file 'HFM_dataloadHFM6064992926974374087.log' is created and the log details are displayed in it. Since I have to automate the process of picking up the everyday log file,
    * I need the log details to be displayed in the specified log name i.e. 'HFM_dataload.log
    Also I was not able to perform any action (copy that newly generated log file into another or send that file in mail) on that log file, since I'm not able to predict the numbers generated along with the specified log file name.
    Kindly help me to overcome this issue.
    Thanks in advance.
    Edited by: user13754156 on Jun 27, 2011 5:08 AM
    Edited by: user13754156 on Jun 27, 2011 5:09 AM

    Thanks a lot for idea.
    I am wonder in HFM data loads. In ODI operator they are showing warning symbol though few records got rejected instead Error. Is it possible to make it fail if one or more records got rejected.
    I have experience with Essbase data loads. if it reaches specified number of records operator will get fail.
    Please guide me if i am missing something.
    Regards,
    PrakashV

  • Data Load Issue "Request is in obsolete version of DataSource"

    Hello,
    I am getting a very strange data load issue in production, I am able to load the data upto PSA, but when I am running the DTP to load the data into 0EMPLOYEE ( Master data Object) getting bellow msg
    Request REQU_1IGEUD6M8EZH8V65JTENZGQHD not extracted; request is in obsolete version of DataSource
    The request REQU_1IGEUD6M8EZH8V65JTENZGQHD was loaded into the PSA table when the DataSource had a different structure to the current one. Incompatible changes have been made to the DataSource since then and the request cannot be extracted with the DTP anymore.
    I have taken the follwoing action
    1. Replicated the data source
    2. Deleted all request from PSA
    2. Activated the data source using (RSDS_DATASOURCE_ACTIVATE_ALL)
    3. Re transported the datasource , transformation, DTP
    Still getting the same issue
    If you have any idea please reply asap.
    Samit

    Hi
    Generate your datasource in R/3 then replicate and activate the transfer rules.
    Regards,
    Chandu.

  • TileList data load issue

    I am having an issue where the data that drives a tilelist
    works correctly when the tile list is not loaded on the first page
    of the application. When it is put on a second page in a viewstack
    then the tilelist displays correctly when you navigate to it. When
    the tilelist is placed in the first page of the application I get
    the correct number of items to display in the tilelist but the
    information the item renderer is supposed to display, ie a picture,
    caption and title, does not. The strange thing is that a Tree
    populates correctly given the same situation. Here is the sequence
    of events:
    // get tree is that data for the tree and get groups is the
    data for the tilelist
    creationComplete="get_tree.send();get_groups.send();"
    <mx:HTTPService showBusyCursor="true" id="get_groups"
    url="[some xml doc]" resultFormat="e4x"/>
    <mx:XMLListCollection id="myXMlist"
    source="{get_groups.lastResult.groups}"/>
    <mx:HTTPService showBusyCursor="true" id="get_tree"
    url="[some xml doc]" resultFormat="e4x" />
    <mx:XMLListCollection id="myTreeXMlist"
    source="{get_tree.lastResult.groups}"/>
    And then the data provider of the tilelist and tree are set
    accordingly. I tried putting moving the data calls from the
    creation complete to the initialize event thinking that it would
    hit earlier in the process and be done by the time the final
    completion came about but that didn't help either. I guess I'm just
    at a loss as to why the tree works fine no matter where I put it
    but the TileList does not. It's almost like the tree and the
    tilelist will sit and wait for the data but the item renderer in
    the tilelist will not wait. Which would explain why clicking on the
    tile list still produces the correct sequence of events but the
    visual component of the tilelist is just not working right. Anyone
    have any ideas?

    Ok, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.

  • Demantra Data Load Issue

    I am new to Demantra. Have installed a stand alone Demantra system in our server. In order to load data, I created a new model, defined item and location levels, then clicked on 'Build Model'. The data is loaded into 3 custom tables created by me. After creating the model, I cannot login to 'Collaborator Workbench', it gives message 'There are system errors. Please contact your System Administrator'. Can anyone please tell me what I am doing wrong and how to resolve the issue.
    Thanks

    Ok, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.

  • HFM Data Load Error in ODI

    Hi,
    I'm loading data into HFM from flat file. When the interface is executed only some of the data are getting loaded. When i checked for the errors in the log, I'm getting the following error message in log:
    'Line: 56, Error: Invalid cell for Period Apr'
    Then i found that its an invalid intersection in HFM which am trying to load.
    In FDM there is an option to validate invalid intersections during data load.
    I would like to know how to do so in ODI to overcome this kind of error i.e. is there any option in ODI to ignore this kind of error.
    Kndly help me.
    Thanks in advance

    Hi,
    I think even if the metadata exists still there might be some issues with HFM forbidden cells. There are HFM rules that determines which intersections are editable/loadable which are not. Please look at your HFM admin regarding forbidden rules. Or otherwise change the property of Custom dimensions so that it accepts data into all intersections.
    Thanks,
    Debasis

  • HFM Data Load Hangs?

    Hello,
    We are trying to load data to HFM from MS SQL.
    1. Successfully reverse engineered both SQL and HFM
    2. User for SQL has DBO access
    3. Successfully mapped source and target
    4. In flow we are using dedicated SQL staging area
    5. We are using LKM SQL to MSSQL and IKM SQL TO HFM Data
    6. In IKM we are using all default settings for properties
    7. When we execute; the interface is hung on the 5th step:
    1. DROP WORK TABLE (Success)
    2. CREATE WORK TABLE (Success)
    3. LOAD DATA (Success)
    4. SQL TO HFM PREPARE TO LOADING (Success)
    *5. SQL TO HFM LOAD DATA TO HFM (RUNNING FOR 14+ hrs)*
    To make sure it wasn't a large volume issue (just 100k rows), we even created a filter to pull just a single entity with very few records, still the process doesn't complete even after 12+ hours...
    We are using 10.1.3.6.0, are there any known issues with IKM SQL TO HFM DATA in this version?
    Please suggest.
    Appreciate your responses.
    Thanks

    Hello,
    Thanks for the response.
    Looked into the logs and nothing that points to 'why its hanging'....
    Here's the log, says connection to source, connection to hfm, options, etc all good...
    </Options>
    2013-05-31 12:39:10,107 INFO [DwgCmdExecutionThread:null:0]: Load Options validated.
    2013-05-31 12:39:10,302 INFO [DwgCmdExecutionThread:null:0]: Source data retrieved.
    2013-05-31 12:39:10,303 INFO [DwgCmdExecutionThread:null:0]: Pre-load tasks completed.
    2013-05-31 12:49:30,396 INFO [DwgCmdExecutionThread:odi_agent:2]: ODI Hyperion Financial Management Adapter Version 9.3.1
    2013-05-31 12:49:30,398 INFO [DwgCmdExecutionThread:odi_agent:2]: Load task initialized.
    2013-05-31 12:49:30,407 INFO [DwgCmdExecutionThread:odi_agent:2]: Connecting to Financial Management application [XXXXX] on [XXXXX] using user-name [XXXXX].
    2013-05-31 12:49:30,923 INFO [DwgCmdExecutionThread:odi_agent:2]: Connected to Financial Management application.
    2013-05-31 12:49:30,923 INFO [DwgCmdExecutionThread:odi_agent:2]: HFM Version: 11.1.2.1.0.
    2013-05-31 12:49:30,923 INFO [DwgCmdExecutionThread:odi_agent:2]: Options for the Financial Management load task are:
    <Options>
    <Option name=LOG_FILE_NAME value=D:\LOGS_ERRORS\SQL_HFM_LOG.LOG/>
    <Option name=IMPORT_MODE value=Merge/>
    <Option name=CONSOLIDATE_ONLY value=false/>
    <Option name=CONSOLIDATE_PARAMETERS value=""/>
    <Option name=LOG_ENABLED value=true/>
    <Option name=ACCUMULATE_WITHIN_FILE value=false/>
    <Option name=CONSOLIDATE_AFTER_LOAD value=false/>
    <Option name=FILE_CONTAINS_SHARE_DATA value=false/>
    So, no clear info on why its hanging on the load step...
    Any suggestions experts? Is it because of the Adaptor version being Version 9.3.1 and HFM Version: 11.1.2.1.0?
    Thanks for your inputs!

  • Error in 0EMPLOYEE Master Data Load Issue

    Hi,
    We have 0EMPLOYEE Master Data. Due to our new development changes ralted to 0EMPLOYEE, we have scheduled 2 new info packages with Personnel number range. While creation of infopackages, we forget to main time interval from 01.01.1900 to 31.12.9999. Instead of this, the default range was selected as 24.04.2009 to 31.12.9999. Because of this selection in InfoPackage, the Employee Master Data Valid from date was changed to 24.04.2009 for all the employees in the master data after the data load.
    Even after i change this selection properly and loading the data also, its not correcting with correct valid from dates.
    Can you please advice, how can we fix this issue ASAP as its a production issue?
    Thanks!
    Best regards,
    Venkata

    > Even after i change this selection properly and loading the data also, its not correcting with correct valid from dates.
    May be for this you have the ONLY option to delete 0Employee master data and reload it again. For this you need to delete dependent transaction data also.
    Cheers,
    Sree

  • Master data loading issue

    Hi gurus,
        Presently i am working on BI 7.0.I have small issue regarding master data loading.
        I have generic data soruce for master data loading.i have to fetch this data to BW side.Always i have to do full data load to the master data object.first time i have scheduled info package and run DTP to load data to master data object, no issues, data got loaded successfully.whenever i run infopacage for second time and run DTP i am getting error saying that duplicated records.
       How can i handle this.
    Best Regards
    Prasad

    Hi Prasad,
    Following is happening in your case:
    <b>Loading 1st Time:</b>
    1. Data loaded to PSA through ipack.It is a full load.
    2. data loaded to infoobject through DTP.
    <b>Loading 2nd Time:</b>
    1. Data is again loaded to PSA. It is a full load.
    2. At this point, data in PSA itself is duplicate. So when you are running the DTP, it picks up the data of both the requests that were loaded to PSA. And hence, you are getting the Duplicate record error.
    Please clear the PSA after the data is loaded to infoobject.
    Assign points if helpful.
    Regards,
    Tej Trivedi

  • Data Loading issues in PSA and Infocube

    Hi team,
    i am loading data into PSA and from there into info cube via a DTP,
    when i load data into PSA all 6000 records are transfered to PSA and process is completed sucessfully,
    When i execute the DTP to load data into info cube the data load process is completed sucessfully but when i observe
    the "manage" tab i see
    "Transferred 6000 || Added Records 50"
    i am not able to get as why only 50 records are loaded into infocube and if some records are rejected where can i find them and the reason for that,
    kindly assist me in understanding this issue,
    Regards
    bs

    hi,
    The records would have got aggregated based on common values.
    if in source u had
    custname xnumber kf1
    R                56             100
    R                53             200
    S                54              200
    after aggregation u will have
    custname                kf1
    R                            300
    S                           200
    if u do not have xnumber in your cube.
    Hope it is clear now.
    Regards,
    Rathy

  • DSO data Load issue

    Hi all,
    i have some issue with DSO data load , while loading the data data comming to psa perfectly 238 ware comes, but when i trigger that DTP, i am getting only 6 records.
    Please any one suggest me.
    Thanks,
    Gayatri.

    Hi Gayatri,
    if you have already loaded some data to DSO & now if you are trying to do Delta, it is possible that it is taking only Delta Data??
    (or)
    If you have any Start/End routines/Rule Routines written for deleting any records based on some conditions.
    (or)
    Also It depends on the keyfield you have selected in the DSO. If the keyfield which you have selected is having some repeated values, then while loading into DSO, it will be aggregated i.e if you have 10 rows for keyfield with value say 101, then in the DSO it will be loaded with only one row with value 101 (10 rows becoming 1 row) & with the keyfigure either Summation/Overwritten depending on what you have selected in the rule details for the keyfigure(you can check it by right click on Keyfigure mapping> Rule Details> here you can see whether it is Overwrite/Summation).
    Also as mentioned in above posts you can check the DSO --> manage --> Check the number of rows transferred & number of rows added.
    Hope it is clear & helpful!
    Regards,
    Pavan

  • 0HR_PY_PP_2 - data load issue

    We have 0HR_PY_PP_2 which pulls deltas weekly. On 29th of June the data load failed with error posting runs are locked, try the transfer again. So we did repeat delta. The load was successful but it got 0 records. But the R/3 guys said there was data. When the chain ran next week i.e. on 6th July, it got all the data including the data which was supposed to be brought in the previous weeks load. Did any one face such an issue. We do not understand what could be the problem.Can some one please help.
    Thank you!

    You should have run a normal delta run (no repeat delta) instead for
    the following reason:
    When you run the extractor in a situation when posting data is
    still locked (by the actual posting process), the extractor stops
    extraction immediately before it calculates any record and
    puts it to the delta queue. However, a "repeat delta" run is a process,
    that simply takes whatever it finds in the delta queue for the
    specific extractor as the result of the last delta run, and resends
    all these records to BW. It does not even invoke the application
    specific extraction logic again. In this specific situation
    this means, that 0 records are sent again and the timestamp is not
    touched.
    In other words, the correct procedure after a detection of a lock
    of the posting data is to run a regular delta again when posting
    has been finished."
    regards,
    Colin Moloney

  • HFM DATA LOAD WITH ODI HANGS LONG TIME

    Hi all,
         There's a very strange problem when I loading data from MS SQLServer to HFM with ODI. Specifically, there are 2 interfaces to 2 applications on the same HFM server. Data amount is about 1,300,000 and 650,000 separately.
    The strange thing is when I execute interface individually, it sometimes works well. However when I execute the package contains 2 interfaces, the larger one almost hangs on about 10+ hours every time whether I use an agent or not.
    After some research, it seems that the session hangs on because it cannot get return info from HFM but loading data has already completed. I found some similar problems on OTN like 64bit driver and jre compatible error OR deadlock on table. Different with this one. So, can anyone help on this? Much appreciate in advance!!!
    BTW, ODI and HFM are on the same server but ODI repositary and source of interface are on another MS SQL data server. The version is as below:
    HFM 11.1.1.3.0.956
    ODI 11.1.1.6.0
    win server 2003 x86
    MS SQLServer 2008 R2
    win server 2008 x64
    Regards,
    Steve

    Hi SH,
         source is MS SQLServer 2008 R2, staging area is on the source side, target is a HFM 11.1.1.3.0.956 based on SQLSERVER.
         KM is a standard 'IKM SQL to Hyperion Financial Management Data'.
         No transformation logic but only a filter to select data in current year.
    Besides, I have do some performance tuning as guide tolds:
    REM #
    REM # Java virtual machine
    REM #
    set ODI_JAVA_HOME=D:\oracle\Java\jdk1.6.0_21
    REM #
    REM # Other Parameters
    REM #
    set ODI_INIT_HEAP=512m
    set ODI_MAX_HEAP=1024m
    set ODI_JMX_PROTOCOL=rmi
    In Regedit:
    EnableServerLocking: 1
    MaxDataCacheSizeinMB :1000
    MaxNumDataRecordsInRAM: 2100000
    MultiServerMaxSyncDelayForApplicationChanges:300
    MultiServerMaxSyncDelayForDataChanges:300
    After some reaserch, I think the problem can be located at the HFM-ODI adapter or HFM side(maybe HFM cannot respond a completed info to ODI), do you have any idea? Thanks in advance

  • BI 7.0 data load issue: InfoPackage can only load data to PSA?

    BI 7.0 backend extraction gurus,
    We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS. 
    After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view.  In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table). 
    Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed!  In the Data Target tab, find the ODS as a target can't be selected!  Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process!  Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS!  Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
    Many new features with BI 7.0!  Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!

    You dont have to select anything..
    Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
    Go through the links for Lucid explainations
    Infopackage -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
    DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Creating DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    <b>Pre-requisite-</b>
    You have used transformations to define the data flow between the source and target object.
    Creating transformations-
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
    Hope it Helps
    Chetan
    @CP..

  • DATA LOAD ISSUE /NO ROLL UP MEMORY

    Hello Team,
    I have this master data load failure for FIS_BELNR. The problem which I think that everytime it tries to load it runs out of internal table space in the backend. I really don't know what it means. This data is been failing everyday because of the same problem.
    I have attached all the screen shots and ABAP short dump analysis screen shots as well. One can read in details why is it failing it tells exactly what the problem is , but how to fix it.
    Any more details needed please let me know.
    ABAP runtime errors    TSV_TNEW_BLOCKS_NO_ROLL_MEMORY                 
           Occurred on     25.10.2007 at 02:53:55                                                                               
    >> Short dump has not been completely stored. It is too big.                                                                               
    No roll storage space of length 2097424 available for internal storage.
                                                                                    What happened?                                                                               
    Each transaction requires some main memory space to process           
    application data. If the operating system cannot provide any more     
    space, the transaction is terminated.                                                                               
    What can you do?                                                                               
    Try to find out (e.g. by targetted data selection) whether the        
    transaction will run with less main memory.                                                                               
    If there is a temporary bottleneck, execute the transaction again.    
    If the error persists, ask your system administrator to check the     
    following profile parameters:                                                                               
    o  ztta/roll_area            (1.000.000 - 15.000.000)                 
           Classic roll area per user and internal mode                   
           usual amount of roll area per user and internal mode           
    o  ztta/roll_extension       (10.000.000 - 500.000.000)               
           Amount of memory per user in extended memory (EM)              
    o  abap/heap_area_total      (100.000.000 - 1.500.000.000)            
           Amount of memory (malloc) for all users of an application      
           server. If several background processes are running on         
           one server, temporary bottlenecks may occur.                   
           Of course, the amount of memory (in bytes) must also be        
           available on the machine (main memory or file system swap).    
           Caution:                                                       
           The operating system must be set up so that there is also      
           enough memory for each process. Usually, the maximum address   
           space is too small.                                            
           Ask your hardware manufacturer or your competence center       
           about this.                                                    
           In this case, consult your hardware vendor                     
    abap/heap_area_dia:        (10.000.000 - 1.000.000.000)               
           Restriction of memory allocated to the heap with malloc        
           for each dialog process.                                       
    Parameters for background processes:                                                                               
    hat can you do?                                                                               
    ry to find out (e.g. by targetted data selection) whether the         
    ransaction will run with less main memory.                                                                               
    f there is a temporary bottleneck, execute the transaction again.                                                                               
    f the error persists, ask your system administrator to check the      
    ollowing profile parameters:                                                                               
    ztta/roll_area            (1.000.000 - 15.000.000)                  
          Classic roll area per user and internal mode                    
          usual amount of roll area per user and internal mode            
      ztta/roll_extension       (10.000.000 - 500.000.000)                
          Amount of memory per user in extended memory (EM)               
      abap/heap_area_total      (100.000.000 - 1.500.000.000)             
          Amount of memory (malloc) for all users of an application       
          server. If several background processes are running on          
          one server, temporary bottlenecks may occur.                    
          Of course, the amount of memory (in bytes) must also be         
          available on the machine (main memory or file system swap).     
          Caution:                                                        
          The operating system must be set up so that there is also       
          enough memory for each process. Usually, the maximum address    
          space is too small.                                             
          Ask your hardware manufacturer or your competence center        
          about this.                                                     
          In this case, consult your hardware vendor                      
    bap/heap_area_dia:        (10.000.000 - 1.000.000.000)                
          Restriction of memory allocated to the heap with malloc         
          for each dialog process.                                        
    arameters for background processes:                                                                               
    Error analysis                                                                               
    The internal table "IT_62" could not be enlarged further.                
    You attempted to create a block table of length 2097424 for the internal 
    table "IT_62". This happens whenever the OCCURS area of the internal table
    is exceeded. The requested storage space was not available in the roll   
    area.                                                                    
    The amount of memory requested is no longer available.                                                                               
    How to correct the error                                                                               
    Please try to decide by analysis whether this request is                 
    reasonable or whether there is a program error. You should pay           
    particular attention to the internal table entries listed below.                                                                               
    The amount of storage space (in bytes) filled at termination time was:                                                                               
    Roll area...................... 2595024                                  
    Extended memory (EM)........... 2001898416                               
    Assigned memory (HEAP)......... 1886409776                               
    Short area..................... 16639                                    
    Paging area.................... 24576                                    
    Maximum address space.......... "-1"                                                                               
    If the error occurred in a non-modified SAP program, you may be          
    able to find a solution in the SAP note system.                          
    If you have access to the note system yourself, use the following        
    search criteria:                                                                               
    "TSV_TNEW_BLOCKS_NO_ROLL_MEMORY"                                        
    "SAPLZ_BW_EXTRACTORS " or "LZ_BW_EXTRACTORSU24 "                        
    "Z_BW_AP_GL_BELNR"                                                      
    If you cannot solve the problem yourself, please send the                
    following documents to SAP:                                                                               
    1. A hard copy print describing the problem.                             
       To obtain this, select the "Print" function on the current screen.    
    Thanks

    Hello,
    The memory of your internal table wants beyond the system configured threshold.
    Decrease your package size or extend the mentioned parameters (basis task):
    ztta/roll_area (1.000.000 - 15.000.000)
    Classic roll area per user and internal mode
    usual amount of roll area per user and internal mode
    ztta/roll_extension (10.000.000 - 500.000.000)
    Amount of memory per user in extended memory (EM)
    abap/heap_area_total (100.000.000 - 1.500.000.000)
    Regards, Patrick Rieken

Maybe you are looking for

  • Providing DRC solution for ATMs in the enterprise network

    Hi all, I am looking for ideas on how to provide a Disaster Recovery solution for thousands of ATMs (Automated Teller Machine) deployed in the bank enterprise network. The solution should consider the following facts:- 1.- Having the ATMs switch the

  • Another system migration problem...

    I had posted this as a reply in a thread about problems with using Migration Assistant with a bootable external FireWire drive, but it wasn't getting any replies, so please forgive the double-post... I need to find an answer to these problems before

  • Problem running pform_report.jsp in Report Builder 9i

    I downloaded the .ZIP file from the Examples page on the otn site. I follow the instructions, but when I try opening up any of the files, it creates an error window "can not display window" and the whole program zapps out. was anyone successful in ru

  • I want to be able to write lists with decimal numbers 1.0, 1.1, 1.2, 1.3, 2.0,etc

    Hi, I am trying to use Pages to write an important document with tiered numbers. I want to be able to structure it like this: 1.0 SECTION ONE 1.1 A 1.2 B 1.3 C 1.4 D 2.0 SECTION TWO 2.1 E 2.2 F And so on... I can't find how you do it? Is it possible

  • [Solved] Gnome 3.6. - no application menu after update

    After the Gnome-Shell-Update there are no applications anymore in the Gnome desktop. I have to start programmes with 'Alt + F2' and it's not possible to pin them in the Dash. They don't appear in the Gnome's desktop search either. If I'm starting ala