Data Load PSA to IO (DTP) Dump: EXPORT_TOO_MUCH_DATA

Hi Gurus,
Iu2019m loading Data from PSA to IO: 0BPARTNER. I habe around 5 Mil entries.
During the load the control Job dumps with the following dump:
EXPORT_TOO_MUCH_DATA
1. Data must be distributed into portions of 2GB
2. 3 possible solutions:
    - Either increase the sequence counter (field SRTF2) to include INT4
    or
    -export less data
    or
    -distribute the data across several IDs
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"EXPORT_TOO_MUCH_DATA" " "
"CL_RSBK_DATA_SEGMENT==========CP" or "CL_RSBK_DATA_SEGMENT==========CM00V"
"GET_DATA_FOR_EXPORT"
This is not the first time I do such large load.
The Field: SRTF2 is already an INT4 type.
Version BI: 701. SP06
I found a lot of OSS Notes for monitoring jobs, Industry solutions, by BI change runu2026 nothing however to data loading process.
Has anyone encountered this problem please?
Thanks in advance
Martin

Hi Martin,
There were series of notes which may be applicable here.
However if you have semantic grouping enabled it may be that this is a data driven issue.
The System will try to put all records into one package in accordance with teh semantic key.
If it is too generic many records could be input to one data package.
Please choose another (more) fields for semantic grouping - or unselect all fields if the grouping is not nessessary at all.
1409851  - Problems with package size in the case of DTPs with grouping.
Hope this helps.
Regards,
Mani

Similar Messages

  • Process Chain - How to make decision based on data in PSA and execute DTP

    Need to create process in Process Chain, which will execute the DTP based on data in PSA.
    Say we have two DSO A & B, both getting data from same datasource.
    In process chain, we need to read data in one of the date field in PSA and make a decision, which DTP to get execute in process chain.
    Say if date field contains year = 2000, than load to DSO A, if year is 2001, load data to DSO B.
    Suggest the solution. ...

    Hi,
    Tyr this steps.
    1) Develop ABAP process types
    ABAP report to read PSA data using function module
    Trigger events based on year
                           event1 -> for even year data
                           event2 -> for add year data
    2) Develop local process chains for each DTPs and schdule it with events mentioned above.
    Do reply with comments.

  • Data load failing with a short-dump

    Hello All,
    There is a data load failing with the following short-dump:
    Runtime Errors         UNCAUGHT_EXCEPTION
    Except.                    CX_FOEV_ERROR_IN_FUNCTION
    Full text:
    "UNCAUGHT_EXCEPTION" CX_FOEV_ERROR_IN_FUNCTIONC
    "CL_RSAR_FUNCTION==============CP" or "CL_RSAR_FUNCTION==============CM004"
    "DATE_WEEK"
    It seems as if it is failing in a function module - please advise.
    Regards,
    Keith Kibuuka

    Did you read OSS [Note 872193 - Error in the formula compiler|https://service.sap.com/sap/support/notes/872193] or OSS [Note 855424 - Runtime error UNCAUGHT_EXCEPTION during upload|https://service.sap.com/sap/support/notes/855424]
    Regards

  • Data Load (Info package and DTP) performance

    Hi,
    We are planning to perform initialisation expecting 30 million records.
    To improve the data load performance i am planning to increase data package size from default value to double the default value
    Will that improve the performance

    Adjusting the info package size would be helpful...
    But don't go with INCREASING the numbers, It's all depends on the your extract structure size, source system memory settings.
    I would suggest to do some trail in DEV/QUALITY systems with enough data.
    I remember playing in production itself for CRM - 0BPARTNER extractor - we actually reduced the number of records to be extracted per package to improve the performance - we did see the impacts.
    First calculate how much is your current package size by number of records extracted per package now with your extract structure size..and then play around..sure it will help unless you DON'T have any excessive ABAP code user exits.
    Let us know if you need further help on this

  • " Data Load error issues in  DTP "

    Hai Friend's,
    When I am loading Data from R/3 to BI.
    My Data is loaded into PSA.
    After  that  when  I  am  running   the  DTP, The Data is not coming to the Multi-Provider.
    Let me know the Solution. so that I can run the DTP.

    hI,
    DTP will help to load the data in DATA TARGETS ..
    so,after checking the PSA, check the data targets (infocube or DSO),which is targeted.
    if the data is there then check the connection between data targets and Multi-providers.
    or
    Give me more and clear picture.
    Hope you got the idea.
    Edited by: chandoo77 on Oct 7, 2011 2:33 PM

  • DTP Error for Data Loads

    Hello,
    i'm using BI 7x version and we have a custom datasource and custom transformation/DTP process. My transformation bring the data into PSA and then DTP shall load the PSA data into infocube... instead my DTP is going against the datasource itself thereby brining double values.
    Please let me know if i'm doing anything wrong here.
    Thanks
    Sameer

    Sameer,
    I'm a bit confused with your explaination.  A transformation doesn't bring data into the PSA (DataSource), an infopackage does this.  Then you have a transformation between the DataSource(PSA) to a InfoProvider which you use a DTP to move the data.  Assuming all of this is true in your example and you don't have two same data packages in the PSA then what may be the issue is that your data source is a 3.x not 7.0.  If so, then your infopackage can load the data to the target and the PSA just like in BI 3.x.  If this is the case, then you run the DTP, you will double your data.
    Regards,
    Dae

  • Master data loading issue

    Hi gurus,
        Presently i am working on BI 7.0.I have small issue regarding master data loading.
        I have generic data soruce for master data loading.i have to fetch this data to BW side.Always i have to do full data load to the master data object.first time i have scheduled info package and run DTP to load data to master data object, no issues, data got loaded successfully.whenever i run infopacage for second time and run DTP i am getting error saying that duplicated records.
       How can i handle this.
    Best Regards
    Prasad

    Hi Prasad,
    Following is happening in your case:
    <b>Loading 1st Time:</b>
    1. Data loaded to PSA through ipack.It is a full load.
    2. data loaded to infoobject through DTP.
    <b>Loading 2nd Time:</b>
    1. Data is again loaded to PSA. It is a full load.
    2. At this point, data in PSA itself is duplicate. So when you are running the DTP, it picks up the data of both the requests that were loaded to PSA. And hence, you are getting the Duplicate record error.
    Please clear the PSA after the data is loaded to infoobject.
    Assign points if helpful.
    Regards,
    Tej Trivedi

  • Data Load Statistics - Monitoring

    Hello All
    I have a quick questions on how to check the no of records LOADED into INFPKG (PSA) --> DTP (DSO) --> DTP (CUBE).
    1) Meaning, I can individually but is there any automated program / transaction / table that can tell me no of records LOADED into PSA --> DSO --> CUBE for any paricular process chain?
    2) If I mistakenly deleted a request from Cube/DSO/PSA, how would I retrieve it ?
    Thanks in advance

    Hi,
    1) Meaning, I can individually but is there any automated program / transaction / table that can tell me no of records LOADED into PSA --> DSO --> CUBE for any paricular process chain?
    --> Table: RSMONICTAB -- Monitor, Data Targets(Infocube/ODS) Table, request related info
    2) If I mistakenly deleted a request from Cube/DSO/PSA, how would I retrieve it ?
    --> If data loading : PSA > DSO> Cube then..
    If data lost in cube, can be loaded from DSO,
    If data lost in DSO, and available at PSA can be loaded/reconstruct from PSA
    If data not available in PSA cant possible. Need to reload from source.
    Hope it Helps
    Srini

  • Error while data loading in real time cube

    HI experts,
    I have a problem. I am loading data from a flat file.The data is loading correctly till the DSO but when i am trying to load it into the cube it is giving an error.
    The cube is  a real time cube for PLANING. I have chnaged the status to allow data loading but still the DTP is giving an error.
    It shows an error "error while extracting from DataStore" and some RSBK 224 ERROR and rsar 051 error.

    What was the resolution to this issue.  We rae having the same issue only with external system (not a flat file).  We get the  RSAR 051 with a return code of 238 error message, like it is not even getting to the rfc connection (DI_SOURCE).  We have been facing this issue for a while and even opened up a message with SAP.

  • Alert Message When Data Load Fails

    Hi Friends,
    I have three process chains(p1,p2,p3)and all for transactional data loading(psa-ods-cube).suppose whenever data load fails,it sends message to my company mail id. is there any functionality to do like this.
    i am not much aware of RSPCM(Process Chain Monitor).if anyone knows. tell me how much does it support my requirement?
    thank in advance
    sam

    hi Sam,
    welcome to sdn ...
    you can add 'create message' by right click the process type, this will lead you to mail sending setting.
    see thread
    Getting Failed process chain info through email
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/b0afcd90-0201-0010-b297-9184845346ca
    you may need to setup sapconnect for email, take a look
    http://help.sap.com/saphelp_webas620/helpdata/en/2b/d925bf4b8a11d1894c0000e8323c4f/frameset.htm
    http://help.sap.com/saphelp_webas620/helpdata/en/af/73563c1e734f0fe10000000a114084/content.htm
    hope this helps.

  • Error while loading data from PSA to DSO using DTP

    Hi,
    I have a Unique aplha numeric identifier of  type "Char" length "32" . When I am loading the data from PSA to DSO using DTP I get the following error message:
    "An error occurred while executing a transformation rule:
    The exact error message is
    Overflow converting from ' '
    The error was triggered at the following point in the program:
    GP4JJHUI6HD7NYAK6MVCDY4A01V 425
    System response
    Processing the data record has been terminated"
    Any idea how I can resolve this....
    Thanks

    Hi,
    fist check weather any special characteristics if not
    check in data source under this we have fields tab check the format of a particular field format internal/external/check u choose internal format, if any check routine once
    use Semantic Groups in the DTP.
    Try  it
    Thanku
    lokeeshM
    Edited by: lmedaSAP_BI on Oct 20, 2010 6:44 AM

  • Error while loading data from PSA to Infoobject through DTP(URGENT)

    Hi-
    I am running into an issue while loading the data from PSA to Infoobject.
    It says
    Data Package 1 : error during processing.
    Updating Attributes for Infoobject
    Error in substep
    Process Terminated
    Please let me know ur solution as this is very urgent....

    Data Package 1 : error during processing.
    This is the error when your flat file is opened at the time of scheduling. Pls close your flat file, will upload correctly.
    Let me know if you have any further query.

  • Master Data load via DTP (Updating attribute section taking long time)

    Hi all,
    Iam loading to a Z infoobject. Its a master data load for attributes.Surprisingly, i could find that PSA  pulls records very fastly( 2 minutes) but the DTP which updates the infoobject takes a lot of time. It runs into hours.
    When observed the DTP execution monitor, which shows the breakup of time between extraction,filter,transformation,updation of attributes
    i could observe that the last step "updation of attributes for infoobject" is taking lots of time.
    The masterdata infoobject has got also two infoobjects compounded.
    In transformation ,even they are mapped.
    No of parallel processes for the DTP was set to 3 in our system.
    Job Class being "C".
    Can anyone think of what could be the reason.

    Hi,
    Check the T code ST22 for any short dump while loading this master data. There must be some short dump occured.
    There is also a chance that you are trying to load some invalid data (like ! character as a first character in the field) into the master.
    Regards,
    Yogesh.

  • BI 7.0 data load issue: InfoPackage can only load data to PSA?

    BI 7.0 backend extraction gurus,
    We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS. 
    After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view.  In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table). 
    Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed!  In the Data Target tab, find the ODS as a target can't be selected!  Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process!  Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS!  Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
    Many new features with BI 7.0!  Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!

    You dont have to select anything..
    Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
    Go through the links for Lucid explainations
    Infopackage -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
    DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Creating DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    <b>Pre-requisite-</b>
    You have used transformations to define the data flow between the source and target object.
    Creating transformations-
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Multiple data loads in PSA with write optimized DSO objects

    Dear all,
    Could someone tell me how to deal with this situation?
    We are using write optimized DSO objects in our staging area. These DSO are filled with full loads from a BOB SAP environment.
    The content of these DSO u2013objects are deleted before loading, but we would like to keep the data in the PSA for error tracking and solving. This also provides the opportunity to see what are the differences between two data loads.
    For the normal operation the most recent package in the PSA should be loaded into these DSO-objects (as normal data staging in BW 3.5 and before) .
    As far as we can see, it is not possible to load only the most recent data into the staging layer. This will cause duplicate record errors when there are more data loads in the PSA.
    We all ready tried the functionality in the DTP with u201Call new records, but that only loads the oldest data package and is not processing the new PSA loads.
    Does any of you have a solution for this?
    Thanks in advance.
    Harald

    Hi Ajax,
    I did think about this, but it is more a work around. Call me naive but it should be working as it did in BW3.5!
    The proposed solution will ask a lot of maintenance afterwards. Beside that you also get a problem with changing PSA id's after the have been changed. If you use the posibility to delete the content of a PSA table via the process chain, it will fail when the datasourcese is changed due to a newly generated PSA table ID.
    Regards,
    Harald

Maybe you are looking for

  • HDMI output to monitor not displaying all colors

    I am trying to use a second monitor on my Satellite S55t-b5152 and am able to get 1920x1080 output from the HDMI port through an adapter to either VGA or DVI into my external monitor, but with either adapter the colors look washed-out and not all the

  • Security preference keeps unlocking

    Not sure what is going on here. In system preferences I have my Accounts settings locked as well as my network and sharing and security settings too. But every time after I shut down or restart the security settings are unlocked? What gives? Is it so

  • But My client want restriction of BOM change from system.

    Dear Friend, My client every now and then BOM is changed. We are changing BOM using Engineering change. But My client want restriction of BOM change from system in plant level/ Material Level. For Example For One month no one should change the BOM. P

  • Firefox keeps not responding.also chrome and internet explorer dhas the sma problem

    fire fox keeps not responding.I have the same problem with internet explorer and chrome.I have spent a fortine on pc systems repair but they have not cured the problem.I operate windows vista.my desk topis a Dell dimension c 521 IT IS 3 YEARS OLD.

  • Retoring files from backup appliction to time machine

    Just bought a new macbook pro, old powerbook is dead and gone, trying to restore files backed up to an external hard drive (files on my powerbook were backed up using mac's Backup application). tried migration assistant, but after i select "From a ti