PSA Data Load

Hi,
I am trying to load Historical data from one of the Association model ( Data Mining) in to a DSO, But am getting error in the PSA stage. Getting an error as MISSING MESSAGES: SELECTION COMPLETED.
I am Struck now, not able to understand. what to do?
Can any one help me please, Thanks in Advance.

Hi,
Check following threads
Missing Messages in Extraction
Transfer (IDocs and TRFC): Missing messages or warnings
check with help of basis
in your source system is connected correctly or not.
check in SM59 for authorisation and connection test.if it fails get it corrected.
Check TRFC in SM58
Thanks and regards
Kiran

Similar Messages

  • PSA Data Load Error

    I am getting the following PSA error and am currently fixing this error manually. Could someone explain me how to have this in a Process Chain.
    <b>failed step: psa cleaning
    cause: not able to truncate psa because of inconsistency of data in partition
    steps taken: make partition consistent using RSRVand return the load
    </b>
    Thanks,
    Priya

    Go to RSRV --> Go to Tests in Transaction RSRV -->PSA Tables -->Consistency Between PSA Partitions and SAP Administration Information## --> give the PSA table name  here and press F8. Once you see errors press on Correct errors button and it shd get fixed.
    You shd be able to get the PSA table name from the Process chain step variant..

  • Data Cleansing Before PSA data Load

    Hi
    In BI, one of my field was off of the data type in flat file, so got error while loading flat file to PSA, so whole file failed to load.
    Do you have some idea how to cleansing the data for data type before loading to PSA or what do we do generally?
    Thanks...

    hi,
    you can write a program to check the data type on the application server file using ABAP routine prior to loading, but even the abap routine will return error if the data types mismatches:).
    Iin any case there will be a failure so i think there might not be a solution present in this case.
    regards,
    Arvind.

  • BI 7.0 data load issue: InfoPackage can only load data to PSA?

    BI 7.0 backend extraction gurus,
    We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS. 
    After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view.  In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table). 
    Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed!  In the Data Target tab, find the ODS as a target can't be selected!  Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process!  Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS!  Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
    Many new features with BI 7.0!  Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!

    You dont have to select anything..
    Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
    Go through the links for Lucid explainations
    Infopackage -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
    DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Creating DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    <b>Pre-requisite-</b>
    You have used transformations to define the data flow between the source and target object.
    Creating transformations-
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Multiple data loads in PSA with write optimized DSO objects

    Dear all,
    Could someone tell me how to deal with this situation?
    We are using write optimized DSO objects in our staging area. These DSO are filled with full loads from a BOB SAP environment.
    The content of these DSO u2013objects are deleted before loading, but we would like to keep the data in the PSA for error tracking and solving. This also provides the opportunity to see what are the differences between two data loads.
    For the normal operation the most recent package in the PSA should be loaded into these DSO-objects (as normal data staging in BW 3.5 and before) .
    As far as we can see, it is not possible to load only the most recent data into the staging layer. This will cause duplicate record errors when there are more data loads in the PSA.
    We all ready tried the functionality in the DTP with u201Call new records, but that only loads the oldest data package and is not processing the new PSA loads.
    Does any of you have a solution for this?
    Thanks in advance.
    Harald

    Hi Ajax,
    I did think about this, but it is more a work around. Call me naive but it should be working as it did in BW3.5!
    The proposed solution will ask a lot of maintenance afterwards. Beside that you also get a problem with changing PSA id's after the have been changed. If you use the posibility to delete the content of a PSA table via the process chain, it will fail when the datasourcese is changed due to a newly generated PSA table ID.
    Regards,
    Harald

  • Data Load PSA to IO (DTP) Dump: EXPORT_TOO_MUCH_DATA

    Hi Gurus,
    Iu2019m loading Data from PSA to IO: 0BPARTNER. I habe around 5 Mil entries.
    During the load the control Job dumps with the following dump:
    EXPORT_TOO_MUCH_DATA
    1. Data must be distributed into portions of 2GB
    2. 3 possible solutions:
        - Either increase the sequence counter (field SRTF2) to include INT4
        or
        -export less data
        or
        -distribute the data across several IDs
    If the error occures in a non-modified SAP program, you may be able to
    find an interim solution in an SAP Note.
    If you have access to SAP Notes, carry out a search with the following
    keywords:
    "EXPORT_TOO_MUCH_DATA" " "
    "CL_RSBK_DATA_SEGMENT==========CP" or "CL_RSBK_DATA_SEGMENT==========CM00V"
    "GET_DATA_FOR_EXPORT"
    This is not the first time I do such large load.
    The Field: SRTF2 is already an INT4 type.
    Version BI: 701. SP06
    I found a lot of OSS Notes for monitoring jobs, Industry solutions, by BI change runu2026 nothing however to data loading process.
    Has anyone encountered this problem please?
    Thanks in advance
    Martin

    Hi Martin,
    There were series of notes which may be applicable here.
    However if you have semantic grouping enabled it may be that this is a data driven issue.
    The System will try to put all records into one package in accordance with teh semantic key.
    If it is too generic many records could be input to one data package.
    Please choose another (more) fields for semantic grouping - or unselect all fields if the grouping is not nessessary at all.
    1409851  - Problems with package size in the case of DTPs with grouping.
    Hope this helps.
    Regards,
    Mani

  • Data loading through PSA in 3.5

    Hi experts,
    I have always worked with 7.0, so when i create a data load chain for a infocube, i first delete index, then execute the infopackage (with only to psa), execute the DTP and create index again.
    Now i am working on a 3.5 and the DTP do not exist :S, how should i proceed to transfer the data after executing the infopackage??  I prefer doing it through PSA if possible.
    Thank-you very much,
    Artur.

    Hi Artur,
    The difference between 3.5 and 7.0 is
    - Infopackage brings the data till PSA in 7.0 and then you have to execute DTP to bring data from PSA to data target. In 3.5 Infopackage brings the data directly to Data target with or without PSA depends upon the update type in processing tab you select in Info package.
    In case of 3.5 you have to follow the same step, Delete Index, Load data through Infopackage (With PSA and data target (package by Package) and create index.
    In 7.0 version SAP broke one process in two parts, one is till PSA (through Info Package) and other is from PSA to Data target (DTP) for better control and to improve ETL process.
    Regards,
    Kams

  • Data load error  in PSA

    hellow gurus
    i have run the data load for 2lis_02_scl  at  psa level..
    but when i went to RSMO
    i m seein
    Transfer ( Idoc and TRFC) : error occured
    Reuest IDoc : application documented posted
    Info Idoc1: Sent not arrived ,Data Passed to port Ok
    how to resolve this error
    thanks
    points wil be assigned as my gesture
    Regards
    Rahul

    Hi,
    Check SM58 and BD87 for pending tRFCs/IDOCS and execute them manually.
    Transact RFC error
    tRFC Error - status running Yellow for long time (Transact RFC will be enabled in Status tab in RSMO).
    Step 1:  Goto Details, Status get the IDoc number,and go to BD87 in R/3,place the cursor in the RED IDoc entroes in tRFC
    queue thats under outbound processing and click on display the IDOC which is on the menu bar.
    Step 2: In the next screen click on Display tRFC calls (will take you to SM58 particular TRFC call)
    place the cursor on the particular Transaction ID and go to EDIT in the menu bar --> press 'Execute LUW'
    (Display tRFC calls (will take you to SM58 particular TRFC call) ---> select the TrasnID ---> EDIT ---> Execute LUW)
    Rather than going to SM58 and executing LUW directly it is safer to go through BD87 giving the IDOC name as it will take you
    to the particular TRFC request for that Idoc.
    OR
    Go into the JOB Overview of the Load there you should be able to find the Data Package ID.
    (For this in RSMO Screen> Environment> there is a option for Job overview.)
    This Data Package TID is  Transaction ID in SM58.
    OR
    SM58 > Give * / user name or background  (Aleremote) user name and execute.It will show you all the pending TRFC with
    Transaction ID.
    In the Status Text column you can see two status
    Transation Recorded and Transaction Executing
    Don't disturb, if the status is second one Transaction Executing. If the status is first one (Transation Recorded) manually
    execute the "Execute LUWs"
    OR
    Directly go to SM58 > Give * / user name or background  (Aleremote) user name and execute. It will show TRFCs to be executed
    for that user. Find the particular TRFC (SM37 > Req name > TID from data packet with sysfail).select the TrasnID (SM58) --->
    EDIT ---> Execute LUW
    IDOCS Process Manually 
    http://help.sap.com/saphelp_nw04s/helpdata/en/0b/2a6620507d11d18ee90000e8366fc2/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/dc/6b815e43d711d1893e0000e8323c4f/content.htm
    &messageID=2311994
    Thanks,
    JituK

  • Data Load Fails due to duplicate records from the PSA

    Hi,
    I have loaded the Master Data Twice in to the PSA.  Then, I created the DTP to load the data from the PSA to the InfoProvider.  The data load is failing with an error "duplicate key/records found".
    Is there any setting that I can configure by which even though I have duplicate records in PSA, I can be successfully able to load only one set of data (without duplicates) in to the InfoProvider?
    How can I set up the process chains to do so?
    Your answer to the above two questions is appreciated.
    Thanks,

    Hi Sesh,
    There are 2 places where the DTP checks for duplicates.
    In the first, it checks previous error stacks. If the records you are loading are still contained in the error stack of a previous DTP run, it will throw the error at this stage. In this case you will first have to clean up the previous error stack.
    The second stage will clean up duplicates  across datapackeages, provided the option is set in your datasource. But you should note that this will not solve the problem if you have duplicates in the same datapackage. In that case you can do the filtering yourself in the start routine of your transformation.
    Hope this helps,
    Pieter

  • Data loaded from a non existing PSA

    Hi all,
    I have this problem:
    I am loading an infocube from a PSA with a DTP.
    The problem is that data in the infocube is duplicated when reading data from that PSA.
    In addition if all PSA requests are deleted data arrives well to the infocube not duplicating registers.
    It is working as if there would be a PSA hidden associated to the datasource.
    Does anybody know how to delete PSA data? (not requests as I have deleted all)
    Program RSAR_PSA_CLEANUP_DIRECTORY does not work...

    Thanks all but,
    there ara no PSA requests. I have deleted all since 1900 manually and via process chain and we need a full upload, not a delta one.
    The problem is that even there ara no PSA requests, somewhere data is stored and the DTP is finding them and loading them to the infocube.
    We have found the PSA BIC/B00001930000 and would like to try deleting its data (not requests as they are all yet deleted)

  • Data Loading issues in PSA and Infocube

    Hi team,
    i am loading data into PSA and from there into info cube via a DTP,
    when i load data into PSA all 6000 records are transfered to PSA and process is completed sucessfully,
    When i execute the DTP to load data into info cube the data load process is completed sucessfully but when i observe
    the "manage" tab i see
    "Transferred 6000 || Added Records 50"
    i am not able to get as why only 50 records are loaded into infocube and if some records are rejected where can i find them and the reason for that,
    kindly assist me in understanding this issue,
    Regards
    bs

    hi,
    The records would have got aggregated based on common values.
    if in source u had
    custname xnumber kf1
    R                56             100
    R                53             200
    S                54              200
    after aggregation u will have
    custname                kf1
    R                            300
    S                           200
    if u do not have xnumber in your cube.
    Hope it is clear now.
    Regards,
    Rathy

  • Load PSA & data targets in parallel and dialog processes

    We have noticed that when we load data into BW using the processing option: PSA and Data targets in Parallel, that the system basically uses all available dialogue processes on the application server that it 
    is running on.  On systems where we only have one application server, this obviously renders the server useless for anything else. In addition we have seen that even when we have additional application servers, the system is still virtually useless and extremely slow.  We would like to be able to use this option for large data loads, as it theoretically can allow these large loads to complete in a shorter amount of time.     
    However, this issues we see cause us to be hesitant to use it, as we cannot afford the system to be affected so negatively while they are running. (we are bascially a 24x7 shop for users).                      
    Is this the behavior that is expected?                                  
    Is there some way that we can limit the number of processes this mode   
    consumes?
    Are we missing something?

    It is a basis setting, but you can limit the amount of processes one user is using in parallel.
    As I'm not basis, I don't know the setting, but we were able to pull it off at my current client.
    kr,
    Tom

  • RSA3 data and PSA data not matching

    Hello All,
    I am in BI7.0 and trying to Load FI_GL data usng 0FI_GL_4 DS from ECC6. I was able to install, activate all necessary standard objects that are involved. I have executed data Load from Infopackage by doing Init. with data Transfer..which went okay and now i am data upto PSA Level with everything green status. But when I noticed that in PSA Maintenance screen its showing whole lot more data..it shows 103 Datapackages transferred and lot of data records in each packages.
    In R3 side (RSA3) shows only 831 records.  IN BI7 , monitor screen shows more then 9 million records recieved.
    Could anyone tell me why this happened? Is it possible that this particular PSA already had data on it? if so How can i find and delete previous records from this PSA? please suggest...
    thanks
    dk

    Hello,
    1)The error comes if the data is wrong in the DSO.
    Something like before and after image are missing for a record and thats why change log is having some issues.
    you can check the data between new tables and PSA and see if its matching or not.
    Try to delete all the data from the PSA and the DSO.do proper mappings again especially in the keys of the DSO and then reload it.
    try to load it for a small a amount like one company code and one fiscal period.check in PSA if its matching or not from RSA3.
    2)If the data is getting loaded then there should not be a any problem with the data source.DTP could be an issue.the program is to activate the data source in BI 7 system and if the data source is already activated then you dont need this again.
    3)Rule proposal can be generated if the fields are same and have same technical name.in your case the field name are different since its datasource and DSO mappings and therefore you will have to do mappings manually.
    4)you mean RSA3??
    RSA3 displays the data from tables only in case of full mode.Also,try to do full loads till the PSA and then check for the same in RSA3 in full mode.keep the package size and number of calls maximum so that it will show you correct number of records extracted.
    RSA3 is an extractor checker and can be used for verify small amount of data,dont use it for verifying the datasource for all the drecords.
    use underlying tables if you want.
    if init is done properly then the data will be correct and you should verify it with the tables and not RSA3.
    Regards
    Ajeet

  • Data load from DSO to cube fails

    Hi Gurus,
    The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
    The DSO has been loaded without errors from ECC.
    Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
    I looked in the  PSA has about 50, 000 records .
    and all data packages have green light and all amounts have 0currency assigned
    I went in to the DTP and looked at the error stack it has nothing in it then I  changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
    The ZARAMT filed has 0currency blank for all these records
    I tried to assign USD to them and  the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
    If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
    What should I do to resolve the issue.
    thanks
    Prasad

    Hi Prasad....
    Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
    Actually....in this case what you are suppose to do is :
    1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
    2) Then correct the erroneos records in the Error Stack..
    3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
    4) Then manually change the status of the original request to Green....
    But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
    Regards,
    Debjani......

  • Data load from DSO to Cube in BI7?

    Hi All,
    We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
    a. Infopackage1 from datasource to ODS and
    b. Infopackage2 from ODS to the CUBE.
    Now after we transported the migrated dataflow to production, to load the same infoproviders I use
    1. Infopackage to load PSA.
    2. DTP1 to load from PSA to DSO.
    3. DTP2 to load from DSO to CUBE.
    step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
    Please note that the DSO already has data loaded using infopackage.  (Is this causing the problem?)
    Thanks,
    Sirish.

    Hi Naveen,
    Thanks for the Reply. The creation of DTP is not possible without a transformation.
    The transformation has been moved to production successfully.

Maybe you are looking for