Data Loading issues in PSA and Infocube
Hi team,
i am loading data into PSA and from there into info cube via a DTP,
when i load data into PSA all 6000 records are transfered to PSA and process is completed sucessfully,
When i execute the DTP to load data into info cube the data load process is completed sucessfully but when i observe
the "manage" tab i see
"Transferred 6000 || Added Records 50"
i am not able to get as why only 50 records are loaded into infocube and if some records are rejected where can i find them and the reason for that,
kindly assist me in understanding this issue,
Regards
bs
hi,
The records would have got aggregated based on common values.
if in source u had
custname xnumber kf1
R 56 100
R 53 200
S 54 200
after aggregation u will have
custname kf1
R 300
S 200
if u do not have xnumber in your cube.
Hope it is clear now.
Regards,
Rathy
Similar Messages
-
Data display in PSA and InfoCube/InfoObject different
HI guru,
After I load data to PSA and InfoCube.
I checked that the data. In PSA, the quantity display as 12345; in InfoCube will be 12345,000.
How to solve this problem?
Thanks!Hi:
In Europe, there is a strange notation used for decimals. Instead of the international standard "." for a decimal point, Europe for some bizarre reason uses ",". Thus the display is utilizing the European display. In SU01, for your user id, change the decimal display to the proper international standard ".".
Thanks for any points you assign
Ron Silberstein
SAP -
BI 7.0 data load issue: InfoPackage can only load data to PSA?
BI 7.0 backend extraction gurus,
We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS.
After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view. In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table).
Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed! In the Data Target tab, find the ODS as a target can't be selected! Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process! Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS! Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
Many new features with BI 7.0! Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!You dont have to select anything..
Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
Go through the links for Lucid explainations
Infopackage -
http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
DTP
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
Creating DTP
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
<b>Pre-requisite-</b>
You have used transformations to define the data flow between the source and target object.
Creating transformations-
http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
Hope it Helps
Chetan
@CP.. -
Data Load Issue "Request is in obsolete version of DataSource"
Hello,
I am getting a very strange data load issue in production, I am able to load the data upto PSA, but when I am running the DTP to load the data into 0EMPLOYEE ( Master data Object) getting bellow msg
Request REQU_1IGEUD6M8EZH8V65JTENZGQHD not extracted; request is in obsolete version of DataSource
The request REQU_1IGEUD6M8EZH8V65JTENZGQHD was loaded into the PSA table when the DataSource had a different structure to the current one. Incompatible changes have been made to the DataSource since then and the request cannot be extracted with the DTP anymore.
I have taken the follwoing action
1. Replicated the data source
2. Deleted all request from PSA
2. Activated the data source using (RSDS_DATASOURCE_ACTIVATE_ALL)
3. Re transported the datasource , transformation, DTP
Still getting the same issue
If you have any idea please reply asap.
SamitHi
Generate your datasource in R/3 then replicate and activate the transfer rules.
Regards,
Chandu. -
Most common BW data load errors in production and how to solve them ..
Hi All,
Most common BW data load errors in production and how to solve them ..
Any doc on it ,if so plz send it across to this id [email protected]
Thanks in advance.
Rgrds
shobahi
1) RFC connection lost.
2) Invalid characters while loading.
3) ALEREMOTE user is locked.
4) Lower case letters not allowed.
5) While loading the data i am getting messeage that 'Record
the field mentioned in the errror message is not mapped to any infoboject in the transfer rule.
6) object locked.
7) "Non-updated Idocs found in Source System".
8) While loading master data, one of the datapackage has a red light error message:
Master data/text of characteristic 'so and so' already deleted .
9) extraction job aborted in r3
10) request couldnt be activated because theres another request in the psa with a smaller sid
11) repeat of last delta not possible
12) datasource not replicated
13) datasource/transfer structure not active ´
14) Idoc Or Trfc Error
15. ODS Activation Error -
I am having an issue where the data that drives a tilelist
works correctly when the tile list is not loaded on the first page
of the application. When it is put on a second page in a viewstack
then the tilelist displays correctly when you navigate to it. When
the tilelist is placed in the first page of the application I get
the correct number of items to display in the tilelist but the
information the item renderer is supposed to display, ie a picture,
caption and title, does not. The strange thing is that a Tree
populates correctly given the same situation. Here is the sequence
of events:
// get tree is that data for the tree and get groups is the
data for the tilelist
creationComplete="get_tree.send();get_groups.send();"
<mx:HTTPService showBusyCursor="true" id="get_groups"
url="[some xml doc]" resultFormat="e4x"/>
<mx:XMLListCollection id="myXMlist"
source="{get_groups.lastResult.groups}"/>
<mx:HTTPService showBusyCursor="true" id="get_tree"
url="[some xml doc]" resultFormat="e4x" />
<mx:XMLListCollection id="myTreeXMlist"
source="{get_tree.lastResult.groups}"/>
And then the data provider of the tilelist and tree are set
accordingly. I tried putting moving the data calls from the
creation complete to the initialize event thinking that it would
hit earlier in the process and be done by the time the final
completion came about but that didn't help either. I guess I'm just
at a loss as to why the tree works fine no matter where I put it
but the TileList does not. It's almost like the tree and the
tilelist will sit and wait for the data but the item renderer in
the tilelist will not wait. Which would explain why clicking on the
tile list still produces the correct sequence of events but the
visual component of the tilelist is just not working right. Anyone
have any ideas?Ok, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.
-
I am new to Demantra. Have installed a stand alone Demantra system in our server. In order to load data, I created a new model, defined item and location levels, then clicked on 'Build Model'. The data is loaded into 3 custom tables created by me. After creating the model, I cannot login to 'Collaborator Workbench', it gives message 'There are system errors. Please contact your System Administrator'. Can anyone please tell me what I am doing wrong and how to resolve the issue.
ThanksOk, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.
-
How to change it if some data is wrong in ODS and infocube
hi all:
could you please tell me how to change it if some data is wrong in ODS and infocube ?
Best regardsYou receive information on all requests that have run in the InfoCube and you can delete requests if required.
http://help.sap.com/saphelp_nw04s/helpdata/en/d4/aa6437469e4f0ae10000009b38f8cf/frameset.htm
Request ID you can find in administration of the InfoCube. If a cubes is compressed you can't delete requestwise...
Regards
Andreas -
Data loading issue in between BI and R/3
Hi Everybody,
Our one of team member has made some changes on the server of BI server and we have lost the LBWQ data of 21-July and 22-July. The data didnt come in PSA for standard extractors.
When the connection between R/3 and BI was made correct the data started coming after 23rd July. Now we wanted to have the data of 21-July and 22-July. Please let me know how to do it.
Thanks
A. JeniferHi goutham,
Can you please tel me where I can find 'Repeat delta update' option.
Regards,
Amar. -
Hello,
A short dump is generated when I try to load data to the infocube with a CX_RSR_X_MESSAGE(UNCAUGHT EXCEPTION) being generated.
Earlier I could load data successfully.
I get the same error message when I try to view data in SAP APO Interactive Planning.
There were some changes made by someone else (creating export data sources out of the characteristics) to the 9AMATNR and 9ALOCNO characteristics that I am using.
I reversed those changes (hopefully) and reactivated the infoobjects.
I still get the same error messages. I even deleted the infopackage and recreated it.
Any help would be greatly appreciated.
Regards,
Mallihi,
am facing the same issue right now. As in, whenever i try to load something, its throwing a short dump saying "Uncaught Exceptions". Could this be a problem due to the system copy that we did a few days back... we did a system copy from our BW Production system to our development and Quality system in order to synchronise all the three systems. and also, it's only then that none of the loads are going through and ending with the short dump of "uncaught Exceptions"...
i would be really happy if someone could throw more light onto this.
Expecting a reply at the earliest.
Thanks & Regards
Manicks -
Hi gurus,
Presently i am working on BI 7.0.I have small issue regarding master data loading.
I have generic data soruce for master data loading.i have to fetch this data to BW side.Always i have to do full data load to the master data object.first time i have scheduled info package and run DTP to load data to master data object, no issues, data got loaded successfully.whenever i run infopacage for second time and run DTP i am getting error saying that duplicated records.
How can i handle this.
Best Regards
PrasadHi Prasad,
Following is happening in your case:
<b>Loading 1st Time:</b>
1. Data loaded to PSA through ipack.It is a full load.
2. data loaded to infoobject through DTP.
<b>Loading 2nd Time:</b>
1. Data is again loaded to PSA. It is a full load.
2. At this point, data in PSA itself is duplicate. So when you are running the DTP, it picks up the data of both the requests that were loaded to PSA. And hence, you are getting the Duplicate record error.
Please clear the PSA after the data is loaded to infoobject.
Assign points if helpful.
Regards,
Tej Trivedi -
Data load issue _ error- ARFCSTATE= SYSFAIL.
Hello .
can you please give the solution for the issues of below.
I am using standard extractor structure to load the data to ODS from ECC to BI , the loading process as taking long time to loading , and data was came to PSA - NO ERROS, but i checked in the ECC side it showing - ARFCSTATE= SYSFAIL and finally job is finished . and also i checked in out bound queue there was around 30 LWE are there ., but in the inbound queue there was nothing selected .
every day same problem occurred , where as in updating manually PSA TO Data Target is it right
can any help me to solve the problem..
thanks ,
siva kumarAs you said 30 Luws were hung, select on of the hung TRFC/LUW and double click on it. it will display the message for not processing.
Goto monitor screen -- menu "environment" -- joboverview -- source system -- it will ask for id n pwd enter then select the job and click on job log...
can you post the job log?
TRFC did not processed in your case thats why load did not finished.This issue is due to Dead lock. check with the basis person.
Take help of BASIS
Try checking the TRFC queue/tn:SM58 if there is any recorded entried. You can also try to process the time limit exeeded TRFC manually by clicking on F6.
Also check the lockwaits and deadlocks in ST04/DB6COCKPIT. If you find any entries in deadlock contact basis team to resolve the issue.Take the help of BASIS to check it.
Try and let us know the status.
ARFCSTATE = SYSFAIL means the TRFC did not processed properly to the BW system from source.
Monitoring object with information about tRFC and qRFC calls that are waiting to be executed in this system; after they have been executed, they are deleted from table ARFCRSTATE.
check for short dumps in st22 if any and try to analyse.
Check the SM21 system logs. -
Data load from to datasources to InfoCube via common Infosource in BI7
Hi all,
I need to load data from two diffent DataSource to one Infocube. Between this Infocube and DataSources there is one infosource common for both Datasources.
Between Infocube and Infosource there is one transformation and between each DataSource and that single InfoSource there are individual transformations.
I need to move data from each DataSource to infocube via InfoSource and through all the three transformations..
I tried to create a DTP to load data from one DS1 to InfoCube. It worked fine. But while creating DTP from other DS to Infocube I am not able to use the transformation which I have created from InfoSource to Infocube. Its getting bypassed and systems creates a new transformation between DS2 to infocube.
Can anybody help me for this???Hello all,
Thanks for instant replies..
We need have Infosource as we need to have currency transformation, and multi-stage transformation...
My question is when I am creating a DTP for second DS it is bypassing the transformation which I have created between InofSource and DataSource...
It is following that that transformation only for first DS.. and not for the second one. Whereas, we need have that transformation (transformation between InfoSource and InfoCube) common for all DS which are loading data through InfoSource. -
hellow gurus
i have run the data load for 2lis_02_scl at psa level..
but when i went to RSMO
i m seein
Transfer ( Idoc and TRFC) : error occured
Reuest IDoc : application documented posted
Info Idoc1: Sent not arrived ,Data Passed to port Ok
how to resolve this error
thanks
points wil be assigned as my gesture
Regards
RahulHi,
Check SM58 and BD87 for pending tRFCs/IDOCS and execute them manually.
Transact RFC error
tRFC Error - status running Yellow for long time (Transact RFC will be enabled in Status tab in RSMO).
Step 1: Goto Details, Status get the IDoc number,and go to BD87 in R/3,place the cursor in the RED IDoc entroes in tRFC
queue thats under outbound processing and click on display the IDOC which is on the menu bar.
Step 2: In the next screen click on Display tRFC calls (will take you to SM58 particular TRFC call)
place the cursor on the particular Transaction ID and go to EDIT in the menu bar --> press 'Execute LUW'
(Display tRFC calls (will take you to SM58 particular TRFC call) ---> select the TrasnID ---> EDIT ---> Execute LUW)
Rather than going to SM58 and executing LUW directly it is safer to go through BD87 giving the IDOC name as it will take you
to the particular TRFC request for that Idoc.
OR
Go into the JOB Overview of the Load there you should be able to find the Data Package ID.
(For this in RSMO Screen> Environment> there is a option for Job overview.)
This Data Package TID is Transaction ID in SM58.
OR
SM58 > Give * / user name or background (Aleremote) user name and execute.It will show you all the pending TRFC with
Transaction ID.
In the Status Text column you can see two status
Transation Recorded and Transaction Executing
Don't disturb, if the status is second one Transaction Executing. If the status is first one (Transation Recorded) manually
execute the "Execute LUWs"
OR
Directly go to SM58 > Give * / user name or background (Aleremote) user name and execute. It will show TRFCs to be executed
for that user. Find the particular TRFC (SM37 > Req name > TID from data packet with sysfail).select the TrasnID (SM58) --->
EDIT ---> Execute LUW
IDOCS Process Manually
http://help.sap.com/saphelp_nw04s/helpdata/en/0b/2a6620507d11d18ee90000e8366fc2/frameset.htm
http://help.sap.com/saphelp_nw04/helpdata/en/dc/6b815e43d711d1893e0000e8323c4f/content.htm
&messageID=2311994
Thanks,
JituK -
Error in 0EMPLOYEE Master Data Load Issue
Hi,
We have 0EMPLOYEE Master Data. Due to our new development changes ralted to 0EMPLOYEE, we have scheduled 2 new info packages with Personnel number range. While creation of infopackages, we forget to main time interval from 01.01.1900 to 31.12.9999. Instead of this, the default range was selected as 24.04.2009 to 31.12.9999. Because of this selection in InfoPackage, the Employee Master Data Valid from date was changed to 24.04.2009 for all the employees in the master data after the data load.
Even after i change this selection properly and loading the data also, its not correcting with correct valid from dates.
Can you please advice, how can we fix this issue ASAP as its a production issue?
Thanks!
Best regards,
Venkata> Even after i change this selection properly and loading the data also, its not correcting with correct valid from dates.
May be for this you have the ONLY option to delete 0Employee master data and reload it again. For this you need to delete dependent transaction data also.
Cheers,
Sree
Maybe you are looking for
-
Install Oracle 8.1.7 on Pentium 4 PC with Windows 2000
Hi, Guys: I got the same problem. I couldn't install Oracle 8.1.7 on my new system with P4 1.5G and Windows 2000. I downloaded Windows service pack 2 and updated my OS. It still didn't work. It is obvious that the problem is in Oracle 8.1.7. It can n
-
When I upgraded to OS X 10.9.1 Everything got synced in very well. But a few days later when automatic apdates were completed, all my saved notes were erased(very important personal information and passwords). I couldn't restore them back. I can you
-
JSP: querystring problem
Hello all! I have a problem in obtaining the query string. Could you please help me? I have an html form with many field (>50) and I need to obtain the query string. If I write the jsp and use the command request.getquerystring, I have a error messag
-
Behaviour of Decentralized AE in a less conectivity with IS
Hi Friends, I have no ideea how a decentralized AE work in case of long term of no conectivity( eg 1 day) in a integration process if it works.This is first. I read help, notes, and i see that UME is dependent the ABAP Stack, but if Central server(IS
-
Why is Weather Channel HD now showing in SD?
Harford County MD. Comcast channel 815, Weather Channel HD, has been appearing in SD for several days. I rebooted, etc., all other channels normal. Any chance of it being fixed?