Data load along with analyzation
Hello,
We had a data inconsistency situation and ran RSRV tests. One of the tests is INITIAL KEY FIGURE UNITS IN FACT TABLES. This test gave a red light giving the message "RESULT OF CHECK FOR KEY FIGURE UNITS IN FACT TABLES OF INFO CUBE "ZXXXXX" ARE EMPTY.
The description is as follows:
Message no. RSRV117
Description:
In the fact tables (E and F tables) of an InfoCube, a search is being carried out for those records that contain values other than zero for key figures that have units, but for which the unit of the key figure is blank (has no value).
Since the value of the unit has to correspond to the value of the key figure, such records would indicate an error in the data upload. The values of the units have not been loaded correctly into BW.
Repairs
There are no automatic repair options. Reload the data.
SO do I need to load all my data once again. Is this error because the data coming from R/3 has no units? Do I need to correct anything in the infocube before loading data? Also can somebody give me a link or documentation for step by step procedure for extraction and loading from R/3 to infocube at the same time analyzing the data load at each step?
Thank you
Hi Visu,
As I understand what he meant is to check the fields in the Extractor that are populating the Currency Key fields that you specified in BW.
If you have a 1-1 mapping in your Transfer structure;
Run the Extractor in RSA3 on R/3 side for the records which have no units comming in.
And then in the output display make sure that those fields are being populated.
If you are using Transfer rules to pupulate those fields debug the transfer rules to find out why you are not getting the units populated correctly..
Let us know ..
Ashish.
Similar Messages
-
Data load failing with a short-dump
Hello All,
There is a data load failing with the following short-dump:
Runtime Errors UNCAUGHT_EXCEPTION
Except. CX_FOEV_ERROR_IN_FUNCTION
Full text:
"UNCAUGHT_EXCEPTION" CX_FOEV_ERROR_IN_FUNCTIONC
"CL_RSAR_FUNCTION==============CP" or "CL_RSAR_FUNCTION==============CM004"
"DATE_WEEK"
It seems as if it is failing in a function module - please advise.
Regards,
Keith KibuukaDid you read OSS [Note 872193 - Error in the formula compiler|https://service.sap.com/sap/support/notes/872193] or OSS [Note 855424 - Runtime error UNCAUGHT_EXCEPTION during upload|https://service.sap.com/sap/support/notes/855424]
Regards -
Transport data entries along with table
How can we transport data entries along with table?
Do I need to declare of specific type??
Thank you,
APHi,
If you have created a table maintenance..
You can use SM30..
Press maintain button..
In the menu..Table view -> transport...Then select the entries include them in the CR.
If the transport menu is not enabled..
Go to SE11..
In the table maintenance generator...Choose the radio button which says "Standard recording routine" and then save..
Now the transport menu option will be enabled..
Thanks,
Naren -
SQL data Load rule with Substitution variable
Hi,
I have a data load rule with MySQL as a data source. I have requirement where user does not want to see Current day data in cube and they keep changing days not to be loaded.
I have setup the substitution variable SUBv as '2009-11-04' and have scripts to update it daily.
Data Load rule in where condition contains
brand = 'HP' and region = 'east' and date <> &SUBv
Now, above where condition does not work for date <> &SUBv but if I set substitution variable SUBv as "brand = 'HP' and region = 'east' and date <> '2009-11-04' " and set load rule where condition with just &SUBv it would just work fine. I dont know whats wrong in 1st place.
I dont want to create multiple where condition in substitution variable (Don't we have limit on them?). I have 40+ load rules which might use this SUBv substitution variable if I find a way to define it so that it work as "brand = 'HP' and region = 'east' and date <> &SUBv"
Thanks,
VikramI did some research and came to know when using substitution variable in Load rule need to keep substitution variable first in string..
It worked out magically. -
Data Load: Replace with Rule
Hi guys,
this might be a simple question but need to find out quickly:
Can it load exports with a rule file even its not in column format??
Data source is an export and NOT in column format and need to load the data with a rule file to replace a member name. The new member names in the new outline has changed a little bit.
Old: Team_1
New: Team_01
Export is done by right click data base -> export.
Any idea?
Regards,
Bernd
Edited by: Bernd on Mar 15, 2012 4:13 PMData file in a free-form format doesn't need a rule file to import. Given your case my best guess is to just replace the names in the export file with the new names and jsut import it.
-
Data loading problem with Movement types
Hi Friends,
I extarcted data using the data source General Ledger : Line itemdata (0fi_gl_4) to BW side.
Problem is Movement Types for some documents missing.
But i checked in rsa3 that time showing correctly.
i restricted the data in bw side infopackage level only particular document that time data loading perfecly.
this data source having 53,460 records.among all the records 400 records doc type 'we' movement types are missing.
please give me solution for this how to loading the data with movement types.
i checked particular document of 50000313 in RSA3 it is showing movement types. then i loaded data in bw side that time that movement types are not comming to be side. then i gave the particular doc 50000313 in infopackage level loading the data that time movement types are loading correctly. this extaractor having 55000 records.
this is very urgent problem.Please give me reply urgenty. i am waiting for your's replys.
Thanks & Regards,
Guna.
Edited by: gunasekhar raya on May 8, 2008 9:40 AMHi,
we enhanced Mvement type field(MSEG-BWART) General ledger (0FI_GL_4) extractor.
this field populated with data all the ACC. Doc . number.
Only 50000295 to 50000615 in this range we are not getting the movement types values.
we didn't write any routines in transfer and update rules level.
just we mapped to BWART field 0MOVETYPE info object.
we restrict the particular doc no 50000313 infopackage level that time loading the the data into cube with movement types.
but we remove the restriction infopackage level then loading the data that time we missing the movement types data of particular doc no 50000295 to 50000615.
Please give mesolution for this. i need to solve this very urgently.
i am witing for your reply.
Thanks,
Guna. -
Infocube data loads fail with UNCAUGHT_EXCEPTION dump after BI 7.01 upgrade
Hi folks,
We're in the middle of upgrading our BW landscape from 3.5 to to 7.01. The Support pack we have in the system is SAPKW70103.
Since the upgrade any of the data loads going to infocubes are failing with the following
ABAP dump UNCAUGHT_EXCEPTION
CX_RSR_COB_PRO_NOT_FOUND.
Error analysis
An exception occurred which is explained in detail below.
The exception, which is assigned to class 'CX_RSR_COB_PRO_NOT_FOUND', was not
caught and
therefore caused a runtime error.
The reason for the exception is:
0REQUEST is not a valid characteristic for InfoProvider
Has anyone come accross this issue and if so any resolutions you adopted to fix this. Appreciate any help in this regard.
Thanks
Mujeebhi experts,
i have exactly the same problem.
But I can`t find a test for infoobjects in rsrv. How can I repair the infoobject 0REQUEST`?
Thx for all answers!
kind regards
Edited by: Marian Bielefeld on Jul 7, 2009 8:04 AM
Edited by: Marian Bielefeld on Jul 7, 2009 8:04 AM -
Data load failure with '#'
Hi Experts,
One of our data load is failing with special characteristic '#'.
The exact error records is as follows: 390-166333#New, SUB, Fan 80x80x80mm, 12
We have maintained '#' in RSKC, but still the load is failing.Any reason ?
After editing '#' in PSA and reload fixed the issue.
Can anyone suggest what should be done to fix this.
Thanks & Regards,
Naveen.Hi,
try below field level routine.
DATA: l_d_length like sy-index.
DATA: l_d_offset LIKE sy-index.
DATA: CharAllowedUpper(60) TYPE C.
DATA: CharAllowedLower(60) TYPE C.
DATA: CharAllowedNumbr(60) TYPE C.
DATA: CharAllowedSondr(60) TYPE C.
DATA: CharAllowedAll(240) TYPE C.
CharAllowedUpper = 'ABCDEFGHIJKLMNOPQRSTUVWXYZÄÜÖ'.
CharAllowedLower = 'abcdefghijklmnopqrstuvwxyzäüöß'.
CharAllowedNumbr = '0123456789'.
CharAllowedSondr = '!"§$%&/()=?{[]}\u00B4`*+~;:_,.-><|@'''.
CONCATENATE CharAllowedUpper CharAllowedLower CharAllowedNumbr
CharAllowedSondr INTO CharAllowedAll.
RESULT = SOURCE_FIELDS-ENTITYNAME.
l_d_length = strlen( RESULT )
IF NOT RESULT CO CharAllowedAll.
DO l_d_length TIMES.
l_d_offset = sy-index - 1.
IF NOT RESULT+l_d_offset(1) CO CharAllowedAll.
RESULT+l_d_offset(1) = ''.
CONDENSE RESULT NO-GAPS.
ENDIF.
ENDDO.
endif.
CONDENSE RESULT NO-GAPS.
TRANSLATE RESULT TO UPPER CASE.
Thanks,
Phani. -
Data load failure with datasource Infoset Query
Dear Experts,
I had a data load failure today,where i am getting data from the datasource which was built on Infoset Query.
we had a source system upgrade and when i checked Infoset query in the development of Source sytem im getting the below message :
"Differences in field EKES-XBLNR
Output Length in Dictionary : 035
Output Length in Infoset : 020
Message was saying Adjust the INFOSET,I dont have authorisation to create the transport in the source system.I requested the respective person to Adjust the Infoset and also to regenerate the same and move to production system.
I think this will solve my problem,Please correct me if am wrong.
Regards,
Sunil Kumar.BHi Suman,
i am still facing the problem even after adjusting the Infoset.The problem is we are facing the short dump with length mismatch.
when i checked the Infoset we are taking the field xblnr from table EKES and data element for teh field was XBLNR_LONG(char-35) but when i checked the datasource in RSA2 the dataelement for XBLNR was showing as BBXBL(char20).
i think this was causing the problem and i checked in SQ02 we will take the field directly from the table and how there is chance to change the dataelement.
Please help me to correct the same.
Regards,
Sunil Kumar.B -
Data load issue with export data source - BW 3.5
Hi,
We are facing issues in loading data with the help of export data source.
We have created export data source of 0PCA_C01 cube. With the help of this export datasource, we are loading data to other custom cube. Scenario is working fine in development server.
But when we transported objects to quality server data is not getting loaded to custom target cube.
It is extracting zero records. All transports are ok and we have generated export datasource in quality before transports .Also regenerated export datasource after transport and activated infosource, update rule via RS* programs. Every object is active but data is not getting extracted.
RSA3 for 80PCA_C01 datasource isn't extracting any record in Quality. Records getting extracted in development. We are in BW 3.5 with patch level 19.
Please guide us to resolve the issue.
Thanks,
AdityaHi
Make sure that you have relevant Role & Authorization at Quality/PRS.
You have to Transport the Source Cube first and then Create a Generate Export Data Source in QAS. Then, replicate data sources for BW QAS Soruce System. Make sure this replicated Data Source in QAS. Only then can transport new update rules for second cube.
Hope it helps and clear -
Data load problem with Sales Header and Item
Hi all,
The delta loads have failed, so I want to reload the delta again. I deleted the failed requests in data targets and then trying to reload the data. I am getting a message saying
Last delta update is not yet completed
Therefore, no new delta update is possible.You can start the request again if the last delta request is red or green in the monitor (QM activity)
Is it in RSA7, that I have to make the delta red? Please let me know where I have to do this. The data is going to an ODS object.
thanks
Sabrina.Hi Sabrina,
Hope you have sorted it out by now, as there are other responses as well. Assuming that you now know the cause of failure.
For the below message "If you set a 'red' delta upload to green,with the next request, BW sends a delta request to the source system and this delivers the next delta data.
The data from the 'red' delta load is then no longer available (even with a repeat). This can lead to loss of data.Do you want to execute a QM Action?" -- Say OK and and set the QM status to red (status not ok) and when prompted for activation of M records, say "no" (as indicated in my previous post).
However, to be able to load this request from PSA you need to have this request in PSA (depending on the processing options), so goto RSA1> Modeling> PSA--> find your request under the relevant application component, and right click -->start the update immediately.
Hope it helps,
Sree -
Sql loader - Data loading issue with no fixed record length
Hi All,
I am trying to load the following data through sql loader. However the records # 1, 3 & 4 are only loading succesfully into the table and rest of the records showing as BAD. What is missing in my syntax?
.ctl file:
LOAD DATA
INFILE 'C:\data.txt'
BADFILE 'c:\data.BAD'
DISCARDFILE 'c:\data.DSC' DISCARDMAX 50000
INTO TABLE icap_gcims
TRAILING NULLCOLS
CUST_NBR_MAIN POSITION(1:9) CHAR NULLIF (CUST_NBR_MAIN=BLANKS),
CONTACT_TYPE POSITION(10:11) CHAR NULLIF (CONTACT_TYPE=BLANKS),
INQUIRY_TYPE POSITION(12:13) CHAR NULLIF (INQUIRY_TYPE=BLANKS),
INQUIRY_MODEL POSITION(14:20) CHAR NULLIF (INQUIRY_MODEL=BLANKS),
INQUIRY_COMMENTS POSITION(21:60) CHAR NULLIF (INQUIRY_COMMENTS=BLANKS),
OTHER_COLOUR POSITION(61:75) CHAR NULLIF (OTHER_COLOUR=BLANKS),
OTHER_MAKE POSITION(76:89) CHAR NULLIF (OTHER_MAKE=BLANKS),
OTHER_MODEL_DESCRIPTION POSITION(90:109) CHAR NULLIF (OTHER_MODEL_DESCRIPTION=BLANKS),
OTHER_MODEL_YEAR POSITION(110:111) CHAR NULLIF (OTHER_MODEL_YEAR=BLANKS)
data.txt file:
000000831KHAN
000000900UHFA WANTS NEW WARRANTY ID 000001017OHAL
000001110KHAP
000001812NHDE231291COST OF SERVICE INSPECTIONS TOO HIGH MAXIMA 92 MK
000002015TPFA910115CUST UPSET WITH AIRPORT DLR. $200 FOR PLUGS,OIL,FILTER CHANGE. FW
Thanks,Hi,
Better if you have given the table structure, I check your script it was fine
11:39:01 pavan_Real>create table test1(
11:39:02 2 CUST_NBR_MAIN varchar2(50),
11:39:02 3 CONTACT_TYPE varchar2(50),
11:39:02 4 INQUIRY_TYPE varchar2(50),
11:39:02 5 INQUIRY_MODEL varchar2(50),
11:39:02 6 INQUIRY_COMMENTS varchar2(50),
11:39:02 7 OTHER_COLOUR varchar2(50),
11:39:02 8 OTHER_MAKE varchar2(50),
11:39:02 9 OTHER_MODEL_DESCRIPTION varchar2(50),
11:39:02 10 OTHER_MODEL_YEAR varchar2(50)
11:39:02 11 );
Table created.
11:39:13 pavan_Real>select * from test1;
no rows selected
C:\Documents and Settings\ivy3905>sqlldr ara/ara@pavan_real
control = C:\control.ctl
SQL*Loader: Release 9.2.0.1.0 - Production on Sat Sep 12 11:41:27 2009
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Commit point reached - logical record count 5
11:42:20 pavan_Real>select count(*) from test1;
COUNT(*)
5 control.ctl
LOAD DATA
INFILE 'C:\data.txt'
BADFILE 'c:\data.BAD'
DISCARDFILE 'c:\data.DSC' DISCARDMAX 50000
INTO TABLE test1
TRAILING NULLCOLS
CUST_NBR_MAIN POSITION(1:9) CHAR NULLIF (CUST_NBR_MAIN=BLANKS),
CONTACT_TYPE POSITION(10:11) CHAR NULLIF (CONTACT_TYPE=BLANKS),
INQUIRY_TYPE POSITION(12:13) CHAR NULLIF (INQUIRY_TYPE=BLANKS),
INQUIRY_MODEL POSITION(14:20) CHAR NULLIF (INQUIRY_MODEL=BLANKS),
INQUIRY_COMMENTS POSITION(21:60) CHAR NULLIF (INQUIRY_COMMENTS=BLANKS),
OTHER_COLOUR POSITION(61:75) CHAR NULLIF (OTHER_COLOUR=BLANKS),
OTHER_MAKE POSITION(76:89) CHAR NULLIF (OTHER_MAKE=BLANKS),
OTHER_MODEL_DESCRIPTION POSITION(90:109) CHAR NULLIF (OTHER_MODEL_DESCRIPTION=BLANKS),
OTHER_MODEL_YEAR POSITION(110:111) CHAR NULLIF (OTHER_MODEL_YEAR=BLANKS)
data.txt
000000831KHAN
000000900UHFA WANTS NEW WARRANTY ID 000001017OHAL
000001110KHAP
000001812NHDE231291COST OF SERVICE INSPECTIONS TOO HIGH MAXIMA 92 MK
000002015TPFA910115CUST UPSET WITH AIRPORT DLR. $200 FOR PLUGS,OIL,FILTER CHANGE. FW
CUST_NBR_MAIN CONTACT_TYPE INQUIRY_TYPE INQUIRY_MODEL INQUIRY_COMMENTS OTHER_COLOUR OTHER_MAKE OTHER_MODEL_DESCRIPTION OTHER_MODEL_YEAR
000000831 KH AN NULL NULL NULL NULL NULL NULL
000000900 UH FA WANTS NEW WARRANTY ID 000001017OHAL NULL NULL NULL NULL
000001110 KH AP NULL NULL NULL NULL NULL NULL
000001812 NH DE 231291C OST OF SERVICE INSPECTIONS TOO HIGH MAXI MA 92 MK NULL NULL NULL
000002015 TP FA 910115C UST UPSET WITH AIRPORT DLR. $200 FOR PLU GS,OIL,FILTER C HANGE. FW NULL NULL- Pavan Kumar N
Edited by: Pavan Kumar on Sep 12, 2009 11:46 AM -
Flat file data load failed with the below error
hi guys,
when the scheduling load(info package) from the flat file ,got below error..
Error 'The argument 'INR' cannot be interpreted as a number ' on assignment field /BI
thanksHi,
It seems that the flat file and data source are not in sync.... check your flat file format and prepare it in sync with your data source.
I guess you have an amount field followed by currency in your flat file, but in your data source currency field is not next to the amount KF or vice versa. It could also be that the KF you have defined is not of Amount type, and hence it doesn't expect the currency field.
Hope this helps
Godhuli -
Master data load issue with flexible data source
Hi All,
1CL_OLIS001have this data source for loading configuration data. the changed records are not updating into this DS regularly.
I have delta load on this every day. It loads correct data for one month and suddently it will start loading 0 records every day with out failing. If I reset the delta, it will work fine for month or two and then will corrupt again.
Is any one have this kind of issue before Please suggest to resolve this issue permanently?
Thanks and Regards,
PoojaHi Pooja,
1) It might be Database table space issue in the backround, Please checkit out with BASIS
Regards,
Marasa. -
Hi Experts,
I am bit confused about ISE distributed deployment model .
I have two data centers one is DC & other one is as a DR I have requirement of guest access service implementation using CWA and get public certificate for HTTPS to avoid certificate error on client devices :
how do i deploy ISE persona for HA in this two data centers
After reading cisco doc , understood that we can have two PAN ( Primary in DC & Secondary in DR ) like wise for MnT (Monitoring will be as same as PAN ) however I can have 5 PSN running in secondary i.e. in DR ISE however I have confusion about HA for PSN .. since we have all PSN in secondary , it would not work for HA if it fails
Can anybody suggest me the best deployment solution for this scenario ?
Another doubt about public certificate :
Public Certificate: The ISE domain must be a registered or part of a registered domain name on the Internet. for that I need Domain name being used from customer .
Please do correct me if I am wrong about certificate understanding :
since Guest will be the outside users , we can not use certificate from internal CA , we need to get the certificate from service provider and install the same in both the ISE servers
Can anybody explain the procedure to opt the public certificate for HTTPS from service provider ? And how do i install it in both the ISE servers ?Hi there. Let me try answering your questions:
PSN HA: The PSNs are not configured as "primary" or "secondary" inside your ISE deployment. They are just PSN nodes as far as ISE is concerned. Instead, inside your NADs (In your case WLCs) you can specify which PSN is primary, which one is secondary, etc. You can accomplish this by:
1. Defining all PSN nodes as AAA radius servers inside the WLC
2. Then under the SSID > AAA Servers Tab, you can list the AAA servers in the order that you prefer. As a result, the WLC will always use the first server listed until that server fails/gets reloaded, etc.
3. As a result, you can have one WLC or SSID prefer PSN server A (located in primary DC) while a second WLC or SSID prefer PSN server B (located in backup DC)
Last but not the least, you could also place PSNs behind a load balancer and that way the traffic would be equally distributed between multiple PSNs. However, the PSN nodes must be Layer 2 adjacent, which is probably not the case if they are located in two different Data Centers
Certificates: Yes, you would want to get a public certificate to service the guest portal. Getting a public/well known certificate would ensure that most devices out there would trust the CA that signed your ISE certificate. For instance, VeriSign, GoDaddy, Entrust are some of the ones out there that would work just fine. On the other hand, if you use a certificate that was signed by your internal CA, then things would be fine for your internal endpoints that trust your internal CA but for any outsiders (Guests, contractors, etc) that do not trust and do not know who your internal CA is would get a certificate error when being redirected to the ISE guest portal. This in general is only a "cosmetic" issue and if the users click "continue" and add your CA as a trusted authority, the guest page would load and the session would work. However, most users out there would not feel safe to proceed and you will most likely get a lot of calls to your helpdesk :)
I hope this helps!
Thank you for rating helpful posts!
Maybe you are looking for
-
Qosmio F10 - Bluetooth doesn`t work
Can't get my Bluetooth to work - machine came with it already loaded. My palm found it once but hangs when I try to transfer data. Now can't find it. Get an error message when I try to alter preferences to my notebook is discoverable and won't save t
-
So this is whats going on. I was listenening to a playlist while playing a game last night. i go to turn on the same playlist tonight and nothing is on itunes. no playlists nor any of the music that is on my external hard drive that stays connected t
-
We made a purchase order for an online phone numbe...
Please let us know the status of our order it has been pending for two days. The telephone number we ordered is 209-451-5111
-
Firewire 1394 in newer versions of Mac
I have an old DV that uses Firewire 1394 in XP. can I use this in the newer Mac?
-
Published EXE file Error-Adobe captivate 4
Hi, I'm having problems with publishing an EXE file in Adobe captivate 4. I know it has something to do with our internet browser settings, we are a Court location and we have strict settings, but I'm wondering if there are work arounds through adobe