Records Rejected .
Hi , I am new to BI .
I am trying to create an info cube by giving three fields Custno ( Primary ) , city and Land.
When i do a DTP The records go into error stack no matter that i am giving unique key every time to custno.
can any one tellme why it keeps rejecting the records ? There are no entries in the P tables either .
some please tellme .
Regards ,
KB.
Hi Kamala,
I think u are loading the cube through flat file datasource.
I assume the flat file contains data in capital letters for city and land.. or city and land infoobjects are checked for lower case.
U need to do either one of these . If not they will go to error stack and u can correct them over there and trigger error dtp.
This could be one reason i guess
Regards
vamsi
Similar Messages
-
SQL*Loader- Records Rejected - Error on table ORA-01722: invalid number
Getting the following errors :
Please tell me where I am going wrong?
Attached is the log file and snippets of datafile along with the control file !!
Also please direct me how can i upload 4900 records at one go?
SQL*Loader: Release 11.1.0.7.0 - Production on Fri Oct 14 03:06:06 2011
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Control File: sample.ctl
Data File: Cities.csv
Bad File: Cities.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table CITY, loaded from every logical record.
Insert option in effect for this table: INSERT
Column Name Position Len Term Encl Datatype
ID FIRST * , CHARACTER
NAME NEXT 35 , ' CHARACTER
COUNTRYCODE NEXT 3 , ' CHARACTER
POPULATION NEXT * WHT CHARACTER
Record 1: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 2: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 3: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 4: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 5: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 6: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 7: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 8: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 9: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 10: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 11: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 12: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 13: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 14: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 15: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 16: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 17: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 18: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 19: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 20: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 21: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 22: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 23: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 24: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 25: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 26: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 27: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 28: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 29: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 30: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 31: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 32: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 33: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 34: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 35: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 36: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 37: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 38: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 39: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 40: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 41: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 42: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 43: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 44: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 45: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 46: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 47: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 48: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 49: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 50: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
Record 51: Rejected - Error on table CITY, column POPULATION.
ORA-01722: invalid number
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table CITY:
0 Rows successfully loaded.
51 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 35840 bytes(64 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 64
Total logical records rejected: 51
Total logical records discarded: 0
Run began on Fri Oct 14 03:06:06 2011
Run ended on Fri Oct 14 03:06:12 2011
Elapsed time was: 00:00:06.18
CPU time was: 00:00:00.03
my control file (sample.ctl):
load data infile 'Cities.csv'
into table city
fields terminated by ','
(id integer external,
name char(35) enclosed by "'",
countrycode char(3) enclosed by "'",
population integer external terminated by '\n'
my datafile (Cities.csv) (it contains 4900 records, but I am showing here just 4 records for ease)
3830,'Virginia Beach','USA',425257
3831,'Atlanta','USA',416474
3832,'Sacramento','USA',407018
3833,'Oakland','USA',399484
Thanks in advance!!Look that when I change a little bit your datafile as follows
1,'Kabul','AFG',1780000
2,'Qandahar','AFG','237500'
3,'Herat','AFG','186800' I got the same error (2 last rows rejected for the same error invalid number)
mhouri > select * from cities;
ID NAME COU POPULATION
1 Kabul AFG 1780000
SQL*Loader: Release 10.2.0.3.0 - Production on Fri Oct 14 10:38:06 2011
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: cities.ctl
Data File: cities.dat
Bad File: cities.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table CITIES, loaded from every logical record.
Insert option in effect for this table: INSERT
Column Name Position Len Term Encl Datatype
ID FIRST * , CHARACTER
NAME NEXT 35 , ' CHARACTER
COUNTRYCODE NEXT 3 , ' CHARACTER
POPULATION NEXT * WHT CHARACTER
Record 4: Rejected - Error on table CITIES, column ID.
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 2: Rejected - Error on table CITIES, column POPULATION.
ORA-01722: invalid number
Record 3: Rejected - Error on table CITIES, column POPULATION.
ORA-01722: invalid number
Table CITIES:
1 Row successfully loaded.
3 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 35840 bytes(64 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 4
Total logical records rejected: 3
Total logical records discarded: 0
Run began on Fri Oct 14 10:38:06 2011
Run ended on Fri Oct 14 10:38:06 2011
Elapsed time was: 00:00:00.23
CPU time was: 00:00:00.09Population value within the data file should be a number
Best regards
Mohamed Houri -
UJK_EXECUTION_EXCEPTION:Data records rejected when writing to the cube
Hello,
I ran into an issue while testing Script using UJKT.
I can plan the same intersection/record from BPC template but it doesn't work from Script/UJKT.
I checked work statuses, scripts, rules. Found nothing that locks the intersection.
Anything else I should check?
Is there a way to find out more detailed log to see why the data records are rejected.
Appreciate your help. Thanks.
P.S. I looked at many scn posts on this error but nothing really helped.
UJKT Log:
#dim_memberset=9
REC :%VALUE% * 110
CALCULATION BEGIN:
QUERY PROCESSING DATA
QUERY TIME : 571.13 ms. 1 RECORDS QUERIED OUT.
QUERY REFERENCE DATA
CALCULATION TIME IN TOTAL :197.47 ms.
1 RECORDS ARE GENERATED.
CALCULATION END.
ENDWHEN ACCUMULATION: 1 RECORDS ARE GENERATED.
UJK_EXECUTION_EXCEPTION:Data records rejected when writing to the cubeThanks Vadim. I debugged UJKT but didn't find anything helpful yet.
Will keep digging.
Here is the script.
*XDIM_MEMBERSET CAP_VIEW = 27
*XDIM_MEMBERSET PPS_TIME = 2014.03
*XDIM_MEMBERSET PPS_ITEM = 52540046EAC51ED3B2E49C10DBDB1565
*XDIM_MEMBERSET CAP_GRP = ZHM_IT00050001
*XDIM_MEMBERSET PPS_ENTITY = US
*XDIM_MEMBERSET CAP_CATG = ZHM_IT0005
*XDIM_MEMBERSET PPS_ACCOUNT = COST
*XDIM_MEMBERSET RPTCURRENCY = USD
*DESTINATION_APP=PPS_PLANNING
*SKIP_DIM = CAP_CATG,CAP_GRP,CAP_VIEW
*ADD_DIM PPS_CATG = ZHM_IT0001
*ADD_DIM PPS_GROUP = ZHM_IT00010001
*ADD_DIM PPS_VIEW = ZHM_IT0001000127
*WHEN PPS_ACCOUNT
*IS COST
*REC(EXPRESSION=%VALUE%*10)
*ENDWHEN
*COMMIT
LGX:
LOG:
FILE:\ROOT\WEBFOLDERS\HM_PLAM \ADMINAPP\PPS_CAPPLAN\TEST.LGF
USER:SMULLAPUDI
APPSET:HM_PLAM
APPLICATION:PPS_CAPPLAN
[INFO] GET_DIM_LIST(): I_APPL_ID="PPS_CAPPLAN", #dimensions=9
CAP_CATG,CAP_GRP,CAP_VIEW,MEASURES,PPS_ACCOUNT,PPS_ENTITY,PPS_ITEM,PPS_TIME,RPTCURRENCY
#dim_memberset=8
CAP_VIEW:27,1 in total.
PPS_TIME:2014.03,1 in total.
PPS_ITEM:52540046EAC51ED3B2E49C10DBDB1565,1 in total.
CAP_GRP:ZHM_IT00050001,1 in total.
PPS_ENTITY:US,1 in total.
CAP_CATG:ZHM_IT0005,1 in total.
PPS_ACCOUNT:COST,1 in total.
RPTCURRENCY:USD,1 in total.
REC :%VALUE%*10
CALCULATION BEGIN:
QUERY PROCESSING DATA
QUERY TIME : 1038.58 ms. 1 RECORDS QUERIED OUT.
QUERY REFERENCE DATA
CALCULATION TIME IN TOTAL :173.04 ms.
1 RECORDS ARE GENERATED.
CALCULATION END.
ENDWHEN ACCUMULATION: 1 RECORDS ARE GENERATED.
UJK_EXECUTION_EXCEPTION:Data records rejected when writing to the cube
Thanks,
Vasu -
How to avoid record rejection due to conversion failed for dirty data
There is a source delimited flat file created with codepage CP936, simplified
Chinese. Data Services XI on HP Unix has to extract this file to a UTF-8 db2
table.
However, because the flat file contains a lot of dirty or corrupted
characters for Chinese word columns, some records are failed to convert and being
rejected by DS to load to db2. Typically the error is the corrupted character
causes DS to think there is less delimiters on the record then it should be.
I use less perfect workaround to cheat DS to tell it is UTF8 codepage input file
instead of CP936 codepage file, and DS doesn't complain missing delimiters no more, but the
extracted Chinese columns are not readable.
However, problem resurfaces again when DS encounters this gibberish data value in
one column of the record:
u20AC\ \u2021 u20ACxu2019 u20AC
The record will be rejected entirely and no error and no warning being generated
by DS to explain the rejection. Even I try to use Text delimiter of double qoute to enclose
this data value, but DS still unable to extract it and rejects the whole record without warning.
Is there a way to use Validation Transform or other transforms to catch all the conversion
errors before hand, and assign null value for dirty data values, rather then
letting the records being rejected?
Or there is a way to set the options and relax the conversion errors to let
all data records successfully extracted? Currently at Error handling section
of flat file format object, all error handling and logging options are
selected Yes.
Edited by: Chow Ming Darren Cheeng on Aug 7, 2009 10:31 AMThank you for your reply. Currently I am using this ifthenelse function to check wheter the data value contains
escape character and set the whole value to null to avoid data rejection.
ifthenelse(index(column_value, '
', 1) IS NOT NULL, NULL, column_value)
The source file team says that whenever they have nonprintable characters inside the data value, the character will be escaped with
I agree with what you said if record is broken up too much, DS has no choice but to skip it. -
Record rejected using SQL*Loader
I'm not too sure why I'm getting the following error message:
Record 1: Rejected - Error on table LOAD_DATA, column CLIENT_NUMBER.
ORA-01401: inserted value too large for column
Here is my info:
Table def:
SQL> describe load_data
Name Null? Type
CLIENT_NUMBER NOT NULL VARCHAR2(4)
PROD_SER VARCHAR2(5)
FEE_TYPE VARCHAR2(5)
QUANTITY NUMBER(13,5)
ALLOC_CODE VARCHAR2(11)
PROCESS_DATE DATE
My control file def:
LOAD DATA
INFILE "DATA.DAT"
APPEND
INTO TABLE <schema>.LOAD_DATA
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
(CLIENT_NUMBER CHAR,
PROD_SER CHAR,
FEE_TYPE CHAR,
QUANTITY "TO_NUMBER(:QUANTITY,'99999999.99999')",
ALLOC_CODE CHAR,
PROCESS_DATE "TO_DATE(:PROCESS_DATE, 'MMDDYYYY')")
Sample Data:
A348,04201,PR200,88,,01182007
A348,04201,PR200,9,,01192007
A348,04201,PR200,93,,01192007
Erorr Message:
Table LOAD_DATA, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
CLIENT_NUMBER FIRST * , CHARACTER
PROD_SER NEXT * , CHARACTER
FEE_TYPE NEXT * , CHARACTER
QUANTITY NEXT * , CHARACTER
SQL string for column : "TO_NUMBER(:QUANTITY,'99999999.99999')"
ALLOC_CODE NEXT * , CHARACTER
PROCESS_DATE NEXT * , CHARACTER
SQL string for column : "TO_DATE(:PROCESS_DATE, 'MMDDYYYY')"
Record 1: Rejected - Error on table LOAD_DATA, column CLIENT_NUMBER.
ORA-01401: inserted value too large for columnYour exact data and control file worked for me...
Table LOAD_DATA:
3 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
SQL> select * from load_data;
CLIE PROD_ FEE_T QUANTITY ALLOC_CODE PROCESS_DAT
A348 04201 PR200 88 18-JAN-2007
A348 04201 PR200 9 19-JAN-2007
A348 04201 PR200 93 19-JAN-2007 -
AP Invoices Records Rejected but Error Table is blank
Hi,
I am uploading total 300 AP Invoices using following Interfaces:
ap_invoices_interface
ap_invoice_lines_interface
but only 22 invoices are being rejected.
But the tension is the ap_interface_rejections is blank.
how could I trace the errors?I got the error. It's a Bug, following is the exact reference:
Bug 8739726: PAYABLES OPEN INTERFACE IMPORT 'REJECTS' THE RECORDS, BUT DOESN'T DISPLAY ERRORS -
Data records rejected while using Data Manager Packages
Hello all ,
We have configured the dimensions and applications in SAP- BPC 7.5 NW. However , while loading the data into the application using a data manager package , all the records are rejected. The following is the error :
Line 3 :Dimension: P_CC member: SALESTEAM1 is invalid or is a calculated member
Line 3 :Dimension: PRODUCT member: PRODUCTA is invalid or is a calculated member
However , these member are present in the dimensions and are processed as well !! Any help is appreciated .
THanks
VijayThe message seems to be saying you have a dimension forumula defined for those two cost centers. Please check your dimension members master data and make sure there is nothing defined in a property column called "Formula" for those two members. If there is something populated, then delete the cell contents and save and process your dimension. Then try your data manager load again.
Best regards,
[Jeffrey Holdeman|http://wiki.sdn.sap.com/wiki/display/profile/Jeffrey+Holdeman]
SAP Labs, LLC
BusinessObjects Division
Americas Applications Regional Implementation Group (RIG) -
Records rejected while loading by SQL *Loader
Hi,
I am loading CSV data into an Oracle table T1. This table has a column C1 which has a FK constraint with C2 from a parent table T2. I added one value in C2 which is present in T1, but SQL *Loader is rejecting all rows and not loading matching records.
Is there some option to allow SQL *Loader to load matching record and reject the rest.
Thanks In advance.
Rajeev.If I understand your requirements right, you may want to consider Loading Records Based on a Condition.
-
All data load records rejected by EIS
I have a very strange behavior with EIS:
We used to successfully build an Essbase cube (V6.5.3) from a SQL 2000 database using EIS (V6.5.4).
A few weeks ago we doubled the size of a dimension (from about 13000 members to 27000) by adding a new branch exclusively based on shared members (no increase in the data).
Since then, most of the time (but not all the time !) the data load rejected absolutely all the records.
I can certify that the records are OK. When we try to reduce this dimension to smaller parts, each individual section can be built and loaded successfully. But when loading everything, EIS reject all the data records.
We were able to do the full load only once or twice just after having restarted EIS...
Did anybody already faced such problem ? Are you aware of a limit in the number of members EIS can handle in one dimension ?
Thanks for any tips or workaround about this issue that becomes very critical for us.
Best regards,
Stephanei faced some similar problem AIS 9.3.0.1
once check the rul file for data load in essbase.EIS automatically creates one rul file for data loading. -
11g Cube not showing any data with no Rejected records
Hi David ,
Strangely one of my 11g Cube is not showing data from today as it is showing all records rejected . However when I lookup Rejected records table I didn't find any records .Not sure what is happening . When I want to peep out the AWM queries from CUBE_BUILD_LOG and ran to the database in the AWM schema the records are retruning perfectly fine . I wonder same query is firing during Cube load but without any data ? My Cube build script has only LOAD and Aggregate .
after maintain My dimensions data are looking fine but no data populated after Cube maintenance . My MV switch off across all dimensions and cubes .
I navigate to CUBE_OPERATION_LOG and not able to comprehend about the content.
Any advice ?
Thanks and Regards,
DxPHi David ,
To be very frank today is very bad day ... Please see below my observation:
Executed below to make sure that no key values in dimension is missing but present in fact . All below query returns no row.
select distinct owner_postn_wid from w_synm_rx_t_f
minus
select distinct row_wid from postn_dh
select distinct payer_type_Wid from w_synm_rx_t_f
minus
select distinct row_wid from wc_ins_plan_dh
select distinct market_wid from w_synm_rx_t_f
minus
select distinct row_wid from w_product_dh
select distinct period_day_wid from w_synm_rx_t_f
minus
select distinct row_wid from w_daytime_D
select distinct contact_wid from w_synm_rx_t_f
intersect
select distinct row_wid from w_person_d
select distinct X_TERR_TYPE_WID from w_synm_rx_t_f
minus
select distinct row_wid from W_LOV_D
============================
Below returns count of 0 rows : ensure no NULL present
select count(1) from w_synm_rx_t_f where contact_wid is null;
select count(1) from w_synm_rx_t_f where owner_postn_wid is null;
select count(1) from w_synm_rx_t_f where payer_type_Wid is null;
select count(1) from w_synm_rx_t_f where period_day_wid is null;
select count(1) from w_synm_rx_t_f where X_TERR_TYPE_WID is null;
select count(1) from w_synm_rx_t_f where market_wid is null;
+++++++++++++++++++++++++++++++++
Cube Build Log has below entry:
796 0 STARTED CLEAR VALUES MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.08.267000000 PM +05:30 JAVA 1 C 47141 67 0 1
796 0 COMPLETED CLEAR VALUES MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.08.267000000 PM +05:30 JAVA 1 C 47141 67 0 2
796 0 STARTED LOAD MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.08.283000000 PM +05:30 JAVA 1 C 47142 68 0 1
796 0 SQL LOAD MKT_SLS_CUBE CUBE "<SQL>
<![CDATA[
SELECT /*+ bypass_recursive_check cursor_sharing_exact no_expand no_rewrite */
T16_ROW_WID ALIAS_127,
T13_ROW_WID ALIAS_128,
T10_ROW_WID ALIAS_129,
T7_ROW_WID ALIAS_130,
T4_ROW_WID ALIAS_131,
T1_ROW_WID ALIAS_132,
SUM(T20_MKT_TRX) ALIAS_133,
SUM(T20_MKT_NRX) ALIAS_134
FROM
SELECT /*+ no_rewrite */
T1."CONTACT_WID" T20_CONTACT_WID,
T1."MARKET_WID" T20_MARKET_WID,
T1."OWNER_POSTN_WID" T20_OWNER_POSTN_WID,
T1."PAYER_TYPE_WID" T20_PAYER_TYPE_WID,
T1."PERIOD_DAY_WID" T20_PERIOD_DAY_WID,
T1."MKT_NRX" T20_MKT_NRX,
T1."MKT_TRX" T20_MKT_TRX,
T1."X_TERR_TYPE_WID" T20_X_TERR_TYPE_WID
FROM
NN_OLAP_POC."W_SYNM_RX_T_F" T1 )
T20,
SELECT /*+ no_rewrite */
T1."ROW_WID" T16_ROW_WID
FROM
NN_OLAP_POC."W_DAYTIME_D" T1 )
T16,
SELECT /*+ no_rewrite */
T1."ROW_WID" T13_ROW_WID
FROM
NN_OLAP_POC."W_PERSON_D" T1 )
T13,
SELECT /*+ no_rewrite */
T1."ROW_WID" T10_ROW_WID
FROM
NN_OLAP_POC."WC_INS_PLAN_DH" T1 )
T10,
SELECT /*+ no_rewrite */
T1."ROW_WID" T7_ROW_WID
FROM
NN_OLAP_POC."W_LOV_D" T1 )
T7,
SELECT /*+ no_rewrite */
T1."ROW_WID" T4_ROW_WID
FROM
NN_OLAP_POC."POSTN_DH" T1 )
T4,
SELECT /*+ no_rewrite */
T1."ROW_WID" T1_ROW_WID
FROM
NN_OLAP_POC."W_PRODUCT_DH" T1 )
T1
WHERE
((T20_PERIOD_DAY_WID = T16_ROW_WID)
AND (T16_ROW_WID = 20100101)
AND (T20_CONTACT_WID = T13_ROW_WID)
AND (T20_PAYER_TYPE_WID = T10_ROW_WID)
AND (T7_ROW_WID = T20_X_TERR_TYPE_WID)
AND (T20_OWNER_POSTN_WID = T4_ROW_WID)
AND (T20_MARKET_WID = T1_ROW_WID)
AND ((T20_PERIOD_DAY_WID) IN ((20100107.000000) , (20100106.000000) , (20100128.000000) , (20100124.000000) , (20100121.000000) , (20100118.000000) , (20100115.000000) , (20100109.000000) , (20100125.000000) , (20100114.000000) , (20100111.000000) , (20100110.000000) , (20100104.000000) , (20100101.000000) , (20100129.000000) , (20100123.000000) , (20100117.000000) , (20100113.000000) , (20100108.000000) , (20100131.000000) , (20100120.000000) , (20100116.000000) , (20100119.000000) , (20100105.000000) , (20100102.000000) ,
(20100130.000000) , (20100127.000000) , (20100122.000000) , (20100112.000000) , (20100103.000000) , (20100126.000000) ) ) )
GROUP BY
(T1_ROW_WID, T4_ROW_WID, T7_ROW_WID, T10_ROW_WID, T13_ROW_WID, T16_ROW_WID)
ORDER BY
T1_ROW_WID ASC NULLS LAST ,
T4_ROW_WID ASC NULLS LAST ,
T7_ROW_WID ASC NULLS LAST ,
T10_ROW_WID ASC NULLS LAST ,
T13_ROW_WID ASC NULLS LAST ,
T16_ROW_WID ASC NULLS LAST ]]>/>
</SQL>" NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.08.627000000 PM +05:30 JAVA 1 MAP1 C 47142 68 0 2
796 0 COMPLETED LOAD MKT_SLS_CUBE CUBE "<CubeLoad
LOADED="0"
REJECTED="4148617"/>" NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.40.486000000 PM +05:30 JAVA 1 C 47142 68 0 3
796 0 STARTED UPDATE MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.40.501000000 PM +05:30 JAVA 1 C 47143 69 0 1
796 0 COMPLETED UPDATE MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.40.548000000 PM +05:30 JAVA 1 C 47143 69 0 2
+++++++++++++++++
You can observer clear rejection of 4 million rows ... Ran the above query which returns my data successfully.
Look out to CUBE_REJECTED records take the sample record and put into the above query it is returning the data fine with my measures and dimension WID's :(PLEASE SEE BELOW THE FILTERS on ROW_WID)
=========================
SELECT /*+ bypass_recursive_check cursor_sharing_exact no_expand no_rewrite */
T16_ROW_WID ALIAS_127,
T13_ROW_WID ALIAS_128,
T10_ROW_WID ALIAS_129,
T7_ROW_WID ALIAS_130,
T4_ROW_WID ALIAS_131,
T1_ROW_WID ALIAS_132,
SUM(T20_MKT_TRX) ALIAS_133,
SUM(T20_MKT_NRX) ALIAS_134
FROM
SELECT /*+ no_rewrite */
T1."CONTACT_WID" T20_CONTACT_WID,
T1."MARKET_WID" T20_MARKET_WID,
T1."OWNER_POSTN_WID" T20_OWNER_POSTN_WID,
T1."PAYER_TYPE_WID" T20_PAYER_TYPE_WID,
T1."PERIOD_DAY_WID" T20_PERIOD_DAY_WID,
T1."MKT_NRX" T20_MKT_NRX,
T1."MKT_TRX" T20_MKT_TRX,
T1."X_TERR_TYPE_WID" T20_X_TERR_TYPE_WID
FROM
NN_OLAP_POC."W_SYNM_RX_T_F" T1 )
T20,
SELECT /*+ no_rewrite */
T1."ROW_WID" T16_ROW_WID
FROM
NN_OLAP_POC."W_DAYTIME_D" T1 )
T16,
SELECT /*+ no_rewrite */
T1."ROW_WID" T13_ROW_WID
FROM
NN_OLAP_POC."W_PERSON_D" T1 )
T13,
SELECT /*+ no_rewrite */
T1."ROW_WID" T10_ROW_WID
FROM
NN_OLAP_POC."WC_INS_PLAN_DH" T1 )
T10,
SELECT /*+ no_rewrite */
T1."ROW_WID" T7_ROW_WID
FROM
NN_OLAP_POC."W_LOV_D" T1 )
T7,
SELECT /*+ no_rewrite */
T1."ROW_WID" T4_ROW_WID
FROM
NN_OLAP_POC."POSTN_DH" T1 )
T4,
SELECT /*+ no_rewrite */
T1."ROW_WID" T1_ROW_WID
FROM
NN_OLAP_POC."W_PRODUCT_DH" T1 )
T1
WHERE
((T20_PERIOD_DAY_WID = T16_ROW_WID)
AND (T16_ROW_WID = 20100101)
AND (T20_CONTACT_WID = T13_ROW_WID)
AND (T20_PAYER_TYPE_WID = T10_ROW_WID)
AND (T7_ROW_WID = T20_X_TERR_TYPE_WID)
AND (T20_OWNER_POSTN_WID = T4_ROW_WID)
AND (T20_MARKET_WID = T1_ROW_WID)
AND T13_ROW_WID = 255811
AND T7_ROW_WID = 122
AND T4_ROW_WID =3
AND T1_ROW_WID=230
AND T10_ROW_WID = 26
AND ((T20_PERIOD_DAY_WID) IN ((20100107.000000) , (20100106.000000) , (20100128.000000) , (20100124.000000) , (20100121.000000) , (20100118.000000) , (20100115.000000) , (20100109.000000) , (20100125.000000) , (20100114.000000) , (20100111.000000) , (20100110.000000) , (20100104.000000) , (20100101.000000) , (20100129.000000) , (20100123.000000) , (20100117.000000) , (20100113.000000) , (20100108.000000) , (20100131.000000) , (20100120.000000) , (20100116.000000) , (20100119.000000) , (20100105.000000) , (20100102.000000) ,
(20100130.000000) , (20100127.000000) , (20100122.000000) , (20100112.000000) , (20100103.000000) , (20100126.000000) ) ) )
GROUP BY
(T1_ROW_WID, T4_ROW_WID, T7_ROW_WID, T10_ROW_WID, T13_ROW_WID, T16_ROW_WID)
ORDER BY
T1_ROW_WID ASC NULLS LAST ,
T4_ROW_WID ASC NULLS LAST ,
T7_ROW_WID ASC NULLS LAST ,
T10_ROW_WID ASC NULLS LAST ,
T13_ROW_WID ASC NULLS LAST ,
T16_ROW_WID ASC NULLS LAST
=================================
THE XML export of CUBE as below:
<!DOCTYPE Metadata [
<!ENTITY % BIND_VALUES PUBLIC "OLAP BIND VALUES" "OLAP METADATA">
%BIND_VALUES;
]>
<Metadata
Version="1.2"
MinimumDatabaseVersion="11.2.0.1">
<Cube
ETViewName="MKT_SLS_CUBE_VIEW"
Name="MKT_SLS_CUBE">
<Measure>
<BaseMeasure
SQLDataType="NUMBER"
ETMeasureColumnName="TRX"
Name="TRX">
<Description
Type="LongDescription"
Language="AMERICAN"
Value="TRX">
</Description>
<Description
Type="ShortDescription"
Language="AMERICAN"
Value="TRX">
</Description>
<Description
Type="Description"
Language="AMERICAN"
Value="TRX">
</Description>
</BaseMeasure>
</Measure>
<Measure>
<BaseMeasure
SQLDataType="NUMBER"
ETMeasureColumnName="NRX"
Name="NRX">
<Description
Type="LongDescription"
Language="AMERICAN"
Value="NRX">
</Description>
<Description
Type="ShortDescription"
Language="AMERICAN"
Value="NRX">
</Description>
<Description
Type="Description"
Language="AMERICAN"
Value="NRX">
</Description>
</BaseMeasure>
</Measure>
<CubeMap
Name="MAP1"
IsSolved="False"
Query="W_SYNM_RX_T_F"
WhereClause="W_DAYTIME_D.ROW_WID = 20100101">
<MeasureMap
Name="TRX"
Measure="TRX"
Expression="W_SYNM_RX_T_F.MKT_TRX">
</MeasureMap>
<MeasureMap
Name="NRX"
Measure="NRX"
Expression="W_SYNM_RX_T_F.MKT_NRX">
</MeasureMap>
<CubeDimensionalityMap
Name="TIME"
Dimensionality="TIME"
MappedDimension="TIME.CALENDER.MONTHLY"
JoinCondition="W_SYNM_RX_T_F.PERIOD_DAY_WID = W_DAYTIME_D.ROW_WID"
Expression="W_DAYTIME_D.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="CUSTOMER"
Dimensionality="CUSTOMER"
MappedDimension="CUSTOMER.CUSTOMER_HIERARCHY.DETAIL_LEVEL"
JoinCondition="W_SYNM_RX_T_F.CONTACT_WID = W_PERSON_D.ROW_WID"
Expression="W_PERSON_D.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="INS_PLAN_DH"
Dimensionality="INS_PLAN_DH"
MappedDimension="INS_PLAN_DH.INS_PLAN.DETAIL"
JoinCondition="W_SYNM_RX_T_F.PAYER_TYPE_WID = WC_INS_PLAN_DH.ROW_WID"
Expression="WC_INS_PLAN_DH.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="LIST_OF_VALUES"
Dimensionality="LIST_OF_VALUES"
MappedDimension="LIST_OF_VALUES.LOV_HIERARCHY.DETAIL_LEVEL"
JoinCondition="W_LOV_D.ROW_WID = W_SYNM_RX_T_F.X_TERR_TYPE_WID"
Expression="W_LOV_D.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="POSITIONDH"
Dimensionality="POSITIONDH"
MappedDimension="POSITIONDH.POST_HIER.DETAIL"
JoinCondition="W_SYNM_RX_T_F.OWNER_POSTN_WID = POSTN_DH.ROW_WID"
Expression="POSTN_DH.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="PRODH"
Dimensionality="PRODH"
MappedDimension="PRODH.PRODHIER.DETAILLVL"
JoinCondition="W_SYNM_RX_T_F.MARKET_WID = W_PRODUCT_DH.ROW_WID"
Expression="W_PRODUCT_DH.ROW_WID">
</CubeDimensionalityMap>
</CubeMap>
<Organization>
<AWCubeOrganization
MVOption="NONE"
SparseType="COMPRESSED"
MeasureStorage="SHARED"
NullStorage="MV_READY"
CubeStorageType="NUMBER"
PrecomputePercent="35"
PrecomputePercentTop="0"
PartitionLevel="TIME.CALENDER.MONTHLY"
AW="&AW_NAME;">
<SparseDimension
Name="TIME"/>
<SparseDimension
Name="CUSTOMER"/>
<SparseDimension
Name="INS_PLAN_DH"/>
<SparseDimension
Name="LIST_OF_VALUES"/>
<SparseDimension
Name="POSITIONDH"/>
<SparseDimension
Name="PRODH"/>
<DefaultBuild>
<![CDATA[BUILD SPEC LOAD_AND_AGGREGATE
LOAD NO SYNCH,
SOLVE
)]]>
</DefaultBuild>
</AWCubeOrganization>
</Organization>
<Dimensionality
Name="TIME"
ETKeyColumnName="TIME"
Dimension="TIME">
</Dimensionality>
<Dimensionality
Name="CUSTOMER"
ETKeyColumnName="CUSTOMER"
Dimension="CUSTOMER">
</Dimensionality>
<Dimensionality
Name="INS_PLAN_DH"
ETKeyColumnName="INS_PLAN_DH"
Dimension="INS_PLAN_DH">
</Dimensionality>
<Dimensionality
Name="LIST_OF_VALUES"
ETKeyColumnName="LIST_OF_VALUES"
Dimension="LIST_OF_VALUES">
</Dimensionality>
<Dimensionality
Name="POSITIONDH"
ETKeyColumnName="POSITIONDH"
Dimension="POSITIONDH">
</Dimensionality>
<Dimensionality
Name="PRODH"
ETKeyColumnName="PRODH"
Dimension="PRODH">
</Dimensionality>
<Description
Type="LongDescription"
Language="AMERICAN"
Value="MKT SLS CUBE">
</Description>
<Description
Type="ShortDescription"
Language="AMERICAN"
Value="MKT SLS CUBE">
</Description>
<Description
Type="Description"
Language="AMERICAN"
Value="MKT SLS CUBE">
</Description>
<ConsistentSolve>
<![CDATA[SOLVE
SUM
MAINTAIN COUNT
OVER ALL
)]]>
</ConsistentSolve>
</Cube>
</Metadata>
+++++++++++++++++++++++
I dropped the AW and create new from exported XML and maintain all dimensions and then rebuild . Still have the issue :(
Any thing you can hightlight from above ?
Thanks,
DxP
Also I sustpect whethere it is a issue due to below error caused when I click on one of my Position_Hier view from AWM : even if I select that view it is throwing the error in SQL developer after displaying first couple of rows (while page down)
java.sql.SQLException: ORA-33674: Data block size 63 exceeds the maximum size of 60 bytes.
at oracle.olap.awm.util.jdbc.SQLWrapper.execute(Unknown Source)
at oracle.olap.awm.querydialog.PagedQueryDialog$1.construct(Unknown Source)
at oracle.olap.awm.ui.SwingWorker$2.run(Unknown Source)
at java.lang.Thread.run(Thread.java:595)
Edited by: e_**** on Aug 17, 2011 8:41 PM -
In EIS at data loading Few Records are getting rejected
Hello,
I m loading the data in Essbase through EIS.I am using user defined SQL for data loading.
In log this this is showing that 22000 records are loaded and 564 records rejected.
Can any Expert let me know why these records are getting rejected.
Thx
ShivendraIf you look on your EIS server, under hyperion\eis there is a logfiles directory. Under it are folders for every load. In these folders are the error files. If you look at the file, the first column shows you the member that got rejected. You can then look to see if it is in your outline. My guess is your custom sql is bringing back rows that are not in the cube
-
SQL* Loader record length problem
HI, I'm trying to load a log file into a table for processing. I want to load each line as a single record into a table of one field VARCHAR2(2000). However it only loads the first character of each line and then fails after 12 records. What am I doing wrong?
Below is my control file.
LOAD DATA
INFILE 'proxyLog.20060627'
BADFILE 'badproxy.dat'
DISCARDFILE 'disproxy.dat'
TRUNCATE
INTO TABLE STAGE_PROXY_LOG
TRAILING NULLCOLS
error_text
)Here's some of the data: and error log
[27 Jun 2006, 00:17] Processing Customers .....
Customer 2649513 [Record 202732] processed.
[27 Jun 2006, 00:32] Processing Customers .....
Customer 2649516 [Record 202733] processed.
[27 Jun 2006, 00:47] Processing Customers .....
Error creating customer profile
ExitStateMsg:
ExitStateType: 3
ExportWkErrorStatusCode: 7
ExportWkErrorStatusMessageTextToInsert: City name
F
ProxyCallException
at CustomerEnrollProcess.<init>(CustomerEnrollProcess.java:229)
at ProcessCustomers.main(ProcessCustomers.java:156)
ERROR LOG -----
Control File: cm.ctl
Data File: proxyLog.20060627
Bad File: badproxy.dat
Discard File: disproxy.dat
(Allow 0 discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 10000
Bind array: 64 rows, maximum of 65536 bytes
Continuation: none specified
Path used: Conventional
Table STAGE_PROXY_LOG, loaded from every logical record.
Insert option in effect for this table: TRUNCATE
Column Name Position Len Term Encl Datatype
ERROR_TEXT FIRST 1 CHARACTER
Record 13: Discarded - all columns null.
Discard limit reached - processing terminated on data file proxyLog.20060627.
Table STAGE_PROXY_LOG:
12 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
1 Row not loaded because all fields were null.
Space allocated for bind array: 256 bytes(64 rows)
Space allocated for memory besides bind array: 0 bytes
Total logical records skipped: 0
Total logical records read: 13
Total logical records rejected: 0
Total logical records discarded: 1
Run began on Fri Jun 30 11:20:27 2006
Run ended on Fri Jun 30 11:20:27 2006 -
SQl loader not loading records
I have my control file like this
options (skip=1)
LOAD DATA
INFILE xxx.csv
into table xxx
TRUNCATE
FIELDS TERMINATED BY ',' optionally enclosed by '"'
RECORD_STATUS,
ITEM_NUMBER,
Sql loader not loading records and giving error like .......
Commit point reached - logical record count 14
Commit point reached - logical record count 26
Commit point reached - logical record count 84
Commit point reached - logical record count 92
and successfully loaded only 41 records among 420 records
Plz help meHI Phiri,
Thx for your reply.Here is the log file.
SQL*Loader: Release 8.0.6.3.0 - Production on Wed May 12 21:26:30 2010
(c) Copyright 1999 Oracle Corporation. All rights reserved.
Control File: saba_price_break_allcur_test.ctl
Data File: saba_price_break_allcur_test.csv
Bad File: saba_price_break_allcur_test.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 1
Errors allowed: 50
Bind array: 64 rows, maximum of 65536 bytes
Continuation: none specified
Path used: Conventional
Table SABA_PRICE_BREAK_ALLCUR_TEST, loaded from every logical record.
Insert option in effect for this table: TRUNCATE
Column Name Position Len Term Encl Datatype
RECORD_STATUS FIRST * , O(") CHARACTER
ITEM_NUMBER NEXT * , O(") CHARACTER
PA1 NEXT * , O(") CHARACTER
PA2 NEXT * , O(") CHARACTER
UOM_CODE NEXT * , O(") CHARACTER
RANGE_PRICING NEXT * , O(") CHARACTER
RANGE_FROM NEXT * , O(") CHARACTER
RANGE_TO NEXT * , O(") CHARACTER
PRICING_ATTRIBUTE NEXT * , O(") CHARACTER
PRICING_METHOD NEXT * , O(") CHARACTER
PRICE_BREAK_LINE_NO NEXT * , O(") CHARACTER
TEMPLATE_NAME NEXT * , O(") CHARACTER
ITEM_DESC NEXT * , O(") CHARACTER
PRICE_USD NEXT * , O(") CHARACTER
PRICE_EUR NEXT * , O(") CHARACTER
PRICE_GBP NEXT * , O(") CHARACTER
PRICE_JPY NEXT * , O(") CHARACTER
GL_ACCOUNT NEXT * , O(") CHARACTER
LONG_DESC NEXT * , O(") CHARACTER
STATUS NEXT * , O(") CHARACTER
MESSAGE NEXT * , O(") CHARACTER
Record 12: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 13: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 27: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 28: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 29: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 30: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 31: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 32: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 33: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 34: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 35: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 36: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 37: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 38: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 39: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 40: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 41: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 42: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 43: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 44: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 45: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 46: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 47: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 48: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 49: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 50: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 51: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 52: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 53: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 54: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 55: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 56: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 57: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 58: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 59: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 60: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 61: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 62: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 63: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 64: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 65: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 66: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 67: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 68: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 69: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 70: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 73: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 74: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 87: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 91: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 92: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table SABA_PRICE_BREAK_ALLCUR_TEST:
41 Rows successfully loaded.
51 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 65016 bytes(12 rows)
Space allocated for memory besides bind array: 0 bytes
Total logical records skipped: 1
Total logical records read: 92
Total logical records rejected: 51
Total logical records discarded: 0
Run began on Wed May 12 21:26:30 2010
Run ended on Wed May 12 21:27:06 2010
Elapsed time was: 00:00:36.08
CPU time was: 00:00:00.00 -
How can I audit the bad records that are getting rejected from the target table. ie) i want to store the bad records rejected out to a separate table.
i tried checking it out in the wb_rt_error_sources table in runtime repository, but it holds only the error column name and its value. i want to hold the entire record.
Thanking in Anticipation,
AmudhaAmudha,
There are two main methods of dealing with the error records:
1. If the error records are violating key constraints, you can configure the target object (right-click on the the mapping, select configure and expand the sources and targets configuration, constraints section). You must put the constraints to false, so that OWB will disable the constraints during the load (and you will have better performance) and re-enable them after the load. The records violating the target table constraints will appear in the runtime audit error table as well as in the exceptions table, if you specify one.
2. If you can detect the error condition with a filter, a lookup table etc., you can channel them through a splitter operator to a target table that will contain the erroneous rows.
Regards:
Igor -
How to reject week 41 to 52 from 2007 data in rules file
Hi there,
I have to reject 2007W41 to 2007W52 from the data loading, meaning i just want data till 2007W40.
Now so far what i have done is i edited the rules files went to select Time column>Records>reject>since i have to reject more than one i have selected "And"
the field is shown as "Time" at the top
Type String / Number Condition
String 2007W41 contain
Can anyone tell me what i m doing wrong here?
Thanks in advance...I think you are reading it wrong
Think of the string aaaaa
Lets do the rejection criterial in engilish. You want to reject if the string contains aaaa or bbbbb
IF you say reject if the string contains aaaa that is good and meets the criteria
if you add and the string contains bbbb it is no longer true since it dones not contain bbbb
If my string were aaaabbbb then the and would work
If you were using a not contains, then youwould have to use the and instead of the or.
IF the string does not contian aaaa and does not contain bbbb
All of this should not be confused with the global and/or logic which works across multiple columns
Maybe you are looking for
-
Help with complex Acrobat forms
Hi Everyone, Background: I worked with another developer and completed a form for one of our clients but I need to know if the below change requests are even possible and if so if anyone would like to help with finishing the form. I am definitely wil
-
Sending email with an excel attachment
Hi, I have designed one workflow for PO approval .For the release code 01 i am displaying the one custom report to the approver . Now the requirement is after release of the code 01 i have to download this custom report into excel format and i have
-
Cant get my smcwusb-g adapter woring
Hi, I have just installed a fresh copy of arch. Everything went smooth except that I cant get my usb wireless adapter to work. I tryed ubuntu pluged it in and it worked just fine using network manager. So I installed network manager and when i plug i
-
Why is the style type element repeating in my html code
I have a Business Catalyst site that I've created using a starter template. I've made some edits to customize the template and upgraded the site in BC. After making some edits to pages as well as some minor adjustments to the CSS file, I've noticed t
-
Multiple Workflows and SOA Projects and Composites
Hello I appolgize if this question has been asked I can't find any information on this basic design approach. I noticed in the SOA Composites that a reference service can only be access by one BPEL component. I have multiple different workflows that