Error in Loading Data with SQLLDR in Oracle 10G

Hello,
Can any one suggest whats the problem in the below mentioned Control file used for loading data through SQL*LOADER
LOAD DATA
INFILE 'D:\test\temt.txt'
BADFILE 'test.bad'
DISCARDFILE 'test.dsc'
INTO TABLE "TEST"
INSERT
(SRNO INTEGER(7),
PROD_ID INTEGER(10),
PROMO_ID INTEGER(10),
CHANNEL_ID INTEGER(10),
UNIT_COST INTEGER(10),
UNIT_PRICE INTEGER(10)
I am trying to load data in SCOTT schema as user scott.
Why do i get such an error, please refer the attach Log file.
SQL*Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 14:43:35 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: D:\test\temt.ctl
Data File: D:\test\temt.txt
Bad File: test.bad
Discard File: test.dsc
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table "TEST", loaded from every logical record.
Insert option in effect for this table: INSERT
Column Name Position Len Term Encl Datatype
SRNO FIRST 7 INTEGER
PROD_ID NEXT 10 INTEGER
PROMO_ID NEXT 10 INTEGER
CHANNEL_ID NEXT 10 INTEGER
UNIT_COST NEXT 10 INTEGER
UNIT_PRICE NEXT 10 INTEGER
Record 1: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 2: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 3: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 4: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 5: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 6: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 7: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 8: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 9: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 10: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 11: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 12: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 13: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 14: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 15: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 16: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 17: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 18: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 19: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 20: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 21: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 22: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 23: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 24: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 25: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 26: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 27: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 28: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 29: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 30: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 31: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 32: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 33: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 34: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 35: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 36: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 37: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 38: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 39: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 40: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 41: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 42: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 43: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 44: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 45: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 46: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 47: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 48: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 49: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 50: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
Record 51: Rejected - Error on table "TEST".
ORA-01460: unimplemented or unreasonable conversion requested
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table "TEST":
0 Rows successfully loaded.
51 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 3648 bytes(64 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 64
Total logical records rejected: 51
Total logical records discarded: 0
Run began on Fri Mar 20 14:43:35 2009
Run ended on Fri Mar 20 14:43:43 2009
Elapsed time was: 00:00:07.98
CPU time was: 00:00:00.28
Below is the method of using SQLLDR and table details,
SQL> desc test
Name Null? Type
SRNO NUMBER(7)
PROD_ID NUMBER(10)
PROMO_ID NUMBER(10)
CHANNEL_ID NUMBER(10)
UNIT_COST NUMBER(10)
UNIT_PRICE NUMBER(10)
Method for using sqlldr is :
cmd promt,
d:\> sqlldr scott/tiger
control = D:\test\temt.ctl
SQL*Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 15:55:50 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Commit point reached - logical record count 64
I even tried few examples,
Which of the below mentioned Control file make sense,
--1
LOAD DATA
INFILE 'D:\test\temt.txt'
BADFILE 'test.bad'
DISCARDFILE 'test.dsc'
INTO TABLE "TEST"
INSERT
FIELD TERMINATED BY (,)
(SRNO INTEGER(7),
PROD_ID INTEGER(10),
PROMO_ID INTEGER(10),
CHANNEL_ID INTEGER(10),
UNIT_COST INTEGER(10),
UNIT_PRICE INTEGER(10)
--2
LOAD DATA
INFILE 'D:\test\temt.txt'
BADFILE 'test.bad'
DISCARDFILE 'test.dsc'
INTO TABLE "TEST"
INSERT
FIELD TERMINATED BY (,) optionally enclosed by '"'
(SRNO INTEGER(7),
PROD_ID INTEGER(10),
PROMO_ID INTEGER(10),
CHANNEL_ID INTEGER(10),
UNIT_COST INTEGER(10),
UNIT_PRICE INTEGER(10)
*For code--1 i get below mentioned error..*
D:\>sqlldr scott/tiger
control = D:\test\temt.ctl
SQL*Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 16:36:00 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
SQL*Loader-350: Syntax error at line 8.
Expecting "(", found "FIELD".
FIELD TERMINATED BY (,)
^
*And for code--2 i get the below error,*
D:\>sqlldr scott/tiger
control = D:\test\temt.ctl
SQL*Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 16:39:22 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
SQL*Loader-350: Syntax error at line 8.
Expecting "(", found "FIELD".
FIELD TERMINATED BY (,) optionally enclosed by '"'
^
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

below is the data i am trying to load through sqlldr
1,14,999,3,89098,111287
2,14,999,2,88645,110899
3,14,999,2,90418,117623
4,14,999,3,89272,115999
5,14,999,4,86364,117623
6,15,999,3,87522,101399
7,15,999,4,84671,99999
8,15,999,4,84671,99999
9,15,999,4,86364,101399
10,15,999,4,88735,100399
11,15,999,2,90418,101399
12,15,999,3,89272,101399
13,15,999,2,90418,101399
14,17,999,2,9694,155099
15,17,999,3,97618,155099
16,17,999,3,97618,155099
17,18,999,3,11333,169783
18,18,999,2,11479,163279
19,18,999,3,11333,163279
20,18,999,3,115141,173504
21,18,999,2,117086,165565
22,18,999,2,116856,17532
23,18,999,2,117086,169783
24,19,999,4,489,6237
25,19,999,3,4968,6302
26,20,999,2,52876,60839
27,20,999,3,52202,59999
28,20,999,3,53246,59999
29,20,999,3,54205,60209
30,20,999,3,54205,60209
31,21,999,4,76204,106773
32,21,999,4,76204,106773
33,21,999,3,7877,105299
34,21,999,4,76204,106773
35,21,999,4,77576,105668
36,21,999,3,7877,105299
37,21,999,4,76204,105299
38,21,999,2,81377,107595
39,21,999,2,81377,107595
40,21,999,4,77728,107595
41,22,999,3,2187,2656
42,22,999,2,2216,2661
43,22,999,3,2187,2656
44,22,999,2,2251,2632
45,22,999,3,2187,2656
46,22,999,4,2154,2628
47,22,999,3,2187,2656
48,22,999,3,2231,2661
49,22,999,3,2231,2624
50,22,999,2,2296,2632
51,22,999,3,2231,2661
52,22,999,4,2158,2661
53,23,999,3,1913,2408
54,23,999,3,1951,2375
55,23,999,3,1987,2383
56,23,999,3,1951,2408
57,24,999,4,3946,4943
58,24,999,3,4073,4883
59,24,999,2,4053,4934
60,24,999,2,4053,4866
61,24,999,4,3946,4943
62,24,999,3,4001,4943
63,24,999,3,4154,4892
64,24,999,4,4025,4875
65,24,999,4,4025,4875
66,24,999,2,4134,4875
67,24,999,3,4081,4943
68,24,999,2,4134,4934
69,24,999,4,4025,4943
70,24,999,4,4025,4875
71,24,999,3,4081,4943
72,25,999,3,983,12655
73,25,999,3,983,12655
74,25,999,2,9958,12655
75,25,999,3,983,12655
76,25,999,2,9958,12832
77,25,999,3,10027,12832
78,25,999,2,10157,12774
79,25,999,4,9888,12655
80,25,999,2,10157,12832
81,25,999,4,9888,12832
82,25,999,4,9888,12832
83,26,999,3,1305,17034
84,26,999,3,1305,16799
85,26,999,3,13551,16858
86,27,999,3,3992,4927
87,27,999,3,4064,4876
88,29,999,3,43761,56175
89,29,999,2,44942,55621
90,29,999,4,42335,55399
91,29,999,2,44322,55399
92,29,999,2,45208,56175
93,29,999,2,45208,56175
94,29,999,4,43182,56175
95,29,999,3,44636,56175
96,29,999,4,43182,56175
97,29,999,4,43182,56175
98,30,999,3,869,1094
99,30,999,4,857,1079
100,30,999,2,898,1079
----------------------------------------------------------------------------------------------------------------------------------------------------------------------

Similar Messages

  • Data error while loading data using SQLLDR

    Hi Gurus,
    Kindly let em know the possible reasons for getting the below error returned by SQLLDR after loading data:
    x no of rows not loaded due to data errors in SQLLDR
    Could it be due to issues in control file?

    you'll find it well explained here :
    http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14215/part_ldr.htm#i436326
    http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_concepts.htm#i1004846

  • Sql@loader-704  and ORA-12154: error messages when trying to load data with SQL Loader

    I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
    The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
    The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
    The first time I loaded the data i did it from a command line with SQL loader
    Now when I try to load the data I get this message:
    sql@loader-704 Internal error: ulconnect OCISERVERATTACH
    ORA-12154: tns:could not resolve the connect identifier specified
    I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
    I am able to  connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
    However SQL Developer will not let me load a file this big
    I have also tried to load the file within Apex  (SQL Workshop/ Utilities) but again, the file is too big.
    So it seems like SQL Loader is the only option
    I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
    Not sure what else to try or where to look
    thanks

    Hi,
    You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
    open a command prompt
    set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
    This will tell oracle use the config files you find here and no others
    then try sqlldr user/pass@db (in the same dos window)
    see if that connects and let us know.
    Cheers,
    Harry
    http://dbaharrison.blogspot.com

  • Oracle BPM 10gR3 Dashboards - Error in loading data

    We have an issue with oracle bpm 10g dashboards. Chart component in the presentation displays "Error in loading data". There is no exception, the attached method returns valid x,y co-ordinates.If we enable 'user refresh" property of the chart component and click on refresh in the presention, it displays the chart. Any change in the presentation(change a drop down a value, etc) sets it again as "Error in loading data" and click on refresh brings back the chart.
    Adding this.refreshGraphicImage(graphicId : "chart") in the code to refresh the chart also din't help.
    please let me know if anybody has faced similar issue.
    Thanks,
    Gopi

    You have to set "fuego.workspace.execution.ajax.enabled" property to FALSE in the workspace.properties, that is located in ...\webapps\workspace\WEB-INF
    Hope it helps!

  • SQL * Loader : Load data with format MM/DD/YYYY HH:MI:SS PM

    Please advice how to load data with format MM/DD/YYYY HH:MI:SS PM into an Oracle Table using SQL * Loader.
    - What format should I give in the control file?
    - What would be the column type to create the table to load data.
    Sample data below;
    MM/DD/YYYY HH:MI:SS PM
    12/9/2012 2:40:20 PM
    11/29/2011 11:23:12 AM
    Thanks in advance
    Avinash

    Hello Srini,
    I had tried with the creation date as DATE datatype but i had got an error as
    ORA-01830: date format picture ends before converting entire input stringI am running the SQL*LOADER from Oracle R12 EBS front-end.
    the contents of my control file is
    LOAD DATA
    INFILE "$_FileName"
    REPLACE
    INTO TABLE po_recp_int_lines_stg
    WHEN (01) = 'L'
    FIELDS TERMINATED BY "|"
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    INDICATOR                POSITION(1) CHAR,
    TRANSACTION_MODE          "TRIM(:TRANSACTION_MODE)",
    RECEIPT_NUMBER               "TRIM(:RECEIPT_NUMBER)",
    INTERFACE_SOURCE          "TRIM(:INTERFACE_SOURCE)",
    RECEIPT_DATE               "TO_CHAR(TO_DATE(:RECEIPT_DATE,'MM/DD/YYYY'),'DD-MON-YYYY')",
    QUANTITY               "TRIM(:QUANTITY)",
    PO_NUMBER               "TRIM(:PO_NUMBER)",
    PO_LINE_NUMBER               "TRIM(:PO_LINE_NUMBER)",
    CREATION_DATE               "TO_CHAR(TO_DATE(:CREATION_DATE,'MM/DD/YYYY HH:MI:SS AM'),'DD-MON-YYYY HH:MI:SS AM')",
    ERROR_MESSAGE                   "TRIM(:ERROR_MESSAGE)",
    PROCESS_FLAG                    CONSTANT 'N',
    CREATED_BY                      "fnd_global.user_id",
    LAST_UPDATE_DATE                SYSDATE,
    LAST_UPDATED_BY                 "fnd_global.user_id"
    {code}
    My data file goes like
    {code}
    H|CREATE|123|ABC|12/10/2012||||
    L|CREATE|123|ABC|12/10/2012|100|PO12345|1|12/9/2012  2:40:20 PM
    L|CORRECT|123|ABC|12/10/2012|150|PO12346|2|11/29/2011 11:23:12 AM{code}
    Below is the desc of the table
    {code}
    INDICATOR             VARCHAR2 (1 Byte)                         
    TRANSACTION_MODE        VARCHAR2 (10 Byte)                         
    RECEIPT_NUMBER             NUMBER                         
    INTERFACE_SOURCE        VARCHAR2 (20 Byte)                         
    RECEIPT_DATE             DATE                    
    QUANTITY             NUMBER                    
    PO_NUMBER             VARCHAR2 (15 Byte)                         
    PO_LINE_NUMBER             NUMBER                         
    CREATION_DATE             TIMESTAMP(0)                         
    ERROR_MESSAGE             VARCHAR2 (4000 Byte)                         
    PROCESS_FLAG             VARCHAR2 (5 Byte)                         
    CREATED_BY             NUMBER               
    LAST_UPDATE_DATE        DATE               
    LAST_UPDATED_BY             NUMBER     {code}
    Thanks,
    Avinash                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Error in Loading Data In Planning Application

    Hi
    we are getting the Following error in loading data by using Outline Load utility by specifying driver members on the Planning Data Load Administrator Page.
    Client Enable Status false
    com.hyperion.planning.InvalidMemberException: The member USD,BS,Current,BU Version_1 ,FY11 does not exist or you do not have access to it.
    We also tried to load the data by using .CSV File, it is showing NO ERROR in the CMD....but we are unable to preview the data in Essbase after refreshing the database in Planning.
    Cheers,
    Mohammed

    I can be totally wrong but did you intentionally put a space between BU Version_1 and comma?
    USD,BS,Current,BU Version_1 ,FY11
    I found this guideline: http://download.oracle.com/docs/cd/E12825_01/epm.111/hp_admin/ch05s02s01.html Are you sure you are following the correct steps?
    Finally, are you able to load the data through Essbase Load Rules?
    Cheers,
    Mehmet

  • Error While loading data for LIS InfoSources.

    Hi All,
    I am repeatedly receiving load failure errors while loading data using 2lis_01_s001 (This is the case with all the InfoSources).
    The error message is:
    An error occurred in the source system.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
    in our Quality system, we disabled the LIS Updation to No Update(R3) and loaded data and then again changed the Update Mode for No Updating to Asynchronous Update(R3). But now we are doing dataloading in Production. How to proceed. Should we have to disable the LIS Updating whenever we have to load the loads from R3 to BW.
    Regards
    Jay

    Hi Jayanthy,
    Pls. check the order of the fields in the two set up tables for the S001 structure. The order of fields in both the tables should be the same.
    You can see the structure in the TCode - SE11.
    If the order is different, then you bneed to ask the BASIS person to change the order so that the order of fields in both the setup tables is same. This should fix the issue.
    Thanks,
    Raj

  • Error while loading data to ODS in BI7

    Getting such following errors when loading data to ODS through DTP(Data Transfer Process) in BI7.     
    Runtime error while executing rule -> see long text RSTRAN     
    Record filtered because records with the same key contain errors     RSM2
    and many so on...
    source system is ok no problem with the connection part.
    plz help.

    Hi Anup.
    Probably your records are not Unique, while you have told the system they should be.
    Try to move a discrimination characteristic from data fileds to key fileds.
    Or
    Try to set the "duplicate reords allowed" setting in your datasource (tab "General Settings" Delivery of Duplicate Data Recs. = Allowed)
    Udo

  • Error while loading data from a file on application server

    Hi all,
    Facing an error while loading data from a flat file.
    Error 'The argument '##yyyymmdd;@##' cannot be interpreted as a number ' while assigning character.
    I changed the format of date fields (tried with number,general,date(International))in the xls. But i still get the same error.Did check all the data types in Data source all the fields are dats.
    Can you please tell me what could be the problem?
    Thank you all,
    Praveen

    Hi all,
    As far as my first question i got through it but i had one more field in my flat file while actually is a time stamp, but in my flat file i have a data in this format
    10/21/2006  5:11:48 AM which i need to change to 10/21/2006
    one more note is i have some of the fields as NULL in this field
    Last Updated Date
    10/21/2006  5:11:48 AM
    10/21/2006  5:11:48 AM
    NULL
    NULL
    10/21/2006  5:11:48 AM
    NULL
    I want to display the values as 10/21/2006 and NULL as it is.
    Please let me know if we have a conversion routine in datasource which can solve my problem.
    Regards,
    Praveen

  • Error while loading data from DS into cube

    Hello All
    I am getting the below error while loading data from Data source to the cube.
    Record 1 :Time conversion from 0CALDAY to 0FISCPER (fiscal year ) failed with value 20070331
    Wht am i suppose to do?
    Need ur input in this regard.
    Regards
    Rohit

    Hi
    Simply map 0calday to 0fiscper(you already done it)..You might have forgotten to map fiscal year variant in update rules....if it comes from souce,just map it..if it won't comes from source...make it as constant in update rules and give value..
    If your business is April to March make it as  'V3'
    If your business is January to Decemeber make it as 'K4'..
    activate your update rules delete old data and upload again....
    Hope it helps
    Thanks
    Teja

  • Error while loading data from 0FI_TX_4 to ODS

    Dear Friends,
    One of our proccess chain getting error since a week, every day with same error while loading data from R3 data source 0FI_TX_4 to ODS.
    Short dump is raised in R3 system.
    "DBIF_RSQL_SQL_ERROR" CX_SY_OPEN_SQL_DBC
    "SAPLBWFIR" or "LBWFIRU13"
    "BWFIR_READ_BSET_CPUDT_DATA"
    ORA-01555
    The number of records are below 10000 only.
    All other chains are OK.
    we have already searched SDN and Notes...and applied below solution, but still no use
    Increase the size of all the roll back segments
    Increase number of REDO logs
    Decrease the number of records per data packet
    Any more suggessions please ??
    Thanks
    Tony

    How are you telling that no. of records are below 10000 .Have you checked the no.of records in PSA???
    Now ,I tell you something which I had done for this error .I increased the Package size such that whole data gets transferred in single data package.Please make the package size as large as possible so that all the data can be processed in single package.
    I know you have read so many threads on the  same but what I am suggesting is what I have done and successfully rectified this issue.
    regards,
    rocks

  • Error while loading data to ODS

    Hello BW folks ,
    I am getting an error while loading data to ODS from R/3 data source.
    The error message is " Error in PSA".
    We have deleted yesterday data from PSA for the related infosource/data source successfully .
    Since then this error is occuring. Previously data upload was successful.
    In Detail tab of monitor we are getting error message in Request option ( red Mark )
    and the same request is not going to R/3 .
    <b>The related Job itself is not present in R/3 .</b>
    This looks very strange and new to me .
    Could anyone help me in rectifying this issue ?
    Thanks in Advance.
    Best Regards ,
    Amol.

    Hello All,
    Thanks for reply.
    I tried to replicate the data source once again and activate the transfer rules in BW Production by running program in SE38 =  RS_TRANS*  activate all tranfer sturctures , then I got an error message that one infoobject is added to transfer rules in BW Production .
    But interestingly the same Transfer rules do not have this infoobject in BW Development server . So I create once again a request for Transfer rules in Dev and transported it to BW production.
    and This solved the problem
    The error message in Monitor was " Error in PSA " and then request itself was not going to R/3 . But the real issue was with Transfer rules. So after rectifying it we have solved the problem of data upload.
    Best Regards ,
    Amol Kulkarni.

  • Unable to load data with impdp

    Hi friends,
    I've encountered with follwing errors while loading data through impdp utility.
    ORA-31626: job does not exist
    ORA-31633: unable to create master table "TRACE.SYS_IMPORT_FULL_05"
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT", line 863
    ORA-00955: insufficient privileges
    I think problem is with last line ORA-00955: insufficient privileges what are you opinion kindly tell me what necessary priviliges user should have to import/export dump file.
    Looking for you Help and suggestion
    Regards,
    Abbasi

    Is this dumpfile consists of onlyTRACE schema objects or other schema objects?
    no need to grant dba priviliges to trace, you can import using sys/system user.
    impdp system/****@TNRDB directory=tnr_dump_dir dumpfile-tnrsms.dmp logfile=loading.logThanks

  • Error while Loading data through .csv file

    Hi,
    I am getting below date error when loading data through into Olap tables through .csv file.
    Data stored in .csv is 20071113121100.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TE_7007 Transformation Evaluation Error [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')]
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11132 Transformation [Exp_FILE_CHNL_TYPE] had an error evaluating output column [CREATED_ON_DT_OUT]. Error message is [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')].
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11019 There is an error in the port [CREATED_ON_DT_OUT]: The default value for the port is set to: ERROR(<<Expression Error>> [ERROR]: transformation error
    ... nl:ERROR(u:'transformation error')).
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11021 An error occurred moving data from the transformation Exp_FILE_CHNL_TYPE: to the transformation W_CHNL_TYPE_DS.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> CMN_1086 Exp_FILE_CHNL_TYPE: Number of errors exceeded threshold [1].
    Any help is greatly appreciated.
    Thanks,
    Poojak

    1) Wrong format, you wont get much support loading OLAP cubes in here I think.
    2) Has your CSV file been anywhere near excel by chance the conversion of 20071113121100 to 2.00711E+13 looks very much like what I see when Excel has an invalid number mask / precision specified for a cell.
    *** We are using EBS. File was already generated by consultants through SQL query. Getting error while loading though Informatica. Target table is setup as date datatype and source is String(19).
    Expression in Informatica is setup as below.
    IIF(ISNULL(CREATED_ON_DT), NULL, TO_DATE(CREATED_ON_DT, 'YYYYMMDDHH24MISS'))
    Thanks,
    Poojak

  • Error while loading data into BW (BW as Target) using Data Services

    Hello,
    I'm trying to extract data from SQL Server 2012 and load into BW 7.3 using Data Services. Data Services shows that the job is finished successfully. But, when I go into BW, I'm seeing the below / attached error.
    Error while accessing repository Violation of PRIMARY KEY constraint 'PK__AL_BW_RE_
    Please let me know what this means and  how to fix this. Not sure if I gave the sufficient information. Please let me know if you need any other information.
    Thanks
    Pradeep

    Hi Pradeep,
    Regarding your query please refer below SCN thread for the same issue:
    SCN Thread:
    FIM10 to BW 73- Violation of PRIMARY KEY -table AL_BW_REQUEST
    Error in loading data from BOFC to BW using FIM 10.0
    Thanks,
    Daya

Maybe you are looking for

  • Connection mysql jdbc error ?

    I have program java likes this : import java.sql.*; public class LoadDriver {     // Define the JDBC Driver and the database connection URL     public static final String DRIVER = "com.mysql.jdbc.Driver";     public static final String URL = "jdbc:my

  • Domain name setup with iWeb 09, MobileMe & Google Apps.

    I purchased a domain (http://tinyurl.com/5v3m3bs) via google, I am hosting the site on my MobileMe (http://tinyurl.com/5vt9qr8) space & am using iWeb09 to edit the website but for the life of me I cannot mask forward using Google Apps from my domain

  • How can I remove Top Sites button from Favorites?

    Hi! How can I remove Top Sites button from Favorites bar? There is no problem to remove it in previous Safari. But I cant find how to remove it in Safari 7.0.

  • DPM 2012 R2 Crashes after upgrade - Tape Backup

    Hello guys, I have recently upgraded from DPM 2012 SP1 to DPM 2012 R2 Rollup 3 on Windows 2012 R2 (in-place upgrade) The basic procedure was to backup the DB, uninstall DPM, install DPM 2012 R2, restore the DB.  re-sync all the inconsistent replicas

  • Planning the physical layout of the database

    Our application keeps very large historical data for 1 year. We do not use the Partitioning Option but have created 12 different BIG TABLES, one for each month. We are wondering if for performance reason or any other reason, whether it is better to p