Data Load Fails due to duplicate records from the PSA
Hi,
I have loaded the Master Data Twice in to the PSA. Then, I created the DTP to load the data from the PSA to the InfoProvider. The data load is failing with an error "duplicate key/records found".
Is there any setting that I can configure by which even though I have duplicate records in PSA, I can be successfully able to load only one set of data (without duplicates) in to the InfoProvider?
How can I set up the process chains to do so?
Your answer to the above two questions is appreciated.
Thanks,
Hi Sesh,
There are 2 places where the DTP checks for duplicates.
In the first, it checks previous error stacks. If the records you are loading are still contained in the error stack of a previous DTP run, it will throw the error at this stage. In this case you will first have to clean up the previous error stack.
The second stage will clean up duplicates across datapackeages, provided the option is set in your datasource. But you should note that this will not solve the problem if you have duplicates in the same datapackage. In that case you can do the filtering yourself in the start routine of your transformation.
Hope this helps,
Pieter
Similar Messages
-
Master data load failed due to duplicate records .
hello friends ,
need some help .
I am loading the master data from soruce sys , and it is throwing error of duplicate 56 records.
I repeated the step , but found the same error once again .
i could not find out the duplicate record , as thr are more than 24000 records , and in this 56 are duplicate . and this duplicate also looks like same.
when i click on error records , it is showing me the below procedure .
maintain the attribute in Psa SCREEN .
I could not find the duplicate records , could you please let me know how can i maintain this .
RegardsHi ,
Reload the masterdata by cheking ignoreduplicate records check box.since the master data has overwriting capability the duplicate records will be overwritten
cheers,
Swapna.G -
While loading master data to infoobject Load failed due to Duplicate record
Hi Experts,
While loading master data to the infoobject load failed .
The error it is showing is 24 Duplicate record found. 23 recordings used in table.
Pls help me to solve this issue
Thanks in Advance.
Regards,
Gopal.In infopackage settings u will find a checkbox for 'delete duplicate records'.
I think it appears beside the radio button for 'To PSA',and also tick checkbox for 'subsequent update to data targets'.
This will remove the duplicate records(if any) from the PSA before they are processed further by transfer and update rules.
Use this and reload master data.
cheers,
Vishvesh -
Master Data load fails because of Duplicate Records
Hi BW Experts,
I am loading historical data for a info-object using flexible update, first i tried to delete the data but it was not possible as it is being used in infocubes and ODS, As i am doing the rework of that cubes, ODS so i have to reload the whole data again. Anyway without deleting i tried loading the data in the info-object but it has thrown error that dulicate records found, I tried again then it has thrown an error ALEREMOTE has locked the object or lock not set for the object.
Please suggest me what to do in these scenario.
Please consider it as urgent.
Thanks in advance.
Sunil MorwalSunil,
First unlock the objects.go to SM12 give ALEREMOTE user name then list...select the request and delete.
Load the data from PSA....
OR reload... rememeber you will have one option at infopackage level in processing tab "<b>Ignore duplicate records"</b>.
Let me know the status.
Thanks
Ram
"BW is Everywhere"
Message was edited by: Ram -
How to delete Duplicate records from the informatica target using mapping?
Hi, I have a scenario like below: In my mapping I have a source, which may containg unique records or duplicate records. Source and target are different tables. I have a target in my mapping which contains duplicate records. Using Mapping I have to delete the duplicate records in the target table. Target table does not contain any surrogate key. We can use target table as a look up table, but target table cannot be used as source in the mapping. We cannot use post SQL.
Hi All, I have multiple flat files which i need to load in a single table.I did that using indirect option at session level.But need to dig out on how to populate substring of header in name column in target table. i have two columns Id and Name. in all input file I have only one column 'id' with header like H|ABCD|Date. I need to populate target like below example. File 1 File2 H|ABCD|Date. H|EFGH|Date.1 42 5 3 6 Target tale: Id Name1 ABCD2 ABCD3 ABCD4 EFGH5 EFGH6 EFGH can anyone help on what should be the logic to get this data in a table in informatica.
-
Load failed due to Overflow converting from 853156+16
Hi All,
Load is getting failed from one DSO to another DSO with following error message. and its AP AR load
Runtime error while executing rule -> see long text
Message no. RSTRAN301
Diagnosis
An error occurred while executing a transformation rule:
The exact error message is:
Overflow converting from '8.85315e+16'
The error was triggered at the following point in the program:
Z_BFOD_A_AR_LOOKUP_ITEM_E 278
System Response
Processing the data record has been terminated.
Procedure
The following additional information is included in the higher-level node of
the monitor:
Transformation ID
Data record number of the source record
Number and name of the rule which produced the error
Thanks,
AsimHey,
Ask the developer that wrote the end routine to check the code where the error occurs. First check if too much data is transferred and if it can be reduced. Then also check if the object needs to be extended 460652 - Extending key figures in BW.
Regards,
Michael -
Master data infoobject can't handle duplicate records after SP10
Hi
I am trying to load master data which happened to contain duplicate records from the source system. In the DTP of the master data infoobject, I have ticked the 'Handle Duplicate Record Keys' checkbox. After executing this DTP, the duplicate master data records were trapped in the Error Stack. I am expecting overwriting of the duplicate master data to take place instead. I understand that this error was fixed in Note 954661 - Updating master data texts when error in data package which is from SP9. After applying Support Pack 10, the master data infoobject just can't handle records with duplicate keys.
Please let me know if you manage to fix this problem.
Many thanks,
AnthonyFound a fix for this problem. Just applied this OSS note Note 986196 - Error during duplicate record handling of master data texts.
-
Delete overlapping/duplicate records from cube
Hi All,
Kindly let me know how to delete overlapping requests from a cube. Actually the cube is getting loaded from varuous infosources, but there are records which get duplicated and the are not wanted , then hiow to delete the duplicate records from the cube.
Regards,
dolaI think what arun is perfectly right....
use DSO for consolidation of various requests..from diferenet infosources...
Now load from DSO to cube...and it is very much possible...though will require little work.
Delete duplicate records option is usually used for master data.With transacdtion data i don't think its advisable.
Regards,
RK -
Data load failed while loading data from one DSO to another DSO..
Hi,
On SID generation data load failed while loading data from Source DSO to Target DSO.
Following are the error which is occuuring--
Value "External Ref # 2421-0625511EXP " (HEX 450078007400650072006E0061006C0020005200650066
Error when assigning SID: Action VAL_SID_CONVERT, InfoObject 0BBP
So, i'm not getting WHY in one DSO i.e Source it got successful but in another DSO i.e. Target its got failed??
While analyzing all i check that SIDs Generation upon Activation is ckecked in source DSO but not in Target DSO..so it is reason its got failed??
Please explain..
Thanks,
SnehaHi,
I hope your data flow has been designed in such a way where the 1st DSO as a staging Device and all transformation rules and routine are maintained in between 1st to 2nd dso and sid generation upon activation maintained in 2nd DSO. By doing so you will be getting your data 1st DSO same as your source system data since you are not doing any transformation rules and routine etc.. which helps to avoid data load failure.
Please analyze the following
Have you loaded masterdata before transaction data ... if no please do it first
go to the property of first dso and check whether there maintained sid generation up on activation (it may not be maintained I guess)
Goto the property of 2nd Dso and check whether there maintained sid generation up on activation (It may be maintained I hope)
this may be the reason.
Also check whether there is any special char involvement in your transaction data (even lower case letter)
Regards
BVR -
Deleting duplicate records from different data packets in BI data source.
Hi,
I am getting same (duplicate) records from different data packets in BI data source, after completion of extraction.
I tried to store key fields of the first data packet in an internal table. But this internal table is not carrying the previous data at the time of extraction of second data packet.
Is there any other way to remove duplicate records after completion of extraction.
Thanks in advance.I did not extensively worked in BI routenes. But I recon there will be routene which will het executed before data mapping part there will be a start routene in which you can validate the existense of data before beeing passed from data source to cube.
Hope this helps,
Regards,
Murthy. -
How to handle the failed records from the table when using DB Adapter
Hi,
I am reading some records from table using DB Adapter inside my synchronous BPEL process. Say like reading 100 records from table in between after successful reading of 90 records an error occured in 91st record due some various reasons(like DB down, Connection interrupted etc.). Then how to handle this situation, whether i have to read all the records from the begining and is there any option to continue from where it stopped reading.
Can please anybody help me out in the regard?
Thanks in advance
Regards,
Aejazwe had the same requirement some time ago and had two option:
1. ask the R/3 development team add a deletion indicator in the table (and thus not actually deleting the record). this deletion indicator could then be used like for any other standard datasource
this option was however refused, due to huge data volume after a while
2. at the end of the load we copied the ZTABLE1 to ZTABLE2. then in the begin of the load (day after) we compare the data of table1 to table2. entries available in table2 but not in table1 are deleted, and we put a 'D'. in deletion indicator; as we only keep the deleted entries for one day, the volume of the new table is acceptable.
M. -
Incomplete Data on report (report does not show all records from the table)
Hello,
I have problem with CR XI, I'm running the same report on the same data with simple select all records from the table (no sorting, no grouping, no filters)
Sometimes report shows me all records sometimes not. Mostly not all records on the report. When report incomplete sometimes it shows different number of records.
I'm using CR XI runtime on Windows Server 2003
Any help appreciated
Thanks!Sorry Alexander. I missed the last line where you clearly say it is runtime.
A few more questions:
- Which CR SDK are you using? The Report Designer Component or the CR assemblies for .NET?
- What is the exact version of CR you are using (from help | about)
- What CR Service Pack are you on?
And a troubleshooting suggestion:
Since this works on some machines, it will be a good idea to compare all the runtime (both CR and non CR) being loaded on a working and non working machines.
Download the modules utility from here:
https://smpdl.sap-ag.de/~sapidp/012002523100006252802008E/modules.zip
and follow the steps as described in this thread:
https://forums.sdn.sap.com/click.jspa?searchID=18424085&messageID=6186767
The download also includes instructions on how to use modules.
Ludek -
Importing and Updating Non-Duplicate Records from 2 Tables
I need some help with the code to import data from one table
into another if it is not a duplicate or if a record has changed.
I have 2 tables, Members and NetNews. I want to check NetNews
and import non-duplicate records from Members into NetNews and
update an email address in NetNews if it has changed in Members. I
figured it could be as simple as checking Members.MembersNumber and
Members.Email against the existance of NetNews.Email and
Members.MemberNumber and if a record in NetNews does not exist,
create it and if the email address in Members.email has changed,
update it in NetNews.Email.
Here is what I have from all of the suggestions received from
another category last year. It is not complete, but I am stuck on
the solution. Can someone please help me get this code working?
Thanks!
<cfquery datasource="#application.dsrepl#"
name="qryMember">
SELECT distinct Email,FirstName,LastName,MemberNumber
FROM members
WHERE memberstanding <= 2 AND email IS NOT NULL AND email
<> ' '
</cfquery>
<cfquery datasource="#application.ds#"
name="newsMember">
SELECT distinct MemberNumber
FROM NetNews
</cfquery>
<cfif
not(listfindnocase(valuelist(newsMember.MemberNumber),qryMember.MemberNumber)
AND isnumeric(qryMember.MemberNumber))>
insert into NetNews (Email_address, First_Name, Last_Name,
MemberNumber)
values ('#trim(qryMember.Email)#',
'#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
trim(qryMember.MemberNumber)#')-
</cfif>
</cfloop>
</cfquery>
------------------Dan,
My DBA doesn't have the experience to help with a VIEW. Did I
mention that these are 2 separate databases on different servers?
This project is over a year old now and it really needs to get
finished so I thought the import would be the easiest way to go.
Thanks to your help, it is almost working.
I added some additional code to check for a changed email
address and update the NetNews database. It runs without error, but
I don't have a way to test it right now. Can you please look at the
code and see if it looks OK?
I am also still getting an error on line 10 after the routine
runs. The line that has this code: "and membernumber not in
(<cfqueryparam list="yes"
value="#valuelist(newsmember.membernumber)#
cfsqltype="cf_sql_integer">)" even with the cfif that Phil
suggested.
<cfquery datasource="#application.ds#"
name="newsMember">
SELECT distinct MemberNumber, Email_Address
FROM NetNewsTest
</cfquery>
<cfquery datasource="#application.dsrepl#"
name="qryMember">
SELECT distinct Email,FirstName,LastName,MemberNumber
FROM members
WHERE memberstanding <= 2 AND email IS NOT NULL AND email
<> ' '
and membernumber not in (<cfqueryparam list="yes"
value="#valuelist(newsmember.membernumber)#"
cfsqltype="cf_sql_integer">)
</cfquery>
<CFIF qryMember.recordcount NEQ 0>
<cfloop query ="qryMember">
<cfquery datasource="#application.ds#"
name="newsMember">
insert into NetNewsTest (Email_address, First_Name,
Last_Name, MemberNumber)
values ('#trim(qryMember.Email)#',
'#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
trim(qryMember.MemberNumber)#')
</cfquery>
</cfloop>
</cfif>
<cfquery datasource="#application.dsrepl#"
name="qryEmail">
SELECT distinct Email
FROM members
WHERE memberstanding <= 2 AND email IS NOT NULL AND email
<> ' '
and qryMember.email NEQ newsMember.email
</cfquery>
<CFIF qryEmail.recordcount NEQ 0>
<cfloop query ="qryEmail">
<cfquery datasource="#application.ds#"
name="newsMember">
update NetNewsTest (Email_address)
values ('#trim(qryMember.Email)#')
where email_address = #qryEmail.email#
</cfquery>
</cfloop>
</cfif>
Thank you again for the help. -
How to remove duplicates records from output ?
how to remove duplicates records from output ? i used delete adjacent but duplicates records are coming again ..suggest me
hi shruthi,
thanks for ur answer ..but duplicates records coming again
here is my code >> plz check it out
*& Report ZCRM_TROUBLE_TICKET
REPORT zcrm_trouble_ticket.
TYPES : BEGIN OF ty_qmih,
qmnum TYPE qmnum,
equnr TYPE equnr,
iloan TYPE iloan,
ausvn TYPE ausvn,
ausbs TYPE ausbs,
auztv TYPE auztv,
auztb TYPE auztb,
iwerk TYPE iwerk,
END OF ty_qmih,
BEGIN OF ty_qmel,
qmnum TYPE qmnum,
qmtxt TYPE qmtxt,
indtx TYPE indltx,
priok TYPE priok,
strmn TYPE strmn,
strur TYPE strur,
ltrmn TYPE ltrmn,
ltrur TYPE ltrur,
objnr TYPE qmobjnr,
arbpl TYPE lgwid,
vkorg TYPE vkorg,
vtweg TYPE vtweg,
spart TYPE spart,
END OF ty_qmel,
BEGIN OF ty_ihpa,
parnr TYPE i_parnr,
parvw TYPE parvw,
objnr TYPE qmobjnr,
END OF ty_ihpa,
BEGIN OF ty_crhd,
arbpl TYPE arbpl,
objid TYPE cr_objid,
END OF ty_crhd,
BEGIN OF ty_crtx,
ktext TYPE cr_ktext,
objid TYPE cr_objid,
END OF ty_crtx,
BEGIN OF ty_qmfe,
fecod TYPE fecod,
fegrp TYPE fegrp,
qmnum TYPE qmnum,
END OF ty_qmfe,
BEGIN OF ty_qmur,
urcod TYPE urcod,
urgrp TYPE urgrp,
urtxt TYPE urstx,
qmnum TYPE qmnum,
END OF ty_qmur,
BEGIN OF ty_iloa,
tplnr TYPE tplnr,
iloan TYPE iloan,
END OF ty_iloa,
BEGIN OF ty_output,
qmnum TYPE qmnum,
equnr TYPE equnr,
iloan TYPE iloan,
ausvn TYPE ausvn,
ausbs TYPE ausbs,
auztv TYPE auztv,
auztb TYPE auztb,
iwerk TYPE iwerk,
qmtxt TYPE qmtxt,
indtx TYPE indltx,
priok TYPE priok,
strmn TYPE strmn,
strur TYPE strur,
ltrmn TYPE ltrmn,
ltrur TYPE ltrur,
objnr TYPE qmobjnr,
arbpl TYPE lgwid,
vkorg TYPE vkorg,
vtweg TYPE vtweg,
spart TYPE spart,
parnr TYPE i_parnr,
parvw TYPE parvw,
arbpl TYPE arbpl,
objid TYPE cr_objid,
arbpl1 TYPE arbpl,
ktext TYPE cr_ktext,
fecod TYPE fecod,
fegrp TYPE fegrp,
urcod TYPE urcod,
urgrp TYPE urgrp,
urtxt TYPE urstx,
tplnr TYPE tplnr,
END OF ty_output.
DATA : it_qmih TYPE STANDARD TABLE OF ty_qmih,
it_qmel TYPE STANDARD TABLE OF ty_qmel,
it_ihpa TYPE STANDARD TABLE OF ty_ihpa,
it_crhd TYPE STANDARD TABLE OF ty_crhd,
it_crtx TYPE STANDARD TABLE OF ty_crtx,
it_qmfe TYPE STANDARD TABLE OF ty_qmfe,
it_qmur TYPE STANDARD TABLE OF ty_qmur,
it_iloa TYPE STANDARD TABLE OF ty_iloa,
it_output TYPE STANDARD TABLE OF ty_output,
wa_qmih TYPE ty_qmih,
wa_qmel TYPE ty_qmel,
wa_ihpa TYPE ty_ihpa,
wa_crhd TYPE ty_crhd,
wa_crtx TYPE ty_crtx,
wa_qmfe TYPE ty_qmfe,
wa_qmur TYPE ty_qmur,
wa_iloa TYPE ty_iloa,
wa_output TYPE ty_output.
INITIALIZATION.
REFRESH : it_qmih,
it_qmel,
it_ihpa,
it_crhd,
it_crtx,
it_qmfe,
it_qmur,
it_iloa,
it_output.
CLEAR: wa_qmih,
wa_qmel,
wa_ihpa,
wa_crhd,
wa_crtx,
wa_qmfe,
wa_qmur,
wa_iloa,
wa_output.
start-of-selection.
SELECT qmnum
equnr
iloan
ausvn
ausbs
auztv
auztb
iwerk
FROM qmih
INTO TABLE it_qmih.
SORT it_qmih BY qmnum .
DELETE ADJACENT DUPLICATES FROM it_qmih COMPARING qmnum equnr iloan ausvn ausbs auztv auztb iwerk.
SELECT qmnum
qmtxt
indtx
priok
strmn
strur
ltrmn
ltrur
objnr
arbpl
vkorg
vtweg
spart
FROM qmel
INTO TABLE it_qmel
FOR ALL ENTRIES IN it_qmih
WHERE qmnum = it_qmih-qmnum.
SORT it_qmel BY qmnum.
DELETE ADJACENT DUPLICATES FROM it_qmel COMPARING qmnum
qmtxt
indtx
strmn
strur
ltrmn
ltrur
objnr
arbpl
vkorg
vtweg
spart.
IF it_qmel IS NOT INITIAL.
SELECT parnr
parvw
objnr
FROM ihpa
INTO TABLE it_ihpa
FOR ALL ENTRIES IN it_qmel
WHERE objnr = it_qmel-objnr.
ENDIF.
DELETE ADJACENT DUPLICATES FROM it_ihpa COMPARING parnr
parvw
objnr.
IF it_qmel IS NOT INITIAL.
SELECT arbpl
objid
FROM crhd
INTO TABLE it_crhd
FOR ALL ENTRIES IN it_qmel
WHERE objid = it_qmel-arbpl.
ENDIF.
DELETE ADJACENT DUPLICATES FROM it_crhd COMPARING arbpl
objid.
IF it_qmel IS NOT INITIAL.
SELECT ktext
objid
FROM crtx
INTO TABLE it_crtx
FOR ALL ENTRIES IN it_crhd
WHERE objid = it_crhd-objid.
ENDIF.
DELETE ADJACENT DUPLICATES FROM it_crtx COMPARING ktext
objid.
IF it_qmih IS NOT INITIAL.
SELECT fecod
fegrp
qmnum
FROM qmfe
INTO TABLE it_qmfe
FOR ALL ENTRIES IN it_qmih
WHERE qmnum = it_qmih-qmnum.
ENDIF.
SORT it_qmfe BY qmnum.
DELETE ADJACENT DUPLICATES FROM it_qmfe COMPARING fecod
fegrp.
IF it_qmih IS NOT INITIAL.
SELECT urcod
urgrp
urtxt
qmnum
FROM qmur
INTO TABLE it_qmur
FOR ALL ENTRIES IN it_qmih
WHERE qmnum = it_qmih-qmnum.
ENDIF.
SORT it_qmur BY qmnum.
DELETE ADJACENT DUPLICATES FROM it_qmur COMPARING urcod
urgrp
urtxt.
IF it_qmih IS NOT INITIAL.
SELECT tplnr
iloan
FROM iloa
INTO TABLE it_iloa
FOR ALL ENTRIES IN it_qmih
WHERE iloan = it_qmih-iloan.
ENDIF.
DELETE ADJACENT DUPLICATES FROM it_iloa COMPARING tplnr
iloan.
LOOP AT it_qmih INTO wa_qmih.
wa_output-qmnum = wa_qmih-qmnum.
wa_output-equnr = wa_qmih-equnr.
wa_output-iloan = wa_qmih-iloan.
wa_output-ausvn = wa_qmih-ausvn.
wa_output-ausbs = wa_qmih-ausbs.
wa_output-auztv = wa_qmih-auztv.
wa_output-auztb = wa_qmih-auztb.
wa_output-iwerk = wa_qmih-iwerk.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_qmel INTO wa_qmel WITH KEY qmnum = wa_qmih-qmnum.
wa_output-qmtxt = wa_qmel-qmtxt.
wa_output-indtx = wa_qmel-indtx.
wa_output-priok = wa_qmel-priok.
wa_output-strmn = wa_qmel-strmn.
wa_output-strur = wa_qmel-strur.
wa_output-ltrmn = wa_qmel-ltrmn.
wa_output-ltrur = wa_qmel-ltrur.
wa_output-objnr = wa_qmel-objnr.
wa_output-arbpl = wa_qmel-arbpl.
wa_output-vkorg = wa_qmel-vkorg.
wa_output-vtweg = wa_qmel-vtweg.
wa_output-spart = wa_qmel-spart.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_ihpa INTO wa_ihpa WITH KEY objnr = wa_qmel-objnr.
wa_output-parnr = wa_ihpa-parnr.
wa_output-parvw = wa_ihpa-parvw.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_crhd INTO wa_crhd WITH KEY objid = wa_qmel-arbpl.
wa_output-arbpl = wa_crhd-arbpl.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_crtx INTO wa_crtx WITH KEY objid = wa_crhd-objid.
wa_output-ktext = wa_crtx-ktext.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_qmfe INTO wa_qmfe WITH KEY qmnum = wa_qmih-qmnum.
wa_output-fecod = wa_qmfe-fecod.
wa_output-fegrp = wa_qmfe-fegrp.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_qmur INTO wa_qmur WITH KEY qmnum = wa_qmih-qmnum.
wa_output-urcod = wa_qmur-urcod.
wa_output-urgrp = wa_qmur-urgrp.
wa_output-urtxt = wa_qmur-urtxt.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_iloa INTO wa_iloa WITH KEY iloan = wa_qmih-iloan.
wa_output-tplnr = wa_iloa-tplnr.
APPEND wa_output TO it_output.
CLEAR wa_output.
ENDLOOP.
DELETE ADJACENT DUPLICATES FROM it_output COMPARING qmnum
equnr
ausvn
ausbs
auztv
auztb
iwerk
qmtxt
indtx
priok
strmn
strur
ltrmn
ltrur
vkorg
vtweg
spart
parnr
parvw
arbpl
ktext
fecod
fegrp
urcod
urgrp
urtxt
tplnr.
*CALL FUNCTION 'STATUS_TEXT_EDIT'
EXPORTING
CLIENT = SY-MANDT
FLG_USER_STAT = ' '
objnr =
ONLY_ACTIVE = 'X'
spras = en
BYPASS_BUFFER = ' '
IMPORTING
ANW_STAT_EXISTING =
E_STSMA =
LINE =
USER_LINE =
STONR =
EXCEPTIONS
OBJECT_NOT_FOUND = 1
OTHERS = 2
*IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
*ENDIF.
*CALL FUNCTION 'READ_TEXT'
EXPORTING
CLIENT = SY-MANDT
id =
language =
name =
object =
ARCHIVE_HANDLE = 0
LOCAL_CAT = ' '
IMPORTING
HEADER =
tables
lines =
EXCEPTIONS
ID = 1
LANGUAGE = 2
NAME = 3
NOT_FOUND = 4
OBJECT = 5
REFERENCE_CHECK = 6
WRONG_ACCESS_TO_ARCHIVE = 7
OTHERS = 8
*IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
*ENDIF.
*LOOP AT IT_OUTPUT INTO WA_OUTPUT.
*WRITE : / WA_OUTPUT-qmnum,
WA_OUTPUT-equnr,
WA_OUTPUT-iloan,
WA_OUTPUT-ausvn,
WA_OUTPUT-ausbs,
WA_OUTPUT-auztv,
WA_OUTPUT-auztb,
WA_OUTPUT-qmtxt,
WA_OUTPUT-indtx,
WA_OUTPUT-strmn,
WA_OUTPUT-strur,
WA_OUTPUT-ltrmn,
WA_OUTPUT-ltrur,
WA_OUTPUT-objnr,
WA_OUTPUT-arbpl,
WA_OUTPUT-parnr,
WA_OUTPUT-parvw,
WA_OUTPUT-objid,
WA_OUTPUT-ktext,
WA_OUTPUT-fecod,
WA_OUTPUT-fegrp,
WA_OUTPUT-urcod,
WA_OUTPUT-urgrp,
WA_OUTPUT-urtxt,
WA_OUTPUT-tplnr.
*ENDLOOP.
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
BIN_FILESIZE =
filename = 'E:\CRM1.TXT'
FILETYPE = 'ASC'
APPEND = ' '
write_field_separator = '|'
HEADER = '00'
TRUNC_TRAILING_BLANKS = ' '
WRITE_LF = 'X'
COL_SELECT = ' '
COL_SELECT_MASK = ' '
DAT_MODE = ' '
CONFIRM_OVERWRITE = ' '
NO_AUTH_CHECK = ' '
CODEPAGE = ' '
IGNORE_CERR = ABAP_TRUE
REPLACEMENT = '#'
WRITE_BOM = ' '
TRUNC_TRAILING_BLANKS_EOL = 'X'
WK1_N_FORMAT = ' '
WK1_N_SIZE = ' '
WK1_T_FORMAT = ' '
WK1_T_SIZE = ' '
WRITE_LF_AFTER_LAST_LINE = ABAP_TRUE
IMPORTING
FILELENGTH =
TABLES
data_tab = it_output
FIELDNAMES =
EXCEPTIONS
FILE_WRITE_ERROR = 1
NO_BATCH = 2
GUI_REFUSE_FILETRANSFER = 3
INVALID_TYPE = 4
NO_AUTHORITY = 5
UNKNOWN_ERROR = 6
HEADER_NOT_ALLOWED = 7
SEPARATOR_NOT_ALLOWED = 8
FILESIZE_NOT_ALLOWED = 9
HEADER_TOO_LONG = 10
DP_ERROR_CREATE = 11
DP_ERROR_SEND = 12
DP_ERROR_WRITE = 13
UNKNOWN_DP_ERROR = 14
ACCESS_DENIED = 15
DP_OUT_OF_MEMORY = 16
DISK_FULL = 17
DP_TIMEOUT = 18
FILE_NOT_FOUND = 19
DATAPROVIDER_EXCEPTION = 20
CONTROL_FLUSH_ERROR = 21
OTHERS = 22
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF. -
Problem loading data from the PSA to the InfoCube
Hello experts.
I'm having a problem loading data from the PSA to the InfoCube.
I'm using a DTP for this process but is happening the following error:
"Diagnosis
An error occurred while executing the transformation rule:
The exact error message is:
Overflow converting from''
The error was triggered at the point in the Following Program:
GP4KMDU7EAUOSBIZVE233WNLPIG 718
System Response
Processing the record date has Been terminated.
Procedure
The Following is additional information included in the higher-level
node of the monitor:
Transformation ID
Data record number of the source record
Number and the name of the rule Which produced the error
Procedure for System Administration
Have already created new DTP's deactivate and reactivate the InfoCube, the transformation, but solves nothing.
Does anyone have any idea what to do?
Thank you.HI,
Is it a flat file load or loading frm any data source?
try to execute the program GP4KMDU7EAUOSBIZVE233WNLPIG 718 in Se38 and check if its active and no syntax errors are there.
Check the mapping of the fileds in transformations weather
some data fileds are mapped to decimal or char 32 filed is mapped to Raw 16
or calweek, calmonth mapped to calday etc.
Check in St22 if there any short dumps ..
Regards
KP
Maybe you are looking for
-
My phone number wrong in Settings
So I just rolled my Verizon number to my new iPhone after testing the iPhone for 30 days. In iTunes and on the iPhone, both display the old number, not the number that it currently is. Is there anyway to change this?
-
Badi or User-Exit for TCode-oawd
Hi All, Please help me out with a BADI or EXIT for the Tcode - OAWD (Store Documents) .As per our requirement, while executing OAWD, after the step - work item created, we want to add a pop-up which will take us to tcode- SBWP. Any help will be highl
-
When does a bean get initialised?
Hi, I am a bit confused at the moment, and the more I think about it, the more confused I get! Here is the scenario.... Some details - it is a JSF application using JSPs and JSF Components. I am using JDeveloper 1.3.2 and Oracle 10g DB. I have a page
-
What is the list of current Adobe CC apps as of 6/21/2014
I recently went through the CC 2014 updates on my Desktop and Laptop. This left me with Adobe products from three different product versions (numbered, CC, CC 2014). I'm trying to remove any superfluous apps on my laptop where HD space is at a premiu
-
AE Base Station wired connection problem..
Okay.. I have cable internet hooked up like so: Modem --> Airport Extreme Basestation extended by 2 AE expresses... I use the LAN port on the AE basestation to route internet to the hardwired computer.. This has worked well until recently... I recent