To insert duplicate records in VO
Hi,
I have a button Duplicate Record on page.
All the existing details on the page are read only except a checkbox for all rows.
When i select the checkbox and click on 'duplicate Record' button, an editable row with the same data as the selected checkbox row should be created on page. This row is getting created but the issue is that the row from which it has been dupliacted also becomes editable. And the changes made to new row gets reflected in Old Row as well.
Any solutions to have the old row read only and only new row as editable?
ok..what i understand is as follows..
In the VO, i create a transient variable 'RowRef' and select it in my VO query.
In my results table, i create a form value "evtSrcRowRef" with View attribute as 'RowRef'.
In CO, i write,
String rowReference = pageContext.getParameter("evtSrcRowRef");
Please correct me if i am wrong or missing somethimg..also please detail how do i use this row refernce to make my original row read only..
Similar Messages
-
Purchase Order Import inserts duplicate records in po_line_locations
Hi,
I'm running standard Purchase Order Import program to import few PO's. We have only one shipment for each item, so its only one record for each line in po_line_locations. But after running the import, it inserts a duplicate record with same qty into po_line_locations. Basically it is inserting same item twice in po_line_locations_all table and quantity is getting doubles at the line level. Seached Metalink but no hits for this, till now.
This is in R12 (12.0.6).
Did anyone encounter this problem earlier? Any hints or comments would help.
Thanks in advance.
Edited by: user2343071 on Sep 2, 2009 3:54 PMHi,
Once you please debug the particular program with the help of ABAPer. That may resolve your issue. Thanking you -
Data Loader inserting duplicate records
Hi,
There is an import that we need to run everyday in order to load data from another system into CRM On Demand . I have set up a data loader script which is scheduled to run every morning. The script should perform insert operation.
Every morning a file with new insert data is available in the same location(generated by someone else) & same name. The data loader script must insert all records in it.
One morning , there was a problem in the other job and a new file was not produced. When the data loader script ran , it found the old file and re-inserted the records (there were 3 in file). I had specified the -duplicatecheckoption parameter as the external id, since the records come from another system, but I came to know that the option works in the case of update operations only.
How can a situation like this handled in future? The external id should be checked for duplicates before the insert operation is performed. If we cant check on the data loader side, is it possible to somehow specify the field as 'unique' in the UI so that there is an error if a duplicate record is inserted? Please suggest.
Regards,Hi
You can use something like this:
cursor crs is select distinct deptno,dname,loc from dept.
Now you can insert all the records present in this cursor.
Assumption: You do not have duplicate entry in the dept table initially.
Cheers
Sudhir -
Using Rownum and ROwid returns duplicate records
Hi All,
We have implemented pagination as below using rowid and rownum
SELECT
id
FROM
emp
WHERE
ROWID IN
SELECT RID FROM (SELECT
ROWID RID,
ROWNUM RNUM
FROM
SELECT ID FROM emp
WHERE
((T_ID IN (200005,200229,200230,200249,200250,200049))) AND
(dte >= sysdate-90) AND
(LOWER(DESC) = LOWER ('A') AND
LOWER(NVL(FLAG,'0')) != LOWER ('3') AND
LOWER(MODDE) like LOWER ('%210%')) ORDER BY dte ASC ))
WHERE ROWNUM < 11) WHERE RNUM>= 1)) ORDER BY dte emp.ASC
But, we face that - the query inserts duplicate records in consecutive pages. For Eg:
1.if a,b,c,d,e - is returned for first iteration, then for the next iteration - f,g,h,a,y is returned.
Is it because that the Order by clause doesnt have a Unique key column.
Please help. or suggest how to efficietly implement pagination without performance hittry distinct since you are using only one column it will eliminate any duplicates.
SELECT distinct id
FROM emp
WHERE (ROWID IN ( SELECT RID
FROM (SELECT ROWID RID,ROWNUM RNUM
FROM (SELECT ID
FROM emp
WHERE ((T_ID IN (200005,200229,200230,200249,200250,200049)))
AND (dte >= sysdate-90)
AND (LOWER(DESC) = LOWER ('A')
AND LOWER(NVL(FLAG,'0')) != LOWER ('3')
AND LOWER(MODDE) like LOWER ('%210%'))
ORDER BY dte ASC ))
WHERE ROWNUM < 11)
WHERE RNUM>= 1))
ORDER BY dte emp.ASC -
Identify newly inserted duplicate row
Hi,
I have an issue. I want to identify the newly inserted duplicate record in the table. But the problem is the old record is having a rowid greater than the one for the newly inserted duplicate record. How this can happen?
Has anyone faced such problem? Please let me know.
Urgent.
Thanks & Regds,
NandakumarMark, I might disagree with that. Just because the fields that make up a unique constraint are duplicates doesn't make the records identical in all fields. Take for example four entries in the Social Security database with my social security number. One would be for Rory, another for Jose, third for Manual and the fourth for Juan. Four records with 3 violating a unique constraint, but the records aren't identical.
First thing to do is put a unique constraint on the table to prevent duplicate key entries. -
Prevent duplicate records?
hi
can any one tell me
how can i prevent the user from typing duplicate records into a tabular form? which trigger should i use?and how to do that?
and also how to restrict inserting duplicate records into my table throw a code which will be in my form.Kevin in his post said
>>
Although you cannot normally read other records in a multi row block without navigating to them, I have found a cunning method to do this validation using the power of calculation properties. You need three extra hidden fields, two of which have calculation properties, and a little function. (If you want to see how it works, try making the hidden fields visible).
Form program unit:
function COMPARISON (in1 number, in2 number) is
if in1 = in2 then
return(1);
else
return(0);
end if;
end;
3 new hidden fields:
CONTROL.PK_COPY
DATABLOCK.MATCH_FOUND
calculation mode: formula
formula: COMPARISON(:control.PK_COPY, :datablock.PK)
CONTROL.NUMBER_OF_MATCHES
calculation_mode: summary
summary_function: Sum
summarised_block: DATABLOCK
summarised_item: MATCH_FOUND
WHEN_VALIDATE_ITEM on DATABLOCK.PK
:control.pk_copy := :datablock.pk;
if :control.number_of_matches > 1 then
message('matching key found');
end if;
(DATABLOCK must have query_all_records = TRUE)
<<
what do i have to name my fields and what to do in each field?
becuase i got lost in what Kevin said.
please answer me! -
Check for duplicate record in SQL database before doing INSERT
Hey guys,
This is part powershell app doing a SQL insert. BUt my question really relates to the SQL insert. I need to do a check of the database PRIOR to doing the insert to check for duplicate records and if it exists then that record needs
to be overwritten. I'm not sure how to accomplish this task. My back end is a SQL 2000 Server. I'm piping the data into my insert statement from a powershell FileSystemWatcher app. In my scenario here if the file dumped into a directory starts with I it gets
written to a SQL database otherwise it gets written to an Access Table. I know silly, but thats the environment im in. haha.
Any help is appreciated.
Thanks in Advance
Rich T.
#### DEFINE WATCH FOLDERS AND DEFAULT FILE EXTENSION TO WATCH FOR ####
$cofa_folder = '\\cpsfs001\Data_pvs\TestCofA'
$bulk_folder = '\\cpsfs001\PVS\Subsidiary\Nolwood\McWood\POD'
$filter = '*.tif'
$cofa = New-Object IO.FileSystemWatcher $cofa_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
$bulk = New-Object IO.FileSystemWatcher $bulk_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
#### CERTIFICATE OF ANALYSIS AND PACKAGE SHIPPER PROCESSING ####
Register-ObjectEvent $cofa Created -SourceIdentifier COFA/PACKAGE -Action {
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
#### CERTIFICATE OF ANALYSIS PROCESS BEGINS ####
$test=$name.StartsWith("I")
if ($test -eq $true) {
$pos = $name.IndexOf(".")
$left=$name.substring(0,$pos)
$pos = $left.IndexOf("L")
$tempItem=$left.substring(0,$pos)
$lot = $left.Substring($pos + 1)
$item=$tempItem.Substring(1)
Write-Host "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp" -fore green
Out-File -FilePath c:\OutputLogs\CofA.csv -Append -InputObject "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp"
start-sleep -s 5
$conn = New-Object System.Data.SqlClient.SqlConnection("Data Source=PVSNTDB33; Initial Catalog=adagecopy_daily; Integrated Security=TRUE")
$conn.Open()
$insert_stmt = "INSERT INTO in_cofa_pvs (in_item_key, in_lot_key, imgfileName, in_cofa_crtdt) VALUES ('$item','$lot','$name','$timestamp')"
$cmd = $conn.CreateCommand()
$cmd.CommandText = $insert_stmt
$cmd.ExecuteNonQuery()
$conn.Close()
#### PACKAGE SHIPPER PROCESS BEGINS ####
elseif ($test -eq $false) {
$pos = $name.IndexOf(".")
$left=$name.substring(0,$pos)
$pos = $left.IndexOf("O")
$tempItem=$left.substring(0,$pos)
$order = $left.Substring($pos + 1)
$shipid=$tempItem.Substring(1)
Write-Host "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp" -fore green
Out-File -FilePath c:\OutputLogs\PackageShipper.csv -Append -InputObject "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp"
Rich ThompsonHi
Since SQL Server 2000 has been out of support, I recommend you to upgrade the SQL Server 2000 to a higher version, such as SQL Server 2005 or SQL Server 2008.
According to your description, you can try the following methods to check duplicate record in SQL Server.
1. You can use
RAISERROR to check the duplicate record, if exists then RAISERROR unless insert accordingly, code block is given below:
IF EXISTS (SELECT 1 FROM TableName AS t
WHERE t.Column1 = @ Column1
AND t.Column2 = @ Column2)
BEGIN
RAISERROR(‘Duplicate records’,18,1)
END
ELSE
BEGIN
INSERT INTO TableName (Column1, Column2, Column3)
SELECT @ Column1, @ Column2, @ Column3
END
2. Also you can create UNIQUE INDEX or UNIQUE CONSTRAINT on the column of a table, when you try to INSERT a value that conflicts with the INDEX/CONSTRAINT, an exception will be thrown.
Add the unique index:
CREATE UNIQUE INDEX Unique_Index_name ON TableName(ColumnName)
Add the unique constraint:
ALTER TABLE TableName
ADD CONSTRAINT Unique_Contraint_Name
UNIQUE (ColumnName)
Thanks
Lydia Zhang -
Avoiding duplicate records while inserting into the table
Hi
I tried the following insert statement , where i want to avoid the duplicate records while inserting itself
but giving me the errror like invalid identifier, though the column exists in the table
Please let me know Where i'm doing the mistake.
INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
SELECT 100,
sk.obj_id,
sk.key_txt,
sk.obj_typ,
sysdate,
FROM S_KEY sk
WHERE sk.obj_typ = 'AY'
AND SYSDATE BETWEEN sk.start_date AND sk.end_date
AND sk.obj_id IN (100170,1001054)
and not exists (select 1
FROM t_map tm1 where tm1.O_ID=tm.o_id
and tm1.sn_id=tm.sn_id
and tm1.txt=tm.txt
and tm1.typ=tm.typ
and tm1.sn_time=tm.sn_time )Then
you have to join the table with alias tml where is that ?do you want like this?
INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
SELECT 100,
sk.obj_id,
sk.key_txt,
sk.obj_typ,
sysdate,
FROM S_KEY sk
WHERE sk.obj_typ = 'AY'
AND SYSDATE BETWEEN sk.start_date AND sk.end_date
AND sk.obj_id IN (100170,1001054)
and not exists (select 1
FROM t_map tm where sk.obj_ID=tm.o_id
and 100=tm.sn_id
and sk.key_txt=tm.txt
and sk.obj_typ=tm.typ
and sysdate=tm.sn_time ) -
The ABAP/4 Open SQL array insert results in duplicate Record in database
Hi All,
I am trying to transfer 4 plants from R/3 to APO. The IM contains only these 4 plants. However a queue gets generated in APO saying 'The ABAP/4 Open SQL array insert results in duplicate record in database'. I checked for table /SAPAPO/LOC, /SAPAPO/LOCMAP & /SAPAPO/LOCT for duplicate entry but the entry is not found.
Can anybody guide me how to resolve this issue?
Thanks in advance
Sandeep PatilHi Sandeep,
Now try to delete ur location before activating the IM again.
Use the program /SAPAPO/DELETE_LOCATIONS to delete locations.
Note :
1. Set the deletion flag (in /SAPAPO/LOC : Location -> Deletion Flag)
2. Remove all the dependencies (like transportation lane, Model ........ )
Check now and let me know.
Regards,
Siva.
null -
Hi,
I am providing support to one of our clients, where we have jobs scheduled to load the data from the tables in the source database to the destination database via SSIS packages. The first time load is a full load where we truncate all the tables in the destination
and load them from the source tables. But from the next day, we perform the incremental load from source to destination, i.e., only modified records fetched using changed tracking concept will be loaded to the destination. After full load, if we run the incremental
load, the job is failing with the error on one of the packages "Violation of PRIMARY KEY constraint. Cannot insert duplicate key in object '<tablename>'. The duplicate key value is <1234>, even though there are no duplicate records. When we
try debugging and running the failing package, it runs successfully. We are not able to figure out why the package fails and when we run the next day it runs successfully. Request you to help me in this regard.
Thank you,
Bala Murali Krishna Medipally.Hi,
I am providing support to one of our clients, where we have jobs scheduled to load the data from the tables in the source database to the destination database via SSIS packages. The first time load is a full load where we truncate all the tables in the destination
and load them from the source tables. But from the next day, we perform the incremental load from source to destination, i.e., only modified records fetched using changed tracking concept will be loaded to the destination. After full load, if we run the incremental
load, the job is failing with the error on one of the packages "Violation of PRIMARY KEY constraint. Cannot insert duplicate key in object '<tablename>'. The duplicate key value is <1234>, even though there are no duplicate records. When we
try debugging and running the failing package, it runs successfully. We are not able to figure out why the package fails and when we run the next day it runs successfully. Request you to help me in this regard.
Thank you,
Bala Murali Krishna Medipally.
I suspect you are trying to insert modified records instead of updating. -
Do we have a shortcut to insert a duplicate record except PK ?
Do we have a shortcut to insert a duplicate record except PK ?
Thanks.Do you want to insert another row with all of the columns the same but with a different PK? If so then:
INSERT INTO t (pk, col1, col2, col3)
SELECT new_value_for_pk, col1, col2, col3
FROM t
WHERE pk = <value>What you use for new_value_for_pk will depend on how the PK is generated.
John -
HR_INFOTYPE_OPERATION in insert mode - duplicate records
Hi,
I'm using athe above function to insert a record into IT0015 (Additional Payments) i.e. Operational Mode is 'INS'
but when I have a record already in the system for the date specified it is just inserting a new record - is there a parameter I can pass into the function that will just update the record rather than creating a new one?
I know I can check prior to the operation via function 'HR_READ_INFOTYPE' if a record exists and then change the operational mode to 'MOD' before I call 'HR_INFOTYPE_OPERATION - is this the only way or can the HR_INFOTYPE_OPERATION handle this step.
CALL FUNCTION 'HR_INFOTYPE_OPERATION'
EXPORTING
infty = '0015'
number = p_pernr
subtype = p0015-subty
* OBJECTID =
* LOCKINDICATOR =
validityend = p0015-begda
validitybegin = p0015-begda
* RECORDNUMBER = 1 "RP 26.11.09
record = p0015
operation = p_operation
* TCLAS = 'A' "RP 26.11.09
* DIALOG_MODE = '0'
* NOCOMMIT =
* VIEW_IDENTIFIER =
* SECONDARY_RECORD =
IMPORTING
return = fs_return
* KEY =
Many thanks in advance.
RajHello Volker,
Are you in anyway associated with Consulut IDES server? I remember seeing a similar name in it. Sorry if thats not you.
Vikranth -
How to find out duplicate record contained in a flat file
Hi Experts,
For my project I have written a program for flat file upload.
Requirement 1
In the flat file there may be some duplicate record like:
Field1 Field2
11 test1
11 test2
12 test3
13 test4
Field1 is primary key.
Can you please let me know how I can find out the duplicate record.
Requirement 2
The flat file contains the header row as shown above
Field1 Field2
How our program can skip this record and start reading / inserting records from row no 2 ie
11 test1
onwards.
Thanks
S
FORM upload1.
DATA : wf_title TYPE string,
lt_filetab TYPE filetable,
l_separator TYPE char01,
l_action TYPE i,
l_count TYPE i,
ls_filetab TYPE file_table,
wf_delemt TYPE rollname,
wa_fieldcat TYPE lvc_s_fcat,
tb_fieldcat TYPE lvc_t_fcat,
rows_read TYPE i,
p_error TYPE char01,
l_file TYPE string.
DATA: wf_object(30) TYPE c,
wf_tablnm TYPE rsdchkview.
wf_object = 'myprogram'.
DATA i TYPE i.
DATA:
lr_mdmt TYPE REF TO cl_rsdmd_mdmt,
lr_mdmtr TYPE REF TO cl_rsdmd_mdmtr,
lt_idocstate TYPE rsarr_t_idocstate,
lv_subrc TYPE sysubrc.
TYPES : BEGIN OF test_struc,
/bic/myprogram TYPE /bic/oimyprogram,
txtmd TYPE rstxtmd,
END OF test_struc.
DATA : tb_assum TYPE TABLE OF /bic/pmyprogram.
DATA: wa_ztext TYPE /bic/tmyprogram,
myprogram_temp TYPE ziott_assum,
wa_myprogram TYPE /bic/pmyprogram.
DATA : test_upload TYPE STANDARD TABLE OF test_struc,
wa2 TYPE test_struc.
DATA : wa_test_upload TYPE test_struc,
ztable_data TYPE TABLE OF /bic/pmyprogram,
ztable_text TYPE TABLE OF /bic/tmyprogram,
wa_upld_text TYPE /bic/tmyprogram,
wa_upld_data TYPE /bic/pmyprogram,
t_assum TYPE ziott_assum.
DATA : wa1 LIKE test_upload.
wf_title = text-026.
CALL METHOD cl_gui_frontend_services=>file_open_dialog
EXPORTING
window_title = wf_title
default_extension = 'txt'
file_filter = 'Tab delimited Text Files (*.txt)'
CHANGING
file_table = lt_filetab
rc = l_count
user_action = l_action
EXCEPTIONS
file_open_dialog_failed = 1
cntl_error = 2
OTHERS = 3. "#EC NOTEXT
IF sy-subrc 0.
EXIT.
ENDIF.
LOOP AT lt_filetab INTO ls_filetab.
l_file = ls_filetab.
ENDLOOP.
CHECK l_action = 0.
IF l_file IS INITIAL.
EXIT.
ENDIF.
l_separator = 'X'.
wa_fieldcat-fieldname = 'test'.
wa_fieldcat-dd_roll = wf_delemt.
APPEND wa_fieldcat TO tb_fieldcat.
CALL FUNCTION 'MESSAGES_INITIALIZE'.
CLEAR wa_test_upload.
Upload file from front-end (PC)
File format is tab-delimited ASCII
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = l_file
has_field_separator = l_separator
TABLES
data_tab = i_mara
data_tab = test_upload
EXCEPTIONS
file_open_error = 1
file_read_error = 2
no_batch = 3
gui_refuse_filetransfer = 4
invalid_type = 5
no_authority = 6
unknown_error = 7
bad_data_format = 8
header_not_allowed = 9
separator_not_allowed = 10
header_too_long = 11
unknown_dp_error = 12
access_denied = 13
dp_out_of_memory = 14
disk_full = 15
dp_timeout = 16
OTHERS = 17.
IF sy-subrc 0.
EXIT.
ELSE.
CALL FUNCTION 'MESSAGES_INITIALIZE'.
IF test_upload IS NOT INITIAL.
DESCRIBE TABLE test_upload LINES rows_read.
CLEAR : wa_test_upload,wa_upld_data.
LOOP AT test_upload INTO wa_test_upload.
CLEAR : p_error.
rows_read = sy-tabix.
IF wa_test_upload-/bic/myprogram IS INITIAL.
p_error = 'X'.
MESSAGE s153 WITH wa_test_upload-/bic/myprogram sy-tabix.
CONTINUE.
ELSE.
TRANSLATE wa_test_upload-/bic/myprogram TO UPPER CASE.
wa_upld_text-txtmd = wa_test_upload-txtmd.
wa_upld_text-txtsh = wa_test_upload-txtmd.
wa_upld_text-langu = sy-langu.
wa_upld_data-chrt_accts = 'xyz1'.
wa_upld_data-co_area = '12'.
wa_upld_data-/bic/zxyzbcsg = 'Iy'.
wa_upld_data-objvers = 'A'.
wa_upld_data-changed = 'I'.
wa_upld_data-/bic/zass_mdl = 'rrr'.
wa_upld_data-/bic/zass_typ = 'I'.
wa_upld_data-/bic/zdriver = 'yyy'.
wa_upld_text-langu = sy-langu.
MOVE-CORRESPONDING wa_test_upload TO wa_upld_data.
MOVE-CORRESPONDING wa_test_upload TO wa_upld_text.
APPEND wa_upld_data TO ztable_data.
APPEND wa_upld_text TO ztable_text.
ENDIF.
ENDLOOP.
DELETE ADJACENT DUPLICATES FROM ztable_data.
DELETE ADJACENT DUPLICATES FROM ztable_text.
IF ztable_data IS NOT INITIAL.
CALL METHOD cl_rsdmd_mdmt=>factory
EXPORTING
i_chabasnm = 'myprogram'
IMPORTING
e_r_mdmt = lr_mdmt
EXCEPTIONS
invalid_iobjnm = 1
OTHERS = 2.
CALL FUNCTION 'MESSAGES_INITIALIZE'.
**Lock the Infoobject to update
CALL FUNCTION 'RSDG_IOBJ_ENQUEUE'
EXPORTING
i_objnm = wf_object
i_scope = '1'
i_msgty = rs_c_error
EXCEPTIONS
foreign_lock = 1
sys_failure = 2.
IF sy-subrc = 1.
MESSAGE i107(zddd_rr) WITH wf_object sy-msgv2.
EXIT.
ELSEIF sy-subrc = 2.
MESSAGE i108(zddd_rr) WITH wf_object.
EXIT.
ENDIF.
*****Update Master Table
IF ztable_data IS NOT INITIAL.
CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
EXPORTING
i_iobjnm = 'myprogram'
i_tabclass = 'M'
I_T_ATTR = lt_attr
TABLES
i_t_table = ztable_data
EXCEPTIONS
attribute_name_error = 1
iobj_not_found = 2
generate_program_error = 3
OTHERS = 4.
IF sy-subrc 0.
CALL FUNCTION 'MESSAGE_STORE'
EXPORTING
arbgb = 'zddd_rr'
msgty = 'E'
txtnr = '054'
msgv1 = text-033
EXCEPTIONS
OTHERS = 3.
MESSAGE e054(zddd_rr) WITH 'myprogram'.
ELSE.
CALL FUNCTION 'MESSAGE_STORE'
EXPORTING
arbgb = 'zddd_rr'
msgty = 'S'
txtnr = '053'
msgv1 = text-033
EXCEPTIONS
OTHERS = 3.
ENDIF.
*endif.
*****update Text Table
IF ztable_text IS NOT INITIAL.
CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
EXPORTING
i_iobjnm = 'myprogram'
i_tabclass = 'T'
TABLES
i_t_table = ztable_text
EXCEPTIONS
attribute_name_error = 1
iobj_not_found = 2
generate_program_error = 3
OTHERS = 4.
IF sy-subrc 0.
CALL FUNCTION 'MESSAGE_STORE'
EXPORTING
arbgb = 'zddd_rr'
msgty = 'E'
txtnr = '055'
msgv1 = text-033
EXCEPTIONS
OTHERS = 3.
ENDIF.
ENDIF.
ELSE.
MESSAGE s178(zddd_rr).
ENDIF.
ENDIF.
COMMIT WORK.
CALL FUNCTION 'RSD_CHKTAB_GET_FOR_CHA_BAS'
EXPORTING
i_chabasnm = 'myprogram'
IMPORTING
e_chktab = wf_tablnm
EXCEPTIONS
name_error = 1.
IF sy-subrc 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
****Release locks on Infoobject
CALL FUNCTION 'RSDG_IOBJ_DEQUEUE'
EXPORTING
i_objnm = 'myprogram'
i_scope = '1'.
ENDIF.
ENDIF.
PERFORM data_selection .
PERFORM update_alv_grid_display.
CALL FUNCTION 'MESSAGES_SHOW'.
ENDFORM.Can you please let me know how I can find out the duplicate record.
you need to split the records from flat file structure into your internal table ans use a delete ADJACENT duplicates comparing fields
split flat_str into wa_f1 wa_f2 wa_f2 at tab_space. -
How to delete duplicate records in 10 G.
how to delete duplicate records in 10 G.
--Here is one way to do it using a second table
create table temp1
(col1 char(1));
--Table created.
insert into temp1 (col1) values('A');
insert into temp1 (col1) values('B');
insert into temp1 (col1) values('B');
--1 row created.
--1 row created.
--1 row created.
create table temp2 as select distinct * from temp1;
--Table created.
--now you have a second table with no duplicates
--truncate your old table
truncate table temp1;
--Table truncated.
--and reload it with data from the new table
insert into temp1 select * from temp2;
--2 rows created.
--then drop the temp2 table
drop table temp2 -
Insert New Record in Master Data by Code
Hi guys,
I need to insert a new value in an infoobject by code creating:
1 new record in table P (data not time dependent)
1 new record in table S (SID table)
This code could be executed by many tasks in parallel and so it could create problems of concurrency in writing and in quality of the value of new SID selected.
The first question is:
THERE IS A STANDARD CODE THAT INSERT A NEW RECORD ALSO CREATING SIDS, managing concurrency in writing and reading?
The second (if not answer to first)
This is a part of my code (draft)... any suggestions:
insert into TABLE P
INSERT INTO /bic/pzck9idfl VALUES st_p_zck9idfl.
IF sy-subrc = 0.
FLAG = 1.
WHILE FLAG = 0.
SELECT MAX( sid )
INTO v_sididfl
FROM /bic/szck9idfl.
ADD 1 TO v_sid.
*record for SID table
st_zck9idfl-sid = v_sid.
st_zck9idfl-/bic/zck9idfl = v_idfl.
st_zck9idfl-chckfl = 'X'.
st_zck9idfl-datafl = 'X'.
st_zck9idfl-incfl = 'X'.
insert record in SID Table
INSERT INTO /bic/szck9idfl VALUES st_zck9idfl.
COMMIT WORK AND WAIT.
IF Sy-subrc = 0.
SELECT SINGLE FROM /bic/szck9idfl
WHERE SID = v_SID
AND /bic/zck9idfl NE v_idfl.
IF Sy-SUBRC = 0.
FLAG = 1.
ELSE.
FLAG = 0.
ENDIF.
ELSE FLAG = 1.
ENDIF.
ENDWHILE.
Thanks and points to helpful answer!
ciao
C@fHi Claudio,
I would not recommend to do this. Please have a look for standard fm to that job of have a look into the class library to find some methods. On the first look at your code here my comments:
SELECT MAX( sid )
INTO v_sididfl
FROM /bic/szck9idfl.
ADD 1 TO v_sid.
Not a pretty good idea, as there is a number range object for getting a sid for each infoobject. If you get your sid like this, all later standard postings will fail with 'duplicate records'.
*record for SID table
st_zck9idfl-sid = v_sid.
st_zck9idfl-/bic/zck9idfl = v_idfl.
st_zck9idfl-chckfl = 'X'.
st_zck9idfl-datafl = 'X'.
st_zck9idfl-incfl = 'X'.
if you mark all these flags with 'X' you will tell the system that this record is used somewhere in masterdata or in a datatarget and you cannot delete it with standard methods.
regards
Siggi
Maybe you are looking for
-
Where can I get an eMac install DVD from?
I've just been given an eMac that was destined for the skip. It came with no install DVD's and the administrator password is unknown. Whoever cleaned it out before parting company with it put in an administrators name as "apple" but didn't clear the
-
How do I move my existing Behance CC account to a different CC account?
How do I move my existing Behance CC account to a different CC account?
-
HT2508 Airport express for more than one device at the same time.
I just set up my Airport Express on my MacBook, but when I try to use this wireless connection on my iPad, I have to disconnect from my existing connection on my MacBook first. How can I use both devices at the same time without having to disconnect?
-
Hi Friends This is My Databasse Output Parent Child P1 C1 C1 C2 C1 C3 C2 C4 C3 C4 Finally . I want the output is P1 | | C1 | | C2 C3 | | | C4 In the form of Tree foramt. Please give me ths Solution for that. Thanks in Advamce, Rajesh Kannan. N
-
No GTS block during delivery even country removed from licence
Hi, We are using GTS 7.2. When we create the order and if we remove the country from License still it allows to create the delivery and PGI. During Delivery and PGI the GTS check should happen again and that is happening but it is taking the same lic