Deleting existing non validated record
Hello there!
I am using a delete button with the code:
if F_YES_NO('?CONFIRM_DEL_REC') then
delete_record;
end if;
post;
When I try to delete a record that is not valid (didn't pass when-validate-record restrictions) and that was already in the data base, it does not delete it, showing the error message.
Is there a way to pass this restrictions?
Thanks in advance!
I don't know why Forms tries to validate before deleting, but it does.
Here is what I do:
In the key-delrec trigger, I start a non-repeating timer:
PLL_START_TIMER('KEY_DELREC',500); -- used by validate triggers
Delete_Record;
Delete_Timer('KEY_DELREC');PLL_Start_Timer is a PLL Library routine to create the timer if it does not exist, or restart it if it already exists.
Then, at the top of every validation trigger where a record can be deleted, I do this:
If not id_null(find_timer('KEY_DELREC')) then
return;
End if;So those three lines go at the top of both When-Validate-Record and When-Validate-Item triggers.
Similar Messages
-
How can we do the validation for non visible records in table control.
Hi Experts,
I have a table control which displays list of material details. I have a button to upload the material details from excel file to table control directly. I have to validate all the customers which are exist in the table. But my item (Material) table control displays only 5 rows i.e only 5 entries are visible in module pool screen. The validation is done for 5 records only in PAI event, but i need to do validation for rest of the records too (Which are not visible on items table), if validation fails then needs to display error message.
How can we do the validation for non visible records in table control.
Regards,
BujjiHi,
try validating material before displaying it in table control...'
Rgds/Abhhi -
Deleting existing record from the oracle
My sender is a proxy,there is one field here which changes say cfield.Xi doesnot send some records if the change field status is 0 or 1 or 2,for other records it sends data to to the JDBC Receiver.When the Sender Cfield status changes from 0 or 1 or 2 to other values,only this corresponding record must be deleted from the the receiver JDBC.How to do this?
Thanks,
SrinivasaHi Srinivasa,
Use the JDBC receiver document format with action DELETE.
1) Develop Mapping Program in such a way that it creates Statement only when it is not 0,1 and 2.
Use the Createif function.
2) To delete only the corresponding records map the primary key value to key field
chk this :
http://help.sap.com/saphelp_nw04/helpdata/en/2e/96fd3f2d14e869e10000000a155106/content.htm
regards
Ganga -
Header ID does not exist on this record or does not match ID
Hi ,
I'm trying to import the sales order using OE_ORDER_PUB.process_order() API. If all my orders has equal no. of lines, I can able to import all the orders at one stoke. But if each order has different no. of lines, I'm unable to import all the orders at single stoke. At first run, certain no. of records are getting processed and remaining records are ending up with error stating *'Header ID does not exist on this record or does not match ID specified on header record. You require a valid header ID if the operation is Create*'. If I re-execute the errored records, its getting processed successfully.
Please help me in getting it resolved.
Find below the procedure code which I'm using to import sales order.
CREATE OR REPLACE PROCEDURE XXRT_SALES_ORDER (ERRBUF VARCHAR2, RETCODE NUMBER)
AS
l_header_rec OE_ORDER_PUB.Header_Rec_Type;
l_line_tbl OE_ORDER_PUB.Line_Tbl_Type;
l_line_adj_tbl OE_ORDER_PUB.Line_Adj_Tbl_Type;
l_return_status varchar2(200);
l_msg_count number;
l_msg_data varchar2(20000);
l_header_val_rec OE_ORDER_PUB.Header_Val_Rec_Type;
l_Header_Adj_tbl OE_ORDER_PUB.Header_Adj_Tbl_Type;
l_Header_Adj_val_tbl OE_ORDER_PUB.Header_Adj_Val_Tbl_Type;
l_Header_price_Att_tbl OE_ORDER_PUB.Header_Price_Att_Tbl_Type;
l_Header_Adj_Att_tbl OE_ORDER_PUB.Header_Adj_Att_Tbl_Type;
l_Header_Adj_Assoc_tbl OE_ORDER_PUB.Header_Adj_Assoc_Tbl_Type;
l_Header_Scredit_tbl OE_ORDER_PUB.Header_Scredit_Tbl_Type;
l_Header_Scredit_val_tbl OE_ORDER_PUB.Header_Scredit_Val_Tbl_Type;
l_line_val_tbl OE_ORDER_PUB.Line_Val_Tbl_Type;
l_Line_Adj_val_tbl OE_ORDER_PUB.Line_Adj_Val_Tbl_Type;
l_Line_price_Att_tbl OE_ORDER_PUB.Line_Price_Att_Tbl_Type;
l_Line_Adj_Att_tbl OE_ORDER_PUB.Line_Adj_Att_Tbl_Type;
l_Line_Adj_Assoc_tbl OE_ORDER_PUB.Line_Adj_Assoc_Tbl_Type;
l_Line_Scredit_tbl OE_ORDER_PUB.Line_Scredit_Tbl_Type;
l_Line_Scredit_val_tbl OE_ORDER_PUB.Line_Scredit_Val_Tbl_Type;
l_Lot_Serial_tbl OE_ORDER_PUB.Lot_Serial_Tbl_Type;
l_Lot_Serial_val_tbl OE_ORDER_PUB.Lot_Serial_Val_Tbl_Type;
l_action_request_tbl OE_ORDER_PUB.Request_Tbl_Type;
v_remarks varchar2(250);
v_count number;
v_transaction_type number;
v_cust_id number;
v_invoice_to number;
v_ship_to number;
v_itemid1 number;
v_itemid2 number;
v_itemid3 number;
v_itemid4 number;
v_itemid5 number;
v_item_type1 varchar2(15);
v_item_type2 varchar2(15);
v_item_type3 varchar2(15);
v_item_type4 varchar2(15);
v_item_type5 varchar2(15);
cursor c_data_val is select * from XXSC_SALES_ORDER_IMPORT
where process_flag='N';
cursor c_data is select * from XXSC_SALES_ORDER_IMPORT
where process_flag='N';
Begin
fnd_global.apps_initialize(1681,51411,660);
for c_data_val_rec in c_data_val
loop
v_remarks:=NULL;
select count(*) into v_count from apps.oe_order_headers_all
where cust_po_number=c_data_val_rec.iwb_no;
if v_count > 0 then
v_remarks:='Customer PO already used';
end if;
BEGIN
select transaction_type_id into v_transaction_type
from apps.OE_TRANSACTION_TYPES_tl
where name like c_data_val_rec.BILL_TO||'-DOMESTIC';
EXCEPTION
WHEN NO_DATA_FOUND THEN
v_remarks:=v_remarks||'Invalid trans type';
END;
BEGIN
select cust_account_id into v_cust_id
from apps.hz_cust_accounts hca,
apps.hz_parties hp
where hca.party_id=hp.party_id
and hp.party_name=c_data_val_rec.CUSTOMER_NAME;
EXCEPTION
WHEN NO_DATA_FOUND THEN
v_remarks:=v_remarks||'Invalid Customer name';
END;
BEGIN
select SITE_USE_ID into v_invoice_to
from apps.hz_parties hp,
apps.hz_party_sites_v hps,
apps.hz_cust_acct_sites_all hcasa,
apps.hz_cust_site_uses_all hcsua
where
party_name=c_data_val_rec.CUSTOMER_NAME
and hp.party_id=hps.party_id
and hps.address1 like '%'||c_data_val_rec.BILL_TO||'%'
and hps.site_use_type='BILL_TO'
and hps.party_site_id=hcasa.party_site_id
and hcasa.cust_acct_site_id=hcsua.cust_acct_site_id
and SITE_USE_CODE='BILL_TO'
and hcsua.Status='A';
EXCEPTION
WHEN OTHERS THEN
v_remarks:=v_remarks||'Invalid BILL_TO';
END;
BEGIN
select SITE_USE_ID into v_ship_to
from apps.hz_parties hp,
apps.hz_party_sites_v hps,
apps.hz_cust_acct_sites_all hcasa,
apps.hz_cust_site_uses_all hcsua
where
party_name=c_data_val_rec.CUSTOMER_NAME
and hp.party_id=hps.party_id
and hps.address1 like '%'||c_data_val_rec.SHIP_TO||'%'
and hps.site_use_type='SHIP_TO'
and hps.party_site_id=hcasa.party_site_id
and hcasa.cust_acct_site_id=hcsua.cust_acct_site_id
and SITE_USE_CODE='SHIP_TO'
and hcsua.Status='A';
EXCEPTION
WHEN OTHERS THEN
v_remarks:=v_remarks||'Invalid SHIP_TO';
END;
IF c_data_val_rec.item1 is not null then
BEGIN
select distinct inventory_item_id into v_itemid1
from apps.mtl_system_items_b
where segment1=c_data_val_rec.item1;
EXCEPTION
WHEN OTHERS THEN
v_remarks:=v_remarks||'Invalid ITEM1';
END;
BEGIN
select distinct Item_type into v_item_type1
from apps.mtl_system_items_b
where segment1=c_data_val_rec.item1;
EXCEPTION
WHEN OTHERS THEN
v_remarks:=v_remarks||'Invalid Itemtype1';
END;
END IF;
IF c_data_val_rec.item2 is not null then
BEGIN
select distinct inventory_item_id into v_itemid2
from apps.mtl_system_items_b
where segment1=c_data_val_rec.item2;
EXCEPTION
WHEN OTHERS THEN
v_remarks:=v_remarks||'Invalid ITEM2';
END;
BEGIN
select distinct Item_type into v_item_type2
from apps.mtl_system_items_b
where segment1=c_data_val_rec.item2;
EXCEPTION
WHEN OTHERS THEN
v_remarks:=v_remarks||'Invalid Itemtype2';
END;
END IF;
IF c_data_val_rec.item3 is not null then
BEGIN
select distinct inventory_item_id into v_itemid3
from apps.mtl_system_items_b
where segment1=c_data_val_rec.item3;
EXCEPTION
WHEN OTHERS THEN
v_remarks:=v_remarks||'Invalid ITEM3';
END;
BEGIN
select distinct Item_type into v_item_type3
from apps.mtl_system_items_b
where segment1=c_data_val_rec.item3;
EXCEPTION
WHEN OTHERS THEN
v_remarks:=v_remarks||'Invalid Itemtype3';
END;
END IF;
IF c_data_val_rec.item4 is not null then
BEGIN
select distinct inventory_item_id into v_itemid4
from apps.mtl_system_items_b
where segment1=c_data_val_rec.item4;
EXCEPTION
WHEN OTHERS THEN
v_remarks:=v_remarks||'Invalid ITEM4';
END;
BEGIN
select distinct inventory_item_id into v_item_type4
from apps.mtl_system_items_b
where segment1=c_data_val_rec.item4;
EXCEPTION
WHEN OTHERS THEN
v_remarks:=v_remarks||'Invalid Itemtype4';
END;
END IF;
IF c_data_val_rec.item5 is not null then
BEGIN
select distinct inventory_item_id into v_itemid5
from apps.mtl_system_items_b
where segment1=c_data_val_rec.item5;
EXCEPTION
WHEN OTHERS THEN
v_remarks:=v_remarks||'Invalid ITEM5';
END;
BEGIN
select distinct inventory_item_id into v_item_type5
from apps.mtl_system_items_b
where segment1=c_data_val_rec.item5;
EXCEPTION
WHEN OTHERS THEN
v_remarks:=v_remarks||'Invalid Itemtype5';
END;
END IF;
IF v_remarks is not null then
update XXSC_SALES_ORDER_IMPORT
set process_flag='E', remarks=v_remarks
where iwb_no=c_data_val_rec.iwb_no;
commit;
END IF;
end loop;
for c_data_rec in c_data
loop
select transaction_type_id into v_transaction_type
from OE_TRANSACTION_TYPES_tl
where name like c_data_rec.BILL_TO||'-DOMESTIC';
select cust_account_id into v_cust_id
from hz_cust_accounts hca,
hz_parties hp
where hca.party_id=hp.party_id
and hp.party_name=c_data_rec.CUSTOMER_NAME;
select SITE_USE_ID into v_invoice_to
from hz_parties hp,
hz_party_sites_v hps,
hz_cust_acct_sites_all hcasa,
hz_cust_site_uses_all hcsua
where
party_name=c_data_rec.CUSTOMER_NAME
and hp.party_id=hps.party_id
and hps.address1 like '%'||c_data_rec.BILL_TO||'%'
and hps.site_use_type='BILL_TO'
and hps.party_site_id=hcasa.party_site_id
and hcasa.cust_acct_site_id=hcsua.cust_acct_site_id
and SITE_USE_CODE='BILL_TO'
and hcsua.Status='A';
select SITE_USE_ID into v_ship_to
from hz_parties hp,
hz_party_sites_v hps,
hz_cust_acct_sites_all hcasa,
hz_cust_site_uses_all hcsua
where
party_name=c_data_rec.CUSTOMER_NAME
and hp.party_id=hps.party_id
and hps.address1 like '%'||c_data_rec.SHIP_TO||'%'
and hps.site_use_type='SHIP_TO'
and hps.party_site_id=hcasa.party_site_id
and hcasa.cust_acct_site_id=hcsua.cust_acct_site_id
and SITE_USE_CODE='SHIP_TO'
and hcsua.Status='A';
IF c_data_rec.item1 is not null then
select distinct inventory_item_id into v_itemid1
from apps.mtl_system_items_b
where segment1=c_data_rec.item1;
select distinct Item_type into v_item_type1
from apps.mtl_system_items_b
where segment1=c_data_rec.item1;
ELSE
v_itemid1:= null;
v_item_type1:= null;
END IF;
IF c_data_rec.item2 is not null then
select distinct inventory_item_id into v_itemid2
from apps.mtl_system_items_b
where segment1=c_data_rec.item2;
select distinct Item_type into v_item_type2
from apps.mtl_system_items_b
where segment1=c_data_rec.item2;
ELSE
v_itemid2:=null;
v_item_type2:= null;
END IF;
IF c_data_rec.item3 is not null then
select distinct inventory_item_id into v_itemid3
from apps.mtl_system_items_b
where segment1=c_data_rec.item3;
select distinct Item_type into v_item_type3
from apps.mtl_system_items_b
where segment1=c_data_rec.item3;
ELSE
v_itemid3:=null;
v_item_type3:=null;
END IF;
IF c_data_rec.item4 is not null then
select distinct inventory_item_id into v_itemid4
from apps.mtl_system_items_b
where segment1=c_data_rec.item4;
select distinct Item_type into v_item_type4
from apps.mtl_system_items_b
where segment1=c_data_rec.item4;
ELSE
v_itemid4:= null;
v_item_type4:= null;
END IF;
IF c_data_rec.item5 is not null then
select distinct inventory_item_id into v_itemid5
from apps.mtl_system_items_b
where segment1=c_data_rec.item5;
select distinct Item_type into v_item_type5
from apps.mtl_system_items_b
where segment1=c_data_rec.item5;
ELSE
v_itemid5:= null;
v_item_type5:= null;
END IF;
l_header_rec := OE_ORDER_PUB.G_MISS_HEADER_REC; -- Required attributes (e.g. Order Type and Customer)
l_header_rec.order_type_id := v_transaction_type;
l_header_rec.ordered_date := c_data_rec.iwb_date;
l_header_rec.sold_to_org_id := v_cust_id;
l_header_rec.price_list_id := 93174;
l_header_rec.cust_po_number := c_data_rec.iwb_no;
l_header_rec.ship_to_org_id := v_ship_to;
l_header_rec.invoice_to_org_id := v_invoice_to;
l_header_rec.CONTEXT:='566' ;
l_header_rec.ATTRIBUTE1:=c_data_rec.DELIVERY_TYPE;
-- l_header_rec.freight_term_code = NULL;
l_header_rec.operation := OE_GLOBALS.G_OPR_CREATE;
-- FIRST LINE RECORD. Initialize record to missing
if c_data_rec.item1 is not null and
c_data_rec.item2 is not null and
c_data_rec.item3 is not null and
c_data_rec.item4 is not null and
c_data_rec.item5 is not null then
l_line_tbl(1) := OE_ORDER_PUB.G_MISS_LINE_REC;
l_line_tbl(1).inventory_item_id :=v_itemid1;
l_line_tbl(1).ordered_quantity := c_data_rec.quantity1;
l_line_tbl(1).CALCULATE_PRICE_FLAG :='N';
l_line_tbl(1).UNIT_SELLING_PRICE := c_data_rec.price1;
l_line_tbl(1).UNIT_LIST_PRICE := c_data_rec.price1;
l_line_tbl(1).attribute13:= v_item_type1;
l_line_tbl(1).operation := OE_GLOBALS.G_OPR_CREATE;
--end if;
-- SECOND LINE RECORD
--elsif c_data_rec.item2 is not null then
l_line_tbl(2) := OE_ORDER_PUB.G_MISS_LINE_REC;
l_line_tbl(2).inventory_item_id := v_itemid2;
l_line_tbl(2).ordered_quantity := c_data_rec.quantity2;
l_line_tbl(2).CALCULATE_PRICE_FLAG :='N';
l_line_tbl(2).UNIT_SELLING_PRICE := c_data_rec.price2;
l_line_tbl(2).UNIT_LIST_PRICE := c_data_rec.price2;
l_line_tbl(2).attribute13:=v_item_type2;
l_line_tbl(2).operation := OE_GLOBALS.G_OPR_CREATE;
--end if;
-- THIRD LINE RECORD
--elsif c_data_rec.item3 is not null then
l_line_tbl(3) := OE_ORDER_PUB.G_MISS_LINE_REC;
l_line_tbl(3).inventory_item_id := v_itemid3;
l_line_tbl(3).ordered_quantity := c_data_rec.quantity3;
l_line_tbl(3).CALCULATE_PRICE_FLAG :='N';
l_line_tbl(3).UNIT_SELLING_PRICE := c_data_rec.price3;
l_line_tbl(3).UNIT_LIST_PRICE := c_data_rec.price3;
l_line_tbl(3).attribute13:=v_item_type3;
l_line_tbl(3).operation := OE_GLOBALS.G_OPR_CREATE;
--end if;
-- FOURTH LINE RECORD
--elsif c_data_rec.item4 is not null then
l_line_tbl(4) := OE_ORDER_PUB.G_MISS_LINE_REC;
l_line_tbl(4).inventory_item_id := v_itemid4;
l_line_tbl(4).ordered_quantity := c_data_rec.quantity4;
l_line_tbl(4).CALCULATE_PRICE_FLAG :='N';
l_line_tbl(4).UNIT_SELLING_PRICE := c_data_rec.price4;
l_line_tbl(4).UNIT_LIST_PRICE := c_data_rec.price4;
l_line_tbl(4).attribute13:=v_item_type4;
l_line_tbl(4).operation := OE_GLOBALS.G_OPR_CREATE;
--END IF;
-- FIFTH LINE RECORD
--elsif c_data_rec.item5 is not null then
l_line_tbl(5) := OE_ORDER_PUB.G_MISS_LINE_REC;
l_line_tbl(5).inventory_item_id := v_itemid5;
l_line_tbl(5).ordered_quantity := c_data_rec.quantity5;
l_line_tbl(5).CALCULATE_PRICE_FLAG :='N';
l_line_tbl(5).UNIT_SELLING_PRICE := c_data_rec.price5;
l_line_tbl(5).UNIT_LIST_PRICE := c_data_rec.price5;
l_line_tbl(5).attribute13:= v_item_type5;
l_line_tbl(5).operation := OE_GLOBALS.G_OPR_CREATE;
elsif c_data_rec.item1 is not null and
c_data_rec.item2 is not null and
c_data_rec.item3 is not null and
c_data_rec.item4 is not null then
l_line_tbl(1) := OE_ORDER_PUB.G_MISS_LINE_REC;
l_line_tbl(1).inventory_item_id :=v_itemid1;
l_line_tbl(1).ordered_quantity := c_data_rec.quantity1;
l_line_tbl(1).CALCULATE_PRICE_FLAG :='N';
l_line_tbl(1).UNIT_SELLING_PRICE := c_data_rec.price1;
l_line_tbl(1).UNIT_LIST_PRICE := c_data_rec.price1;
l_line_tbl(1).attribute13:= v_item_type1;
l_line_tbl(1).operation := OE_GLOBALS.G_OPR_CREATE;
--end if;
-- SECOND LINE RECORD
--elsif c_data_rec.item2 is not null then
l_line_tbl(2) := OE_ORDER_PUB.G_MISS_LINE_REC;
l_line_tbl(2).inventory_item_id := v_itemid2;
l_line_tbl(2).ordered_quantity := c_data_rec.quantity2;
l_line_tbl(2).CALCULATE_PRICE_FLAG :='N';
l_line_tbl(2).UNIT_SELLING_PRICE := c_data_rec.price2;
l_line_tbl(2).UNIT_LIST_PRICE := c_data_rec.price2;
l_line_tbl(2).attribute13:=v_item_type2;
l_line_tbl(2).operation := OE_GLOBALS.G_OPR_CREATE;
--end if;
-- THIRD LINE RECORD
--elsif c_data_rec.item3 is not null then
l_line_tbl(3) := OE_ORDER_PUB.G_MISS_LINE_REC;
l_line_tbl(3).inventory_item_id := v_itemid3;
l_line_tbl(3).ordered_quantity := c_data_rec.quantity3;
l_line_tbl(3).CALCULATE_PRICE_FLAG :='N';
l_line_tbl(3).UNIT_SELLING_PRICE := c_data_rec.price3;
l_line_tbl(3).UNIT_LIST_PRICE := c_data_rec.price3;
l_line_tbl(3).attribute13:=v_item_type3;
l_line_tbl(3).operation := OE_GLOBALS.G_OPR_CREATE;
--end if;
-- FOURTH LINE RECORD
--elsif c_data_rec.item4 is not null then
l_line_tbl(4) := OE_ORDER_PUB.G_MISS_LINE_REC;
l_line_tbl(4).inventory_item_id := v_itemid4;
l_line_tbl(4).ordered_quantity := c_data_rec.quantity4;
l_line_tbl(4).CALCULATE_PRICE_FLAG :='N';
l_line_tbl(4).UNIT_SELLING_PRICE := c_data_rec.price4;
l_line_tbl(4).UNIT_LIST_PRICE := c_data_rec.price4;
l_line_tbl(4).attribute13:=v_item_type4;
l_line_tbl(4).operation := OE_GLOBALS.G_OPR_CREATE;
elsif c_data_rec.item1 is not null and
c_data_rec.item2 is not null and
c_data_rec.item3 is not null then
l_line_tbl(1) := OE_ORDER_PUB.G_MISS_LINE_REC;
l_line_tbl(1).inventory_item_id :=v_itemid1;
l_line_tbl(1).ordered_quantity := c_data_rec.quantity1;
l_line_tbl(1).CALCULATE_PRICE_FLAG :='N';
l_line_tbl(1).UNIT_SELLING_PRICE := c_data_rec.price1;
l_line_tbl(1).UNIT_LIST_PRICE := c_data_rec.price1;
l_line_tbl(1).attribute13:= v_item_type1;
l_line_tbl(1).operation := OE_GLOBALS.G_OPR_CREATE;
--end if;
-- SECOND LINE RECORD
--elsif c_data_rec.item2 is not null then
l_line_tbl(2) := OE_ORDER_PUB.G_MISS_LINE_REC;
l_line_tbl(2).inventory_item_id := v_itemid2;
l_line_tbl(2).ordered_quantity := c_data_rec.quantity2;
l_line_tbl(2).CALCULATE_PRICE_FLAG :='N';
l_line_tbl(2).UNIT_SELLING_PRICE := c_data_rec.price2;
l_line_tbl(2).UNIT_LIST_PRICE := c_data_rec.price2;
l_line_tbl(2).attribute13:=v_item_type2;
l_line_tbl(2).operation := OE_GLOBALS.G_OPR_CREATE;
--end if;
-- THIRD LINE RECORD
--elsif c_data_rec.item3 is not null then
l_line_tbl(3) := OE_ORDER_PUB.G_MISS_LINE_REC;
l_line_tbl(3).inventory_item_id := v_itemid3;
l_line_tbl(3).ordered_quantity := c_data_rec.quantity3;
l_line_tbl(3).CALCULATE_PRICE_FLAG :='N';
l_line_tbl(3).UNIT_SELLING_PRICE := c_data_rec.price3;
l_line_tbl(3).UNIT_LIST_PRICE := c_data_rec.price3;
l_line_tbl(3).attribute13:=v_item_type3;
l_line_tbl(3).operation := OE_GLOBALS.G_OPR_CREATE;
end if;
-- CALL TO PROCESS ORDER
OE_Order_PUB.Process_Order(
p_api_version_number=>1.0,
p_header_rec => l_header_rec,
p_line_tbl=> l_line_tbl,
p_line_adj_tbl=> l_line_adj_tbl,
-- OUT variables,
x_header_rec =>l_header_rec
, x_header_val_rec =>l_header_val_rec
, x_Header_Adj_tbl =>l_Header_Adj_tbl
, x_Header_Adj_val_tbl =>l_Header_Adj_val_tbl
, x_Header_price_Att_tbl =>l_Header_price_Att_tbl
, x_Header_Adj_Att_tbl =>l_Header_Adj_Att_tbl
, x_Header_Adj_Assoc_tbl =>l_Header_Adj_Assoc_tbl
, x_Header_Scredit_tbl =>l_Header_Scredit_tbl
, x_Header_Scredit_val_tbl=>l_Header_Scredit_val_tbl
, x_line_tbl =>l_line_tbl
, x_line_val_tbl =>l_line_val_tbl
, x_Line_Adj_tbl =>l_Line_Adj_tbl
, x_Line_Adj_val_tbl =>l_Line_Adj_val_tbl
, x_Line_price_Att_tbl =>l_Line_price_Att_tbl
, x_Line_Adj_Att_tbl =>l_Line_Adj_Att_tbl
, x_Line_Adj_Assoc_tbl =>l_Line_Adj_Assoc_tbl
, x_Line_Scredit_tbl =>l_Line_Scredit_tbl
, x_Line_Scredit_val_tbl =>l_Line_Scredit_val_tbl
, x_Lot_Serial_tbl =>l_Lot_Serial_tbl
, x_Lot_Serial_val_tbl =>l_Lot_Serial_val_tbl
, x_action_request_tbl =>l_action_request_tbl,
x_return_status=> l_return_status,
x_msg_count=> l_msg_count,
x_msg_data=> l_msg_data);
if l_msg_count > 0 then
for l_index in 1..l_msg_count loop
l_msg_data := oe_msg_pub.get(p_msg_index => l_index, p_encoded => 'F');
update XXSC_SALES_ORDER_IMPORT
set process_flag='E',remarks=l_msg_data
where iwb_no=c_data_rec.iwb_no;
commit;
dbms_output.put_line('Order Failed.'||l_msg_data);
end loop;
end if;
-- Check the return status
if l_return_status = FND_API.G_RET_STS_SUCCESS then
update XXSC_SALES_ORDER_IMPORT
set process_flag='Y',ORDER_NUMBER=l_header_rec.order_number,remarks='SUCCESS'
where iwb_no=c_data_rec.iwb_no;
commit;
fnd_file.put_line (fnd_file.output,'Order no:'||l_header_rec.order_number||' Created for the IWB no:'||c_data_rec.iwb_no);
dbms_output.put_line('Order..'||l_header_rec.order_number);
Else
dbms_output.put_line('Order Failed.');
end if;
commit;
END LOOP;
End;
/Your code always sends 4 line records with each header. Code is not checking if it belongs to same header or not. Please check your code.
-
Receiver JDBC: Error while doing the Deleting and Inserting new records
Hi All,
I am doing Idoc to JDBC scenario. In this I am collecting & bundling different type of Idocs and then sending to the JDBC receiver. My requirement is to delete the existing records in the database and insert the new records. I have configures as mentioned in the link
Re: Combining DELETE and INSERT statements in JDBC receiver
In the above link its shows for single mapping. In my scenario I am using multi mapping for collecting idocs in BPM. If I configured for normal mapping then it is working fine(Deleting existing records and Inserting new record). Whenever I am using multi mapping then I am getting following error in the receiver JDBC communication channel u201CError while parsing or executing XML-SQL document: Error processing request in sax parser: No 'action' attribute found in XML document (attribute "action" missing or wrong XML structure)u201D . Can you please tell me what might be the problem.
Thanks & Regards,
T.PurushothamHi !
check this out:
JDBC - No 'action' attribute found in XML document - error
JDBC receiver adapter: No 'action' attribute found in XML document
It appears that the inbound payload (the one that is going from XI to the JDBC adapter) does not have the requiered tag to specify which SQL action to execute in the receiver system. Maybe the multimapping is not creating the desired output message format.
Regards,
Matias. -
Importing and Updating Non-Duplicate Records from 2 Tables
I need some help with the code to import data from one table
into another if it is not a duplicate or if a record has changed.
I have 2 tables, Members and NetNews. I want to check NetNews
and import non-duplicate records from Members into NetNews and
update an email address in NetNews if it has changed in Members. I
figured it could be as simple as checking Members.MembersNumber and
Members.Email against the existance of NetNews.Email and
Members.MemberNumber and if a record in NetNews does not exist,
create it and if the email address in Members.email has changed,
update it in NetNews.Email.
Here is what I have from all of the suggestions received from
another category last year. It is not complete, but I am stuck on
the solution. Can someone please help me get this code working?
Thanks!
<cfquery datasource="#application.dsrepl#"
name="qryMember">
SELECT distinct Email,FirstName,LastName,MemberNumber
FROM members
WHERE memberstanding <= 2 AND email IS NOT NULL AND email
<> ' '
</cfquery>
<cfquery datasource="#application.ds#"
name="newsMember">
SELECT distinct MemberNumber
FROM NetNews
</cfquery>
<cfif
not(listfindnocase(valuelist(newsMember.MemberNumber),qryMember.MemberNumber)
AND isnumeric(qryMember.MemberNumber))>
insert into NetNews (Email_address, First_Name, Last_Name,
MemberNumber)
values ('#trim(qryMember.Email)#',
'#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
trim(qryMember.MemberNumber)#')-
</cfif>
</cfloop>
</cfquery>
------------------Dan,
My DBA doesn't have the experience to help with a VIEW. Did I
mention that these are 2 separate databases on different servers?
This project is over a year old now and it really needs to get
finished so I thought the import would be the easiest way to go.
Thanks to your help, it is almost working.
I added some additional code to check for a changed email
address and update the NetNews database. It runs without error, but
I don't have a way to test it right now. Can you please look at the
code and see if it looks OK?
I am also still getting an error on line 10 after the routine
runs. The line that has this code: "and membernumber not in
(<cfqueryparam list="yes"
value="#valuelist(newsmember.membernumber)#
cfsqltype="cf_sql_integer">)" even with the cfif that Phil
suggested.
<cfquery datasource="#application.ds#"
name="newsMember">
SELECT distinct MemberNumber, Email_Address
FROM NetNewsTest
</cfquery>
<cfquery datasource="#application.dsrepl#"
name="qryMember">
SELECT distinct Email,FirstName,LastName,MemberNumber
FROM members
WHERE memberstanding <= 2 AND email IS NOT NULL AND email
<> ' '
and membernumber not in (<cfqueryparam list="yes"
value="#valuelist(newsmember.membernumber)#"
cfsqltype="cf_sql_integer">)
</cfquery>
<CFIF qryMember.recordcount NEQ 0>
<cfloop query ="qryMember">
<cfquery datasource="#application.ds#"
name="newsMember">
insert into NetNewsTest (Email_address, First_Name,
Last_Name, MemberNumber)
values ('#trim(qryMember.Email)#',
'#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
trim(qryMember.MemberNumber)#')
</cfquery>
</cfloop>
</cfif>
<cfquery datasource="#application.dsrepl#"
name="qryEmail">
SELECT distinct Email
FROM members
WHERE memberstanding <= 2 AND email IS NOT NULL AND email
<> ' '
and qryMember.email NEQ newsMember.email
</cfquery>
<CFIF qryEmail.recordcount NEQ 0>
<cfloop query ="qryEmail">
<cfquery datasource="#application.ds#"
name="newsMember">
update NetNewsTest (Email_address)
values ('#trim(qryMember.Email)#')
where email_address = #qryEmail.email#
</cfquery>
</cfloop>
</cfif>
Thank you again for the help. -
CRM IDES Exports data into existing Non IDES System?
Hello,
Is it possible to import SAP CRM IDES data using IDES Exports into any existing non IDES SAP CRM System without re-installation? The thing is that we already have some data in two clients (client 100 and 200) and we don't want to loose it and recently we have received IDES DVD's and would like to have this data as well. So, is there any way to get this IDES data without re-installation?
Thanks and Regards,
VasuNo.
An installation doesn´t know anything about "clients", sapinst deletes the database if it exists and then loads the data from the export DVDs.
Markus -
How to delete existing e-mail signature for all users in outlook 2010
Dears,
How to delete existing (or reset to none ) e-mail signature for all users .
If anyone knows where I can find this script or another way of accomplishing this I would be very thankful.
outlook Version : 2010
PS : Not to disable.
Khaja HameedIf you want to manage signatures for multiple users your best bet is to use Group Policy.
See:
Setting up a Corporate Signature | HowTo-Outlook
http://www.howto-outlook.com/howto/corporatesignatures.htm
<p>Eric Legault (<a href="https:/mvp.support.microsoft.com/default.aspx/profile/legault">MVP: Outlook</a>)<br/> <a href="http://about.me/ericmlegault">About me...</a><br/> <a href="http://www.outlookappins.com/products/social-contacts">Outlook
Appins</a>: Store Social Media fields in your Outlook Contacts!</p> -
Deleting existing payroll results.
can we delete the existing fields in standard Infotypes.
and how can we delete existing payroll results from RT and CRT. please give the procedure.
how can we identify an employee termination date? using infotype(41) i 'mean which field.
and tell the procedure.
if possible please provide the fields of infotypes to my id [email protected]
Thanks in advance.Hi Vamsi,
If you want to Code the syntax then this might help you
DATA: loeschkey LIKE pcl2-srtfd.
* loeschkey contains Pernr and the sequence number of the Payroll Result.
* This statement can be used with a loop to delete the result if multipal records
* have to be deleted.
DELETE FROM pcl2 CLIENT SPECIFIED
WHERE relid = " ID of the Cluster .... it is country specific
AND client = sy-mandt " Client Number
AND srtfd = loeschkey. "
There are also standard reports available for deleting single or multiple payroll results.
Have a look at this <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/sdn/profile?userid=1894412">tread</a>
~Bhawanidutt.
PS: Close the thread and reward points if solved. -
How do I add an Airport Extreme with Time Capsule to an existing non-Apple network?
How do I add an Airport Extreme with Time Capsule to an existing non-Apple network? I have an ISP provided wireless Router that has to remain as the base station. I am able to join my 1/2TB Airport Extreme and Airport Express to the network, but I can't access/use Time Machine.
One option would be to connect the Time Capsule (TC) to the ISP-provided wireless router by Ethernet. You can then configure the for a roaming network. Then, depending on what your goal is for the AirPort Express, you can either: 1) Add it to roaming network, 2) Configure the TC & the Express for an "extended" network, or 3) Configure the Express to "join" the roaming network for AirPlay.
-
My iPhone 5 won't delete videos I've recorded.
My iPhone 5 won't delete videos I've recorded. when I go to select them, the blue trash can doesn't turn blue allowing me to delete them. Help?
Try This...
Close All Open Apps... Sign Out of your Account... Perform a Reset... Try again...
Reset ( No Data will be Lost )
Press and Hold the Sleep/Wake Button and the Home Button at the Same Time...
Wait for the Apple logo to Appear...
Usually takes about 15 - 20 Seconds... ( But can take Longer...)
Release the Buttons...
If no joy... Try a Restore...
1: Connect the device to Your computer and open iTunes.
2: If the device appears in iTunes, select and click Restore on the Summary pane. -
How to delete fixed number of records at a time
Hi,
I have a millions of records in one table.I have to purge first 15 days data per day by deleting at time 10000 records.
Appreciate any ideas from you.
Thanks in Advance.
regards,
SureshHi,
I have a millions of records in one table.I haveto
purge first 15 days data per day by deleting attime
10000 records.
Appreciate any ideas from you.
Obviously you will need a timestamp.I have one column which will have record created time
Why would you limit it to 10,000 at a time?I am using oracle 9i as back end.My requirement is not to delete more than 10000 at a time as load will be very high on this table. -
Can no longer open two Firefox tabs at the same time. Will disappear from screen but won't close - have to go to Task Bar/ Processes to close it, some links won't activate on websites. As suggested in the support area I have gone through the process of backing up my profile, deleted existing profile, removed firefox from computer, reinstalled profile in fresh folder, reinstalled firefox 6. I have done this twice for no effect - may work as it used to for a while (30mins) then gets back to its old tricks.
It is a bit irritating.Sometimes a problem with Firefox may be a result of malware installed on your computer, that you may not be aware of.
You can try these free programs to scan for malware, which work with your existing antivirus software:
* [http://www.microsoft.com/security/scanner/default.aspx Microsoft Safety Scanner]
* [http://www.malwarebytes.org/products/malwarebytes_free/ MalwareBytes' Anti-Malware]
* [http://support.kaspersky.com/faq/?qid=208283363 TDSSKiller - AntiRootkit Utility]
* [http://www.surfright.nl/en/hitmanpro/ Hitman Pro]
* [http://www.eset.com/us/online-scanner/ ESET Online Scanner]
[http://windows.microsoft.com/MSE Microsoft Security Essentials] is a good permanent antivirus for Windows 7/Vista/XP if you don't already have one.
Further information can be found in the [[Troubleshoot Firefox issues caused by malware]] article.
Did this fix your problems? Please report back to us! -
What is the best practice of deleting large amount of records?
hi,
I need your suggestions on best practice of deleting large amount of records of SQL Azure regularly.
Scenario:
I have a SQL Azure database (P1) to which I insert data every day, to prevent the database size grow too fast, I need a way to remove all the records which is older than 3 days every day.
For on-premise SQL server, I can use SQL Server Agent/job, but, since SQL Azure does not support SQL Job yet, I have to use a Web job which scheduled to run every day to delete all old records.
To prevent the table locking when deleting too large amount of records, in my automation or web job code, I limit the amount of deleted records to
5000 and batch delete count to 1000 each time when calling the deleting records stored procedure:
1. Get total amount of old records (older then 3 days)
2. Get the total iterations: iteration = (total count/5000)
3. Call SP in a loop:
for(int i=0;i<iterations;i++)
Exec PurgeRecords @BatchCount=1000, @MaxCount=5000
And the stored procedure is something like this:
BEGIN
INSERT INTO @table
SELECT TOP (@MaxCount) [RecordId] FROM [MyTable] WHERE [CreateTime] < DATEADD(DAY, -3, GETDATE())
END
DECLARE @RowsDeleted INTEGER
SET @RowsDeleted = 1
WHILE(@RowsDeleted > 0)
BEGIN
WAITFOR DELAY '00:00:01'
DELETE TOP (@BatchCount) FROM [MyTable] WHERE [RecordId] IN (SELECT [RecordId] FROM @table)
SET @RowsDeleted = @@ROWCOUNT
END
It basically works, but the performance is not good. One example is, it took around 11 hours to delete around 1.7 million records, really too long time...
Following is the web job log for deleting around 1.7 million records:
[01/12/2015 16:06:19 > 2f578e: INFO] Start getting the total counts which is older than 3 days
[01/12/2015 16:06:25 > 2f578e: INFO] End getting the total counts to be deleted, total count:
1721586
[01/12/2015 16:06:25 > 2f578e: INFO] Max delete count per iteration: 5000, Batch delete count
1000, Total iterations: 345
[01/12/2015 16:06:25 > 2f578e: INFO] Start deleting in iteration 1
[01/12/2015 16:09:50 > 2f578e: INFO] Successfully finished deleting in iteration 1. Elapsed time:
00:03:25.2410404
[01/12/2015 16:09:50 > 2f578e: INFO] Start deleting in iteration 2
[01/12/2015 16:13:07 > 2f578e: INFO] Successfully finished deleting in iteration 2. Elapsed time:
00:03:16.5033831
[01/12/2015 16:13:07 > 2f578e: INFO] Start deleting in iteration 3
[01/12/2015 16:16:41 > 2f578e: INFO] Successfully finished deleting in iteration 3. Elapsed time:
00:03:336439434
Per the log, SQL azure takes more than 3 mins to delete 5000 records in each iteration, and the total time is around
11 hours.
Any suggestion to improve the deleting records performance?This is one approach:
Assume:
1. There is an index on 'createtime'
2. Peak time insert (avgN) is N times more than average (avg). e.g. supposed if average per hour is 10,000 and peak time per hour is 5 times more, that gives 50,000. This doesn't have to be precise.
3. Desirable maximum record to be deleted per batch is 5,000, don't have to be exact.
Steps:
1. Find count of records more than 3 days old (TotalN), say 1,000,000.
2. Divide TotalN (1,000,000) with 5,000 gives the number of deleted batches (200) if insert is very even. But since it is not even and maximum inserts can be 5 times more per period, set number of deleted batches should be 200 * 5 = 1,000.
3. Divide 3 days (4,320 minutes) with 1,000 gives 4.32 minutes.
4. Create a delete statement and a loop that deletes record with creation day < today - (3 days ago - 3.32 * I minutes). (I is the number of iterations from 1 to 1,000)
In this way the number of records deleted in each batch is not even and not known but should mostly within 5,000 and even you run a lot more batches but each batch will be very fast.
Frank -
Best way to delete large number of records but not interfere with tlog backups on a schedule
Ive inherited a system with multiple databases and there are db and tlog backups that run on schedules. There is a list of tables that need a lot of records purged from them. What would be a good approach to use for deleting the old records?
Ive been digging through old posts, reading best practices etc, but still not sure the best way to attack it.
Approach #1
A one-time delete that did everything. Delete all the old records, in batches of say 50,000 at a time.
After each run through all the tables for that DB, execute a tlog backup.
Approach #2
Create a job that does a similar process as above, except dont loop. Only do the batch once. Have the job scheduled to start say on the half hour, assuming the tlog backups run every hour.
Note:
Some of these (well, most) are going to have relations on them.Hi shiftbit,
According to your description, in my opinion, the type of this question is changed to discussion. It will be better and
more experts will focus on this issue and assist you. When delete large number of records from tables, you can use bulk deletions that it would not make the transaction log growing and runing out of disk space. You can
take the table offline for maintenance, a complete reorganization is always best because it does the delete and places the table back into a pristine state.
For more information about deleting a large number of records without affecting the transaction log.
http://www.virtualobjectives.com.au/sqlserver/deleting_records_from_a_large_table.htm
Hope it can help.
Regards,
Sofiya Li
Sofiya Li
TechNet Community Support
Maybe you are looking for
-
Is there a way to delete some of the used versions of the project I've just completed?
How can I deleted unusable versions of the project I've just completed in PE 9?
-
How to start WebDynpro - Feeling down and lost
Hi guys, I started working with webdynpro based on ABAP. Its definately nice and I appreciate SAPs tutorials. But they are just the basics and a lot of things are not described. How can I create a good tree model? SAP online docu has nothing about it
-
VPRS condition type not to display
Hi all I have condition type VPRS in my pricing procedure but my client wants it should not be display to the all user while creating sale order. Is there any way. Kindly replay urgently. Vinod
-
I have discovered the fix for the iPhoto color shift bug (at least with OS X 10.3.9). For those who tried the solution of renaming the Generic RGB Profile, this idea was heading down the right track, but ultimately is not the answer (nor did it eve
-
HT4623 i have the original iphone how can i update the ios
i have the iphone 3g (i think) and it has the old ios. i want to update it to use better apps