Inserting an entry in to a partition in a particular node
I have the below scenario
(1) Initiate Cache A with a Map Trigger
(2) Initiate Cache B
(3) Insert Entry A into Cache A. This will call the Map Trigger
(4) I am inserting Entry B into Cache B in the Map Trigger
I want to keep Entry A and Entry B in the same node. Data affinity would not work on this scenario since Cache A and Cache B are from two different cache services.
Cache A and Cache B are distributed caches.
Is there a way I can specify the Entry B to be inserted to the same node that the trigger is running ?
Thanks
Dasun.
Dasun.Weerasinghe wrote:
I have the below scenario
(1) Initiate Cache A with a Map Trigger
(2) Initiate Cache B
(3) Insert Entry A into Cache A. This will call the Map Trigger
(4) I am inserting Entry B into Cache B in the Map Trigger
I want to keep Entry A and Entry B in the same node. Data affinity would not work on this scenario since Cache A and Cache B are from two different cache services.
Cache A and Cache B are distributed caches.
Is there a way I can specify the Entry B to be inserted to the same node that the trigger is running ?
Thanks
Dasun.Hi Dasun,
there can be no hard guarantees provided by Coherence to this problem, as Coherence cannot guarantee that you configured it in a way that both cache services are running on exactly the same nodes in storage-enabled way, nor that you configured the same partition count. Also, different lifecycle-related events can happen to the service which could make only one of the services die on a certain node...
There is a best-effort PartitionAssignmentStrategy implementation (called com.tangosol.net.partition.MirroringAssignmentStrategy) which assumes that configuration is appropriate but even that cannot guarantee that the partition lifecycles are synchronized together, therefore even with that the trigger should not go to the backing map of the other service. Also, if service B is overwhelmed by operations, it would also make service A run slow if service A invoked service B from within the trigger.
The only safe approach is to use MirroringAssignmentStrategy and offload operations from the trigger to another (single) thread which would carry out operations on service B, which would hopefully be local operations but even if not, it would not break. The usual idempotency concerns of course apply as operations on service A may be re-executed leading to double execution on the trigger A.
Best regards,
Robert
Similar Messages
-
How to insert multiple entries in table control
Hi All,
I want to insert multiple entries in table control in BDC program.Can any body help in this regard.
Thanks,
Satish.Hi,
Go through the following 2 example programs:
THis is example to upload the Bank details of the Vendor which has the TC.
REPORT zprataptable2
NO STANDARD PAGE HEADING LINE-SIZE 255.
DATA : BEGIN OF itab OCCURS 0,
i1 TYPE i,
lifnr LIKE rf02k-lifnr,
bukrs LIKE rf02k-bukrs,
ekorg LIKE rf02k-ekorg,
ktokk LIKE rf02k-ktokk,
anred LIKE lfa1-anred,
name1 LIKE lfa1-name1,
sortl LIKE lfa1-sortl,
land1 LIKE lfa1-land1,
akont LIKE lfb1-akont,
fdgrv LIKE lfb1-fdgrv,
waers LIKE lfm1-waers,
END OF itab.
DATA : BEGIN OF jtab OCCURS 0,
j1 TYPE i,
banks LIKE lfbk-banks,
bankl LIKE lfbk-bankl,
bankn LIKE lfbk-bankn,
END OF jtab.
DATA : cnt(4) TYPE n.
DATA : fdt(20) TYPE c.
DATA : c TYPE i.
INCLUDE bdcrecx1.
START-OF-SELECTION.
CALL FUNCTION 'WS_UPLOAD'
EXPORTING
filename = 'C:\first1.txt'
filetype = 'DAT'
TABLES
data_tab = itab.
CALL FUNCTION 'WS_UPLOAD'
EXPORTING
filename = 'C:\second.txt'
filetype = 'DAT'
TABLES
data_tab = jtab.
LOOP AT itab.
PERFORM bdc_dynpro USING 'SAPMF02K' '0100'.
PERFORM bdc_field USING 'BDC_CURSOR'
'RF02K-KTOKK'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_field USING 'RF02K-LIFNR'
itab-lifnr.
PERFORM bdc_field USING 'RF02K-BUKRS'
itab-bukrs.
PERFORM bdc_field USING 'RF02K-EKORG'
itab-ekorg.
PERFORM bdc_field USING 'RF02K-KTOKK'
itab-ktokk.
PERFORM bdc_dynpro USING 'SAPMF02K' '0110'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFA1-LAND1'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_field USING 'LFA1-ANRED'
itab-anred.
PERFORM bdc_field USING 'LFA1-NAME1'
itab-name1.
PERFORM bdc_field USING 'LFA1-SORTL'
itab-sortl.
PERFORM bdc_field USING 'LFA1-LAND1'
itab-land1.
PERFORM bdc_dynpro USING 'SAPMF02K' '0120'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFA1-KUNNR'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_dynpro USING 'SAPMF02K' '0130'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFBK-BANKN(01)'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=ENTR'.
cnt = 0.
LOOP AT jtab WHERE j1 = itab-i1.
cnt = cnt + 1.
CONCATENATE 'LFBK-BANKS(' cnt ')' INTO fdt.
PERFORM bdc_field USING fdt jtab-banks.
CONCATENATE 'LFBK-BANKL(' cnt ')' INTO fdt.
PERFORM bdc_field USING fdt jtab-bankl.
CONCATENATE 'LFBK-BANKN(' cnt ')' INTO fdt.
PERFORM bdc_field USING fdt jtab-bankn.
IF cnt = 5.
cnt = 0.
PERFORM bdc_dynpro USING 'SAPMF02K' '0130'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFBK-BANKS(01)'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=P+'.
PERFORM bdc_dynpro USING 'SAPMF02K' '0130'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFBK-BANKN(02)'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=ENTR'.
ENDIF.
ENDLOOP.
PERFORM bdc_dynpro USING 'SAPMF02K' '0130'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFBK-BANKS(01)'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=ENTR'.
PERFORM bdc_dynpro USING 'SAPMF02K' '0210'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFB1-FDGRV'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_field USING 'LFB1-AKONT'
itab-akont.
PERFORM bdc_field USING 'LFB1-FDGRV'
itab-fdgrv.
PERFORM bdc_dynpro USING 'SAPMF02K' '0215'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFB1-ZTERM'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_dynpro USING 'SAPMF02K' '0220'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFB5-MAHNA'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_dynpro USING 'SAPMF02K' '0310'.
PERFORM bdc_field USING 'BDC_CURSOR'
'LFM1-WAERS'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_field USING 'LFM1-WAERS'
itab-waers.
PERFORM bdc_dynpro USING 'SAPMF02K' '0320'.
PERFORM bdc_field USING 'BDC_CURSOR'
'RF02K-LIFNR'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=ENTR'.
PERFORM bdc_dynpro USING 'SAPLSPO1' '0300'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=YES'.
PERFORM bdc_transaction USING 'XK01'.
ENDLOOP.
PERFORM close_group.
ABAP Name : ZMM_PR_UPLOAD_MAT
Description : PR Upload BDC Program(With Material)
Created by : Anji Reddy V
Created on : 04/11/2004
Description : This Program is used to Upload the Purchase
Requisition data Using the Transaction ME51N.
Modification Log:
Date Programmer Correction Description
04/11/2004 Anji Reddy Initial
REPORT zmm_pr_upload_mat
NO STANDARD PAGE HEADING
LINE-SIZE 255.
Standard Include for Selection Screen
INCLUDE bdcrecx1.
Internal Table for Upload Data
DATA: BEGIN OF i_pr OCCURS 0,
Header Screen
sno(3), " SNo
bsart(004), " PR Type
epstp(001), " Item Category
knttp(001), " Account Assignment
eeind(010), " Delivery Date
lpein(001), " Category of Del Date
werks(004), " Plant
lgort(004), " Storage Location
ekgrp(003), " Purchasing Group
matkl(009), " Material Group
bednr(010), " Tracking No
afnam(012), " Requisitioner
Item Details
matnr(018), " Material No
menge(017), " Quantity
badat(010),
frgdt(010),
preis(014), " Valuation Price
waers(005), " Currency
peinh(005),
wepos(001),
repos(001),
sakto(010), " GL Account
kostl(010), " Cost Center
bnfpo(005),
END OF i_pr.
Internal Table for header Data
DATA: BEGIN OF it_header OCCURS 0,
sno(3), " SNo
bsart(004), " PR Type
epstp(001), " Item Category
knttp(001), " Account Assignment
eeind(010), " Delivery Date
werks(004), " Plant
lgort(004), " Storage Location
ekgrp(003), " Purchasing Group
matkl(009), " Material Group
bednr(010), " Tracking No
afnam(012), " Requisitioner
END OF it_header.
Internal Table for Item Data
DATA: BEGIN OF it_item OCCURS 0,
sno(3), " SNo
matnr(018), " Material No
menge(017), " Quantity
preis(014), " Valuation Price
sakto(010), " GL Account
kostl(010), " Cost Center
END OF it_item.
Data Variables & Constants
CONSTANTS : c_x VALUE 'X'. " Flag
DATA : v_l(2), " Counter
v_rowno(5), " Row No
v_2(2), " Counter
v_rows LIKE sy-srows, " Rows in TC
v_field(45). " String
Parameters
PARAMETERS: p_file LIKE ibipparms-path. " Filename
At selection-screen on Value Request for file Name
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
Get the F4 Values for the File
CALL FUNCTION 'F4_FILENAME'
EXPORTING
program_name = syst-cprog
dynpro_number = syst-dynnr
IMPORTING
file_name = p_file.
Start of Selection
START-OF-SELECTION.
Open the BDC Session
PERFORM open_group.
Upload the File into internal Table
CALL FUNCTION 'UPLOAD'
EXPORTING
filename = p_file
filetype = 'DAT'
TABLES
data_tab = i_pr
EXCEPTIONS
conversion_error = 1
invalid_table_width = 2
invalid_type = 3
no_batch = 4
unknown_error = 5
gui_refuse_filetransfer = 6
OTHERS = 7.
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
SORT i_pr BY sno.
LOOP AT i_pr.
MOVE-CORRESPONDING i_pr TO it_item.
APPEND it_item.
CLEAR it_item.
AT END OF sno.
READ TABLE i_pr INDEX sy-tabix.
MOVE-CORRESPONDING i_pr TO it_header.
APPEND it_header.
CLEAR it_header.
ENDAT.
ENDLOOP.
SORT it_header BY sno.
SORT it_item BY sno.
v_rows = sy-srows - 6.
Upload the Data from Internal Table
LOOP AT it_header.
Header Data
PERFORM bdc_dynpro USING 'SAPMM06B' '0100'.
PERFORM bdc_field USING 'BDC_CURSOR'
'EBAN-BEDNR'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_field USING 'EBAN-BSART'
it_header-bsart.
PERFORM bdc_field USING 'RM06B-EPSTP'
it_header-epstp.
PERFORM bdc_field USING 'EBAN-KNTTP'
it_header-knttp.
PERFORM bdc_field USING 'RM06B-EEIND'
it_header-eeind.
PERFORM bdc_field USING 'RM06B-LPEIN'
it_header-lpein.
PERFORM bdc_field USING 'EBAN-WERKS'
it_header-werks.
PERFORM bdc_field USING 'EBAN-LGORT'
it_header-lgort.
PERFORM bdc_field USING 'EBAN-EKGRP'
it_header-ekgrp.
PERFORM bdc_field USING 'EBAN-MATKL'
it_header-matkl.
PERFORM bdc_field USING 'EBAN-BEDNR'
it_header-bednr.
PERFORM bdc_field USING 'EBAN-AFNAM'
it_header-afnam.
Item Details
v_l = 0.
To add no. of rows
v_2 = 0 .
As the screen is showing 13 rows defaulted to 130
v_rowno = 130 .
LOOP AT it_item WHERE sno = it_header-sno.
v_l = v_l + 1.
IF v_l = 14 .
IF v_2 = 12 .
v_2 = 12 .
v_l = 2 .
From second time onwards it is displaying 12 rows only
v_rowno = v_rowno + 120 .
PERFORM bdc_dynpro USING 'SAPMM06B' '0106'.
PERFORM bdc_field USING 'BDC_CURSOR'
'RM06B-BNFPO'.
PERFORM bdc_field USING 'RM06B-BNFPO'
v_rowno.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
ELSE.
V_2 initialized to 12 for second screen purpose
v_2 = 12 .
v_l = 2 .
PERFORM bdc_dynpro USING 'SAPMM06B' '0106'.
PERFORM bdc_field USING 'BDC_CURSOR'
'RM06B-BNFPO'.
PERFORM bdc_field USING 'RM06B-BNFPO'
v_rowno .
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
ENDIF.
ENDIF.
PERFORM bdc_dynpro USING 'SAPMM06B' '0106'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
CLEAR v_field.
CONCATENATE 'EBAN-MATNR(' v_l ')' INTO v_field.
PERFORM bdc_field USING v_field it_item-matnr.
CLEAR v_field.
CONCATENATE 'EBAN-MENGE(' v_l ')' INTO v_field.
PERFORM bdc_field USING v_field it_item-menge.
PERFORM bdc_dynpro USING 'SAPMM06B' '0102'.
PERFORM bdc_field USING 'BDC_CURSOR'
'EBAN-PREIS'.
PERFORM bdc_field USING 'BDC_OKCODE'
'/00'.
PERFORM bdc_field USING 'EBAN-PREIS'
it_item-preis.
PERFORM bdc_dynpro USING 'SAPMM06B' '0505'.
PERFORM bdc_field USING 'BDC_CURSOR'
'EBKN-SAKTO'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=ENTE'.
PERFORM bdc_field USING 'EBKN-SAKTO'
it_item-sakto.
Cost Center
PERFORM bdc_dynpro USING 'SAPLKACB' '0002'.
PERFORM bdc_field USING 'BDC_CURSOR'
'COBL-KOSTL'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=ENTE'.
PERFORM bdc_field USING 'COBL-KOSTL'
it_item-kostl.
PERFORM bdc_field USING 'BDC_OKCODE'
'=ENTE'.
ENDLOOP.
PERFORM bdc_dynpro USING 'SAPMM06B' '0106'.
PERFORM bdc_field USING 'BDC_CURSOR'
'RM06B-BNFPO'.
PERFORM bdc_field USING 'BDC_OKCODE'
'=BU'.
Call The Transaction
PERFORM bdc_transaction USING 'ME51'.
ENDLOOP.
Close the BDC Session
PERFORM close_group.
reward for useful answers.
Regards,
Anji -
How to guarantee that all the inserted data entries are shipped to the...
How to guarantee that all the inserted data entries are shipped to the correct Coherence caches?
I successfully created the table Employees, Address and Phone tables and their conresponding Java Classes. I also finished all the JPA and persistence works. Any updated data can persist to the corresponding database tables. Now I want to create the corresponding Coherence caches for these java objects. I want to know how to guarantee that all the inserted data entries are shipped to the correct Coherence caches? For example, guarantee all the employee data entries are added to the Employees caches and all the address data entries are shipped to the Address caches (do not ship the address data entries to the Phone or Employees caches)?
Thank youqkc wrote:
How to guarantee that all the inserted data entries are shipped to the correct Coherence caches?
I successfully created the table Employees, Address and Phone tables and their conresponding Java Classes. I also finished all the JPA and persistence works. Any updated data can persist to the corresponding database tables. Now I want to create the corresponding Coherence caches for these java objects. I want to know how to guarantee that all the inserted data entries are shipped to the correct Coherence caches? For example, guarantee all the employee data entries are added to the Employees caches and all the address data entries are shipped to the Address caches (do not ship the address data entries to the Phone or Employees caches)?
Thank youWhat persistence architecture are you using?
- Cache stores with JPA/Toplink cache stores?
- JPA implementation with Coherence used as a cache provider for the JPA engine?
- TopLink grid?
- anything else?
Until you provide this bit of info, don't expect a meaningful answer...
Best regards,
Robert -
Transport request prompted when insert new entries for ADCOMP via SM30?
When I insert new entries for table ADCOMP via SM30, the system prompts me for a transport request number when I save the entries. It never asks me the transport no. for any other tables via SM30 edit. Is it something specific to table ADCOMP?
The Table Maintenace Generator is delived by SAP with Authorization group assign to VC (SD User Control), will this cause the issue I have here?
The user is responsible for updating the entries in Production, I had created a Z tcode to run SM30 on this table, at the end, it stills ask for a transport request no.
Can someone help me out on this?
Thanks for your help in advance.I had searched OSS notes before I posted my question here. Note# 726769 mentioned the maintenance view on ADCOMP. As I mentioned in my last reply, I am NOT going to change anything on ADCOMP, I just create a custom view ZV_ADCOMP on top of it and change the attributes on the custom view.
I tried and it seems working. I can enter data via SM30 on ZV_ADCOMP, it allows me to save the data without a transport request.
I think my problem has been solved. Thanks for Rob and Suhas 's reply. Points has been rewarded for both. -
Best way to insert data into a non indexed partitioned table?
Hi,
I've a rather basic question that I haven't been able to find an answer to so hope that someone may be able to help.
We need to insert sales data into a partitioned table that is range partitioned on sales date. The incoming sales data can span a number of days, e.g. 31-JUL-2007 through to 04-AUG-2007 inclusive. I've come up with two approaches and would like to know which you would recommend and why?
Approach 1: Load data into each individual partition using a statement like:
INSERT /*+ APPEND */ INTO sales PARTITION (sales_20070731) SELECT * FROM sales_new WHERE sales_date = TO_DATE('31-AUG-2007','DD-MON-YYYY')
Approach 2: Load data into the entire table using a statement like:
INSERT /*+ APPEND */ INTO sales SELECT * FROM sales_new
ThanksYou need to compare both approaches by creating simple test case.
But there is no advantages in approach 1until you will have index on sales_date column.
With index these approaches are comparable and you can choose whatever best fits to your requirements.
Best Regards,
Alex -
Table Partitionining - Insert on primary select from previous partition
Hello,
I have a large table storing log data from the application. I want to query this table <TABLEA> and insert the result data to another table <TABLEB>.
At the same time another query is running selecting only from <TABLEB>.
I want to create 2 partitions on the TABLEB where the insert query will use one partition and the select query the second partition.
How can i do that ?alekons wrote:
Hello,
I have a large table storing log data from the application. I want to query this table <TABLEA> and insert the result data to another table <TABLEB>.
At the same time another query is running selecting only from <TABLEB>.
I want to create 2 partitions on the TABLEB where the insert query will use one partition and the select query the second partition.
How can i do that ?what type of partitioning was used?
Oracle support the following partitioning methods:
Range partitioning - data is mapped to partitions based on a range of column values (usually a date column)
Hash partitioning - data is mapped to partitions based on a hashing algorithm, evenly distributing data between the partitions.
List partitioning - data is mapped to partitions based on a list of discrete values.
Interval partitioning - data is mapped to partitions based on an interval
Composite partitioning - combinations of the above methods
Range-Hash partitioning - data is partitioned by range, then hashed to sub-partitions.
Range-List partitioning - data is partitioned by range, then to sub-partitions based on a value list.
Range-range partitioning
List-range partitioning
List-hash partitioning
List-list partitioning
Interval partitioning, an extension of range partitioning. New partitions are automatically added.
System partitioning, application controlled partitioning.
Reference partitioning, partitioned like a parent table (referenced in a referential constraint). -
Insert new entry on SD Document Flow
Hi,
I'm trying to insert a new entry on SD document without success. Actually I'm trying to use the function SD_DOCUMENT_FLOW_MAINTAIN but it's not working, I don't know what I'm missing. I've tried this one as well: RV_DOCUMENT_FLOW_ADD.
I have to insert a connection between the delivery and the invoice document.
Thanks a lot,
Alexandre.Hi,
the scenario is somewhat similar to a o/b delivery created from a PO & then u do MIGO & MIRO for billing purposes.
u can link it using tables LIKP,LIPS(for delivery header & item) EKBE(PO history), VBFA (document flow)
in LIKP u have LIFEX which is PO number
EKKO-EBELN = LIKP-LIFEX .
once u have PO number u can proceed fetching MIGO & MIRO docs from EKBE and do the necessary processing.
For reference u can check
FM SD_DELIVERY_DETERM_VIA_LIFEX
BADI LE_SHP_GOODSMOVEMENT.
Regards,
Amit -
Insert multiple entries for calendar
I am inserting CREATEDATE and DUEDATE which could be 9/18/06
& 9/22/06 into a DB as one single entry. I would like a way to
calculate the amount of days between 9/18/06 and 9/22/06 and be
able to insert the same amount of entries as the amount of days
between these dates so that it shows as follows:
Row1: Entry 9/18/06
Row2: Entry 9/19/06
Row2: Entry 9/20/06
Row2: Entry 9/21/06
Row2: Entry 9/22/06
So that I have 5 unique entries into the DB as oppose to just
one.
Thanks
EmmanuelFor anybody interested I ended up setting up the following
which works fine:
<cfset mydate1=DateFormat(arguments.DUEDATE,"yyyy,m,dd
hh:mm:ss")>
<cfset mydate2=DateFormat(arguments.LESSONDATE,"yyyy,m,dd
hh:mm:ss")>
<Cfset intervaldays = "#Int(mydate2-mydate1)#">
<cfset correctamountofdays =
#REreplace(intervaldays,"-","","All")#>
<cfloop index="LoopCount" from="0"
to="#correctamountofdays#">
<Cfset datevar = DateAdd("d", "#LoopCount#",
"#mydate2#")>
<cfquery datasource="#application.dsn#"
name="updateLesson">
insert into
aip.LESSONPLANS
(SUBJECT,LESSONDATE,OBJECTIVE,STANDARD,ASSIGNMENT,STRATEGY,ASSESSMENT,CREATEDATE,DUEDATE, TID,MODIFICATIONS,COLOR)
values
( <cfqueryparam value="#arguments.SUBJECT#"
cfsqltype="CF_SQL_VARCHAR">,
<cfqueryparam value="#datevar#"
cfsqltype="cf_sql_timestamp">,
<cfqueryparam value="#arguments.OBJECTIVE#"
cfsqltype="CF_SQL_VARCHAR">,
<cfqueryparam value="#arguments.STANDARD#"
cfsqltype="CF_SQL_VARCHAR">,
<cfqueryparam value="#arguments.ASSIGNMENT#"
cfsqltype="CF_SQL_VARCHAR">,
<cfqueryparam value="#arguments.STRATEGY#"
cfsqltype="CF_SQL_VARCHAR">,
<cfqueryparam value="#arguments.ASSESSMENT#"
cfsqltype="CF_SQL_VARCHAR">,
SYSDATE,
<cfqueryparam value="#datevar#"
cfsqltype="cf_sql_timestamp">,
<cfqueryparam value="#session.lessonPlanTID#"
cfsqltype="CF_SQL_VARCHAR">,
<cfqueryparam value="#arguments.MODIFICATIONS#"
cfsqltype="CF_SQL_VARCHAR">,
<cfqueryparam value="#arguments.color#"
cfsqltype="CF_SQL_VARCHAR">)
</cfquery>
</cfloop> -
URGENT -- Problems when inserting/updating entries in a database tables
Hi,
I've created a custom table via SE11. Then I also created a custom program that will popuate entries in the table that I created.
When I ran the program, there were more than 6,000 entries that should be populated. But when I checked on the table, there were only 2000 entries populated. I am not sure what's wrong with my table. I tried changing the size category in the technical settings to handle larger volume of data but this does not seem to work.
Please advise.Hello Maria
The resizing of DB tables is done automatically by the SAP system so this is not the problem.
It is more likely that your list of records contains duplicate entries.
How does your report insert the records? By using INSERT or MODIFY? I assume you are using MODIFY.
To be sure about the true number of genuine records simply add the following statement to your coding:
SORT gt_itab BY keyfld1 keyfld2 ... . " sort itab by all key fields
DELETE ADJACENT DUPLICATES FROM gt_itab
COMPARING keyfld1 keyfld2 ... . " delete all duplicate entries having same key fields
DESCRIBE TABLE gt_itab. " fills sy-tfill with number of records
MESSAGE sy-tfill TYPE 'I'. " output of true record number
Regards
Uwe -
Dump DBIF_RSQL_INVALID_RSQL inserting the entry in bbp_pdisc
Hi Experts,
In our SRM senario, we are haing differnt backend system which are connected to SRM portal. Each system send the PO to SRM. I am currently facing a ssue with SRM system. When i create the PO in some backend system i am able to see the PO in SRM box. but for other systems PO it is giving the short dump as DBIF_RSQL_INVALID_RSQL.
On analysis of the dump i found that the error is occuring in the inesrt statemnt while iinserting the data in table bbp_pdisc. in standard program.
I am not sure while this is happing only for few system , Can any one let me know why this is happening ,any inputs on this are really helpful.hi ,
Please check table bbp_pdisc format and pass data according
if there is data type mismatch then it will give dump .
regards
Deepak. -
INSERTING A VALUES IN A DDIC TABLE AT A PARTICULAR INDEX VIA WORKAREA
HI ALL
Can I INSERT a value at a particular index in a DDIC table via a workarea with similar structure.??
like if I have a DDIC table with four feilds and creating a internal table and a workarea with similar structure.
after fetching the values from DDIC table to our internal table and making some changes or wants to insert a values at a particular index. Then Cal I write like:
INSERT WA INTO zMARA INDEX 2.
IS IT WRITE?
OR PLEASE SUGGEST ME THE CORRECT CODE.You can insert or rather update the row of table desired by using Primary key combination..
A Db table will have at least one primary key. So first select the desired record to be updated using
unique combination of primary key to get the row you want to modify in work area then change the
value of that field in work are and then use MODIFY db table from workarea.. Remembr you can't change primary key
If key combination is not found in DB table mentioned in work area then it will insert the row in ZABC..
Code..
consider ZABC having 2 fileds AA and BB, then consider AA as primary key.. and BB as normal field..
Consider row having value AA = 1 and BB = 2.
data: wa_ZABC type ZABC.
data: i_zabc type table of zabc with header line.
Select single * from ZABC into wa_zabc where AA = '1'.
wa_zabc-bb = '3'.
modify ZABC from wa_zabc.
if you want to change multiple lines in internal table and then update in DB table then use
Modify ZABC from table i_zabc. after selecting value of zabc in i_zabc and then change
value by loop .. endloop and then apply the Modify statement..
Hope this solves your query..
Regards,
Uday Desai. -
SMQ2 entries - PI dummy activities cannot have input/output nodes
Hi
I've just completed an upgrade of APO from 30A to SCM5 SR2.
I'm trying to reconcile differencies between R/3 and SCM with the DELTAREPORT3 (/sapapo/ccr).
It keeps coming back with alot of entries for Planned Orders.
When I try and reconcile (send to APO) the APO inbound queue (SMQ2) fills up with loads of SYSFAIL entries.
Each entry has the same error "PI dummy activities cannot have input/output nodes".
Can anyone shed some light on why this could happen, and what this BASIS guy can do to resolve?
thanks,
StephenHI Stephen.
Would suggest that there is a problem with master data. Try and work with someone with APO/R3-PP knowledge and compare PP master data with APO PPM's or PDS's.
Pick a particular planned order that is failing and work through it methodically understanding what master data it was using when it was generated. Compare this data to the APO master data.
Very general, I know but quite often master data is at the bottom of these things, especially if you are using PP-PI.
Regards, Mark. -
Insert statement does not insert all records from a partitioned table
Hi
I need to insert records in to a table from a partitioned table.I set up a job and to my surprise i found that the insert statement is not inserting all the records on the partitioned table.
for example when i am using select statement on to a partitioned table
it gives me 400 records but when i insert it gives me only 100 records.
can anyone help in this matter.INSERT INTO TABLENAME(COLUMNS)
(SELECT *
FROM SCHEMA1.TABLENAME1
JOIN SCHEMA2.TABLENAME2a
ON CONDITION
JOIN SCHEMA2.TABLENAME2 b
ON CONDITION AND CONDITION
WHERE CONDITION
AND CONDITION
AND CONDITION
AND CONDITION
AND (CONDITION
HAVING SUM(COLUMN) > 0
GROUP BY COLUMNS -
Inserting data into a partition.
Hi,
I am inserting 10 million rows into a partition and its consuming lot of time (>24 hrs).
I am using simple insert statement.
Is there any better way to insert data into a partition.
Thanks for your help in advance
Thanks
ManjuHello, this has nothing to do with the fact that the target table is partitioned.
One thing you could try first-off, is a Direct Path Insert, i.e.,
INSERT /*+ APPEND */ INTO part1
SELECT...That will speed it up, but remember to do a COMMIT directly after INSERTing.
Are you sure that the SELECT isn't taking a long time in itself?
What is the value of your cursor_sharing database parameter? -
Hello experts,
I got an easy question for you. How can I include new entries on any table that at the beginning does not have entries?. If the table has entries I can select one of them and insert new ones by debugging but I don´t know how to do it without entries
Thanks a lot
Regards
GermanHi,
Go to se16,
Write the table name you need then click on the white paper ()
Insert your entries:
Finally, click on Save
Maybe you are looking for
-
Hard Drive And RAM upgrade queries
I have a Mid 2010 Unibody White MacBook. I want to upgrade the Hard Drive and the RAM. I currently have a 250GB hard drive and 2GB of RAM that came with the MacBook. I want to upgrade the Hard Drive to either 640 GB or 750 GB. On checking a few other
-
Problem creating in some attribute options for class through java API
Hello everyone, I am successfully creating class and its attributes but i am not able to create attributes with its option like IS REQUIRED, IS INDEXED, IS SETTABLE and READ ONLY..... public void create_class(String cname,String cdesc,String scname)
-
Mountain Lion Hangs on Early 2008 Mac Pro
I installed Mountain Lion 12A269 on a freshly formatted disk on my Early 2008 Mac Pro. I have 10 GB of memory. Once or twice a day, the computer hangs. I can move the mouse cursor, but can't click on anything and the keyboard is inactive. The dock do
-
Any configuration required from SD about Sub-contracting PO
Dear All, Need your assistance.I am SD connsultant. 1. MM team can directly create an outbound delivery from ME2O now. 2. It creates an outbound delivery and delivery note in the system 3. The create delivery process is not reducing any inven
-
Error when exporting to Tiffs or JPEGs
I've finished toning, and cropping 103 ORF raw files. When I try export them to a to a new folder, I get a failed to export message, and none of the files will open in the develop module. I rebooted, and ran defrag, and just to be on the safe side I