Group by problem from huge table
Hi,
I have a huge cdr table which contains 700 million records. There s no partition only , index on gsm_no column. (main_cdr_table)
And i have another table with the following columns . gsm_no , cdr_count , last_digit. (gsm_temp)
What i want to do is , to get a sum from main_cdr_table for each gsm_no and store it another table , like this.
insert into my_temp
select gsm_no,sum(price_amount)
from main_cdr_table
group by gsm_no
As you see this way is not efficient. What can you offer to perform this operation?
Thanks...
Alas I'm a rather underprivileged user . (some thirteenth pig we use to say)
Nevertheless I could try to find out something useful (reporting to investigate big table compression(s) should look serious enough in these vacation times )
As far as I can remember, maybe a year ago, I was investigating various table compressions but using only a 3M rows table collecting 100% sample statistics to check the results.
There was quite a difference when comparing the number of blocks, but a rather small one when comparing select timings. A 3M rows table seems too small for the task.
A 92M table might reveal new facts.
At the moment I know collecting 100% sample statistics took 4.5 hours.
Regarding automagical things I cannot check how things get done (I can only read in the documentation about that), the storage is definitely data aware (usually you don't need an index for each and every low cardinality - storage full scan seems to be smart enough to almost always get just the blocks where the required rows are located)
I'll let you know about the findings.
Regards
Etbin
Sorry, no further findings possible for now - ORA-01625: unable to extend temp segment by 8192 in tablespace ...
Message was edited by: Etbin
Similar Messages
-
Huge Group by operation on Huge Table takes lot of time
Hi,
Pl find the below given process which takes of time in execution (approx 5-6 hrs)
The mailn reason for this is
1) It Fetch data from huge table partition (i.e 18GB data for per day)
2)Performs Group by operations
3)In the where clause Index is not there on destination_number so performs Full Table scan
I have some idea i.e I need to change the Some Parameter which will make the process faster ,
Can you please help on this
create or replace table tmp_kumar nologging as
SELECT c.series_num , subscriber_id , COUNT(1) cnt , SUM(NVL(total_currency_charge,0))total_currency_charge ,
TRUNC(disconnect_date) FROM
(select * from prepcdr.PREPCDR_MAR_P3_10 partition(disconnect_date_11) union all
select * from prepcdr.PREPCDR_MAR_P3_10 partition(disconnect_date_11_new)) b,
(SELECT series_num, des, created_dt, LENGTH (series_num) len
FROM PREPCDR.HSS_SERIES_MAST where home_ind ='Y'
UNION
SELECT cimd_number, des, created_dt, LENGTH (cimd_number)
FROM PREPCDR.HSS_CIMD_MASTER) c
WHERE b.cdr_call_type = '86'
AND SUBSTR (b.destination_number, 1, c.len) = c.series_num
AND c.len = (SELECT MAX(x.len) FROM (SELECT series_num, des, created_dt, LENGTH (series_num) len
FROM PREPCDR.HSS_SERIES_MAST where home_ind ='Y'
UNION
SELECT cimd_number, des, created_dt, LENGTH (cimd_number) len
FROM PREPCDR.HSS_CIMD_MASTER) x WHERE x.series_num = SUBSTR (b.destination_number, 1, x.len))
AND disconnect_date >= '11-MAR-2010'
AND disconnect_date < '12-MAR-2010'
GROUP BY c.series_num , TRUNC(disconnect_date) , suBscriber_idThis, most likely, will be more efficient:
SELECT c.series_num,
subscriber_id,
COUNT(1) cnt,
SUM(NVL(total_currency_charge,0)) total_currency_charge,
TRUNC(disconnect_date)
FROM (
select *
from prepcdr.PREPCDR_MAR_P3_10 partition(disconnect_date_11)
union all
select *
from prepcdr.PREPCDR_MAR_P3_10 partition(disconnect_date_11_new)
) b,
SELECT DISTINCT series_num,
des,
created_dt,
len
FROM (
SELECT series_num,
des,
created_dt,
len,
RANK() OVER(ORDER BY len) rnk
FROM (
SELECT series_num,
des,
created_dt,
LENGTH(series_num) len
FROM PREPCDR.HSS_SERIES_MAST
where home_ind ='Y'
UNION ALL
SELECT cimd_number,
des,
created_dt,
LENGTH(cimd_number)
FROM PREPCDR.HSS_CIMD_MASTER
WHERE rnk = 1
) c
WHERE b.cdr_call_type = '86'
AND SUBSTR(b.destination_number,1,c.len) = c.series_num
AND disconnect_date >= DATE '2010-03-11'
AND disconnect_date < DATE '2010-03-12'
GROUP BY c.series_num,
TRUNC(disconnect_date),
suBscriber_id
/SY. -
hi,
we need to delete from a huge table (~11 million records) based on a column lookup from another table. Other than general DELETE statement , is there any best way to have fast delete
thanks.SHMYG@rex> create table test (f1 varchar2(10));
SHMYG@rex> create table test1 (f1 varchar2(10));
SHMYG@rex> insert into test values ('a');
SHMYG@rex> insert into test values ('b');
SHMYG@rex> insert into test1 values ('a');
SHMYG@rex> select * from test;
F1
a
b
SHMYG@rex> select * from test1;
F1
a
SHMYG@rex> delete from test where exists (select * from test1 where test.f1 = test1.f1);
SHMYG@rex> select * from test;
F1
b -
Group by problem at pivot table
Hi,
firstly i am making a table, everything is ok,
after i click pivot table, st_desc column writes same value again, i couldnt understand the reason.
please check my screen shots below,
http://www.odilibrary.com/index.php/oracle-data-integrator/99-group-by-problem
kindly advise pleasethat my fault
there is a null at the beginning of the value, they are different,
sorry -
Selecting Max Value from Huge Table
Dear Proffessionals
I have a huge table (20,000,000+ records) with the following columns:
[Time], [User], [Value]
The values in [Value] column can recur for a single User at a Time e.g.
2015-01-01, Me, X
2015-01-01, Me, Y
2015-01-01, Me, X
2015-01-02, Me, Z
2015-01-02, Me, X
2015-01-02, Me, Z
For each day, and for every user I want to have the maximum recurring value :
2015-01-01, Me, X
2015-01-02, Me, Z
to be inserted into another table.
PS: I want the MOST optimized way of achieving this functionality, bcause I am expecting a growth on the raw table over time, so PERFORMANCE is of great consideration.
I would really appreciate it, if somebody can help me.
RegardsI can think of two techniques based on the data selecticity
1) using row number function
2) using cross apply operator
USE Northwind;
-- Solution 1
SELECT S.SupplierID, S.CompanyName, CA.ProductID, CA.UnitPrice
FROM dbo.Suppliers AS S
CROSS APPLY
(SELECT TOP (10) *
FROM dbo.Products AS P
WHERE P.SupplierID = S.SupplierID
ORDER BY UnitPrice DESC, ProductID DESC) AS CA
ORDER BY S.SupplierID, CA.UnitPrice DESC, CA.ProductID DESC;
-- Solution 2
WITH C AS
SELECT S.SupplierID, S.CompanyName, P.ProductID, P.UnitPrice,
ROW_NUMBER() OVER(
PARTITION BY P.SupplierID
ORDER BY P.UnitPrice DESC, P.ProductID DESC) AS RowNum
FROM dbo.Suppliers AS S
JOIN dbo.Products AS P
ON P.SupplierID = S.SupplierID
SELECT SupplierID, CompanyName, ProductID, UnitPrice
FROM C
WHERE RowNum <= 10
ORDER BY SupplierID, ProductID DESC, UnitPrice DESC;
Best Regards,Uri Dimant SQL Server MVP,
http://sqlblog.com/blogs/uri_dimant/
MS SQL optimization: MS SQL Development and Optimization
MS SQL Consulting:
Large scale of database and data cleansing
Remote DBA Services:
Improves MS SQL Database Performance
SQL Server Integration Services:
Business Intelligence -
Hi,
I am extracting data from a table which has more than 25 million records without using any where condition.
45000 records in a file, in multiple files.
What is the best way to write the select statement.
Thanks,
fractDon't know why you're extracting 25,000,000 records, but package size would be my first choice....something like:
types: gtyp_int type i.
data: itab type table of <dbtab>.
parameters: p_pkg type i default 45000.
start-of-selection.
perform myform.
form myform.
data: file_cnt type gtyp_int.
select * from <dbtab> into table <itab>
package size p_pkg.
file_cnt = file_cnt + 1.
do something with itab contents
case file_cnt.
when 1.
append lines of itab to .... "or do a table copy.
when 2.
endcase.
endselect.
endform. -
Download problem from 'Z' table
Hi
I am facing very strange problem ,
I have created a 'Z' table and now it has 2800 records but when i download from data base it only get the 1627 records
What might me the problem ???
Technical Settings :
Data class APPL0 Master data, transparent tables
Size category 3 Data records expected: 7.700 to 31.000
then i tried other standard table to download from data base save as local file....which contains 5000 records so it get all the 5000 records
so in thisw case wat can be done ?
pls help me on this
regards
rajanDATA : IT_TAB TYPE STANDARD TABLE OF ZMM_OPENSEZ.
SELECT * FROM ZMM_OPENSEZ INTO TABLE IT_TAB .
DATA: LOC_FILENAME TYPE STRING.
LOC_FILENAME = P_FILE.
* CONCATENATE P_FILE '.XLS' INTO LOC_FILENAME.
CLEAR IT_FIELDNAMES.
REFRESH IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Client'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Start Date of Period'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Last Date of Period'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Plant'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Project Definition'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Material Number'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Valuation Type'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Valution Type Description'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'SerialNo'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'ITEM GROUP LONG TEXT'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'ANNEXURE NO'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'POST1'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Number of Material Document'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Posting Date in the Document'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Movement Type (Inventory Management)'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Material Description (Short Text)'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Account Number of Vendor or Creditor'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'NAME'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Unit of Entry'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Purchasing Document Number'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Unit Rate'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Opening Proj Qty'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Opening Proj Amt'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Opening CS01 Qty'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Opening CS01 Amt'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'RECEIPT QTY'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'RECEIPT AMT'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'ISSUE QTY'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'ISSUE AMT'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'TRANSFER QTY'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'TRANSFER AMT'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Consumed Return Qty'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Consumed Return Amt'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Contractor Return Qty'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Contractor Return Amt'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'CS01 Consump. Qty'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'CS01 Consump. Amt'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'CS01 Consump. Ret Qty'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'CS01 Consump. Ret Amt'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Work in Progress Qty'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Work in Progress Amt'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Closing Stock with APL'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Total Closing Stock'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Total Closing Stock Amt'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'FINALUPDATE'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Modified By'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
IT_FIELDNAMES-STRING = 'Modified on'.
APPEND IT_FIELDNAMES.CLEAR IT_FIELDNAMES.
FILE_NAME = P_FILE.
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
* BIN_FILESIZE =
FILENAME = FILE_NAME
FILETYPE = 'ASC'
* APPEND = ' '
WRITE_FIELD_SEPARATOR = 'X'
* HEADER = '00'
* TRUNC_TRAILING_BLANKS = ' '
* WRITE_LF = 'X'
* COL_SELECT = ' '
* COL_SELECT_MASK = ' '
* DAT_MODE = 'X'
* CONFIRM_OVERWRITE = ' '
* NO_AUTH_CHECK = ' '
* CODEPAGE = ' '
* IGNORE_CERR = ABAP_TRUE
* REPLACEMENT = '#'
* WRITE_BOM = ' '
* TRUNC_TRAILING_BLANKS_EOL = 'X'
* WK1_N_FORMAT = ' '
* WK1_N_SIZE = ' '
* WK1_T_FORMAT = ' '
* WK1_T_SIZE = ' '
* WRITE_LF_AFTER_LAST_LINE = ABAP_TRUE
* SHOW_TRANSFER_STATUS = ABAP_TRUE
* IMPORTING
* FILELENGTH =
TABLES
DATA_TAB = IT_TAB
FIELDNAMES = IT_FIELDNAMES
* EXCEPTIONS
* FILE_WRITE_ERROR = 1
* NO_BATCH = 2
* GUI_REFUSE_FILETRANSFER = 3
* INVALID_TYPE = 4
* NO_AUTHORITY = 5
* UNKNOWN_ERROR = 6
* HEADER_NOT_ALLOWED = 7
* SEPARATOR_NOT_ALLOWED = 8
* FILESIZE_NOT_ALLOWED = 9
* HEADER_TOO_LONG = 10
* DP_ERROR_CREATE = 11
* DP_ERROR_SEND = 12
* DP_ERROR_WRITE = 13
* UNKNOWN_DP_ERROR = 14
* ACCESS_DENIED = 15
* DP_OUT_OF_MEMORY = 16
* DISK_FULL = 17
* DP_TIMEOUT = 18
* FILE_NOT_FOUND = 19
* DATAPROVIDER_EXCEPTION = 20
* CONTROL_FLUSH_ERROR = 21
* OTHERS = 22
IF SY-SUBRC = 0.
MESSAGE S229(ZMM).
ELSE.
MESSAGE S230(ZMM).
ENDIF.
it_tab contains 2800 records ..
Regards -
Data fetch problem from EBAN table
Hi All,
I have a problem in data fetching. My SQL statement is
SELECT A~BANFN
A~FRGDT
A~MATNR
A~MENGE
A~MEINS
A~AFNAM
A~EKGRP
A~PRIO_URG
A~STATU
A~RLWRT
A~EBELN
A~LOEKZ
A~EBELP
A~FRGKZ
INTO CORRESPONDING FIELDS OF TABLE ITAB_DATA
FROM EBAN AS A
WHERE A~STATU IN S_STATU
AND A~EKGRP = S_EKGRP
AND A~BANPR = '05'
AND AFNAM IN P_AFNAM
AND BEDNR IN P_BEDNR .
In EBAN table data in AFNAM field is like 'Mech', 'mech' & 'Mech'
Now in selection screen if user give Mech then system picks only that data where 'Mech' is there but requirement is it should pick all data related to ('Mech', 'mech', 'Mech') in AFNAM field. How do I do?
Thanks and regards,
Rajesh VasudevaHello,
What you ask is not easy but it is feasible.
We had the same request to make a case-insensitive search on a text field.
As a reference for our development we took the following example:
[http://wiki.sdn.sap.com/wiki/display/Snippets/CaseInsensitiveSearchHelpExitforMaterialGroup|http://wiki.sdn.sap.com/wiki/display/Snippets/CaseInsensitiveSearchHelpExitforMaterialGroup]
In short : the purpose is that first of all you build up a list of all possible values in a separate internal table.
Then use this separate internal table in the FOR ALL ENTRIES clause when you perform the select on the actual data.
Success.
Wim -
Printing Problem from VBFA table(Transportation Management)
Hello experts
iam getting a problem in printout from a Z program (Execution) in which some information regarding 'Delivery No.' and 'Invoice No.' is coming.
Can anyone tell me the reason for not getting the printout.
I was not getting the print preview either.
This problem is coming from few days only previously it is running fine.
Regards
Yogesh Sharmathanks
-
Problems inserting huge ammount of lines in a table
Dear all:
I have a process that reads data from one table (table A), makes some calculations and writes the modified data into another table (table B).
For each line on table A the process will write from 1 to 4 lines in table B. The calculation is VERY simple.
The process reads 1 million lines from table A and writes 4 million lines into table B, and I need to do this in 2 hours.
I have a sun/solaris server with 8 processors and an EMC storage (the machine is really fast), but I can not complete the process in a reasonable time.
The problem is that after 15 minutes the Oracle Database slows A LOT. If I query the v$session_wait view, I receive "log buffer wait" messages, and there is nothing I can do to solve this problem (many DBAs tried to solve it).
Is there a special way to insert a huge amount of data into a table in a fast way?
I really appreciate any help!I am extracting once from the source and inserting into the destinations table making calculations from the cursor variables.
I heard that the problem is a big contention at the Oracle i/o slave. I will try to use direct load and paralellize the process, I hope this can work!
Thank you!
Josi,
Since you are extracting from one source table and inserting into a target table, and you mentioned your are generating for each source 4 rows into the target, i like know knwo what your OWB mapping look like; i.e. how did you model the "row-multiplier" in the OWB mapping editor.... using a splitter like this:
+----> Calculate()-->Target1
+
Source1 ---> Splitter +----> Calculate()--> Target1
+
+----> Calculate()--> Target1
+
+----> Calculate()--> Target1
If you use the splitter and run the map in set-based mode, you are effectively extracting the table Source1 four times... quite costly... you probably get better performance by running the map in row-based mode and using the bulk mode..... again my recommendations are based on guesses what your mapping looks like and where/when you do the calculation in the dataflow graph.
In 9i there is a multi table insert statement which will provide you with a set-based splitter operation; we will make support available for multi table insert in our upcoming release. -
Problem fetching data from multiple tables
Hi,
I am facing a problem in fetching data from 3 different tables. i am able to get the data from 2 tables but when i try accessing data from other table, i am getting redundant data. please help, my query is as under.
select a.*,b.*, c.BidAmount from carregister a , watchlist b, Bidmaster c where a.CarID= b.CarID and b.UserID = 23 and c.BidAmount=(select max(BidAmount) from BidMaster where carid=a.carid group by c.BidAmount);
Regards,
Ujjwal B Soni
<Software Developer>
Baroda Gujarat IndiaHi,
I got the solution. The solved query is as under :
select a.*,b.*,(select max(BidAmount) from BidMaster where carid=b.carid) as MAX_AMT from carregister a , watchlist b where a.CarID= b.CarID and b.UserID=23;
Thank you all for replying me.
Warm Regards,
Ujjwal B Soni
<Software Developer>
<Baroda Gujarat India> -
Problem in Creating excel file from internal table
Hi Experts,
here iam having a proble with creating an excel file from an internal table which is having around 15 records, when iam generating an excel file, its getting only upto 255 chars, after that its breaking and new record is being started in the next row. how should i get all the contents in one row of the excel or may be extended to the next line , how to resolve this issue, can anyone give me some idea or sample code.
Thanks & Regards,
poornaur downloading to excel using a single field...then it will download only 255 chars....may be from internla table u can download the fields and the excel sheet gets downloaded with data....check below code....
*& Report ZETA_EXCEL_DOWNLOAD_CLIPBOARD *
report zeta_excel_download_clipboard .
include ole2incl.
data: w_cell1 type ole2_object,
w_cell2 type ole2_object.
*--- Ole data Declarations
data: h_excel type ole2_object, " Excel object
h_mapl type ole2_object, " list of workbooks
h_map type ole2_object, " workbook
h_zl type ole2_object, " cell
h_f type ole2_object, " font
gs_interior type ole2_object, " Pattern
worksheet type ole2_object,
h_cell type ole2_object,
h_cell1 type ole2_object,
range type ole2_object,
h_sheet2 type ole2_object,
h_sheet3 type ole2_object,
gs_font type ole2_object,
flg_stop(1) type c.
Internal table Declaration
data: begin of t_excel occurs 0,
vkorg(20) type c, "Sales Org
vbtyp(20) type c, "Document Category
auart(20) type c, "Document Type
ernam(20) type c, "Created By
vbeln(20) type c, "Document Number
posnr(20) type c, "Item Number
erdat(20) type c, "Created Date
vdatu(20) type c, "Header Requested Delivery Date
reqdat(20) type c, "Request date
condat(20) type c, "Confirm date
lifsk(20) type c, "Header Block
txt30(30) type c, "Order User Status Description
lifsp(20) type c, "Line Block
dispo(20) type c, "MRP Controller
dsnam(20) type c, "MRP Controller Description
vmsta(20) type c, "Material Sales Status
kunnr(20) type c, "Sold To
cname(35) type c, "Sold To Name
regio(20) type c, "State
cufd(10) type c, "CUD
bstnk(20) type c, "PO#
bsark(20) type c, "Ordering Method
matnr(20) type c, "Material
maktx(35) type c, "Material Description
t200(20) type c, "T200
vtext(20) type c, "T200 Description
matkl(20) type c, "Material Group
zzbomind(7) type c, "BOM Indicator
ostat(20) type c, "Order Status
cmgst(20) type c, "CRD
inco1(20) type c, "Incoterms
oqty(20) type c, "Order Quantity
pqty(20) type c, "Open Quantity
unit(20) type c, "UOM
onet(20) type c, "Order Value
pnet(20) type c, "Open Value
curr(20) type c, "Currency key
so_bezei like tvkbt-bezei,"Sales Office
sg_bezei like tvgrt-bezei,"Sales Group
bname(20) type c, "Ordering Party
contact(20) type c, "Contact Name
telf1(20) type c, "Contact telf1
reqqty(20) type c, "Item Request qty
reqval(20) type c, "Item Request value
conqty(20) type c, "Item Confirm qty
conval(20) type c, "Item Confirm value
zzrev(02) type c, "Revenue recognition acceptance
bezei(20) type c, "Revenue recognition text
vgbel(20) type c, "Reference Order for RETURNS
0008text(255) type c, "Internal Order Comment Text
end of t_excel.
data: t_excel_bckord like t_excel occurs 0 with header line,
t_excel_bcklog like t_excel occurs 0 with header line,
t_excel_blkord like t_excel occurs 0 with header line.
types: data1(1500) type c,
ty type table of data1.
data: it type ty with header line,
it_2 type ty with header line,
it_3 type ty with header line,
rec type sy-tfill,
deli(1) type c,
l_amt(18) type c.
data: begin of hex,
tab type x,
end of hex.
field-symbols: <fs> .
constants cns_09(2) type n value 09.
assign deli to <fs> type 'X'.
hex-tab = cns_09.
<fs> = hex-tab.
data gv_sheet_name(20) type c .
M A C R O Declaration
define ole_check_error.
if &1 ne 0.
message e001(zz) with &1.
exit.
endif.
end-of-definition.
t_excel_bckord-vkorg = 'ABC'.
t_excel_bckord-vbtyp = 'DEF'.
t_excel_bckord-auart = 'GHI'.
t_excel_bckord-ernam = 'JKL'.
t_excel_bckord-vbeln = 'MNO'.
t_excel_bckord-0008text = 'XYZ'.
append t_excel_bckord.
t_excel_bckord-vkorg = 'ABC1'.
t_excel_bckord-vbtyp = 'DEF1'.
t_excel_bckord-auart = 'GHI1'.
t_excel_bckord-ernam = 'JKL1'.
t_excel_bckord-vbeln = 'MNO1'.
t_excel_bckord-0008text = 'XYZ1'.
append t_excel_bckord.
t_excel_bckord-vkorg = 'ABC2'.
t_excel_bckord-vbtyp = 'DEF2'.
t_excel_bckord-auart = 'GHI2'.
t_excel_bckord-ernam = 'JKL2'.
t_excel_bckord-vbeln = 'MNO2'.
t_excel_bckord-0008text = 'XYZ2'.
append t_excel_bckord.
t_excel_bcklog-vkorg = 'ABC'.
t_excel_bcklog-vbtyp = 'DEF'.
t_excel_bcklog-auart = 'GHI'.
t_excel_bcklog-ernam = 'JKL'.
t_excel_bcklog-vbeln = 'MNO'.
t_excel_bcklog-0008text = 'XYZ'.
append t_excel_bcklog.
t_excel_bcklog-vkorg = 'ABC1'.
t_excel_bcklog-vbtyp = 'DEF1'.
t_excel_bcklog-auart = 'GHI1'.
t_excel_bcklog-ernam = 'JKL1'.
t_excel_bcklog-vbeln = 'MNO1'.
t_excel_bcklog-0008text = 'XYZ1'.
append t_excel_bcklog.
t_excel_bcklog-vkorg = 'ABC2'.
t_excel_bcklog-vbtyp = 'DEF2'.
t_excel_bcklog-auart = 'GHI2'.
t_excel_bcklog-ernam = 'JKL2'.
t_excel_bcklog-vbeln = 'MNO2'.
t_excel_bcklog-0008text = 'XYZ2'.
append t_excel_bcklog.
t_excel_bcklog-vkorg = 'ABC3'.
t_excel_bcklog-vbtyp = 'DEF3'..
t_excel_bcklog-auart = 'GHI3'.
t_excel_bcklog-ernam = 'JKL3'.
t_excel_bcklog-vbeln = 'MNO3'.
t_excel_bcklog-0008text = 'XYZ3'.
append t_excel_bcklog.
t_excel_blkord-vkorg = 'ABC'.
t_excel_blkord-vbtyp = 'DEF'.
t_excel_blkord-auart = 'GHI'.
t_excel_blkord-ernam = 'JKL'.
t_excel_blkord-vbeln = 'MNO'.
t_excel_blkord-0008text = 'XYZ'.
append t_excel_blkord.
t_excel_blkord-vkorg = 'ABC1'.
t_excel_blkord-vbtyp = 'DEF1'.
t_excel_blkord-auart = 'GHI1'.
t_excel_blkord-ernam = 'JKL1'.
t_excel_blkord-vbeln = 'MNO1'.
t_excel_blkord-0008text = 'XYZ1'.
append t_excel_blkord.
t_excel_blkord-vkorg = 'ABC2'.
t_excel_blkord-vbtyp = 'DEF2'.
t_excel_blkord-auart = 'GHI2'.
t_excel_blkord-ernam = 'JKL2'.
t_excel_blkord-vbeln = 'MNO2'.
t_excel_blkord-0008text = 'XYZ2'.
append t_excel_blkord.
t_excel_blkord-vkorg = 'ABC3'.
t_excel_blkord-vbtyp = 'DEF3'..
t_excel_blkord-auart = 'GHI3'.
t_excel_blkord-ernam = 'JKL3'.
t_excel_blkord-vbeln = 'MNO3'.
t_excel_blkord-0008text = 'XYZ3'.
append t_excel_blkord.
t_excel_blkord-vkorg = 'ABC4'.
t_excel_blkord-vbtyp = 'DEF4'..
t_excel_blkord-auart = 'GHI4'.
t_excel_blkord-ernam = 'JKL4'.
t_excel_blkord-vbeln = 'MNO4'.
t_excel_blkord-0008text = 'XYZ4'.
append t_excel_blkord.
loop at t_excel_bckord.
concatenate
t_excel_bckord-vkorg
t_excel_bckord-vbtyp
t_excel_bckord-auart
t_excel_bckord-ernam
t_excel_bckord-vbeln
t_excel_bckord-posnr
t_excel_bckord-erdat
t_excel_bckord-vdatu
t_excel_bckord-reqdat
t_excel_bckord-condat
t_excel_bckord-lifsk
t_excel_bckord-txt30
t_excel_bckord-lifsp
t_excel_bckord-dispo
t_excel_bckord-dsnam
t_excel_bckord-vmsta
t_excel_bckord-kunnr
t_excel_bckord-cname
t_excel_bckord-regio
t_excel_bckord-cufd
t_excel_bckord-bstnk
t_excel_bckord-bsark
t_excel_bckord-matnr
t_excel_bckord-maktx
t_excel_bckord-t200
t_excel_bckord-vtext
t_excel_bckord-matkl
t_excel_bckord-zzbomind
t_excel_bckord-ostat
t_excel_bckord-cmgst
t_excel_bckord-inco1
t_excel_bckord-oqty
t_excel_bckord-pqty
t_excel_bckord-unit
t_excel_bckord-onet
t_excel_bckord-pnet
t_excel_bckord-curr
t_excel_bckord-so_bezei
t_excel_bckord-sg_bezei
t_excel_bckord-bname
t_excel_bckord-contact
t_excel_bckord-telf1
t_excel_bckord-reqqty
t_excel_bckord-reqval
t_excel_bckord-conqty
t_excel_bckord-conval
t_excel_bckord-zzrev
t_excel_bckord-bezei
t_excel_bckord-vgbel
t_excel_bckord-0008text
into it
separated by deli.
append it.
clear it.
endloop.
loop at t_excel_bcklog.
concatenate
t_excel_bcklog-vkorg
t_excel_bcklog-vbtyp
t_excel_bcklog-auart
t_excel_bcklog-ernam
t_excel_bcklog-vbeln
t_excel_bcklog-posnr
t_excel_bcklog-erdat
t_excel_bcklog-vdatu
t_excel_bcklog-reqdat
t_excel_bcklog-condat
t_excel_bcklog-lifsk
t_excel_bcklog-txt30
t_excel_bcklog-lifsp
t_excel_bcklog-dispo
t_excel_bcklog-dsnam
t_excel_bcklog-vmsta
t_excel_bcklog-kunnr
t_excel_bcklog-cname
t_excel_bcklog-regio
t_excel_bcklog-cufd
t_excel_bcklog-bstnk
t_excel_bcklog-bsark
t_excel_bcklog-matnr
t_excel_bcklog-maktx
t_excel_bcklog-t200
t_excel_bcklog-vtext
t_excel_bcklog-matkl
t_excel_bcklog-zzbomind
t_excel_bcklog-ostat
t_excel_bcklog-cmgst
t_excel_bcklog-inco1
t_excel_bcklog-oqty
t_excel_bcklog-pqty
t_excel_bcklog-unit
t_excel_bcklog-onet
t_excel_bcklog-pnet
t_excel_bcklog-curr
t_excel_bcklog-so_bezei
t_excel_bcklog-sg_bezei
t_excel_bcklog-bname
t_excel_bcklog-contact
t_excel_bcklog-telf1
t_excel_bcklog-reqqty
t_excel_bcklog-reqval
t_excel_bcklog-conqty
t_excel_bcklog-conval
t_excel_bcklog-zzrev
t_excel_bcklog-bezei
t_excel_bcklog-vgbel
t_excel_bcklog-0008text
into it_2
separated by deli.
append it_2.
clear it_2.
endloop.
loop at t_excel_blkord.
concatenate
t_excel_blkord-vkorg
t_excel_blkord-vbtyp
t_excel_blkord-auart
t_excel_blkord-ernam
t_excel_blkord-vbeln
t_excel_blkord-posnr
t_excel_blkord-erdat
t_excel_blkord-vdatu
t_excel_blkord-reqdat
t_excel_blkord-condat
t_excel_blkord-lifsk
t_excel_blkord-txt30
t_excel_blkord-lifsp
t_excel_blkord-dispo
t_excel_blkord-dsnam
t_excel_blkord-vmsta
t_excel_blkord-kunnr
t_excel_blkord-cname
t_excel_blkord-regio
t_excel_blkord-cufd
t_excel_blkord-bstnk
t_excel_blkord-bsark
t_excel_blkord-matnr
t_excel_blkord-maktx
t_excel_blkord-t200
t_excel_blkord-vtext
t_excel_blkord-matkl
t_excel_blkord-zzbomind
t_excel_blkord-ostat
t_excel_blkord-cmgst
t_excel_blkord-inco1
t_excel_blkord-oqty
t_excel_blkord-pqty
t_excel_blkord-unit
t_excel_blkord-onet
t_excel_blkord-pnet
t_excel_blkord-curr
t_excel_blkord-so_bezei
t_excel_blkord-sg_bezei
t_excel_blkord-bname
t_excel_blkord-contact
t_excel_blkord-telf1
t_excel_blkord-reqqty
t_excel_blkord-reqval
t_excel_blkord-conqty
t_excel_blkord-conval
t_excel_blkord-zzrev
t_excel_blkord-bezei
t_excel_blkord-vgbel
t_excel_blkord-0008text
into it_3
separated by deli.
append it_3.
clear it_3.
endloop.
if h_excel-header = space or h_excel-handle = -1.
start Excel
create object h_excel 'EXCEL.APPLICATION'.
endif.
PERFORM err_hdl.
*--- get list of workbooks, initially empty
call method of h_excel 'Workbooks' = h_mapl.
PERFORM err_hdl.
set property of h_excel 'Visible' = 1.
add a new workbook
call method of h_mapl 'Add' = h_map.
PERFORM err_hdl.
*GV_SHEET_NAME = '1st SHEET'.
gv_sheet_name = 'Back Orders'.
get property of h_excel 'ACTIVESHEET' = worksheet.
set property of worksheet 'Name' = gv_sheet_name .
*--Formatting the area of additional data 1 and doing the BOLD
call method of h_excel 'Cells' = w_cell1
exporting
#1 = 1
#2 = 1.
call method of h_excel 'Cells' = w_cell2
exporting
#1 = 1
#2 = 50.
call method of h_excel 'Range' = h_cell
exporting
#1 = w_cell1
#2 = w_cell2.
*CALL METHOD OF gs_cells 'Select' .
get property of h_cell 'Font' = gs_font .
set property of gs_font 'Bold' = 1 .
data l_rc type i.
call method cl_gui_frontend_services=>clipboard_export
importing
data = it[]
changing
rc = l_rc
exceptions
cntl_error = 1
error_no_gui = 2
not_supported_by_gui = 3
others = 4.
call method of h_excel 'Cells' = w_cell1
exporting
#1 = 1
#2 = 1.
call method of h_excel 'Cells' = w_cell2
exporting
#1 = 1
#2 = 1.
PERFORM err_hdl.
call method of h_excel 'Range' = range
exporting
#1 = w_cell1
#2 = w_cell2.
call method of range 'Select'.
PERFORM err_hdl.
call method of worksheet 'Paste'.
PERFORM err_hdl.
CALL METHOD OF h_excel 'QUIT'.
*GV_SHEET_NAME = '2ND SHEET'.
gv_sheet_name = 'Backlog'.
get property of h_excel 'Sheets' = h_sheet2 .
call method of h_sheet2 'Add' = h_map.
set property of h_map 'Name' = gv_sheet_name .
get property of h_excel 'ACTIVESHEET' = worksheet.
*--Formatting the area of additional data 1 and doing the BOLD
call method of h_excel 'Cells' = w_cell1
exporting
#1 = 1
#2 = 1.
call method of h_excel 'Cells' = w_cell2
exporting
#1 = 1
#2 = 50.
call method of h_excel 'Range' = h_cell
exporting
#1 = w_cell1
#2 = w_cell2.
get property of h_cell 'Font' = gs_font .
set property of gs_font 'Bold' = 1 .
call method cl_gui_frontend_services=>clipboard_export
importing
data = it_2[]
changing
rc = l_rc
exceptions
cntl_error = 1
error_no_gui = 2
not_supported_by_gui = 3
others = 4.
call method of h_excel 'Cells' = w_cell1
exporting
#1 = 1
#2 = 1.
call method of h_excel 'Cells' = w_cell2
exporting
#1 = 1
#2 = 1.
PERFORM err_hdl.
call method of h_excel 'Range' = range
exporting
#1 = w_cell1
#2 = w_cell2.
call method of range 'Select'.
PERFORM err_hdl.
call method of worksheet 'Paste'.
*GV_SHEET_NAME = '3rd SHEET'.
gv_sheet_name = 'Blocked Orders'.
get property of h_excel 'Sheets' = h_sheet3 .
call method of h_sheet3 'Add' = h_map.
set property of h_map 'Name' = gv_sheet_name .
get property of h_excel 'ACTIVESHEET' = worksheet.
*--Formatting the area of additional data 1 and doing the BOLD
call method of h_excel 'Cells' = w_cell1
exporting
#1 = 1
#2 = 1.
call method of h_excel 'Cells' = w_cell2
exporting
#1 = 1
#2 = 50.
call method of h_excel 'Range' = h_cell
exporting
#1 = w_cell1
#2 = w_cell2.
get property of h_cell 'Font' = gs_font .
set property of gs_font 'Bold' = 1 .
call method cl_gui_frontend_services=>clipboard_export
importing
data = it_3[]
changing
rc = l_rc
exceptions
cntl_error = 1
error_no_gui = 2
not_supported_by_gui = 3
others = 4.
call method of h_excel 'Cells' = w_cell1
exporting
#1 = 1
#2 = 1.
call method of h_excel 'Cells' = w_cell2
exporting
#1 = 1
#2 = 1.
PERFORM err_hdl.
call method of h_excel 'Range' = range
exporting
#1 = w_cell1
#2 = w_cell2.
call method of range 'Select'.
PERFORM err_hdl.
call method of worksheet 'Paste'.
*--- disconnect from Excel
free object h_zl.
free object h_mapl.
free object h_map.
free object h_excel. -
Problem reading data from two tables
Hi experts,
I'm developing a JDBC - IDOC scenario that needs to read data from two oracle tables. I have created a BPM that has a initial fork for the two channels and it works fine.
The problem is that I need to read data from the first, two or both tables depending if there is data to read. If there is data in the two tables it works, but if only there is data in one of the two tables, I have read problems. I have tryed to set the 'neccesary branches' to 1 but this is a problem when I have data in both tables.
Any idea?
Best Regards,
Alfredo Lagunar.Hi,
have your fork step inside a block and then right-click your block to insert a deadline branch to your BPM process and specify the time after which your BPM process should be cancelled.....so if in that time, you get data from both tables, your BPM will work okay otherwise if that time is over, then your BPM process will be cancelled.
Regards,
Rajeev Gupta -
Problem removing focused object from a table
Hi.
I have a problem removing a component that is focused from a table. The
whole row is removed except a single component (JSpinner) in one row.
My table has three columns, the first two are String fields
(not editable) and the last is a JSpinner. Since there is no
Default cell renderer or cell editor for JSpinner I had to
create them.
So, if no element of the row is focused then calling the function:
public void clearTable() {
setVisible(false);
dataVector.removeAllElements(); // clear data vector
setVisible(true);
works just fine
but if a spinner in a row is focused them that spinner is left alone "floating" on an otherwise empty table.
Questions:
1) Any one has an ideia of what might be the problem?
2) Do I have to play with Focus to solve this? (hate the idea,
since always been told to avoid messing with focus, because
of the problems it normally brings)
Thanks in advance for the help,
Rgds,
JorgeThe problem is that you need to properly stop the editing operation first.if( myTable.isEditing() ) {
// If you want to throw away any edits use...
myTable.getCellEditor().cancelCellEditing();
// If you want to apply any edits, then end use...
myTable.getCellEditor().stopCellEditing(); -
LDAP in weblogic. Need additional GROUP from External Table
I have the LDAP authentication in weblogic & I need to get the GROUP information from external table also since I have some more groups in table apart from LDAP groups.So how can I get that.
I tried using GROUP variable in RPD but it didn't work.
Please let me know if anyone has faced this issue in OBIEE11gHI,
As per my knowledge OBIEE user should be authenticated from only one source. it should be either database authetication or LDAP authentication. we cant associate multiple initilization blocks for single system variable USER. so you should convey client to insert groups/users in LDAP.
I hope this help you and understand it.
Thanks
Jay.
Maybe you are looking for
-
Just curious about this trojan
ive been reading some things that the mac is able to get a trojan virus...i dont belive it...is it possible?
-
Unable to install tomb raider underworld
unable to install tomb raider underworld it shows that the setup is detect tomb raider is already install on your machine but i dont have any installed tomb raider. please help me..
-
Which font supported in Arabic language
Hi There, Can somebody please suggest me how to develop Arabic language course. witch font support in this language. i am using captivate 6. Looking forward for help. Thanks, Srikanth
-
Pass req due date to external app
Pass req due date to external app I have a service link agent configured to create items in an external system and I would like to pass the requistion due date as a service link parameter to the external application. Is there a way to do this using a
-
If i tag a song in itunes how can i see the tags in finder on a macbook pro
if i tag a song in itunes how can i see the tag in finder on a mcbook pro