DB2 : data compression active ?
Hi forum,
how can I find out whether data compression is active in our DB2 database with SAP BW ?
Kind regards
S0007336574
Hello Olaf,
to check whether you are entitled for compression license-wise, use the following:
db2licm -l
Check for the storage optimization feature.
To check, whether there are any tables that have the compression flag enabled
db2 "select count(*) from syscat.tables where compression = 'R' or compression = 'B'"
To get the names of these tables
db2 "select tabname, compression from syscat.tables where compression = 'R' or compression = 'B'"
"R" means row compression, "B" means both Row and Value compression.
To check the actual compression savings for tables, you can use this:
--> Be aware that it may run for a longer time
--> Replace the SAPHJZ by your own SAPSID
SELECT substr(TABNAME,1,25) as TABLE_NAME,
DICT_BUILDER,
COMPRESS_DICT_SIZE,
EXPAND_DICT_SIZE,
PAGES_SAVED_PERCENT,
BYTES_SAVED_PERCENT
FROM SYSIBMADM.ADMINTABCOMPRESSINFO
WHERE tabschema='SAPHJZ'
AND COMPRESS_DICT_SIZE > 0
order by 1 asc
Hope this helps.
Hans-Jürgen
Similar Messages
-
DB2 Row compression activated - Any changes during System Copy process?
Hell All,
I have activated DB2 Row Compression.
Now i want to do a System Copy.
Any changes due to this feature in the regular System Copy procedure? Like using R3load fastload COMPRESS options?
Thanks,
BidwanHello Bidwan,
Please see the following blog regarding row compression:
/people/johannes.heinrich/blog/2006/09/05/new-features-in-db2-udb-v9--part-4
R3load has the new option '-loadprocedure fast COMPRESS'. Started with this option it loads part of the data, does a REORG to create the compression dictionary and then loads the rest of the data. This way the table does not grow up to it's full size before it is compressed. For more details see OSS note 886231.
Regards,
Paul -
Hi All
I need some help with IBM DB2 version 9.5 to implement deep compression on very large tables compressing all the data in the tables not just from the point of enabling the compression going forward, the problem is these tables are very bit one as large as 2TB, which will take to long to REORG.
I came across the IBM DB2 Deep Compression best practice guide which describes a method of creating a 10% sample table using the BERNOULLI function in a new table space, then activating the compression flag on this smaller table, running a REORG on the smaller table to generate the compression dictionary, and then to migrate all the data in the original table to the smaller table thus compressing the full content using the compression dictionary created off the 10% sample size and renaming the table names.
Code example in the guide looks like this:
CREATE TABLE T2 LIKE T1 IN TABLESPACE_B;
DECLARE LOAD_CUR1 CURSOR FOR SELECT * FROM T1 TABLESAMPLE BERNOULLI(10);
LOAD FROM LOAD_CUR1 OF CURSOR INSERT INTO T2;
REORG TABLE T2 RESETDICTIONARY;
DECLARE LOAD_CUR2 CURSOR FOR SELECT * FROM T1;
LOAD FROM LOAD_CUR2 OF CURSOR REPLACE INTO T2;
I would like to know if this is supported by SAP and if so how to handle the indexes and index table spaces linked to the data tables and data table spaces.
I used the SAP DB2 Compression Tool in SAP Note 980067 to identify the candidates for compression but the tables identified are the very big ones.
If you have any creative idea's I will be grateful for the help.
Thanks and regards,Hello Amanda,
as far as I know, Deep Compression is supported by any SAP System that supports DB2 version 9.1 and higher. This should be supported from SAP kernel level 6.40 (SP18) upwards. But have a look at the SAP notes 900487, 1089568 or 1351160, depending on the DB2 version. Have also a look at [this page|http://www.sdn.sap.com/irj/scn/weblogs;jsessionid=%28J2EE3417800%29ID0062928350DB00623116135828352293End?blog=/pub/wlg/4330].
Compression can save lots of storage depending on your workload. It also improves I/O performance because not that much data must be moved over network. The only disadvantage known to me is that some extent of extra-CPU is needed to compress and recompress data.
Regards,
Marcel -
Using Data Compression on Microsoft SQL 2008 R2
We have a very large database which keeps growing and growing. This has made our upgrade process extremely troublesome because the upgrade wizard seems to require close to 3 times the database size of free space to even start.
As such, we are considering activating DATA COMPRESSION on the PAGE level in Microsoft SQL Server 2008 R2. This technology is native to the SQL Server and compresses the rows and the pages so that they do not take up more space than necessary.
Traditionally each row take up the space of the maximum of all the fields even though only part of a field is filled with data.
[Blog about Data Compression|http://blogs.msdn.com/b/sqlserverstorageengine/archive/2007/11/12/types-of-data-compression-in-sql-server-2008.aspx]
Our idea is to use this initially on the Axxx tables (historic data) to minimize the space they take by using for example:
ALTER TABLE [dbo].[ADO1] REBUILD PARTITION = ALL
WITH (DATA_COMPRESSION = PAGE)
On a test database we have seen tables go from 6GB of space to around 1,5GB which is a significant saving.
MY QUESTION: Is this allowed to do from SAP point of view? The technology is completely transparent but it does involve a rebuild of the table as demonstrated above.
Thanks.
Best regards,
MikeWe are using Simple recovery model, so our log files are pretty small.
Our database itself is about 140GB now and it keeps growing.
We've also reduced the history size to about 10 versions.
Still, some of our tables are 6-10GB.
Some of the advantages of data compression is to also that it improves disk I/O at the cost of slightly higher CPU, which we are pretty sure our server can handle.
Mike -
DB2 UDB Compression vs performance improvements benchmarks
Hi
Are there any bench mark results on the effect of DB2 UDB data compression. Though by theory, this is supposed to improve the performance, even after considering the trade off in the data compression, I am not seeing any real life results on this.
If there are any, particularly for the BI 7.0, please let me know.
Thanks
RajHello,
>> http://www-306.ibm.com/software/success/cssdb.nsf/CS/STRD-6Z2KE3?OpenDocument&Site=corp&cty=en_us
at first i wanna say, that i have no experience and knowledge in DB2, but if i read some statements likes this - i do not wonder about some management discussions/decisions:
The move from our existing version of DB2 to DB2 9 was very simple, so we didnt need any external help. Because DB2 9 is backwards-compatible, there was no need to upgrade our SAP software
With the STMM, we can tune the buffer pool automatically, which saves considerable time for our database administrators reducing their workload by around 10 per cent.
Our database is now 43 per cent smaller than before, and some of the largest tables have been reduced by up to 70 per cent
With DB2 9, our two-person IT team can handle database administration on top of all their other work, even without much specialist knowledge
Please correct me, if i am wrong and DB2 can really handle this:
1) Is the upgrade to DB2 9 really a big step or anything really hard to handle? I don't think so...
2) Are memory settings really so time-consuming in DB2, that a SAP administrator is spending 4 hours per week on that (if he works 40 hours per week)? I don't think so..
3) I have read some documents about the compression feature and it really seems to be great, but maybe an reorg would also make a benefit from round about 10-20%? All these aspects are not mentioned anywhere.
4) DB administration without special know-how in DB2? I could not believe that.... if you want a fast and stable environment
I am also very interested in some performance facts / data with compression and performance issues in DB2, but i am searching for real-data and not only some "manager" statements.
What i am searching for is a benchmark with defined load simulations and analysis like this one from oracle:
http://regmedia.co.uk/2007/05/20/ibmpower6.pdf
Does anyone have some links to that kind of benchmarks?
Regards
Stefan -
DSO upload and no data in Active data table
Hi Experts,
I have a strange problem.I have loaded data to DSO from DS in BI7. It has the further uplaod to cube.I have activated the DSO and it went sucessfull and had Request ID generated. It has added and transfer records available like 150000 records as I have done full upload. strangly I cannot see any data in Active data table.
Pls advise how can I check the data I am doing some mistake.I have data mart status for this DSO. the deletion of the DSO and reloading could that create the data not visible in DSO even after activation
Pls advise.
TatiHi,
I believe this got something to do with the display setting.. After displaying the data, get into the settings menu and look for Layout option --> display --> see if there is any default setting applied.. change this setting to something else.. create a setting with all the columns dragged & dropped.. These are the options you can try..
If this did not resolve.. please try displaying the data from SE16 transaction and post us the results..
Thanks,
Naren -
I am trying to compare a DB2 date format to a date variable from SSIS. I have tried to set the variable as datetime and string. I have also casted the SQL date as date in the data flow task. I have tried a number of combinations of date formats, but no luck
yet. Does anyone have any insights on how to set a date (without the time) variable and be able to use it in the data flow task SQL? It has to be an easy way to accomplish that. I get the following error below:
An invalid datetime format was detected; that is, an invalid string representation or value was specified. SQLSTATE=22007".
Thanks!Hi Marcel,
Based on my research, in DB2, we use the following function to convert a string value to a date value:
Date(To_Date(‘String’, ‘DD/MM/YYYY’))
So, you can set the variable type to String in the package, and try the following query:
ACCOUNT_DATE BETWEEN '11/30/2013' AND Date(To_Date(?, ‘DD/MM/YYYY’))
References:
http://stackoverflow.com/questions/4852139/converting-a-string-to-a-date-in-db2
http://www.dbforums.com/db2/1678158-how-convert-string-time.html
Regards,
Mike Yin
TechNet Community Support -
Buffer busy waits on UNDO data in Active Dataguard
Oracle Version: 11.1.0.7.0
Active Dataguard
Statspack has been configured for Active Dataguard on Primary database.
We got an spike of Buffer busy waits for about 5 min in Active Dataguard, this was causing worse Application SQL's response time during this 5 min window.
Below is what i got from statspack report for one hour
Snapshot Snap Id Snap Time Sessions Curs/Sess Comment
~~~~~~~~ ---------- ------------------ -------- --------- -------------------
Begin Snap: 18611 21-Feb-13 22:00:02 236 2.2
End Snap: 18613 21-Feb-13 23:00:02 237 2.1
Elapsed: 60.00 (mins)
Top 5 Timed Events Avg %Total
~~~~~~~~~~~~~~~~~~ wait Call
Event Waits Time (s) (ms) Time
buffer busy waits 2,359 2,133 904 76.2
reliable message 7,210 179 25 6.4
parallel recovery control message reply 8,831 109 12 3.9
CPU time 100 3.6
latch free 13 85 6574 3.1
Host CPU (CPUs: 16)
~~~~~~~~ Load Average
Begin End User System Idle WIO WCPU
1.07 0.82 0.68 0.39 98.88 0.00Since this is 11g version I was able to drill down on the segment on which buffer busy waits were occurring by using v$active_session_history on Active Dataguard.
SQL> select count(*),p1,p2 from v$active_session_history where event='buffer busy waits' and sample_time between to_date('21-FEB-2013 21:55:00','DD-MON-YYYY HH24:MI:SS') and to_date('21-FEB-2013 22:09:00','DD-MON-YYYY HH24:MI:SS') group by p1,p2
COUNT(*) P1 P2
2136 3 99405
17 3 7293
SQL> select owner,segment_name,segment_type from dba_extents where file_id = 3 and 99405 between block_id AND block_id + blocks - 1
OWNER SEGMENT_NAME SEGMENT_TYPE
SYS _SYSSMU14_1303827994$ TYPE2 UNDO
SQL> select owner,segment_name,segment_type from dba_extents where file_id = 3 and 7293 between block_id AND block_id + blocks - 1;
OWNER SEGMENT_NAME SEGMENT_TYPE
SYS _SYSSMU11_1303827994$ TYPE2 UNDOThought to check the SQL_ID which were waiting on this buffer busy waits.
SQL> select count(*),sql_id,session_state from v$active_session_history where event='buffer busy waits' and sample_time between to_date('21-FEB-2013 21:55:00','DD-MON-YYYY HH24:MI:SS') and to_date('21-FEB-2013 22:09:00','DD-MON-YYYY HH24:MI:SS') group by sql_id,session_state order by 1;
COUNT(*) SQL_ID SESSION_STATE
1 cvypjyh0mm56x WAITING
1 02dtz82as4y42 WAITING
1 80gz2r4hx1wrj WAITING
2 6tfk1t4mwt7hu WAITING
9 0q63qhsbqmpf0 WAITING
12 0jgnx96ur0bmb WAITING
12 7pguapqcc6372 WAITING
14 4t6hqk5r2zbqs WAITING
18 1qwt0qkd59xj3 WAITING
23 5phgg8btvhh6p WAITING
23 banp2v6yttym7 WAITING
30 a1kdmb1x084yh WAITING
30 8hxuagk22f8jz WAITING
30 9r0nysyp360hn WAITING
31 cackx62yu477k WAITING
32 40zxqg1qrdvuh WAITING
32 0jqrd56ds1rbm WAITING
32 7009zmuhvac54 WAITING
38 1jb37ryn1c871 WAITING
60 aum74caa623rs WAITING
63 cr8mv0wawhak9 WAITING
63 3xgk3vsh3nm08 WAITING
86 3k9cq3jv0c3rg WAITING
95 0sy9vjuutgwqu WAITING
122 bhn2kk76wpg12 WAITING
134 4pkfqgyt7rh34 WAITING
139 1sbzsw7y88c7t WAITING
146 92y0ha2nqd6zj WAITING
163 djjqcp1sg2twb WAITING
173 arxq6au12zazw WAITING
256 fa0gzxmgyyxj2 WAITING
282 2f17qywcgu751 WAITINGSo top 10 sql_id's were on tables TAB1 and TAB2 under schemas SCHEMA1 to SCHEMA8.
Checked DML's occurred on Primary using dba_tab_modifications view since last stats job ran on these was about 10 hours ago from when the issue occurred on Active Dataguard.
SQL> select TABLE_OWNER,TABLE_NAME,INSERTS,UPDATES,DELETES from dba_tab_modifications where TABLE_NAME='TAB1' order by 3;
TABLE_OWNER TABLE_NAME INSERTS UPDATES DELETES
SCHEMA1 TAB1 4448 0 3728
SCHEMA2 TAB1 4547 0 4022
SCHEMA3 TAB1 4612 0 4152
SCHEMA4 TAB1 4628 0 3940
SCHEMA5 TAB1 4719 0 4258
SCHEMA6 TAB1 4809 0 4292
SCHEMA7 TAB1 4853 0 4356
SCHEMA8 TAB1 5049 0 4536
SQL> select TABLE_OWNER,TABLE_NAME,INSERTS,UPDATES,DELETES from dba_tab_modifications where TABLE_NAME='TAB2' order by 3;
TABLE_OWNER TABLE_NAME INSERTS UPDATES DELETES
SCHEMA1 TAB2 25546 0 26360
SCHEMA2 TAB2 26728 0 27565
SCHEMA3 TAB2 27403 0 27763
SCHEMA4 TAB2 27500 0 28149
SCHEMA5 TAB2 28408 0 30440
SCHEMA6 TAB2 30453 0 31906
SCHEMA7 TAB2 31469 0 31988
SCHEMA8 TAB2 32875 0 34670 But confused about Why there could sudden spike of demand on UNDO data in Active Data Guard ? Could any one please shed some lights on finding the reason for this issue ?But confused about Why there could sudden spike of demand on UNDO data in Active Data Guard ? Could any one please shed some lights on finding the reason for this issue ?It's been interesting, The job runs only on ADG?
Even it is only reporting purposes, Which runs only select statements. Are you sure that issue is only because of this job?
Moreover am interested to know, How you able to monitor at the same time? Using EM?
What all are the jobs ran on primary at the same time?
Then, is it possible to run the job on primary and see whats the response?
I suggest you to run the same job again on standby and see the ET of the job and also gather the statspack to check whether you got same buffer busy waits or not.
What storage you are using for primary and standby? In terms of I/O and performance all are same?
You got chance to take statspack even on primary database?
What are the parameters differ in primary and standby?
Also check this note, Heresome work around provided.
*Resolving Intense and "Random" Buffer Busy Wait Performance Problems [ID 155971.1]* -
How to save HR data in Active Directory using ABAP i.e thru LDAP Connector
Hi All,
Can any one please help me out how
to save HR data in Active directory
using LDAP Connector ?
Please help ASAP as it is very urgent .
Thanks
JitendraThere are 100 of such scripts are there online.
here are few tips and codes. you will get more.
https://gallery.technet.microsoft.com/scriptcenter/Feeding-data-to-Active-0227d15c
http://blogs.technet.com/b/heyscriptingguy/archive/2012/10/31/use-powershell-to-modify-existing-user-accounts-in-active-directory.aspx
http://powershell.org/wp/forums/topic/ad-import-csv-update-attributes-script/
Please mark this as answer if it helps -
How to save hr data in Active directory using abap
Hi all
can any one please help me out how to save hr data in Active directory using LDAP connector
please help as this is very urgent requirement
thanks in advance
Thanks
ChantiWhat form do you have the user's name in ?
ANTIPODES\alberteString searchFilter = "(&(objectClass=user)(samAccountName=alberte))";[email protected] searchFilter = "(&(objectClass=user)(userPrincipalName=[email protected]))";Albert EinsteinString searchFilter = (&(objectClass=user)(givenName=Albert)(sn=Einstein))";or using Ambiguous Name Resolution (anr)String searchFilter = "(&(objectClass=user)(anr=Albert Einstein))";or it's even clever enough to useString searchFilter = "(&(objectClass=user)(anr=Einstein Albert))"; -
I'd like to extend file adapter behavior to add data compression features like unzip after read file and zip before write file. I read oracles's file adapter documentation but i didn't find any extension point
if its java mapping, just create a DT with any structure as you wish.
ex.
DT_Dummy
|__ Dummy_field
java mapping does not validate the xml against the DT you created -
Accessing DB2 data through DBLINK in Java program recieves ORA-02063 error
Hello,
Is it posible to access a mainframe DB2 table through a Java JDBC connection utilizing a DBLink / Synonym. Using Toad or SQLDeveloper, I can select the DB2 data just fine. Also, PL/SQL programs can access the data without issues. When I try to select data from the DB2 Table in a Java program, I get ORA-01002: fetch out of sequence -- ORA-02063 preceding line from DB2D errors.
The code I am using to test is:
(Note: TBL_LOC_ADR is the Synonym that points to the Mainfram DB2 table)
public static void main(String[] args) {
Connection con = null;
Statement stmt = null;
String query = "SELECT * FROM tbl_loc_adr";
ResultSet rs;
try {
System.out.println("Getting connection");
con = BatchDBConnection.getDBConnection();
TestConnection tc = new TestConnection();
System.out.println("Creating ps");
stmt = con.createStatement();
System.out.println("Calling qds");
rs = stmt.executeQuery(query);
System.out.println(" Returned data " );
} catch (Exception e) {
System.out.println("Error " + e.getMessage());
e.printStackTrace();
The error occurs on the rs = stmt.executeQuery(query); line.
Thanks for any input into this error. It's been driving me nuts all day.
ChrisChris,
What's your oracle version? You might be hitting a bug as identified in this metalkink doc id, see if this helps out. Resolution posted in the metalink by applying patches or confirm that you have applied all the required patches.
Doc ID: 456815.1 ORA-01002 ORA-2063 When Profiling DB2/400 Table Accessed by View via Transparent Gateway for DRDA
Regards -
CONTROL DATA FOR ACTIVITY TYPE NOT MAINTAINED
dear all
i get this error when calculating production order cost
control data for activity type 1000\4230\1420 not maintained for fiscal year 2007
i hv maintained data in kp26
can any one pl help. i am learnign sap pp and hv 4.7 version
thanks and regardsDear Anshuman
my prob. started during cost cal. in prod. order first error was version 0 not maintained for fiscal year 2007, THEN i maintained it in the path as told by you T code OKEQ.
but after that error changed to CONTROL DATA FOR ACTIVITY TYPE NOT MAINTAINED ( 1000/4230/1420)
i maintained data in kp26, but the error is still coming
pl. suggest some solution as i am not able to proceed further.
Thanks and regards
Raj -
Creating DB2 data source in weblogic 9.2
Hello,
Steps how to create DB2 data source in weblogic 9.2 especially what driver to use for the same.Please look at Oracle docs
Configuring JDBC Data Sources
Supported Database Configurations
Create JDBC data sources
Thanks,
Souvik. -
How can we find out when master data was activated
Hello Friends,
We are loading master data for one info object directly from R3 to Info Object without PSA.
1) The Request Green in RSMO only tells that data is loaded successfully but it does not give any information about Activation is that right ?
2) How can i find out histroy of when one particular Info Object( Master data ) is activated? any tables or tcodes ?
Thanks
Tonyhi,
1) The Request Green in RSMO only tells that data is loaded successfully but it does not give any information about Activation is that right ?
Ans- yes it give loading details only.
2) How can i find out histroy of when one particular Info Object( Master data ) is activated? any tables or tcodes ?
ans -in SM37 chk any attribute change run is completed , else in RSA1 Tools -> apply attribute change and hierarchy -> you can see the finished and scheduled job.
also u can chk the master data table for any M version records is available.
Ramesh
Maybe you are looking for
-
Unit of measure PCS not defined for language FR
Dear Gurus, I try to raise a PO in ME21N, when saving the order, I am being prompted by the system with the following message: "Unit of measure PCS not defined for language FR" I then went to MM02 to check the material master data "Sales Text" Tab. T
-
External display and sleep issues
A 15-inch MacBook Pro worked fine with an external display and keyboard prior to the 10.5.2 update. After the update the external display goes dark when the laptop is closed. Hitting the external keyboard momentarily wakes up the external display but
-
Check G/L field status group for Business area
Hi, when I set the buiness area, I set field status group to required for BA field, the setting always check the BA field on accouting doc. But some G/L is automatic posted, always show "Business area is a required field for G/L account ...", Can you
-
How to add help documentation in a module pool screen?
Hi , I have to add documentation for a module pool screen. I have added the documentation for the program in se38 but what can I do to make it visible on the screen I have designed. Since I am new to module pool, kindly explain me step by step. Thank
-
What is the best (out of the box running) web cam for a macbook pro? ( I'm running os10.6.8) Little background below: Man I'm confused! I see that the logitech c910 web cam works with the mac but then I also see that it only "sort of " works..... h