Row compression test
Hello, All
I try to research and test row compression on DB2 9.7
I have one BW table with 67 M rows on our production system with DB2 9.1.7 what hold 793558 pages
I create the same general table (ODSTAB) and it's copy table (ODSTAB_COMP) with COMPRESS YES on sample DB2 9.7 database.
Next, i load both tables with 67 M rows
I have strange result's of table's size after load and runstats compared to 9.1 original not compressed table
The table size is estimated from SYSIBMADM.ADMINTABINFO view (DATA_OBJECT_P_SIZE column)
9.1.7 - original not comressed table - 12698528
9.7.0 - ODSTAB (not compressed table) - 14281344 (Wow!!!)
9.7.0 - ODSTAB_COMP (compressed table) - 12344928
Compression ratio for ODSTAB_COMP table is 76% (as presented in SYSIBMADM.ADMINTABCOMPRESSINFO) but it's not true (why?) in compare of allocated data pages.
In the sample 9.7 database table's created each on it's own AUTRESIZE DMS table space with 16k pagesize, extentsize 2 and INCREASESIZE 5M.
I have two question :
1. Why native table size is different on 9.1 and 9.7 of ~1,5Gb ( 12698528 vs 14281344) ?
2. Why real comression ratio (14281344 vs 12344928) is too small and how i can increase it (may be some best practice or trick's is available) ?
Thank you for all advises!
With best regards, Dmitry
Second question is solved.
After REORG ....RESETDICTIONARY allocated pages is reduced by 74%
A think it related with automatic dictionary creation when table was LOADed (ROWS_SAMPLED was only 10132)
With best regards, Dmitry
Similar Messages
-
Page compression vs. row compression - 30 % savings
According to
Note 1143246 - R3load: row compression as default for SQL 2008
R3load uses row compression (starting with a certain patchlevel).
I built up a system using TDMS (ERP 6.0 with EHP4) with row compression, target system size was ~ 550 GB. If I built the same system using page compression (by changing DBMSS.TPL during R3load) the system uses only 300 GB.
As of now, is page compression supported and will there be a standard way in building systems using page compression (e. g. an R3load option?)
MarkusHi Markus,
Please check the following note:
Note 991014 - Row Compression/Vardecimal on SQL Server
> Page Compression:
>
> With SQL Server 2008 "Page Compression" was introduced, as well. Here the whole data page will be compressed, not
> just row by row as with "Row Compression".
> Page compression is not the standard for SAP and it's currently still under investigation.
> Page compression is supported from the ABAP/4 dictionary and we did not see or do expect any problems in that area.
> But you should still first test page compression for your application before you go productive with it.
Regards,
Federico Biavati -
Ideas for tension and compression test stand
Hello,
I currently work for a company that manufactures weigh scale
devices. A majority of the devices are either patient weight
scales ( the kind you stand on at the doctors ) or a tension load cell
(displays patient weight when on a lifter). Typically the devices
can weigh up to 1000lbs full scale.
My job usually consists of performing engineering test / evaluation of
these devices. A majority of the testing involves me slinging
massive amounts of weight all day… I usually have to do
repeatability testing, linearity runs (ascending / descending),
etc. Currently this technique is purely physical… I
am looking for ideas to make it a LabView controlled technique that is
relatively CHEAP.
Desired:
1) Relatively accurate (at 1000 lbs) accurate AND REPEATABLE to 0.05 lbs
2) Once at a particular load it must remain stable to 0.00 lbs ( sometimes air compression devices oscillate)
3) Can be configured for tension loads or compression loads (can be manual to configure)
4) Automation – Would like to run either tension testing OR compression
testing but to be able to just walk away and have the system
automagically perform the repeatability/linearity runs, etc.
Ideas so far:
A) A lever arm whose fulcrum point changes based on a linear motor and a known load.
B Some kind of linear motor that can be securely mounted
vertically: would have to have various types of attachments to
configure tension / compression testing.
I haven’t even begun to research linear motors… I don’t even know if they are appropriate or not for the application.
Any brainstorming ideas would be greatly appreciated
Thank you for your timePersonally, I would not try to reinvent the wheel. First to enter my mind is compressive/tensile testing with off-the-shelf equipment from Instron or similar manufacturer. They have highly capable automation/data collection software too. Not cheap as you wanted, but precise, repeatable, traceable, and pretty much out of the box solution, less custom fixturing. Realize you request 0.005% repeatability at 1000lbs full scale, keep in mind the time, cost, and labor you may incur trying to design, build, test, and certify an in-house built solution.
If not, maybe you can get some ideas by looking at how such equipment works.
Voice-coil mass drivers may be a means to apply forces.
May the force be with you
~~~~~~~~~~~~~~~~~~~~~~~~~~
"It’s the questions that drive us.”
~~~~~~~~~~~~~~~~~~~~~~~~~~ -
DB2 Row compression activated - Any changes during System Copy process?
Hell All,
I have activated DB2 Row Compression.
Now i want to do a System Copy.
Any changes due to this feature in the regular System Copy procedure? Like using R3load fastload COMPRESS options?
Thanks,
BidwanHello Bidwan,
Please see the following blog regarding row compression:
/people/johannes.heinrich/blog/2006/09/05/new-features-in-db2-udb-v9--part-4
R3load has the new option '-loadprocedure fast COMPRESS'. Started with this option it loads part of the data, does a REORG to create the compression dictionary and then loads the rest of the data. This way the table does not grow up to it's full size before it is compressed. For more details see OSS note 886231.
Regards,
Paul -
ROW COMPRESSION - table /SAPAPO/BOP
Hello guys,
I have a big performance problem in table /SAPAPO/BOP at SCM system. I run the /SAPAPO/BOP_DELETE to delete entries older than 7 days. I saw in my database lot executions as sequential read, update and delete during of SPP and GATP - BO processing. As I have the ROW COMPRESSION active for this table, I would like to know if this could have some impact for the performance issue.
Could you please advise about this?
Many thanks,
Carlos.Hi
Please check the note below may help you,
Note 1416044 - BOP performance improvement for updates back from erp to scm
Thanks
Sadiq -
Labview interface with Instron 3360 Series (for compressive test)
Hello Forums!
I have to write a LabVIEW code to read the Instron 3360 to compress a thin-walled cylinder. The cylinder is expected to withstand about 500lb. How do I go about doing this?
- Thanks!
- gibiv (high school student)Can you tell me a little more about your application? More specifically:
1. What is the input/output of the instrument you are using to measure force?
2. Are you using NI Hardware to communicate with this device?
3. What drivers are you using?
4. What version of LabVIEW are you using?
I have included some general information below about configuring a data acquisition system in LabVIEW.
Introduction to Data Acquisition: http://www.ni.com/white-paper/3536/en/
R. Brill
Applications Engineer
National Instruments -
VStest.console.exe Query for a Data Row Result DURING Test Execution Status
I started with the following
thread and was asked to create a new thread.
I have the following test that reads a .csv as a data source.
/// <summary>
/// Summary description for Test
/// </summary>
[TestCategory("LongTest"),
TestCategory("Visual Studio 2013"), TestMethod]
[DeploymentItem("data.csv")]
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV",
"|DataDirectory|\\data.csv",
"data#csv",
DataAccessMethod.Sequential),
DeploymentItem("data.csv")]
public void Create_And_Build_All_Templates()
testmethodname = "Create And Build All Templates ";
LaunchVisualStudio2013();
When I run the test from VStest.console.exe, I see the following:
vstest.console.exe /testcasefilter:"TestCategory=LongTest" /settings:"C:\testing\CodedUI.testsettings" /logger:TRX /logger:CodedUITestLogger C:\AppTests\CodedUITest.dll
Microsoft (R) Test Execution Command Line Tool Version 12.0.31101.0
Copyright (c) Microsoft Corporation. All rights reserved.
Running tests in C:\testing\bin\debug\TestResults
Starting test execution, please wait...
I want to report on the status of the iterations DURING the test run from VStest.console.exe, like how the test explorer window does this.
How can I achieve the output below (notice the (Data Row ) values) ?
vstest.console.exe
/testcasefilter:"TestCategory=LongTest"
/settings:"C:\testing\CodedUI.testsettings"
/logger:TRX
/logger:CodedUITestLogger
C:\AppTests\CodedUITest.dll
Microsoft (R) Test Execution Command Line Tool Version 12.0.31101.0
Copyright (c) Microsoft Corporation. All rights reserved.
Running tests in C:\testing\bin\debug\TestResults
Starting test execution, please wait...
Test Passed - Create_And_Build_All_Templates (Data Row 1)
Test Passed - Create_And_Build_All_Templates (Data Row 2)
Test Failed - Create_And_Build_All_Templates (Data Row 3)
Test Passed - Create_And_Build_All_Templates (Data Row 4)
Ian CeicysJack, again the results are printed to the std. out console AFTER the test data row has been completed. Is there a way to query VSTest.console and find out which test\data row is being executed so it can be written out to the console
DURING the test run?
I put together the following screencast showing the issue:
http://www.screencast.com/t/IrxxfhGlzD
Also here is the github repo with the source code that I included in the screen cast:
https://github.com/ianceicys/VisualStudioSamples2015
Take a look at LongRunningDataDrivenTest.sln
Unit Test
[TestClass]
public class UnitTest
public static int _executedTests = 0;
public static int _passedTests = 0;
public void IncrementTests()
_executedTests++;
public void IncrementPassedTests()
_passedTests++;
[TestInitialize]
public void TestInitialize()
IncrementTests();
Console.WriteLine("Total tests Row executed: {0}", _executedTests);
[TestCleanup]
public void TestCleanup()
if (TestContext.CurrentTestOutcome == UnitTestOutcome.Passed)
IncrementPassedTests();
Console.WriteLine("Total passed tests: {0}", _passedTests);
private TestContext testContextInstance;
/// <summary>
/// Long Running Test
/// </summary>
[TestCategory("Long Running Test Example"),TestMethod]
[DeploymentItem("data.csv")]
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV",
"|DataDirectory|\\data.csv",
"data#csv",
DataAccessMethod.Sequential),
DeploymentItem("data.csv")]
public void LongRunning_TestMethod_Takes_Over_30_Seconds_to_Run()
WaitTime(Convert.ToInt32(TestContext.DataRow["WaitTime"].ToString()));
public void WaitTime (int waittime)
Thread.Sleep(waittime);
public TestContext TestContext
get { return testContextInstance; }
set { testContextInstance = value; }
data.csv
WaitTime,
13000
19000
10000
11000
15000
Ian Ceicys -
11.2.0.3.3 impdp compress table
HI ML :
源库 : 10.2.0.3 compress table
target : 11.2.0.3.3 impdp 源端的compress tables,在目标端是否是compress table
之前在10g库直接 通过impdp dblink 导入时候 发现,入库的表需要手工做move compress。
MOS 文档给的的测试时 在10g开始 支持导入自动维护compress table :
Oracle Server - Enterprise Edition - Version: 9.2.0.1 to 11.2.0.1 - Release: 9.2 to 11.2
Information in this document applies to any platform.
Symptoms
Original import utility bypasses the table compression or does not compress, if the table is precreated as compressed. Please follow the next example that demonstrates this.
connect / as sysdba
create tablespace tbs_compress datafile '/tmp/tbs_compress01.dbf' size 100m;
create user test identified by test default tablespace tbs_compress temporary tablespace temp;
grant connect, resource to test;
connect test/test
-- create compressed table
create table compressed
id number,
text varchar2(100)
) pctfree 0 pctused 90 compress;
-- create non-compressed table
create table noncompressed
id number,
text varchar2(100)
) pctfree 0 pctused 90 nocompress;
-- populate compressed table with data
begin
for i in 1..100000 loop
insert into compressed values (1, lpad ('1', 100, '0'));
end loop;
commit;
end;
-- populate non-compressed table with identical data
begin
for i in 1..100000 loop
insert into noncompressed values (1, lpad ('1', 100, '0'));
end loop;
commit;
end;
-- compress the table COMPRESSED (previous insert doesn't use the compression)
alter table compressed move compress;
Let's now take a look at data dictionary to see the differences between the two tables:
connect test/test
select dbms_metadata.get_ddl ('TABLE', 'COMPRESSED') from dual;
DBMS_METADATA.GET_DDL('TABLE','COMPRESSED')
CREATE TABLE "TEST"."COMPRESSED"
( "ID" NUMBER,
"TEXT" VARCHAR2(100)
) PCTFREE 0 PCTUSED 90 INITRANS 1 MAXTRANS 255 COMPRESS LOGGING
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "TBS_COMPRESS"
1 row selected.
SQL> select dbms_metadata.get_ddl ('TABLE', 'NONCOMPRESSED') from dual;
DBMS_METADATA.GET_DDL('TABLE','NONCOMPRESSED')
CREATE TABLE "TEST"."NONCOMPRESSED"
( "ID" NUMBER,
"TEXT" VARCHAR2(100)
) PCTFREE 0 PCTUSED 90 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "TBS_COMPRESS"
1 row selected.
col segment_name format a30
select segment_name, bytes, extents, blocks from user_segments;
SEGMENT_NAME BYTES EXTENTS BLOCKS
COMPRESSED 2097152 17 256
NONCOMPRESSED 11534336 26 1408
2 rows selected.
The table COMPRESSED needs fewer storage space than the table NONCOMPRESSED. Now, let's export the tables using the original export utility:
#> exp test/test file=test_compress.dmp tables=compressed,noncompressed compress=n
About to export specified tables via Conventional Path ...
. . exporting table COMPRESSED 100000 rows exported
. . exporting table NONCOMPRESSED 100000 rows exported
Export terminated successfully without warnings.
and then import them back:
connect test/test
drop table compressed;
drop table noncompressed;
#> imp test/test file=test_compress.dmp tables=compressed,noncompressed
. importing TEST's objects into TEST
. . importing table "COMPRESSED" 100000 rows imported
. . importing table "NONCOMPRESSED" 100000 rows imported
Import terminated successfully without warnings.
Verify the extents after original import:
col segment_name format a30
select segment_name, bytes, extents, blocks from user_segments;
SEGMENT_NAME BYTES EXTENTS BLOCKS
COMPRESSED 11534336 26 1408
NONCOMPRESSED 11534336 26 1408
2 rows selected.
=> The table compression is gone.
Cause
This is an expected behaviour. Import is not performing a bulk load/direct path operations, so the data is not inserted as compressed.
Only Direct path operations such as CTAS (Create Table As Select), SQL*Loader Direct Path will compress data. These operations include:
•Direct path SQL*Loader
•CREATE TABLE and AS SELECT statements
•Parallel INSERT (or serial INSERT with an APPEND hint) statements
Solution
The way to compress data after it is inserted via a non-direct operation is to move the table and compress the data:
alter table compressed move compress;
Beginning with Oracle version 10g, DataPump utilities (expdp/impdp) perform direct path operations and so the table compression is maintained, like in the following example:
- after crating/populating the two tables, export them with:
#> expdp test/test directory=dpu dumpfile=test_compress.dmp tables=compressed,noncompressed
Processing object type TABLE_EXPORT/TABLE/TABLE
. . exported "TEST"."NONCOMPRESSED" 10.30 MB 100000 rows
. . exported "TEST"."COMPRESSED" 10.30 MB 100000 rows
Master table "TEST"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
and re-import after deletion with:
#> impdp test/test directory=dpu dumpfile=test_compress.dmp tables=compressed,noncompressed
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "TEST"."NONCOMPRESSED" 10.30 MB 100000 rows
. . imported "TEST"."COMPRESSED" 10.30 MB 100000 rows
Job "TEST"."SYS_IMPORT_TABLE_01" successfully completed at 12:47:51
Verify the extents after DataPump import:
col segment_name format a30
select segment_name, bytes, extents, blocks from user_segments;
SEGMENT_NAME BYTES EXTENTS BLOCKS
COMPRESSED 2097152 17 256
NONCOMPRESSED 11534336 26 1408
2 rows selected.
=> The table compression is kept.
===========================================================
1 到底11.2.0.3 是否 支持impdp自动维护compress table 通过dblink 方式?
2
This is an expected behaviour. Import is not performing a bulk load/direct path operations, so the data is not inserted as compressed.
Only Direct path operations such as CTAS (Create Table As Select), SQL*Loader Direct Path will compress data. These operations include:
•Direct path SQL*Loader
•CREATE TABLE and AS SELECT statements
•Parallel INSERT (or serial INSERT with an APPEND hint) statements
Solution
The way to compress data after it is inserted via a non-direct operation is to move the table and compress the data:
以上意思在10g之前是 必须以上方式才能支持目标端入库压缩表,10g开始支持自动压缩? 貌似 10g也需要手工move。ODM TEST:
Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
SQL> create table nocompres tablespace users as select * from dba_objects;
Table created.
SQL> create table compres_tab tablespace users as select * from dba_objects;
Table created.
SQL> alter table compres_tab compress 3;
Table altered.
SQL> alter table compres_tab move ;
Table altered.
select bytes/1024/1024 ,segment_name from user_segments where segment_name like '%COMPRES%'
BYTES/1024/1024 SEGMENT_NAME
3 COMPRES_TAB
9 NOCOMPRES
C:\Users\ML>expdp maclean/oracle dumpfile=temp:COMPRES_TAB2.dmp tables=COMPRES_TAB
Export: Release 11.2.0.3.0 - Production on Fri Sep 14 12:01:12 2012
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "MACLEAN"."SYS_EXPORT_TABLE_01": maclean/******** dumpfile=temp:COMPRES_TAB2.dmp tables=COMPRES_TAB
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 3 MB
Processing object type TABLE_EXPORT/TABLE/TABLE
. . exported "MACLEAN"."COMPRES_TAB" 7.276 MB 75264 rows
Master table "MACLEAN"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
Dump file set for MACLEAN.SYS_EXPORT_TABLE_01 is:
D:\COMPRES_TAB2.DMP
Job "MACLEAN"."SYS_EXPORT_TABLE_01" successfully completed at 12:01:20
C:\Users\ML>impdp maclean/oracle remap_schema=maclean:maclean1 dumpfile=temp:COMPRES_TAB2.dmp
Import: Release 11.2.0.3.0 - Production on Fri Sep 14 12:01:47 2012
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "MACLEAN"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "MACLEAN"."SYS_IMPORT_FULL_01": maclean/******** remap_schema=maclean:maclean1 dumpfile=temp:COMPRES_TAB2.dmp
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "MACLEAN1"."COMPRES_TAB" 7.276 MB 75264 rows
Job "MACLEAN"."SYS_IMPORT_FULL_01" successfully completed at 12:01:50
1* select bytes/1024/1024 ,segment_name from user_segments where segment_name like '%COMPRES%'
SQL> /
BYTES/1024/1024 SEGMENT_NAME
3 COMPRES_TAB
SQL> drop table compres_tab;
Table dropped.
C:\Users\ML>exp maclean/oracle tables=COMPRES_TAB file=compres1.dmp
Export: Release 11.2.0.3.0 - Production on Fri Sep 14 12:03:19 2012
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Export done in ZHS16GBK character set and AL16UTF16 NCHAR character set
About to export specified tables via Conventional Path ...
. . exporting table COMPRES_TAB 75264 rows exported
Export terminated successfully without warnings.
C:\Users\ML>
C:\Users\ML>imp maclean/oracle fromuser=maclean touser=maclean1 file=compres1.dmp
Import: Release 11.2.0.3.0 - Production on Fri Sep 14 12:03:45 2012
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Export file created by EXPORT:V11.02.00 via conventional path
import done in ZHS16GBK character set and AL16UTF16 NCHAR character set
. importing MACLEAN's objects into MACLEAN1
. . importing table "COMPRES_TAB" 75264 rows imported
Import terminated successfully without warnings.
SQL> conn maclean1/oracle
Connected.
1* select bytes/1024/1024 ,segment_name from user_segments where segment_name like '%COMPRES%'
SQL> /
BYTES/1024/1024 SEGMENT_NAME
8 COMPRES_TAB
我的理解 对于direct load 总是能保持compression
但是 imp默认是conventional path 即使用普通的走buffer cache的INSERT实现导入 所以无法保持compression
而impdp不管 access_method 是external_table还是direct_path 模式都可以保持compression -
Enabling GZIP compression on the HTTPBC or a suggested alternative approach
Hi,
I posted this on the openusb-users group, but I haven't come up with a good solution yet.
Any takers?
Thanks,
Rob
http://openesb-users.794670.n2.nabble.com/gzip-compression-support-in-HTTP-BC-tp5366301.html
Does anybody know whether the HTTP BC shares all of the same features that the Glassfish HTTP listener supports such as gzip compression? If so, where are those parameters specified for the HTTP BC? (Compression of the XML can of course make a tremendous bandwidth savings for the typical XML passed around via SOAP.)
Please see:
http://docs.sun.com/app/docs/doc/820-4338/abhco?a=view
compression - on
Specifies use of HTTP/1.1 GZIP compression to save server bandwidth. Allowed values are:
off Disables compression.
on Compresses data.
force Forces data compression in all cases.
positive integer Specifies the minimum amount of data required before the output is compressed.
If the content-length is not known, the output is compressed only if compression is set to on or force.
compressableMimeType text/html,text/xml,text/plain
Specifies a comma-separated list of MIME types for which HTTP compression is used.
If there isn't a way to configure the HTTPBC, What would be the best way to enable a gzip compression of the XML for the HTTPBC or JBI framework in general?
My first approach might be to just have a Glassfish HTTP-Listener based WebService delegate to the HTTPBC or more directly to the JBI SE components if possible. Maybe a filter on the HTTPBC to gzip the payload? Are there any plans to refactor HTTPBC to use the Glassfish HTTP Listener code, it appears to have more capability and options?
RobI tried it did not work.
Below are the steps which I did:
Test2: Table level compression test
===================================
CREATE TABLE SCOTT.TEST_PMT_TAB_COMP
PMT_ID INTEGER NOT NULL,
PMT_AGCY_NBR CHAR(7 BYTE),
PMT_SBAGCY_NBR CHAR(7 BYTE),
PMT_PAY_DT DATE,
PMT_POL_NBR CHAR(14 BYTE),
PMT_POL_SEQ_NBR CHAR(2 BYTE),
PMT_PRM_INSD_NM VARCHAR2(30 BYTE),
PMT_PAY_TYP_DESC VARCHAR2(20 BYTE),
PMT_PAY_AMT NUMBER(11,2),
PMT_POST_DT DATE
COMPRESS FOR ALL OPERATIONS;
inserting record from the test table:
SQL> insert into SCOTT.TEST_PMT_TAB_COMP select * from SCOTT.TEST_PMT;
5051013 rows created.
SQL> commit;
Commit complete.
SQL> select count(*) from scott.TEST_PMT_TAB_COMP;
COUNT(*)
5051013
Checking size:
SQL> select bytes/1024/1024, segment_name, owner, segment_type, tablespace_name from dba_segments where segment_name='TEST_PMT_TAB_COMP';
BYTES/1024/1024 SEGMENT_NAME OWNER SEGMENT_TYPE TABLESPACE_NAME
776 TEST_PMT_TAB_COMP SCOTT TABLE USERS_DATA01
SQL> select owner, table_name, COMPRESSION, COMPRESS_FOR from dba_tables where table_name='TEST_PMT_TAB_COMP';
OWNER TABLE_NAME COMPRESS COMPRESS_FOR
SCOTT TEST_PMT_TAB_COMP ENABLED OLTP
Now it is occupying more size. -
Real experience with compression
Hi all,
has anyone actually done a 2008 compression exercise yet? How much space did you save and what where the implications?
We are 'fighting' 6.5TB, growing by 80GB/week and would like to hear from others who have done it yet.
Thanks & Regards,
AndiHi,
since I wanted to test the revised Stored Procedure sp_use_db_compression (see above) I installed an IDES SR3 + imported a bunch of SAP Support Packages to closely match our other systems. Then I created the Stored Procedure and ran this statement against the DB (the job finished a couple minutes ago):
sp_use_db_compression PAGE, @maxdop=4, @online='OFF', @data_only='ALL', @verbose=1, @schemas='ALL';
The results were (in GB):
Files - Reserved - Used before Compression - Used after Compression
DATA1 - 64,45 - 62,70 - 32,11
DATA2 - 64,45 - 62,74 - 32,03
DATA3 - 64,45 - 62,84 - 32,14
The overall TA-Log volume was about 65 GB (backup every 10 minutes). I used PAGE and index compression which - I think - is not fully supported yet (check Note 991014). With ROW compression we achieved results of 25 to 40%. Set up a testrun and post your results.
Sven -
Problem in .rtf template with 2 dimensions in rows section
There is a problem with standard .rtf template: if we select two dimensions in rows, table header (with months) moves left to the position of the second column of row dimension like there is only one dimension in rows (i tested even with 4 dimension in rows, header still starts from the position of the second column of row dimension), in any case default template acts like there is only one dimension in a rows section when a report is being formed, if anybody has already faced that problem maybe you could share an advice to fix it?
p.s. We usually use 1 and 2 dimension in rowsi have a suggestion that there is a problem with colspan parameter in xml file that is built when HP starts creating pdf report, could anybody advise where i can find this xml that is used for creating report?
Edited by: s0uLr1pPeR on 18.05.2012 16:31 -
Local, Testing server, Remote server uploading
I have my web files in a local directory, say c:
webfiles\myweb. I also have a local testing server at
c:\Apache\htdocs\myweb, and finally a production site at
www.myweb.com. The problem is that when I edit a file in my local
directory it updated the testing server and not the production
remote - which is what I expect. My problem is that I don't know
how to update the production server as there does not seem to be
any menu choice for this. I am sure I am doing something stupid,
but I can't figure out whats wrong. Basically I don't quite get how
the relationship between the 3 file locations interact with each
other. Any help would be sincerely appreciated as its driving me
crazy.Expand the Files panel. On the toolbar, there are three icons
in a row -
Remote | Testing | Site Map. Select the Remote icon, and you
will now be
connecting with the Remote site for PUT/GET. The testing
server will now
only be used for Previews or Live Data displays.
Murray --- ICQ 71997575
Adobe Community Expert
(If you *MUST* email me, don't LAUGH when you do so!)
==================
http://www.dreamweavermx-templates.com
- Template Triage!
http://www.projectseven.com/go
- DW FAQs, Tutorials & Resources
http://www.dwfaq.com - DW FAQs,
Tutorials & Resources
http://www.macromedia.com/support/search/
- Macromedia (MM) Technotes
==================
"johnbreda" <[email protected]> wrote in
message
news:f2sahm$r4g$[email protected]..
>I have my web files in a local directory, say c:
webfiles\myweb. I also
>have a
> local testing server at c:\Apache\htdocs\myweb, and
finally a production
> site
> at www.myweb.com. The problem is that when I edit a file
in my local
> directory
> it updated the testing server and not the production
remote - which is
> what I
> expect. My problem is that I don't know how to update
the production
> server as
> there does not seem to be any menu choice for this. I am
sure I am doing
> something stupid, but I can't figure out whats wrong.
Basically I don't
> quite
> get how the relationship between the 3 file locations
interact with each
> other.
> Any help would be sincerely appreciated as its driving
me crazy.
> -
How can i set the alternating colors for a table rows
Dear All,
Please any one help me how can i set the Alternating colors for Table Rows.
i created a theam there i set the background alternating color to brown and i set the table design properity to alternating. but it is not reflecting.Hi,
The design property in Table properties should work for your requirement. Select "alternating" value for design.
Please see the API below:
design
Determines the appearance of the table. The property design can take the following values and is represented by enumeration type WDTableDesign.
alternating - The table rows are displayed alternately in a different color.
standard - The table background has one color. The individual table rows are displayed with grid net lines.
transparent - The table background is transparent. The individual table rows are displayed without grid net lines.
Check whether you have changed the right property or not? Also table should contain more than one rows to test this scenario.
Regards,
Jaya.
Edited by: VJR on Jun 17, 2009 6:43 PM -
Update multiple rows & columns using a single statement
I have the following table with tablename - emp
first_name last_name age
aaa bbb 31
56
78
ggg hhh 36
2nd & 3rd row contain null values (no data) in first_name & last_name column . I want to update those two rows with data using a single statement. How do I do it?
I was thinking may be something like the following:-
UPDATE emp
SET first_name= , last_name=
CASE
WHEN age = 56 THEN 'ccc', 'ddd'
WHEN age = 78 THEN 'eee', 'fff'
ELSE first_name, last_name
END
-----------------------------------------------Can you give an example of a nested decode statement.
test@ora>
test@ora>
test@ora> --
test@ora> drop table t;
Table dropped.
test@ora> create table t as
2 select rownum x, cast(null as varchar2(10)) y from all_objects
3 where rownum <= 10;
Table created.
test@ora>
test@ora> select x, y from t;
X Y
1
2
3
4
5
6
7
8
9
10
10 rows selected.
test@ora>
test@ora> -- You want to change the values of y to 'a' through 'j' for x = 1 through 10
test@ora> -- and y = 'X' otherwise
test@ora> --
test@ora> -- Let's say the limit on the number of components were 12
test@ora> -- Then your decode statement would've been:
test@ora> --
test@ora> update t
2 set y = decode(x,
3 1, 'a',
4 2, 'b',
5 3, 'c',
6 4, 'd',
7 5, 'e',
8 'f');
10 rows updated.
test@ora>
test@ora>
test@ora> select x, y from t;
X Y
1 a
2 b
3 c
4 d
5 e
6 f
7 f
8 f
9 f
10 f
10 rows selected.
test@ora>
test@ora> -- As you can see you are unable to:
test@ora> --
test@ora> -- change y to 'g' if x = 7
test@ora> -- change y to 'h' if x = 8
test@ora> -- change y to 'i' if x = 9
test@ora> -- change y to 'j' if x = 10
test@ora> -- change y to 'X' otherwise
test@ora> --
test@ora>
test@ora> -- What you would do then is -
test@ora> -- (i) Let the 11 components remain as they are and
test@ora> -- (ii) Introduce a nested decode function *AS THE 12TH COMPONENT*
test@ora> --
test@ora>
test@ora> rollback;
Rollback complete.
test@ora>
test@ora> --
test@ora> update t
2 set y = decode(x,
3 1, 'a',
4 2, 'b',
5 3, 'c',
6 4, 'd',
7 5, 'e',
8 decode(x,
9 6, 'f',
10 7, 'g',
11 8, 'h',
12 9, 'i',
13 10, 'j',
14 'X')
15 );
10 rows updated.
test@ora>
test@ora> select x, y from t;
X Y
1 a
2 b
3 c
4 d
5 e
6 f
7 g
8 h
9 i
10 j
10 rows selected.
test@ora>
test@ora>HTH
isotope
Extrapolate that to 255 components and you get 254 + 255 = 509 components. And so on...
Message was edited by:
isotope -
Hi all,
i am facing problem while generating Test Result word document after successful execution of TestStand.
The Problem is :
i want to add rows Formatted as table headings, table headings are repeated when a table spans more than one page(marked as Red).
Example:
Page No. 1
| Test case Number | Test Step number |
| 100 | 100 |
Page No. 2
| Test case Number | Test Step number |
| 200 | 300 |
Test Result word document should generate with Table headings(marked as Red) in every pages of the document, but i am not getting as per above example.
Please through light on this.
Regards,
Susa.Hi Santiago,
Thank you very much for your valuable reply.
i want to generate MS-word report for TestStand after successful testing using MS-word2000.
Test report contains Actual values, Expected values and Pass/Fail status.
In my program i have customized all fields i can able to generate test report which contains Verification engineer name , test mode, test date, start time, end time Actual values, Expected values and Pass/Fail status.etc....
To put all values of test case number, Test step number, Actual values, Expected values and Pass/Fail status in to table for each time, i will
insert a row into table every time values arrives, once the table exceedes its page size it moves to the next page, next page should start with table row header but it start with values of above said parameters.
so i'm not able to repeat table row header for each page.
Please find the attached file for your reference.
Attched file expected.doc : This file contains what i wanted to generate MS-word report. Here table row header "Test Case Number and Test Step Number " is repeated in second page.
Attached file Actual output from source code.doc : This report generated from the source code. Here table row header "Test Case Number and Test Step Number" is not repeated in second page.
Do you know any property to set "repeat as header row at the top of each page" using MS-word ActiveX in CVI/Labwindows.
i think this information is sufficient for you,
Still if you need some information please ask me.
Thanks
Susa.
Attachments:
Actual output from source code.doc 25 KB
expected.doc 26 KB
Maybe you are looking for
-
Mini DisplayPort No Audio!
I just purchased this » http://www.amazon.com/Premium-High-Speed-inch-DisplayPort-Cable/dp/B005J5ZBJG. I just hooked it up and have no audio, i wanted to see if anyone could help me out on whether or not it is capable of outputing audio. Thnx!
-
DropDownList ItemRenderer within Flex Datagrid Not Refreshing
I have a datagrid which contains a Spark dropdownlist that needs to obtain dynamic data. The datagrid uses a separate dataProvider. When I use a static ArrayCollection within my ItemRenderer, it works (please see listing 1). However, when I use Swiz
-
Can't print now with Leopard w/iMac but can with MacBook&PowerBook
This is my 3rd upgrade (iMac) to Leopard. First two were on MacBook & PowerBook and printing on Epson C88 via AEBS works both remote and via USB.. Now iMac does not print to C88 attached to AEBS nor to Epson CX4200 attached by USB to iMac. Tried down
-
Need to uninstall-reinstall CS4
I have CS4 Design Premium on a Vista Biz 32 bit system. It's not an upgrade but a new purchase. The only trial CS4 product I had was DW which was removed long before installing CS4. The original install seemed fine...no known issues. Been successfull
-
Workflow Payload update needed once we move the node(payload path)
Hi , When you moved your node in a workflow process,how the new payload should be used in the further workflow process steps. if we use the payload in the further workflow process steps, this will give an error , as the payload which the workflow pro