Excluding tables in user export
Does anybody know of a way to exclude certain tables from a user (schema) level export in 9i? I know I can use pattern matching to export certain tables, but if a schema has 100 tables, and I wanted to exclude 3 of them from exporting how could I do this (without listing the other 97 in a par file)?
I am hoping to find a way to exclude certain tables instead of reverting to listing all of them out directly. I am working on a project to migrate this database from one data center to another, and upgrading it to 10g. This is a dev database so tables could be added or dropped in this working dev database in the next couple months, and I didn't want to miss any tables that were added. The tables I want to exclude are very large, and I am exporting them seperately to keep the dmp file down to a manageable size (even after gzipping it).
Similar Messages
-
How to exclude tables while doing import in Traditional import/export
Hi OTN,
Please suggest me, while using Oracle Datapump we can exclude tables as below i mentioned.... if i will use Traditional export/import how can i exclude tables.
impdp tech_test10/[email protected] directory=EAMS04_DP_DIR dumpfile=EXP_DEV6_05092012.dmp logfile=IMP_TECH_TEST10_29MAY.log REMAP_SCHEMA=DEV6:TECH_TEST10 remap_tablespace=DEV6_DATA ATA_TECH_TEST10 remap_tablespace=DEV6_INDX:INDEX_TECH_TEST10 EXCLUDE=TABLE:\"IN \(\'R5WSMESSAGES_ARCHIVE\',\'R5WSMESSAGES\',\'R5AUDVALUES\'\)\"
Your suggestions will be helpful to me find the way.you cannot exclude. But you can use tables parameter to list all the tables needs to be exported.
exp scott/tiger file=emp.dmp tables=(emp,dept)
similarly for import
Edited by: Kiran on Jul 18, 2012 2:20 AM -
Exporting while excluding table
Hi!
i am trying to export a schema and exclude 2 tables. I am working on 10g on RHEL5.
I am using
expdp schemas=BARRY exclude=TABLE:"IN ('TBL1','TBL2')" dumpfile=barry.dmp logfile=barry.log
and i get the following errors:
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
With the Partitioning, Real Application Clusters, OLAP, Data Mining
and Real Application Testing options
ORA-39001: invalid argument value
ORA-39071: Value for EXCLUDE is badly formed.
ORA-00936: missing expression
Where am i going wrong?Thanks Satish!
I created the parameter file and just moved all the attributes to that. It worked like a charm.
Aman!
Thanks for your input. I had to use IN as i have to exclude two tables.
Anyways, Thansk for your advices! -
EXPORT at schema level, but exclude some tables within the export
I have been searching, but had no luck in finding the correct syntax for my situation.
I'm simply trying to export at the schema level, but I want to omit certain tables from the export.
exp cltest/cltest01@clprod file=exp_CLPROD092508.dmp log=exp_CLPROD092508.log statistics=none compress=N
Thanks!Hi,
Think in simple first.. you use the TABLES Clause..
Example.
exp scott/tiger file=empdept.expdat tables=(EMP,DEPT) log=empdept.log
In case if you scehma contains less number of tables.. !!
Logically if you have large number of tables, I say this solutuion might work ...all around... alternative solutions to solve the problems.. If you have hundered of tables... in your schema....
Try to Create a New Schema and using CTAS create a tables which are skippable in the Current Scehma.
Do an Export and once the Job Done.. you recreate the backup fom New schema
and Import to DB (Destinaiton)
- Pavan Kumar N -
Note:304522.1 How to Move Queue Tables without using export import
Trying to use the pkg available in Metalink "Note:304522.1 How to Move Queue Tables without using export import"
Using the 10.1.0.x and upwards Package, I'm getting the following error on a single consumer queue table with an xmltype payload:
SQL> exec move_aqt.move('XFORM_TEST_INT','INTERFACE_XML_QUEUE','SMALLBLOCK');
BEGIN move_aqt.move('XFORM_TEST_INT','INTERFACE_XML_QUEUE','SMALLBLOCK'); END;
ERROR at line 1:
ORA-20000: ORA-06502: PL/SQL: numeric or value error
ORA-06512: at "SYS.MOVE_AQT", line 652
ORA-06512: at line 1
We've tried in multiple environments, always with the same results.
Trace file shows:
*** 2006-11-08 10:06:47.154
*** SERVICE NAME:(SYS$USERS) 2006-11-08 10:06:47.147
*** SESSION ID:(379.954) 2006-11-08 10:06:47.147
qtable move procedure starting execution at 08-11-2006 10:06:47 for queue table XFORM_TEST_INT.INTERFACE_XML_QUEUE
qtable move procedure experienced an exception at 08-11-2006 10:06:47
qtable move error message ORA-06502: PL/SQL: numeric or value error
qtable move procedure ended execution at 08-11-2006 10:06:47
Can anyone help with this? Has anyone used this before successfully (or not successfully). We urgently need this working today to test moving our queue table into a tablespace with a smaller block size for performance reasons in production.
Thanks for the help!
TonyThank you,
Yes we've done that. They've confirmed a problem with the links/scripts on the note. The 10.1 and up version was not really 10.1 and up.
As they would not have a new version available in time for the move we needed to perform, we switched our approach to using dbms_redefinition instead.
Thanks for the reply,
Tony -
Hi Friends,
I need to exclude two tables in impdp.. help me in this...
impdp system directory=exportdump tables=SYNC.cin_document query=SYNC.CIN_DOCUMENT:'"where rownum<10"' dumpfile=satexpdpdump.dmp logfile=1.log CONTENT=DATA_ONLY EXCLUDE=TABLE:"SYNC.CIN_DETAIL,SYNC.CIN_DOCUMENT_GTIN"
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39001: invalid argument value
ORA-39071: Value for EXCLUDE is badly formed.
ORA-00920: invalid relational operatorThanks friends ,,, I got the solution.
Solution:
instead of excluding i have mentioned only one table name in "tables=SYNC.cin_document" field. I have just removed the EXCLUDE option ..It is working fine..
The following query i have used.
impdp system directory=exportdump tables=SYNC.cin_document query=SYNC.CIN_DOCUMENT:'"where rownum<2570000"' dumpfile=satexpdpdump.dmp logfile=2570000d000gih.log CONTENT=DATA_ONLY
For your reference.:
The following scenario explains: 3 tables exported under scott. but we need to import only two tables. output follows
step1:
=============
create directory data_pump_dir as '/u01/oracle/my_dump_dir';
grant read,write on directory data_pump_dir1 to system;
expdp system/sys DIRECTORY=data_pump_dir DUMPFILE=tables.dmp logfile=table.log TABLES=SCOTT.DEPT,SCOTT.EMP,SCOTT.BONUS
output:
[oracle@linux1 ~]$ expdp system/sys DIRECTORY=data_pump_dir DUMPFILE=tables.dmp logfile=table.log TABLES=SCOTT.DEPT,SCOTT.EMP,SCOTT.BONUS
Export: Release 10.2.0.1.0 - Production on Tuesday, 10 May, 2011 19:40:00
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Starting "SYSTEM"."SYS_EXPORT_TABLE_01": system/******** DIRECTORY=data_pump_dir DUMPFILE=tables.dmp logfile=table.log TABLES=SCOTT.DEPT,SCOTT.EMP,SCOTT.BONUS
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 128 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
. . exported "SCOTT"."DEPT" 5.656 KB 4 rows
. . exported "SCOTT"."EMP" 7.820 KB 14 rows
. . exported "SCOTT"."BONUS" 0 KB 0 rows
Master table "SYSTEM"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
Dump file set for SYSTEM.SYS_EXPORT_TABLE_01 is:
/u01/app/oracle/product/10.2.0/db_1/rdbms/log/tables.dmp
Job "SYSTEM"."SYS_EXPORT_TABLE_01" successfully completed at 19:40:22
[oracle@linux1 ~]$
Step 2:
SQL> alter table DEPT rename to dept1;
Table altered.
SQL> alter table EMP rename to emp1;
Table altered.
SQL> select * from tab;
TNAME TABTYPE CLUSTERID
BONUS TABLE
SALGRADE TABLE
EMP1 TABLE
DEPT1 TABLE
==============================================================================
[oracle@linux1 ~]$ impdp system/sys DIRECTORY=data_pump_dir DUMPFILE=tables.dmp TABLES=SCOTT.DEPT,SCOTT.EMP
output:
Import: Release 10.2.0.1.0 - Production on Tuesday, 10 May, 2011 19:48:51
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Master table "SYSTEM"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_TABLE_01": system/******** DIRECTORY=data_pump_dir DUMPFILE=tables.dmp TABLES=SCOTT.DEPT,SCOTT.EMP
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "SCOTT"."DEPT" 5.656 KB 4 rows
. . imported "SCOTT"."EMP" 7.820 KB 14 rows
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
ORA-31684: Object type INDEX:"SCOTT"."PK_DEPT" already exists
ORA-31684: Object type INDEX:"SCOTT"."PK_EMP" already exists
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
ORA-31684: Object type CONSTRAINT:"SCOTT"."PK_DEPT" already exists
ORA-31684: Object type CONSTRAINT:"SCOTT"."PK_EMP" already exists
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
ORA-39083: Object type REF_CONSTRAINT failed to create with error:
ORA-02270: no matching unique or primary key for this column-list
Failing sql is:
ALTER TABLE "SCOTT"."EMP" ADD CONSTRAINT "FK_DEPTNO" FOREIGN KEY ("DEPTNO") REFERENCES "SCOTT"."DEPT" ("DEPTNO") ENABLE
Job "SYSTEM"."SYS_IMPORT_TABLE_01" completed with 5 error(s) at 19:48:54
[oracle@linux1 ~]$
SQL> select * from tab;
TNAME TABTYPE CLUSTERID
BONUS TABLE
SALGRADE TABLE
EMP1 TABLE
DEPT1 TABLE
DEPT TABLE
EMP TABLE
6 rows selected.
SQL> -
Shell Creation Modify Control Files with Exclude Table Information Step er
I am try to create Shell using TDMS I am gettingbelow error while executing step Modify Control Files with Exclude Table Information.
I am getting error saying that //No such file or direcroty Message no.: CNV_TDMS_13_SHELL000
I am using NFS common mount as my install directory between CI & DB, please can you help me.
Please can help me.
Thanks
RameshIt appears that the directory for shell installation is not accessible from CI (TDMS execution server). Do the following -
Copy the entire Shell installation directory (having TPL files) temporarily to the CI app. server and maintain the path of this directory in the TDMS package.
Then execute the activity which is currently failing. Most probably it will finish successfully.
Now copy the DDLORA.TPL file from this directory to the original directory on the DB app server and continue with your exports.
I hope this helps. -
Exclude tables list kink option in exp/imp ????
Hi,
Is there any option to exclude tables while exp/imp, if so from which version?
Thanks.Actually, in 10.1 and later, the Data Pump versions of export and import allow you to specify an EXClUDE parameter to exclude one or more tables.
Justin
Distributed Database Consulting, Inc.
http://www.ddbcinc.com/askDDBC -
How to exclude schema name from exported files (PL SQL Developer)
Dear all,
Just one question: I am using PL SQL Developer. My goal is to export some data (as .sql and .dmp files) from one database and to import them into the another database (both databases have identical structure - test database and production, just different database names and names of schema. In order to make it possible, I need to exclude schema name from generated export file. I believe that it is possible to do it automatically by setting up parameters of PL SQL Developer. How?
Thank you in advance,
Kindest regards,
DraganaIn the meantime, I have found the answer on my previous question:
Actually, the initial idea (how to exclude schema name from exported files) was wrong. No need for any intervention.
Trick is: Schema name can be changed during the import of exported files (PL SQL Developer during import gives possibility: From User (old schema) To User (new schema) .
Hope that this will be useful info for others.
Dragana -
How to bind the data from user table into user report
Hi All,
Please assist me to bind the data from user table into user report. I did create an user table with data and create a user report template (using Query Print Layout). How can I display my data into report format which I created before? Any sample program or document I can refer?
Platform: SAPB1 2005A
Add On Language: VB.Net 2003
Thanks.
rgds
ERICHi Ibai,
Thanks for your feed back. I give you an example.
Let say now i wanna print employee list, so i will go
1. Main Menu -> Reports -> HR -> Employee List
2. Choose the Selection Criteria -> OK
3. Matrix will display (Employee List)
4. I can print the report click on print button
5. Printing report
My target
1. Main Menu -> Eric_SubMenu -> Employee List
2. Matrix will display (Employee List)
3. Print button
4. Print report
My problem
Now I would like to use my own report format. My own report format means I wanna add on my logo or do some customization within the employee report. So how I am going to do? I only able to display the employee list in matrix. How do I create a new report format and display it.
Thanks.
rgds
ERIC -
In which table deleted user information is stored
Hi all,
I have made one user ZTEST in sap through SU01. Its details has been stored in USR01 .
When i deleted this user than the details of this user has been deleted from the tables USR01.
After deletion on which table deleted user information is stored.
Any BAPI is available which sgives the deleted table list .
Thanks & regardsHi
You can get current database status using the following BAPIs-
BAPI_USER_EXISTENCE_CHECK
BAPI_USER_GETLIST
BAPI_USER_GET_DETAIL
Also check the report RPUAUD00 in which you can find out new infotype creation/modification etc.
Regards -
Hi All,
Does anyone know the Table for User Parameter IDs?
Thanks in AdvanceHi Duke,
USR01 User master record (runtime data)
USR02 Logon data
USR03 User address data
USR04 User master authorizations
USR05 User Master Parameter ID
Regards,
Ashok -
In which table deleted user list is stored
Hi all,
I have made one user ZTEST in sap through SU01. Its details has been stored in USR01 .
When i deleted this user than the details of this user has been deleted from the tables USR01.
After deletion on which table deleted user information is stored.
Any BAPI is available which sgives the deleted table list .
Thanks & regardsHi Lolo,
There is no table available in SAP to view the deleted user information. But you can connect to SQL & see the deleted user information.
I think the command " SQL>show recyclebin;" will show you the deleted user information.
Regards,
Mahesh Gattu -
How does FCP7 user export QT. movie to DNxHD uncompressed AVI format? format
dear sir, madam,
How does FCP7 user export or convert quick time mov to uncompressed AVI format?
thank you.Not in Classic. Use this forum guide to learn where to post:
https://discussions.apple.com/docs/DOC-2463 -
Invalid table name "USERS" specified at position 21
hi i'm getting this error when i try to execute sql statement like this:
* @jc:sql statement="SELECT username FROM users WHERE username = {user}"
public String checkLogin(String user) throws SQLException;
and using it:
try {
String answer = shop.checkLogin(message1);
} catch(SQLException ex) {
ex.printStackTrace();
the full exception is here:
java.sql.SQLException: Invalid table name "USERS" specified at position 21.
at com.pointbase.net.netJDBCPrimitives.handleResponse(Ljava.io.DataInput
Stream;)V(Unknown Source)
at com.pointbase.net.netJDBCPrimitives.handleJDBCObjectResponse(Ljava.io
.DataInputStream;)I(Unknown Source)
at com.pointbase.net.netJDBCConnection.prepareStatement(Ljava.lang.Strin
g;)Ljava.sql.PreparedStatement;(Unknown Source)
at weblogic.jdbc.common.internal.ConnectionEnv.makeStatement(ZLjava.lang
.String;II)Ljava.sql.Statement;(ConnectionEnv.java:1190)
at weblogic.jdbc.common.internal.ConnectionEnv.getCachedStatement(ZLjava
.lang.String;II)Ljava.lang.Object;(ConnectionEnv.java:932)
at weblogic.jdbc.common.internal.ConnectionEnv.getCachedStatement(ZLjava
.lang.String;)Ljava.lang.Object;(ConnectionEnv.java:920)
at weblogic.jdbc.wrapper.Connection.prepareStatement(Ljava.lang.String;)
Ljava.sql.PreparedStatement;(Connection.java:359)
at weblogic.jdbc.wrapper.JTSConnection.prepareStatement(Ljava.lang.Strin
g;)Ljava.sql.PreparedStatement;(JTSConnection.java:544)
at com.bea.wlw.runtime.core.control.DatabaseControlImpl.getStatement_v2(
Ljava.lang.reflect.Method;[Ljava.lang.Object;ZLjava.lang.String;)Ljava.sql.Prepa
redStatement;(DatabaseControlImpl.jcs:1676)
at com.bea.wlw.runtime.core.control.DatabaseControlImpl.invoke(Ljava.lan
g.reflect.Method;[Ljava.lang.Object;)Ljava.lang.Object;(DatabaseControlImpl.jcs:
2567)
at com.bea.wlw.runtime.core.dispatcher.DispMethod.invoke(Ljava.lang.Obje
ct;[Ljava.lang.Object;)Ljava.lang.Object;(DispMethod.java:377)
at com.bea.wlw.runtime.core.container.Invocable.invoke(Ljava.lang.Object
;Ljava.lang.String;Lcom.bea.wlw.runtime.core.dispatcher.DispMethod;[Ljava.lang.O
bject;)Ljava.lang.Object;(Invocable.java:423)
at com.bea.wlw.runtime.core.container.Invocable.invoke(Lcom.bea.wlw.runt
ime.core.dispatcher.DispMethod;[Ljava.lang.Object;)Ljava.lang.Object;(Invocable.
java:396)
at com.bea.wlw.runtime.core.container.Invocable.invoke(Lcom.bea.wlw.runt
ime.core.request.Request;)Lcom.bea.wlw.runtime.core.dispatcher.InvokeResult;(Inv
ocable.java:248)
at com.bea.wlw.runtime.jcs.container.JcsContainer.invoke(Lcom.bea.wlw.ru
ntime.core.request.Request;)Lcom.bea.wlw.runtime.core.dispatcher.InvokeResult;(J
csContainer.java:85)
at com.bea.wlw.runtime.core.bean.BaseContainerBean.invokeBase(Lcom.bea.w
lw.runtime.core.request.Request;)Lcom.bea.wlw.runtime.core.dispatcher.InvokeResu
lt;(BaseContainerBean.java:224)
at com.bea.wlw.runtime.core.bean.SLSBContainerBean.invoke(Lcom.bea.wlw.r
untime.core.request.Request;)Lcom.bea.wlw.runtime.core.dispatcher.InvokeResult;(
SLSBContainerBean.java:103)
at com.bea.wlwgen.StatelessContainer_ly05hg_ELOImpl.invoke(Lcom.bea.wlw.
runtime.core.request.Request;)Lcom.bea.wlw.runtime.core.dispatcher.InvokeResult;
(StatelessContainer_ly05hg_ELOImpl.java:45)
at com.bea.wlwgen.GenericStatelessSLSBContAdpt.invokeOnBean(Ljava.lang.O
bject;Lcom.bea.wlw.runtime.core.request.Request;)Lcom.bea.wlw.runtime.core.dispa
tcher.InvokeResult;(GenericStatelessSLSBContAdpt.java:62)
at com.bea.wlw.runtime.core.bean.BaseDispatcherBean.runAsInvoke(Lcom.bea
.wlw.runtime.core.request.Request;)Lcom.bea.wlw.runtime.core.request.Response;(B
aseDispatcherBean.java:153)
at com.bea.wlw.runtime.core.bean.BaseDispatcherBean.invoke(Lcom.bea.wlw.
runtime.core.request.Request;)Lcom.bea.wlw.runtime.core.request.Response;(BaseDi
spatcherBean.java:54)
at com.bea.wlw.runtime.core.bean.SyncDispatcherBean.invoke(Lcom.bea.wlw.
runtime.core.request.Request;)Lcom.bea.wlw.runtime.core.request.Response;(SyncDi
spatcherBean.java:168)
at com.bea.wlw.runtime.core.bean.SyncDispatcher_k1mrl8_EOImpl.invoke(Lco
m.bea.wlw.runtime.core.request.Request;)Lcom.bea.wlw.runtime.core.request.Respon
se;(SyncDispatcher_k1mrl8_EOImpl.java:46)
at com.bea.wlw.runtime.core.dispatcher.Dispatcher.remoteDispatch(Lcom.be
a.wlw.runtime.core.dispatcher.DispFile;Lcom.bea.wlw.runtime.core.request.Request
;)Lcom.bea.wlw.runtime.core.request.Response;(Dispatcher.java:161)
at com.bea.wlw.runtime.core.dispatcher.ServiceHandleImpl.invoke(Lcom.bea
.wlw.runtime.core.request.Request;)Ljava.lang.Object;(ServiceHandleImpl.java:436
at com.bea.wlw.runtime.core.dispatcher.WlwProxyImpl._invoke(Lcom.bea.wlw
.runtime.core.request.ExecRequest;)Ljava.lang.Object;(WlwProxyImpl.java:326)
at com.bea.wlw.runtime.core.dispatcher.WlwProxyImpl.invoke(Ljava.lang.Ob
ject;Ljava.lang.reflect.Method;[Ljava.lang.Object;)Ljava.lang.Object;(WlwProxyIm
pl.java:315)
at $Proxy16.sprawdzLogin(Ljava.lang.String;)Ljava.lang.String;(Unknown S
ource)
at shop.ShopController.Login(Lshop.ShopController$LoginForm;)Lcom.bea.wl
w.netui.pageflow.Forward;(ShopController.jpf:111)
at jrockit.reflect.NativeMethodInvoker.invoke0(Ljava.lang.Object;ILjava.
lang.Object;[Ljava.lang.Object;)Ljava.lang.Object;(Unknown Source)
at jrockit.reflect.NativeMethodInvoker.invoke(Ljava.lang.Object;[Ljava.l
ang.Object;)Ljava.lang.Object;(Unknown Source)
at jrockit.reflect.VirtualNativeMethodInvoker.invoke(Ljava.lang.Object;[
Ljava.lang.Object;)Ljava.lang.Object;(Unknown Source)
can someone tell me why is this happening, i'm getting this error even though mysql server is shut down
thanksIs the users table is in pointbase db or sql server, in the exception it looks like it is looking up in pointbase db for users table. check you connection @jc:connection data-source-jndi-name of your database control.
Maybe you are looking for
-
Hi All, We have SharePoint Production server 2013 where users are complaining that they are not able to copy or move files from one document library to another document library using "Open with Explorer" functionality. We tried to activate publishin
-
Help -- I have a Lenovo Samsung SSD MZ7PC256HAFU on HP laptop -- ssd firmware will not update
1. i tried updating the SSD firmware by doing it with the samsung magician software --- it does not see the SSD 2. then i tried doing it with the Lenovo iso found here: http://support.lenovo.com/en_US/detail.page?LegacyDocID=MIGR-69806 this suppose
-
R11.0.3: An Invoice with future dated apyment data flow
제품 : FIN_AP 작성날짜 : 2003-11-18 R11.0.3: An Invoice with future dated apyment data flow =============================================== PURPOSE Future Dated Payment Term을 사용하는 Invoice의 Data Flow에 대해 살펴보도록 한다. Explanation accounting rule = accrual .allo
-
Hi all, If my application can not establish connection with the server it is storing 8000 xml files(one file per record).After connection with the server resumes it sends these xml files to server one by one and simultaneously deletes these files. I
-
Aperture aliasing issue?
Hi all, I'm experiencing some aliasing/jaggies artifacts with certain images displayed at less than 50% zoom in Aperture 3. It's very noticeable in split-view and viewer only mode, and is still quite noticeable in full-screen mode. There is no aliasi