Data Pump API -- how to exlude constraints?

Hi,
We wrote a PL/SQL procedure using the Data Pump API to perform a schema export, as in http://www.oracle-base.com/articles/10g/OracleDataPump10g.php#DataPumpAPI
We would like our export to exlude constraints and indexes, but I can't find an example of what the syntax would be.
Could anybody please give me an example of what this metadata_filter would look like?
Thanks,
Nora

In DBMS_DATAPUMP.metadata_filter use "name" parameter EXCLUDE_PATH_LIST or EXCLUDE_PATH_EXPR
Check following link for details
http://download-west.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_datpmp.htm#sthref2182
Virag

Similar Messages

  • How to exclude statistic using Data Pump API?

    How to exclude all statistics while exporting data using Oracle Data Pump API (DBMS_DATAPUMP package)?

    You would call the metadata filter api like this:
    dbms_datapump.METADATA_FILTER(
    handle = your_handle_here,
    name = 'EXCLUDE_PATH_LIST',
    value = 'STATISTICS');
    Hope this helps.
    Dean

  • Select table when import using Data Pump API

    Hi,
    Sorry for the trivial question, I export the data using Data Pump API, with "TABLE" mode.
    So all tables will be exported in one .dmp file.
    My question is, then how to import few tables only using Data Pump API?, how to define "TABLES" property like command line interface?
    should I use DATA_FILTER procedures?, if yes how to do that?
    Really thanks in advance
    Regards,
    Kahlil

    Hi,
    You should be using metadata_filter procedure for the same.
    eg:
    dbms_datapump.metadata_filter
                (handle1
                 ,'NAME_EXPR'
                 ,'IN (''TABLE1'', '"TABLE2'')'
    {code}
    Regards
    Anurag                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Oracle Export Data pump API - job already exists

    Hi all,
    I use stored procedure and using data pump API for export and import. I use simple schema export from http://download.oracle.com/docs/cd/B14117_01/server.101/b10825/dp_api.htm . It works on the first executing, but the second time
    I found the error ORA-31634: job already exists.
    At the first running I check the Job state from dba_datapump_jobs master table, and it stated DEFINING after process completed.
    why is it happened?
    Really thanks in advance.
    Regards,
    Mr.K

    My question go in direction to:
    1. Did I something wrong with Data Pump Utility?I do not think so. But may be with the existing schema :-(
    2. How is the correct way to export and import
    sequences that they keep their actual values?If the Sequences exist in the target before the import, oracle does not drop and recreate it. So you need to ensure that the sequences do not already exist in the target or the existing ones are dropped before the import.
    3. When the behaviour described here is correct, how
    can I correct the values that start again from the
    last value that was used in the source database?You can either refresh with the import after the above correction or drop and manually recreate the sequences to START WITH the NEXT VALUE of the source sequences.
    The easier way is to generate a script from the source if you know how to do it

  • Data Pump API Question

    Hi,
    Is there a way to use the Data Pump API to export tables from multiple schemas in the same job? I can't figure out what the filters would be. It seems like I can either specify many tables from 1 schema only, or I can specify multiple schemas but not limit the tables I want to export.
    I keep running into this error: ORA-31655: no data or metadata objects selected for job
    I'd like to do something like this:
    --METADATA FILTER: SPECIFY TABLES TO EXPORT
    dbms_datapump.metadata_filter(
    handle => hdl,
    name => 'NAME_EXPR',
    value => 'IN(''schema1.table1'',''schema2.table2'')');
    This does not seem to be possible..
    Any help would be appreciated.
    Thanks,
    Nora

    User that have EXP_FULL_DATABASE role should be able to do what you want.
    Search here for that role http://students.kiv.zcu.cz/doc/oracle/server.102/b14215/dp_export.htm#i1007837
    Seems like you could do what you want by using that role in
    joint venture wiht exclude and include parameters http://students.kiv.zcu.cz/doc/oracle/server.102/b14215/dp_export.htm#i1009903

  • While using data pump (impdp) how to rename references within objects?

    using 10g;
    what i want to accomplish is to change schema & tablespace ownership using the data pump method via the command line; i have had success using the command line for expdp / impdp. Problem is that there are objects that reference the old schemas that DO NOT get updated (e.g. procedure may reference usr1.table1 in the PL/SQL statement) and this is where i have been UN-successfull). Anyone know of a way to change references from old schema to new schama name in objects(procedures, views, etc) via the command line?
    this is what i currently use that works to change schema, tablespace, but will not change references within my objects;
    expdp system/<pass> schemas=usr1,usr2 DIRECTORY=dp_dir DUMPFILE=dataPump_BothSchemas.dmp LOGFILE=expdpAllSchema.log parallel=2
    impdp system/<pass> DIRECTORY=dp_dir DUMPFILE=dataPump_BothSchemas.dmp LOGFILE=impbothSchToEE.log remap_schema=usr1:newUsr1,usr2:newUsr2 remap_tablespace=old_ts_tables:new_ts_tables full=y
    Thanks!
    p.s. I have acomplished this using the enterprise manager.

    (e.g. procedure may reference usr1.table1 in the PL/SQL statement) If you hard coded such reference in stored procedure, you have to manually correct them. Consider use synonym if your storage procedure referencing other schema's objects.

  • DATA PUMP API returning ORA-31655

    Hello Gurus,
    I am using below code to import a table(EMP) from one Database to Another using Network.
    set serveroutput on;
    DECLARE
    ind NUMBER; -- Loop index
    spos NUMBER; -- String starting position
    slen NUMBER; -- String length for output
    h1 NUMBER; -- Data Pump job handle
    percent_done NUMBER; -- Percentage of job complete
    job_state VARCHAR2(30); -- To keep track of job state
    le ku$_LogEntry; -- For WIP and error messages
    js ku$_JobStatus; -- The job status from get_status
    jd ku$_JobDesc; -- The job description from get_status
    sts ku$_Status; -- The status object returned by get_status
    BEGIN
    h1 := DBMS_DATAPUMP.OPEN('IMPORT','TABLE','DBLINK',NULL,'LATEST');
    DBMS_DATAPUMP.METADATA_FILTER(h1,'NAME_EXPR','IN (''SCOTT.EMP'')','TABLE');
    DBMS_DATAPUMP.SET_PARAMETER(h1,'TABLE_EXISTS_ACTION','REPLACE');
    DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_SCHEMA','SCOTT','SCOTT');
    --DBMS_DATAPUMP.METADATA_FILTER(h1,'INCLUDE_PATH_LIST','like''TABLE''');
    DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_TABLESPACE','USERS','USERS');
    DBMS_DATAPUMP.SET_PARALLEL(h1,8);
    begin
    dbms_datapump.start_job(h1);
    dbms_output.put_line('Data Pump job started successfully');
    exception
    when others then
    if sqlcode = dbms_datapump.success_with_info_num
    then
    dbms_output.put_line('Data Pump job started with info available:');
    dbms_datapump.get_status(h1,
    dbms_datapump.ku$_status_job_error,0,
    job_state,sts);
    if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
    then
    le := sts.error;
    if le is not null
    then
    ind := le.FIRST;
    while ind is not null loop
    dbms_output.put_line(le(ind).LogText);
    ind := le.NEXT(ind);
    end loop;
    end if;
    end if;
    else
    raise;
    end if;
    end;
    -- The export job should now be running. In the following loop, we will monitor
    -- the job until it completes. In the meantime, progress information is
    -- displayed.
    percent_done := 0;
    job_state := 'UNDEFINED';
    while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
    dbms_datapump.get_status(h1,
    dbms_datapump.ku$_status_job_error +
    dbms_datapump.ku$_status_job_status +
    dbms_datapump.ku$_status_wip,-1,job_state,sts);
    js := sts.job_status;
    -- If the percentage done changed, display the new value.
    if js.percent_done != percent_done
    then
    dbms_output.put_line('*** Job percent done = ' ||
    to_char(js.percent_done));
    percent_done := js.percent_done;
    end if;
    -- Display any work-in-progress (WIP) or error messages that were received for
    -- the job.
    if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
    then
    le := sts.wip;
    else
    if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
    then
    le := sts.error;
    else
    le := null;
    end if;
    end if;
    if le is not null
    then
    ind := le.FIRST;
    while ind is not null loop
    dbms_output.put_line(le(ind).LogText);
    ind := le.NEXT(ind);
    end loop;
    end if;
    end loop;
    -- Indicate that the job finished and detach from it.
    dbms_output.put_line('Job has completed');
    dbms_output.put_line('Final job state = ' || job_state);
    dbms_datapump.detach(h1);
    -- Any exceptions that propagated to this point will be captured. The
    -- details will be retrieved from get_status and displayed.
    exception
    when others then
    dbms_output.put_line('Exception in Data Pump job');
    dbms_datapump.get_status(h1,dbms_datapump.ku$_status_job_error,0,
    job_state,sts);
    if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
    then
    le := sts.error;
    if le is not null
    then
    ind := le.FIRST;
    while ind is not null loop
    spos := 1;
    slen := length(le(ind).LogText);
    if slen > 255
    then
    slen := 255;
    end if;
    while slen > 0 loop
    dbms_output.put_line(substr(le(ind).LogText,spos,slen));
    spos := spos + 255;
    slen := length(le(ind).LogText) + 1 - spos;
    end loop;
    ind := le.NEXT(ind);
    end loop;
    end if;
    end if;
    END;
    when i run the same code and change the mode for SCHEMA LEVEL import it is working fine.
    But when i want to import a single table it is giving below error
    Data Pump job started successfully
    Starting "SCOTT"."SYS_IMPORT_TABLE_06":
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 0 KB
    ORA-31655: no data or metadata objects selected for job
    *** Job percent done = 100
    Job "SCOTT"."SYS_IMPORT_TABLE_06" completed with 1 error(s) at 20:56:23
    Job has completed
    Final job state = COMPLETED
    can you give any suggestions on this error.
    Thanks in advance

    Did you check this link?
    http://arjudba.blogspot.com/2009/01/ora-31655-no-data-or-metadata-objects.html
    Does it provide any help?
    Regards.
    Satyaki De.

  • Help needed with Export Data Pump using API

    Hi All,
    Am trying to do an export data pump feature using the API.
    while the export as well as import works fine from the command line, its failing with the API.
    This is the command line program:
    expdp pxperf/dba@APPN QUERY=dev_pool_data:\"WHERE TIME_NUM > 1204884480100\" DUMPFILE=EXP_DEV.dmp tables=PXPERF.dev_pool_data
    Could you help me how should i achieve the same as above in Oracle Data Pump API
    DECLARE
    h1 NUMBER;
    h1 := dbms_datapump.open('EXPORT','TABLE',NULL,'DP_EXAMPLE10','LATEST');
    dbms_datapump.add_file(h1,'example3.dmp','DATA_PUMP_TEST',NULL,1);
    dbms_datapump.add_file(h1,'example3_dump.log','DATA_PUMP_TEST',NULL,3);
    dbms_datapump.metadata_filter(h1,'NAME_LIST','(''DEV_POOL_DATA'')');
    END;
    Also in the API i want to know how to export and import multiple tables (selective tables only) using one single criteria like "WHERE TIME_NUM > 1204884480100\"

    Yes, I have read the Oracle doc.
    I was able to proceed as below: but it gives error.
    ============================================================
    SQL> SET SERVEROUTPUT ON SIZE 1000000
    SQL> DECLARE
    2 l_dp_handle NUMBER;
    3 l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    4 l_job_state VARCHAR2(30) := 'UNDEFINED';
    5 l_sts KU$_STATUS;
    6 BEGIN
    7 l_dp_handle := DBMS_DATAPUMP.open(
    8 operation => 'EXPORT',
    9 job_mode => 'TABLE',
    10 remote_link => NULL,
    11 job_name => '1835_XP_EXPORT',
    12 version => 'LATEST');
    13
    14 DBMS_DATAPUMP.add_file(
    15 handle => l_dp_handle,
    16 filename => 'x1835_XP_EXPORT.dmp',
    17 directory => 'DATA_PUMP_DIR');
    18
    19 DBMS_DATAPUMP.add_file(
    20 handle => l_dp_handle,
    21 filename => 'x1835_XP_EXPORT.log',
    22 directory => 'DATA_PUMP_DIR',
    23 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    24
    25 DBMS_DATAPUMP.data_filter(
    26 handle => l_dp_handle,
    27 name => 'SUBQUERY',
    28 value => '(where "XP_TIME_NUM > 1204884480100")',
    29 table_name => 'ldev_perf_data',
    30 schema_name => 'XPSLPERF'
    31 );
    32
    33 DBMS_DATAPUMP.start_job(l_dp_handle);
    34
    35 DBMS_DATAPUMP.detach(l_dp_handle);
    36 END;
    37 /
    DECLARE
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3043
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3688
    ORA-06512: at line 25
    ============================================================
    i have a table called LDEV_PERF_DATA and its in schema XPSLPERF.
    value => '(where "XP_TIME_NUM > 1204884480100")',above is the condition i want to filter the data.
    However, the below snippet works fine.
    ============================================================
    SET SERVEROUTPUT ON SIZE 1000000
    DECLARE
    l_dp_handle NUMBER;
    l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    l_job_state VARCHAR2(30) := 'UNDEFINED';
    l_sts KU$_STATUS;
    BEGIN
    l_dp_handle := DBMS_DATAPUMP.open(
    operation => 'EXPORT',
    job_mode => 'SCHEMA',
    remote_link => NULL,
    job_name => 'ldev_may20',
    version => 'LATEST');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.dmp',
    directory => 'DATA_PUMP_DIR');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.log',
    directory => 'DATA_PUMP_DIR',
    filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    DBMS_DATAPUMP.start_job(l_dp_handle);
    DBMS_DATAPUMP.detach(l_dp_handle);
    END;
    ============================================================
    I dont want to export all contents as the above, but want to export data based on some conditions and only on selective tables.
    Any help is highly appreciated.

  • How can I use the data pump export from external client?

    I am trying to export a bunch of table from a DB but I cant figure out how to do it.
    I dont have access to a shell terminal on the server itself, I can only login using TOAD.
    I am trying to use TOAD's Data Pump Export utility but I keep getting this error:
    ORA-39070: Unable to open the log file.
    ORA-39087: directory name D:\TEMP\ is invalid
    I dont understand if its because I am setting the parameter file wrong or if the utility is trying to find that directory on the server whereas I am thinking its going to dump it to my local filesystem where that directory exists.
    I'd hate to have to use SQL Loader to create ctl files for each and every table...
    Here is my parameter file:
    DUMPFILE="db_export.dmp"
    LOGFILE="exp_db_export.log"
    DIRECTORY="D:\temp\"
    TABLES=ACCOUNT
    CONTENT=ALL
    (just trying to test it on one table so far...)
    P.S. Oracle 11g
    Edited by: trant on Jan 13, 2012 7:58 AM

    ORA-39070: Unable to open the log file.
    ORA-39087: directory name D:\TEMP\ is invalidDirectory here it should not be physical location, its a logical representation.
    For that you have to create a directory from SQL level, like create directory exp_dp..
    then you have to use above created directory as DIRECTORY=exp_dp
    HTH

  • How to find table with colum that not support by data pump network_link

    Hi Experts,
    We try to import a database to new DB by data pump network_link.
    as oracle statement, Tables with columns that are object types are not supported in a network export. An ORA-22804 error will be generated and the export will move on to the next table. To work around this restriction, you can manually create the dependent object types within the database from which the export is being run.
    My question, how to find these tables with colum that that are object types are not supported in a network export.
    We have LOB object and oracle spital SDO_GEOMETRY object type. our database size is about 300G. nornally exp will takes 30 hours.
    We try to use data pump with network_link to speed export process.
    How do we fix oracle spital users type SDO_GEOMETRY issue during data pump?
    our system is 32 bit window 2003 and 10GR2 database.
    Thanks
    Jim
    Edited by: user589812 on Nov 3, 2009 12:59 PM

    Hi,
    I remember there being issues with sdo_geometry and DataPump. You may want to contact oracle support with this issue.
    Dean

  • How to consolidate data files using data pump when migrating 10g to 11g?

    We have one 10.2.0.4 database to be migrated to a new box running 11.2.0.1. The 10g database has too many data files scattered within too many file systems. I'd like to consolidate the data files into one or two large chunk in one file systems. Both OSs are RHEL 5. How should I do that using Data Pump Export/Import? I knew there is "Remap" option could be used, but it's only one to one mapping. How can I map multiple old data files into one new data file?

    hi
    datapump is terribly slow, make sure you have as much memory as possible allocated for Oracle but the bottleneck can be I/O throughput.
    Use PARALLEL option, set also these ones:
    * DISK_ASYNCH_IO=TRUE
    * DB_BLOCK_CHECKING=FALSE
    * DB_BLOCK_CHECKSUM=FALSE
    set high enough to allow for maximum parallelism:
    * PROCESSES
    * SESSIONS
    * PARALLEL_MAX_SERVERS
    more:
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_perf.htm
    that's it, patience welcome ;-)
    P.S.
    For maximum throughput, do not set PARALLEL to much more than twice the number of CPUs (two workers for each CPU).
    Edited by: g777 on 2011-02-02 09:53
    P.S.2
    breaking news ;-)
    I am playing now with storage performance and I turned the option of disk cache (also called write-back cache) to ON (goes at least along with RAID0 and 5 and setting it you don't lose any data on that volume) - and it gave me 1,5 to 2 times speed-up!
    Some says there's a risk of lose of more data when outage happens, but there's always such a risk even though you can lose less. Anyway if you can afford it (and with import it's OK, as it ss not a production at that moment) - I recommend to try. Takes 15 minutes, but you can gain 2,5 hours out of 10 of normal importing.
    Edited by: g777 on 2011-02-02 14:52

  • Data Pump - How Can I to use in Oracle 8i

    Hi
    How can I to use Data Pump in Oracle 8 i ?
    is there other way for to export an import data ?
    I have to Export data from Database Oracle to Other Database Oracel only a table , some data

    or use spool method
    set head off
    set pagesize 0
    set echo off
    set feedback off
    spool yourdump.file
    select columnA||','||columnB||','||columnC from mytable;
    spool off
    -- assuming your delimiter is comma.

  • Data Pump - how can I take a hot export of a database?

    Hi,
    I have a running database, and I want to kick off an export using Data Pump. I don't want any transactions that take place once I kick off the export to be in the export file. Is this how Data pump works?
    Thanks.

    You can get a consistent set of data, but from ether exp or expdp, you can't get a consistent set of metadata. For example:
    If you user flashback parameter, the data in the dumpfile will be data that was entered before your flashback parameter. This means that if your flashback equates to scn 1234 and someone enters data at 1235, the dumpfile will not have that new data. But, if someone creates a new table at scn 1236, that table may or may not be in the dumpfile. This is true for both exp and expdp.
    Hope this helps.
    Dean

  • How-to list the contents of a Data Pump Export file?

    How can I list the contents of a 10gR2 Data Pump Export file? I'm looking at the Syntax Diagram for Data Pump Import and can't see a list-only option.
    Regards,
    Al Malin

    use the parameter SQLFILE in the impdp which writes all the sql ddl's to the specified file.
    http://download-west.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_import.htm
    SQLFILE
    Default: none
    Purpose
    Specifies a file into which all of the SQL DDL that Import would have executed, based on other parameters, is written.
    Syntax and Description
    SQLFILE=[directory_object:]file_name
    The file_name specifies where the import job will write the DDL that would be executed during the job. The SQL is not actually executed, and the target system remains unchanged. The file is written to the directory object specified in the DIRECTORY parameter, unless another directory_object is explicitly specified here. Any existing file that has a name matching the one specified with this parameter is overwritten.
    Note that passwords are not included in the SQL file. For example, if a CONNECT statement is part of the DDL that was executed, it will be replaced by a comment with only the schema name shown. In the following example, the dashes indicate that a comment follows, and the hr schema name is shown, but not the password.
    -- CONNECT hr
    Therefore, before you can execute the SQL file, you must edit it by removing the dashes indicating a comment and adding the password for the hr schema (in this case, the password is also hr), as follows:
    CONNECT hr/hr
    For Streams and other Oracle database options, anonymous PL/SQL blocks may appear within the SQLFILE output. They should not be executed directly.
    Example
    The following is an example of using the SQLFILE parameter. You can create the expfull.dmp dump file used in this example by running the example provided for the Export FULL parameter. See FULL.
    impdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=expfull.dmpSQLFILE=dpump_dir2:expfull.sql
    A SQL file named expfull.sql is written to dpump_dir2.
    Message was edited by:
    Ranga
    Message was edited by:
    Ranga

  • Data Pump - How to avoid exporting/importing dbms_scheduler jobs?

    Hi,
    I am using data pump to export a users objects. When I import them it also imports any jobs that user has created with dbms_scheduler - how can I avoid this. I tried EXCLUDE=JOBS but no luck.
    Thanks,
    Jon.
    Here are my export and import paramater files:
    DIRECTORY=dpump_dir1
    DUMPFILE=reveal.dmp
    CONTENT=METADATA_ONLY
    SCHEMAS=REVEAL
    EXCLUDE=TABLE_STATISTICS
    EXCLUDE=INDEX_STATISTICS
    LOGFILE=reveal.log
    DIRECTORY=dpump_dir1
    DUMPFILE=reveal.dmp
    CONTENT=METADATA_ONLY
    SCHEMAS=reveal
    REMAP_SCHEMA=reveal:reveal_backup
    TRANSFORM=SEGMENT_ATTRIBUTES:n
    EXCLUDE=TABLE_STATISTICS
    EXCLUDE=INDEX_STATISTICS
    LOGFILE=reveal.log

    Sorry for the reply to an old post.
    It seems that now (10.2.0.4) JOB is included in the list of SCHEMA_EXPORT_OBJECTS.
    SQL> SELECT OBJECT_PATH FROM SCHEMA_EXPORT_OBJECTS WHERE object_path LIKE '%JOB%';
    OBJECT_PATH
    JOB
    SCHEMA_EXPORT/JOB
    Unfortunatly, EXCLUDE=JOB still generates invalid argument on my schema imports. I also don't know whether these are old style jobs, or scheduler jobs. I don't see anything for object_path LIKE '%SCHED%' , which is my real interest anyway.
    The data pump is so rich already, I hate ask for more, but ... may we please have even more?? scheduler_programs, scheduler_jobs, scheduler etc.
    Thanks
    Steve

Maybe you are looking for

  • Unable to obtain full screen when burning DVD

    I have a trial version of Adobe Premiere Elements 8. When burning dvd and then playing on my tv through dvd player I am not getting a full screen. The wide screen video covers about 3/4 of the screen instead of the entire screen. The settings I am us

  • Download of iPhoto 9.2.3 won't install, says I need app store

    I have a slow internet connection, but I have 4 computers, all running either Lion or Mountain Lion.  I ike to download an update once from the Apple Download page, then put the single download on all 4 computers instead of trying to do this individu

  • CiscoWorks IPM - collectors stuck in pending state

    I have IPM 2.6 running on ciscoworks, the first collector i created as a test worked perfectly and started running immediately.  Now when i've come to start the project proper to configure 400 collectors only 4 collectors show up.  Cisco's recommende

  • How do I edit out the mouse pointer from Video Demos?

    I have recorded a few short video demos and would like to leave the mouse pointer in so it can be seen, but on occasions I need to edit it out, is there a way to do this?

  • Connect OS X Server To The Internet

    Hi. I am fairly new to OS X server and have just had it configured to a .private. I now have a domain name and would like to be able to put the server on the internet to where people could use all the functions and log in to vpn. I cannot seem to fig