Data pump and SGA, system memory in window 2003

Hi Experts,
I have a question for oracle 10g data pump. Based on oracle document,
all data pump ".dmp" and ".log" files are created on the Oracle server, not the client machine.
That means data pump need to used oracle server SGA or other system memory. is it true?
Or data pump must be support by oracle server memory?
we use oracle 10 G R4 in 32 bit window 2003 with memory issue. So we take care this point.
at the present, we use exp to do exp job. we DB size is 280 G. SGA is 2G in window.
for testing, i can saw data pump message in alert file. I does not saw taht a export job broken DB and data replication job.
Does any experts have this point experience?
Thanks,
JIM

Hi Jim,
user589812 wrote:
I have a question for oracle 10g data pump. Based on oracle document,
all data pump ".dmp" and ".log" files are created on the Oracle server, not the client machine.Yes, they are but you can always give a shared location on to another server.
That means data pump need to used oracle server SGA or other system memory. is it true?
Or data pump must be support by oracle server memory?Irrespective, the SGA is used for Conventional Export. You can reduce the overhead to some extent by a Direct Export (direct=y), but still the server resources will be in use.
we use oracle 10 G R4 in 32 bit window 2003 with memory issue. So we take care this point.If you have windiows enterprise edition, why don't you enable PAE to use more memory on provided you have memory beyond 3GB on the server.
at the present, we use exp to do exp job. we DB size is 280 G. SGA is 2G in window.With respect to the size of the database, your SGA is too small.
Hope it helps.
Regards
Z.K.

Similar Messages

  • Rendering and Low System Memory issues in PE9.

    Hello again.
    N.B: I am an ex-PE3 user.
    I am having a few issues with PE9.
    1. My main issue is that I am regularly getting the 'low system memory; please proceed with caution' message. I have googled it and read people's suggestions, but am not sure where to go from here. I have tried saving my file under another name, but am already up to '....E' (fifth) version of the file I've been working on this afternoon. The total time of the project is only approx 45 min's, although I will say that this 'low system memory' issue seems to be worst today, as I'm adding more still photos.
    2. One of the suggestions made by Steve G on one of the forums was to make sure projects are regularly rendered. When I hit 'Enter' to render, the little render window pops up for a split second and then disappears. And I would assume from the red line above my timeline that my file is not rendered.
    3. When I switch to 'Sceneline' view, my horizontal scroll-bar across the bottom will only drag SO far (to a certain point in my project; but definitely not to the end of my project. It is really frustrating. I sometimes can fix it by flicking between Sceneline and Timeline, but I have to do this multiple times... and it seems a ridiculous way to do it.
    It would be fantastic if someone could help me with these issues, please!
    System details:
    AMD processor 3.2 Ghz
    4 gig RAM (3.12 gig usable)
    Windows 7 Ultimate

    Hi everyone:
    Thankyou for all your suggestions.
    Sorry, Bill, but both those links give me an error message. I was all set to read them, but they won’t work, unfortunately.
    What is meant by ‘page file managed’, sorry? And do you mean ‘hard disk drives’ by HDD, or does it stand for something else in this context? I’m not sure what you mean by how do I have them allocated. Again, apologies for my ignorance. I do have only one HDD, but it is partitioned.
    The image sizes all seem to have a depth of 24, but vary in dimensions – I can see that some are quite large (around 3 mg in size, with dimensions of 4288x2848, for example). Is the actual mg-size relevant, or only the physical length and width?
    I have tried to look at the frame size of my project, but it seems to be greyed out. In pale digits, there is 720x576 written.
    Just before I sent this, I thought of one other thing to try first.
    I tried starting a ‘New Project’. I then inserted just 6 still photos, for experimentation purposes. I then hit ‘Enter’ and the Rendering window came up and appears to have rendered correctly. So I then tried looking at one of my other projects (that hadn’t previously been rendered) and tried to render it. In this case, as in for the project I am currently working hard on, the Rendering window flashes up for about ¼ of a second and then disappears.
    What is going on???
    From: Bill Hunt [email protected]
    Sent: Friday, 11 March 2011 2:00 AM
    To: rachel jewiss
    Subject: Rendering and Low System Memory issues in PE9.
    although I will say that this 'low system memory' issue seems to be worst today, as I'm adding more still photos.
    What are the pixel x pixel dimensions of those still images?
    What is the Frame Size of your Project?
    How is your Page File Managed, and what is the size? This http://forums.adobe.com/thread/632449?tstart=30 will give you some background and tips.
    Also, tell us about your HDD's, and how you have them allocated.
    For more discussion on Rendering, please see this http://forums.adobe.com/thread/794719?tstart=0.
    Good luck,
    Hunt

  • How to install Rescue and Recovery 4.21 on Windows 2003

    There should be a good amount of users running Windows 2003 on their thinkpad for various reasons. As much as I tried, I could not get Rescue and Recovery's MSI to install on my X61 running Win2003 R2 SP2, despite attempts to modify the Setup.ini file.
    the changes made in setup.ini:
    [Win2003Server]
    MajorVer=5
    MinorVer=2
    MinorVerMax=3
    BuildNo=3790
    PlatformId=2
    The returned installer error is "The current operating system is not supported for running Rescue and Recovery. Supported operating systems includes Microsoft Windows 2000 with Service Pack 3 or Windows XP with Service Pack 1 or later".
    This is such a shame as I'm pretty sure RR would run just fine on win2003 as both WinXP and win2003 are based on the same underlying technologies and code. Please advice if there's any method such as a command switch for MSI installer to suppress OS detection error or any modification that can be made to setup.ini to get it to install on win2003 or any utility that can be used to modify MSI files and database? Thanks!
    Message Edited by Xseriesfanboy on 08-31-2008 08:26 PM
    Message Edited by Xseriesfanboy on 08-31-2008 08:30 PM
    Solved!
    Go to Solution.

    Alrighty..I've manage to google a hack to this problem of getting Rescue and Recovery to install on Windows 2003. So let this post be a guide to how to get this done:
    Getting the tools to modify Rescue and Recovery MSI (Windows Installer)
    Download and install Orca MSI editor from http://astebner.sts.winisp.net/Tools/Orca.zip
    Review http://ravi1337.blogspot.com/2007/05/orca-msi-editor-make-any-software.html, this guide should give you a good idea what you'll be doing with this tool.
    Download Rescue and Recovery™ 4.21 from Lenovo
     Download Rescue and Recovery™ 4.21 for Windows XP (32-bit) or Windows 2000  from http://www-307.ibm.com/pc/support/site.wss/license.do?filename=thinkvantage_en/z652_setup_rnr42_016c...
    Ensure you are downloading full install with all language pack to avoid any missing installation file due to regional settings
    Run the installer and see that all files are extracted accordingly.
    Modification of Rescue and Recovery Installer
    Delete all files in C:\Documents and Settings\(User Account Name)\Local Settings\Temp\ this will prevent confusion on which files are unpacked and created by RR installer
    Extract z652_setup_rnr42_016c.exe into a folder named z652_setup_rnr42_016c.
    Execute tvtrnr42.exe
    The MSI installer is then unpacked to the following default directory: C:\Documents and Settings\(User Account Name)\Local Settings\Temp\. Note that the installer creates two directories for installation of RR, that are {F151F2B3-0C32-44D3-90E2-E639B8024622} and _isXXXX <--- 4 randomly generated digits
    The folder _isXXXX contain Rescue and Recovery.msi
    On the first run of the installer, the following error will be shown "The current operating system is not supported for running Rescue and Recovery. Supported operating systems includes Microsoft Windows 2000 with Service Pack 3 or Windows XP with Service Pack 1 or later". DO NOT CLICK OK! This is cause the installer to purge all installation files and related directories in the temp directory. Just leave the error window as it is without clicking the OK button!
    Copy all TVT files from folder z652_setup_rnr42_016c into both directories
    Modify setup.ini entry for Win2003Server with notepad and save changes:
    [Win2003Server]
    MajorVer=5
    MinorVer=2
    MinorVerMax=3
    BuildNo=3790
    PlatformId=2
    Right-click on Rescue and Recovery.msi and edit with Orca
    Go to LaunchCondition Table
    Drop the following rows and save changes:
    (((VersionNT=501 AND ServicePackLevel>=1)OR (VersionNT=500 AND ServicePackLevel>=3)) AND WINSERVER=0) OR Installed- The current operating system is not supported for running [ProductName].  Supported operating systems include Microsoft(R) Windows(R) 2000 with Service Pack 3 or later and Windows XP with Service Pack 1 or later.
    IEEXISTS=1 OR Installed - [ProductName] requires Microsoft(R) Internet Explorer version 5.5 or higher. You cannot install the Rescue and Recovery program without this version of Internet Explorer.
    INCOMPAPPNAME=0 OR Installed - [INCOMPAPPNAME] is installed and is incompatible with this version of [ProductName].  This application cannot be installed.  To install [ProductName], uninstall [INCOMPAPPNAME] and then restart this setup.
    Not SGEWRONGVERSION OR Installed - [INCOMPAPPNAME] is installed and is incompatible with this version of [ProductName].  This application cannot be installed.  To install [ProductName], uninstall [INCOMPAPPNAME] and then restart this setup.
    When you're done with these modification, re-run Rescue and Recovery.msi without closing the previous error window. Rescue and Recovery should now install without any glitch! Good Luck!
    Message Edited by Xseriesfanboy on 09-01-2008 06:37 AM
    Message Edited by Xseriesfanboy on 09-01-2008 06:38 AM

  • Folder+Exclamation point, Disk mode, and "Serious system error" in Windows

    Hey there,
    A couple days ago, I made the mistake of running with my 4th gen 40gb iPod. After playing for a while, it froze; whatever, it's done this before, and come back. This time, I keep getting the folder and exclamation point icon. Following the FAQ on the support part of the site, I put it into disk mode.
    Here's where the frustrating part beings: When I open iTunes and plug in my iPod (either via USB or Firewire), one of two things happens:
    In the first instance, iTunes tells me the iPod is corrupt. I click "Restore" and after a while, it tells me that either the necessary data cannot be found, or that an error (1418) has occurred. The farthest I have ever gotten was a progress meter and "Restoring iPod", shortly followed by an error.
    In the second, and decidedly more frustrating, after waiting a couple minutes for iTunes to recognize my iPod (my computer does instantaneously), my computer shuts off. Upon restarting, I get a system message from Windows telling me that a serious error has occurred.
    I went to an Apple store today and handed one of the workers there my iPod. When HE turned it on, everything seemed to be working...I brought it home, tried to play a song, and found myself back at square one.
    Can I restore this thing? I can get it into disk mode, my computer just won't cooperate. Will reformatting my computer help? I am using Windows XP.
    Thanks,
    Ted

    Hi Martha,
    Try to do with slightly differnt, open the Disk Utility, go to tab of PARTITION, and do a delete. Then go back to the tab ERASE and click erase with the option format of MAC extended journal. Once it has being complete, open the iPod updater and click RESTORE.

  • Data Pump and LOB Counts

    I was recently tasked with copying three schemas from one 10gR2 database to another 10gR2 database on the same RHEL server. I decided to use Data Pump in network mode.
    After performing the transfer, the count of database objects for the schemas were the same EXCEPT for LOBs. There were significantly fewer on the target (new) database as opposed to the source (old) database.
    To be certain, I retried the operation using an intermediate dump file. Again, everything ran without error. The counts were the same as before - fewer LOBs. I also compared row counts of the tables, and they were identical on both systems.
    Testing by the application user seemed to indicate everything was fine.
    My assumption is that consolidation is going on when the data is imported, resulting in fewer LOBs ... haven't worked much (really, at all) with them before. Just looking for confirmation that this is the case and that nothing is "missing".
    Here are the results:
    ORIGINAL SOURCE DATABASE
    OWNER                          OBJECT_TYPE         STATUS         CNT
    COGAUDIT_DEV                   INDEX               VALID            6
                                   LOB                 VALID           12
                                   TABLE               VALID           21
    COGSTORE_DEV                   INDEX               VALID          286
                                   LOB                 VALID          390
                                   SEQUENCE            VALID            1
                                   TABLE               VALID          200
                                   TRIGGER             VALID            2
                                   VIEW                VALID            4
    PLANNING_DEV                   INDEX               VALID           37
                                   LOB                 VALID           15
                                   SEQUENCE            VALID            3
                                   TABLE               VALID           31
                                   TRIGGER             VALID            3
    14 rows selected.Here are the counts on the BOPTBI (target) database:
    NEW TARGET DATABASE
    OWNER                          OBJECT_TYPE         STATUS         CNT
    COGAUDIT_DEV                   INDEX               VALID            6
                                   LOB                 VALID            6
                                   TABLE               VALID           21
    COGSTORE_DEV                   INDEX               VALID          286
                                   LOB                 VALID           98
                                   SEQUENCE            VALID            1
                                   TABLE               VALID          200
                                   TRIGGER             VALID            2
                                   VIEW                VALID            4
    PLANNING_DEV                   INDEX               VALID           37
                                   LOB                 VALID           15
                                   SEQUENCE            VALID            3
                                   TABLE               VALID           31
                                   TRIGGER             VALID            3
    14 rows selected.We're just curious ... thanks for any insight on this!
    Chris
    Edited by: 877086 on Aug 3, 2011 4:38 PM
    Edited by: 877086 on Aug 3, 2011 4:40 PM
    Edited by: 877086 on Aug 3, 2011 4:40 PM

    OK, here is the SQL that produced the object listing:
    break on owner skip 1;
    select   owner,
             object_type,
             status,
             count(*) cnt
    from     dba_objects
    where    owner in ('COGAUDIT_DEV', 'COGSTORE_DEV', 'PLANNING_DEV')
    group by owner,
             object_type,
             status
    order by 1, 2;Here is the export parameter file:
    # cog_all_exp.par
    userid = chamilton
    content = all
    directory = xfer
    dumpfile = cog_all_%U.dmp
    full = n
    job_name = cog_all_exp
    logfile = cog_all_exp.log
    parallel = 2
    schemas = (cogaudit_dev, cogstore_dev, planning_dev)Here is the import parameter file:
    # cog_all_imp.par
    userid = chamilton
    content = all
    directory = xfer
    dumpfile = cog_all_%U.dmp
    full = n
    job_name = cog_all_imp
    logfile = cog_all_imp.log
    parallel = 2
    reuse_datafiles = n
    schemas = (cogaudit_dev, cogstore_dev, planning_dev)
    skip_unusable_indexes = n
    table_exists_action = replaceThe above parameter files were for the dumpfile version. For the original network link version, I omitted the dumpfile parameter and substituted "network_link = boptcog_xfer".
    Chris
    Edited by: 877086 on Aug 3, 2011 6:18 PM

  • Data Pump and Data Manipulation (DML)

    Hi all,
    I am wondering if there is a method to manipulate table column data as part of a Data Pump export? For example, is it possible make the following changes during DP export:
    update employee
    set firstname = 'some new string';
    We are looking at a way where we can protect sensitve data columns. We already have a script to update column data (ie scramble text to protect personal information) but we would like to automate the changes as we export the database schema, rather than having to export, import, run DML and then export again.
    Any ideas or advice would be great!
    Regards,
    Leigh.

    Thanks Francisco!
    I haven't read the entire document yet but at first glance it appears to give me what I need. Do you know if this functionality for Data Pump is available on Oracle Database versions 10.1.0.4 and 10.2.0.3 (the versions we currently use).
    Regards,
    Leigh.
    Edited by: lelliott78 on Sep 10, 2008 9:48 PM

  • Data Pump and Physical Standby

    I want to perform a data pump export on a schema attached to a physical standby database. Is this possible and what are the steps.

    Thanks for the information. I will give it a tryI just realized it might not work, because data pump need to update the data pump job information.
    What's the concern not running export on primary?

  • Data structure and out of memory error

    I have a program need to load data files and store the float data values in 3 two dimentional array. These data files are generated from biological experiments and are quite large. For e.g., when I tried to load 59 files, each file has 409,600 rows and store these data in 3 Float[][] arrays, like following
    pixel: Float[59][409600]
    signal: Float[59][409600]
    std: Float[59][409600]
    I know the variables of float data type occupy 4 bytes in memory, is this true for the object of Float data type(the wrapper class for float)?
    After cacluation, the total memory needed for the three 2D arrays are:
    4 * 409600 * 59 * 3 = 276.6 (MB)
    I'm using JBuilder 9 Personel and already set the parameter of -XmX as 800 MB, but I still get out of memory error in the middle of loading. Could you give a hand on this issue?
    thanks a lot!

    ok, you mean the Float object will take 24 bytes to
    store in memory instead of 4 bytes needed for float
    variable, right?Yes it's about 20 bytes per Float object. 4 for the actual float and about 16 in overhead. And then 4 bytes for the reference to the object in the array. So each float will occupy a total of about 24 bytes whereas a float just takes 4.
    Would you please give me a clue where
    you get this information?Well an object stores the actual data and has an overhead of about 16 bytes so it's just to add it up -:) I use this as a rule of thumb. I don't remember anymore where I got it in the first place.
    The purpose to store in
    Float object is to display these data directly in
    JTable to reduce the overhead for displaying data in
    table.Yes but it's better to minimize the storage of the raw bulk data. You can always have getter methods that return Float objects to the rest of the program "on demand".
    I've a class to create the table model by
    subclassing the AbstractTableModel. Based on these
    loaded raw data, I've to do other expensive operation
    using different analysis methods. So maybe using the
    built in data type is more effcient in my later
    calcuations.It's always a tradeoff. Use float for fast calculations and raw storage, and Float otherwise for maximum object orientation.

  • HELP! Problems using large memory on windows 2003(32bit) and Oracle 11g

    i have Oracle 11g installed on windows 2003 enterprise server 32bit with 6GB RAM.
    And I followed the steps to enable the Very Large Memory support:
    1. add /3GB /PAE in c:\boot.ini
    2. add AWE_WINDOW_MEMORY entry in the regitry and set it's value to 209715200
    3. as for init.ora:
    add:
    dbblock_lru_latches = 64
    db_block_size = 4096
    db_block_buffers = 262144
    remove the parameters: db_cache_size,sga_max_size
    it does work when I use Oracle 10g, but for Oracle 11g, there are some errors:
    ORA-00371: not enough shared pool memory, should be atleast **** bytes
    OR
    ORA-27102: out of memory
    Can anybody share the experience in Oracle 11g?
    Much thanks!

    Increase your shared_pool_size parameter
    It looks like you have set the memory 1gb after setting up /PAE and /3GB switch
    4096*262144=1GB...
    Then why did you set /3GB

  • Data Pump and Java Classes

    I tried importing the data from one schema to another but the Java classes that we use (via Java Stored Procedures) are showing up as invalid. Is there any reason why? We are using Oracle 10g R2. I tried resolving them by running the following sql, but that didn't work either:
    ALTER JAVA CLASS "<java_clss>" RESOLVER (("*" <schema_name>)(* PUBLIC)) RESOLVE;
    Any thoughts will be appreciated.

    There are two ways to instantiate a target's data. One is to use a native data loader or utility. In Oracle's case, Oracle Data Pump (not the "data pump" in GG) or SQL*Loader, as an example. The other way is to use GoldenGate's data pump.
    You can configure DDL synchronization for Oracle, but you have to turn off the recycle bin. See Chapter 13 in the admin guide.

  • Query on data block and Operating system block

    Hi ,
    Does data base requests data in term's of Data blocks or operating system block's . If data block's , how an data block get accessed the operating system block . what are the advantages over separating the data blocks from the operating system block's . can any one please explain the what happens once the data base requests the data ?
    Thank You

    sybrand_b wrote:
    This place is called 'Oracle Forum' It does not offer the Oracle University curriculum for free, nor does it offer free abstracts of the Oracle Concepts Manual.
    Kindly read the Concepts Manual of your unmentioned version yourself.
    Thank you.
    Sybrand Bakker
    Senior Oracle DBADear Sybrand
    As you said, this place is called Forum. If you know answer, then answer the question. If you know the documentation, then refer to documentation. If you don't know the answer, then don't answer in such rough manner! Just stop. Just don't type anything. Just be polity

  • The old allowing oracle 3GB virtual memory on windows 2003 server question?

    Hi all,
    I have a 9.2.0.7.0 database on windows 2003 server (32bit). 4 dual core cpu's and 16GB of Ram.
    I have set .use_indirect_data_buffers=TRUE and *._enable_NUMA_optimization=FALSE.
    The db_block_buffers is set at 4GB.
    Shared_pool 200MB
    Large_pool 56MB
    Java pool 56MB
    PGA is currently at 400MB
    AWE_WINDOW_MEMORY is set at 720MB (which is the lowest I can set it)
    At the moment there are various jobs running on the database and no users logged on and the virtual memory is at 1.8GB.
    The problem is as soon as the users logs onto the DB the virtual memory goes over 2GB and we see the error 'ORA-04030: out of process memory when trying to allocate 2416 bytes'.
    Now I have edited the boot.ini with the following parameters:
    multi(0)disk(0)rdisk(0)partition(1)\WINNT="UBS WSP - Wintel Server Platform with PAE" /fastdetect /PAE /4GT
    because the /3GB parameter switch started the server with lots of errors.
    The oracle.exe process still seems unable to go over the 2GB limit with getting 'ORA-04030' errors.
    Can someone tell me if I have missed anything? Or a reason why oracle is not using the switch?
    TIA
    Eddy

    Satish,
    see metalink Note:1036312.6 : Utilizing Up to 3GB Virtual Memory on Windows.
    it says :
    "The 32-bit versions of the Windows® 2000 Advanced Server and Windows NT Server 4.0, Enterprise Edition, operating systems were the first versions of Windows to provide applications with a 3-GB flat virtual address space, with the kernel and executive components using only 1 GB. In response to customer requests, Microsoft has expanded the availability of this support to the 32-bit version of Windows XP Professional and all 32-bit versions of Windows Server™ 2003."
    of course, we mustn't be mistaken with the AWE switch (/PAE) that is not available for standard edition.
    http://support.microsoft.com/kb/291988 ==> tells more about what can be supported
    regards,
    marc

  • Oracle Services are not removed from system (10g, on Windows 2003)

    Hi,
    I haven't used Oracle on Windows Os before and not sure what to do.
    Here is what happened
    1- I installed 10g on windows 2003 server, created a database with service name OPER
    2- I was unhappy with the installation and removed it.
    3- then I installed Oracle 10g software again, creating a database again which has name IZROPER
    now, there are two of every oracle service in the services.msci
    service names containing the first db's (OPER) are still showing up in services, how can I remove them?
    ORACLEDBCONSOLEoper
    OracleJobSchedulerOper
    OracleServiceOper
    etc..

    start -- run -- > regedit
    In regedit mode --> HKLM -> SYSTEM -- currentcontrol set --> services --> here Oracle services are avilable , you can delete from here.

  • New create desktop shortcut GPO is causing 1030 and 1058 errors on a Windows 2003 terminal server

    Primary DC: Win2012.  BDC: Win2008. Clients: Win7, Win8.1.  Maintaining an old Win2003 terminal server to run legacy software.
    Everything works fine for a look time until I added a new GPO to create a desktop shortcut on the client machines, Windows 2003 runs into Error 1030 (Cannot query list of GPO) and Error 1058 (Cannot access the file gpt.ini for GPO, The system cannot find
    the path specified) GP processing aborted).
    Tried changing the GPO to target Win7 and 8.1 only.  Tried creating shortcut using Computer Config settings only as well as User Config settings only.  Deleting the GPO and creating it from scratch.  Nothing works, as soon as GPO is in place,
    it disables the Win2003 from processing any other GPO. 
    No issue with the client machines or other Windows servers. 
    Thanks in advance for any help.
    Roget Luo

    Hi Roget,
    Thanks for posting here.
    Error 1030 and 1058 could be caused by many aspects.
    Firstly, please run the command dcdiag on the command prompt of a DC to check if there are any error.
    Secondly, make sure the DFS service is started on the domain controller, and set the Startup type to automatic.
    What's more, make sure the TCP/IP Helper service  is started on the domain co, set it to Automatic and started it.
    For your information, please refer to the following link to get more help:
    Userenv errors occur and events are logged after you apply Group Policy to computers that are running Windows Server 2003, Windows XP, or Windows 2000
    http://support.microsoft.com/kb/887303/en-us
    How to solve event 1058 and 1030 on Windows Server 2003 SP2 member server in a domain?
    http://social.technet.microsoft.com/Forums/en-US/ceec95fb-3efa-4016-8dd5-7003909abba4/how-to-solve-event-1058-and-1030-on-windows-server-2003-sp2-member-server-in-a-domain?forum=winserverDS
    Windows 2003 Server USERENV 1058 & Group Policy Errorshttp://social.technet.microsoft.com/Forums/en-US/fad9cd47-5091-481f-8bda-0e7b10b2c814/windows-2003-server-userenv-1058-group-policy-errors?forum=winservergen
    Group Policy - Event ID Errors 1030 & 1058
    http://social.technet.microsoft.com/Forums/en-US/6c08278d-c3e3-434a-bcda-18d411a8e9fa/group-policy-event-id-errors-1030-1058?forum=winserverGP
    Hope it helps.
    Best Regards,
    Elaine
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected]

  • Data pump export full RAC database  in window single DB by network_link

    Hi Experts,
    I have a window 32 bit 10.2 database.
    I try to export a full rac database (350G some version with window DB) in window single database by dblink.
    exp syntax as
    exdpd salemanager/********@sale FULL=y DIRECTORY=dataload NETWORK_LINK=sale.net DUMPFILE=sale20100203.dmp LOGFILE=salelog20100203.log
    I created a dblink with fixed instance3. It was working for two day and display message as
    ORA-31693: Table data object "SALE_AUDIT"."AU_ITEM_IN" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEPOPULATE callout
    ORA-01555: snapshot too old: rollback segment number with name "" too small
    ORA-02063: preceding line from sale.netL
    I stoped export and checked window target alert log.
    I saw some message as
    kupprdp: master process DM00 started with pid=16, OS id=4444
    to execute - SYS.KUPM$MCP.MAIN('SYS_EXPORT_FULL_02', 'SYSTEM', 'KUPC$C_1_20100202235235', 'KUPC$S_1_20100202235235', 0);
    Tue Feb 02 23:56:12 2010
    The value (30) of MAXTRANS parameter ignored.
    kupprdp: master process DM00 started with pid=17, OS id=4024
    to execute - SYS.KUPM$MCP.MAIN('SYS_EXPORT_FULL_01', 'SALE', 'KUPC$C_1_20100202235612', 'KUPC$S_1_20100202235612', 0);
    kupprdp: worker process DW01 started with worker id=1, pid=18, OS id=2188
    to execute - SYS.KUPW$WORKER.MAIN('SYS_EXPORT_FULL_01', 'SALE');
    In RAC instance alert.log. I saw message as
    SELECT /*+ NO_PARALLEL ("KU$") */ "ID","RAW_DATA","TRANSM_ID","RECEIVED_UTC_DATE ","RECEIVED_FROM","ACTION","ORAUSER",
    "ORADATE" FROM RELATIONAL("SALE_AUDIT"."A U_ITEM_IN") "KU$"
    How to fixed this error?
    add more undotbs space in RAC instance 3 or window database?
    Thanbks
    Jim
    Edited by: user589812 on Feb 4, 2010 10:15 AM

    I usually increate undo space. Is your undo retention set smaller than the time it takes to run the job? If it is, I would think you would need to do that. If not, then I would think it would be the space. You were in the process of exporting data when the job failed which is what I would have expected. Basically, DataPump want to export each table consistent to itself. Let's say that one of your tables is partitioned and it has a large partition and a smaller partition. DataPump attempts to export the larger partiitons first and it remembers the scn for that partition. When the smaller partitions are exported, it will use the scn to get the data from that partition as it would have looked like if it exported the data when the first partiiton was used. If you don't have partitioned tables, then do you know if some of the tables in the export job (I know it's full so that includes just about all of them) are having data added to them or removed from them? I can't think of anything else that would need undo while exporting data.
    Dean

Maybe you are looking for

  • I recently changed my email address and need to change my icloud account both on my itunes and iphone. how do I do this without completely wiping the phone?

    I recently changed my email address and need to change my icloud account on both my itunes and iphone. How do I do this without completely wiping the phone. Also I tried to update to iOS6 and it is asking for my password under my old signin and locke

  • Need help setting up FTP server.

    I have a networked video server that converts analog video from my security cameras to mpeg, it has an internal FTP client that I have pointed to a share on my server, with the correct login info. This will all occur on my intranet and if possible I

  • XMLP for EBS and delivery notifications

    Ok, I'm using XMLP in EBS to generate various reports. Nothing fancy, just some rdf's called from a concurrent program with the output set as XML and templates for each. I schedule most of the reports in the concurrent manager and set the notify upon

  • Identify data blocks in a form at runtime

    At form runtime, how can I know which data blocks or canvasses are used by the current form? For instance, a form was designed to have 2 data blocks and a canvas, at runtime, is there a way for me to identify all data blocks and canvas that the form

  • WASJAVA Run SO SLOW

    Hello everyone ,I am Installing XI on Oracle at platform IBM-AIX5.3  I have done everything accord with the installation guide.It's seems everything is "OK" . But When I install Java ADD-IN for ABAP at the step is work more than 12 hours ,and now it