Import into primary instance in Standby environment

Hi ,
We have a production standby setup in which the both the primary and standby instances are running on verison
10.2.0.4, the platform being RHEL5. We are planning some maitenance activities on the primary which comprise the following:
1) Export 2 schemas (250G)
2) Drop them in the primary
3) Import these schemas back into primary.
I would like to know whether any of these activities would effect the standby synchronisation. Are there any pre-cautionary steps we need to take before starting on this task?
Regards,
Pavan

Hi,
I opened an SR with Oracle for this reason too. but no reply since 2 days :(
If I am doing a schema level export, should the tables be already created on the standby before the import?
Since the drop table is also reflected on the standby
I have a 10.2.0.3 database on AIX 5.3
Its a logical standby and not a physical standby.
does the same apply? Can I drop and do an import and will it work. Or do I need to take any more steps?
Thanks

Similar Messages

  • Dbstart/dbstop for Primary or Physical Standby instance

    We have established a Physical DataGuard instance on Oracle 9i RAC, RedHat 2.
    In our environment we switchover from the Primary to Standby site about every 6 months.
    DBSTART is called after a system restart - and its goal is to bring the database up automatically.
    DBSTART works ok when the instance is a primary instance, and does not work when the instance is a Standby database.
    As supplied by Oracle, DBSTART puts the instance into NOMOUNT status when the instance is a Standby Instance.
    I would to put the database into "Managed Recovery Mode" when the instance is a Standby Instance.
    I know that I can modify DBSTART to always put the instance into managed recovery - but then I would need to remember to modify DBSTART after a switchover.
    Has anyone modified DBSTART so that it detects a "Standby Control File" and issues the appropriate statements?

    Hi Richard,
    I'd suggest you read the documentation as a starting point, to answer your questions on here would be quite lengthy,
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14239/toc.htm
    In a nutshell you dont need downtime to maintain a standby, you can create them from a hot backup of your primary and get them recovering whilst it is open, in terms of which type of standby to use (physical or logical) that depends on your requirements for the use of the standby ... as I said read the doco (chapter 2 has a section on the benefits of each type).
    HTH
    Paul

  • CREATING A SINGLE INSTANCE PHYSICAL STANDBY FOR A RAC PRIMARY

    Hi
    Creating a single instance physical standby database for a RAC Primary.
    Getting this error.
    sql statement: alter database mount standby database
    RMAN-00571: ===========================================================
    RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
    RMAN-00571: ===========================================================
    RMAN-03002: failure of Duplicate Db command at 01/17/2008 23:05:38
    RMAN-03015: error occurred in stored script Memory Script
    RMAN-03009: failure of sql command on clone_default channel at 01/17/2008 23:05:38
    RMAN-11003: failure during parse/execution of SQL statement: alter database mount standby database
    ORA-01103: database name 'PROD' in control file is not 'DPROD'
    Any help on this.
    Regards
    Satish

    The problem here is probably with your standby init.ora file.
    When you create a standby database, the db_name parameter must NOT change. It has to match the primary database. So in your case, db_name ='PROD' and your db_unique_name='DPROD'...
    -peter

  • Is it possible to export user settings and import into same environment?

    Hi guys,
    I exported a user profile settings into a file and triedd to import saved file into same environment where I exported, but I am getting following error
    "Login name already in use"
    I followed the same process for restoring groups.Exported groups into a file and imported saved file into same environment, I was successfully restored thr groups.But I am failing to restore the user settings.
    Any help or suggestions on this issue is greately appreciated.
    Thanks in advance!

    Importing into the same environment is not supported.
    Instead, you could create all the test users in production with all the roles and then deactivate the user, so they are not usable.
    After you overwrite the non-production DBs with the production DB, you could then run a script (you will have to create it first) to activate all the users.
    Will this work for you?
    Segal

  • DBMS_DATAPUMP import into another database instance

    Hi there,
    i have a quick question as i didn't find an answer in my oracle documentations...
    I use a PL/SQL program in a Oracle Application Express (APEX) application to import a dumpfile via the dbms_datapump API. It`s no problem to import it into the database on which APEX runs and i'm connected to in my application, but now i want to start an import of the dumpfile into another database. I know that it is possible in the commandline interface of the data pump tool to import a dumpfile from one database instance to another with a connect identifier in the connect string.
    Could someone tell me where i have to put these information in the datapump API?
    Fennek

    Hi,
    thx for the replies...
    I think damorgan is right, in the meantime i read somewhere else that you appearantly cant start an import into another databse instance out of PL/SQL...
    Well, maybe i will try to use dbms_scheduler to start an external script which starts the import tool like apex_disco wrote...
    Fennek

  • How does Captivate 4 function once imported into and now within Flash 4 (AS3 environment)?

    I have a specific need to try to get a handle on a certain section of the timeline to add a textbox with certain text in it.  if (label.name == "Slide_3") {then... display the textbox and the text } does not work.  I am trying to use the addChild method to display the textbox and its text in the displayList during the Slide_3 section of the timeline.  This may not be the way to go as I can't seem to get it to work...
    My Goal: 
    Using AS3, working from within Frame1 coding area (Timeline scripting area), have some text appear in a textbox but only while Slide_3 is playing.
    I already have written the code that pulls text in from an XML document, stuffs it into a dynamically created textboxand, and  am trying to take some of the text and have that text only appear during a certain part of the timeline. 
    Assumptions:
    The standard/native way that a Captivate.cp file is imported into Flash CS4 results in a timeline that has labels (Slide_1, Slide_2, Slide_3, etc.)
    I don't want to manually create a textbox and manually put the text into it on the Actions frame of the Slide_3 label, but instead would like to put the text into the text property of the textField programatically from Frame1 (Action frame in Frame1 + F9)
    I have tried using the addChild method but that adds the textbox to the WHOLE timeline. (Creating a textbox is not a problem, but instead getting some text into the textbox only during Slide_3 is the issue)
    The Problem:
    How can I get the text to only appear during the playing of the Slide_3 label frames? (But NOT display during the rest of the timeline)
    When I run code on the Action frame of Slide_3 that outputs the displayList, I get the following:
    root1 [object MainTimeline]
         info_txt [object TextField]
    // this is a textfield that i created
          instance3 [object TextField]
          Slide_3 [object Slide_3_7]
               instance15 [object Shape]
               instance16 [object MovieClip]
                    instance17 [object Shape]
               instance18 [object clickbox4_9]
                    click_zone [object SimpleButton]
                    instance19 [object Shape]
               instance20 [object Shape]
               instance21 [object Shape]
               instance22 [object Shape]
          SlideClip3 [object MovieClip]
    Yet, when I trace the keyword *this* (from within the Action frame in Slide_3) Flash outputs [Global Object].
    I am wondering how I can display the specific text for Slide_3 only during the playback of Slide_3. I'm guessing that, if I could only find what the Global Object is, I could addChild to its display list property.
    I also tried using some of the Static variables to no avail (that were automatically created when the import occurred) such as:
    var ref=this;
    var nCurSlideIndex:Number;
    var currentSlide_mc:MovieClip;
    var quizPlaybackController;
    var scrubberChanged:Boolean;
    var slideNewFrame:Number;
    var mcLoader:Loader = new Loader();
    var loading_sub_mc:MovieClip;
    var m_preloaderLoaded:Boolean = false;
    Any guidance, suggestions, ideas, that you have would be very welcomed! 
    Some Things that I've Tried:
    Incidentally, when I write code in the main timeline (Frame1 and press F9) I get some of the following behavior:
    //this.Slide_1.addChild(info_txt); //displays the textbox with text but only during "slide 1"
    //this.addChild(info_txt); //displays textbox and text during the whole timeline
    //SlideClip1.addChild(info_txt); //displays textbox and text only during "slide 1"
    //SlideClip3.addChild(info_txt); //nothing displays --TypeError: Error #1009: Cannot access a property or method of a null object reference.
    //ref.addChild(info_txt); //displays textbox and text all the time
    //addChild(info_txt); //displays textbox and text all the time
    //ref["Slide_3"].addChild //no luck here either

    I don't have any insights into this particular problem. But we are having strange things happen with Captivate and imported AS3 animations. never had problems witht he AS2 animations. so, we are having to recreate and everything and republish as as2.

  • Switch Standby into Primary and reverse.

    Hi group,
    I have been asked to write a document describing a procedure to put a standby database as a primary database ((1)primary -> (2)standby) to ((1)failed db - (2)primary), and after their testing switch them again to the first state ((1)primary -> (2)standby), All this as a test just in case a disaster occurs, To be honest I have never been on this kind of situation that is why I ask for your help. The procedure must contain steps for 3 databases:
    1. Oracle 10g r2 Enterprise Edition (Dataguard Real Time apply) (40GB data size) (SO: AIX)
    2. Oracle 9i r2 Standard Edition (Manual Standby) (400GB data size) (SO: AIX)
    3. Oracle 9i r2 Standard Edition (Manual Standby) (80GB data size) (SO: AIX)
    I would appreciate a lot your help.
    Thanks in advance.

    This is a general procedure that we have followed for a similar test:
    Here is the setup we had: 10gR2 Approx. size 2 TB Physical standby database with dataguard configuration
    1. We conncted to the dataguard command line using "dgmgrl"
    2. Connect to the database - and verify status: "show configuration;"
    3. Issue command: "SWITCHOVER TO <standby database name>;"
    4. Exit from the dgmgrl utility.
    5. Perform the verification you need to perform as your standby is now primary - NOTE: Your primary database and standby database both will be in sync and any changes you made to database will be made to both the databases. You should not perform any distructive action against this database.
    6. Ideally you can run your entire application pointing to the original standby datbase.
    7. Follow step 1 to 4
    8. You will have you original primary back as primary database.
    Hope this will help you.
    Regards.

  • Single-instance manual standby for a three-node RAC database

    Hi all,
    I am wondering how it is possible to create a manual standby database for a rac primary database.
    Oracle versione is standard edition 11.1.
    I have experience creating and managing manual standby for single-instance database; I am wondering if and how I can instruct the single-instance standby database to discover and apply the three redo threads that I will copy from the primary site to the standby site.
    Should I use rman to catalog all the archived log on the standby site ?
    Should I configure many LOG_ARCHIVE_DEST_n parameters to point to many different path, one for each primary instance (i.e. one for each redo thread), and let rman automagically discover all the archived logs ?
    Should I configure only LOG_ARCHIVE_DEST_1 , put archivelogs from all threads in the same folder, and let rman automagically discover all the archived logs ?
    Thanks for every answer!
    Andrea

    Hi,
    indeed i catalog all transfered archive logs in front of the recover process.
    During the tests i believe (iirc) i was unable to recover from transfered archivelogs without catalog them first. But this statement is out of my mind, i am not 100% sure for this, sorry.
    Some more details.
    Because it is a standard edition, i have to use on the RAC side ASM. So our normal scripts to transfer archivelogs from primary to standby are obsolete.
    So i decided to put the whole standby thing into the database. Now the primary database uses external scheduler jobs to do the work. The standby side is not using ASM, so there is no need to handle the logs in any special way.
    The steps are:
    1. Get SCN from primary and standby
    2. Transfer the logs for the gap from ASM to "normal" filesystem.
    3. Transfer (and compress, if not in LAN) to standby archive dest
    4. Catalog on standby side
    5. Recover on standby side
    6. Delete on normal filesystem (on both sides, but on standby with a delay of 2 days just to be sure)
    The steps on the standby side are both using the primary scn from step 1 as a parameter to avoid error messages during the catalog or recover call.
    Hth
    Joerg

  • How do the application servers connect the new database after failing over from primary DB to standby DB

    How do the application servers connect the new database after failing over from primary DB to standby DB?
    We have setup a DR environment with a standalone Primary server and a standalone Physical Standby server on RHEL Linux 6.4. Now our application team would like to know:
    When the primary DB server is crashed, the standy DB server will takeover the role of primary DB through the DataGuard fast failover. As the applications are connected by the primary DB IP before,currently the physical DB is used as a different IP or listener. If this is happened, they need to stop their application servers and re-configure their connection so the they coonect the new DB server, they cannot tolerate these workaround. 
    Whether does oracle have the better solution for this so that the application can automatically know the role's transition and change to the new IP without re-confige any connection and shutdown their application?
    Oracle support provides us the answer as following:
    ==================================================================
    Applications connected to a primary database can transparently failover to the new primary database upon an Oracle Data Guard role transition. Integration with Fast Application Notification (FAN) provides fast failover for integrated clients.
    After a failover, the broker publishes Fast Application Notification (FAN) events. These FAN events can be used in the following ways:
    Applications can use FAN without programmatic changes if they use one of these Oracle integrated database clients: Oracle Database JDBC, Oracle Database Oracle Call Interface (OCI), and Oracle Data Provider for .NET ( ODP.NET). These clients can be configured for Fast Connection Failover (FCF) to automatically connect to a new primary database after a failover.
    JAVA applications can use FAN programmatically by using the JDBC FAN application programming interface to subscribe to FAN events and to execute event handling actions upon the receipt of an event.
    FAN server-side callouts can be configured on the database tier.
    FAN events are published using Oracle Notification Services (ONS) and Oracle Streams Advanced Queuing (AQ).
    =======================================================================================
    Who has the experience and the related documentation or other solutions? we don't have the concept of about FAN.
    Thank very much in advance.

    Hi mesbeg,
    Thanks alot.
    For example, there is an application JBOSS server connecting the DB, we just added another datasource and put the standby IP into the configuration file except adding a service on DB side like this following:
            <subsystem xmlns="urn:jboss:domain:datasources:1.0">
            <datasources>
                    <datasource jta="false" jndi-name="java:/jdbc/idserverDatasource" pool-name="IDServerDataSource" enabled="true" use-java-context="true">
                        <connection-url>jdbc:oracle:thin:@<primay DB IP>:1521:testdb</connection-url>
                        <connection-url>jdbc:oracle:thin:@<standby DB IP>:1521:testdb</connection-url>
                        <driver>oracle</driver>
                        <pool>
                            <min-pool-size>2</min-pool-size>
                            <max-pool-size>10</max-pool-size>
                            <prefill>true</prefill>
                        </pool>
                        <security>
                            <user-name>TEST_USER</user-name>
                            <password>Password1</password>
                        </security>
                        <validation>
                            <valid-connection-checker class-name="org.jboss.jca.adapters.jdbc.extensions.oracle.OracleValidConnectionChecker"/>
                            <validate-on-match>false</validate-on-match>
                            <background-validation>false</background-validation>
                            <use-fast-fail>false</use-fast-fail>
                            <stale-connection-checker class-name="org.jboss.jca.adapters.jdbc.extensions.oracle.OracleStaleConnectionChecker"/>
                            <exception-sorter class-name="org.jboss.jca.adapters.jdbc.extensions.oracle.OracleExceptionSorter"/>
                        </validation>
                    </datasource>
                    <drivers>
                        <driver name="oracle" module="com.oracle.jdbc">
                            <xa-datasource-class>oracle.jdbc.OracleDriver</xa-datasource-class>
                        </driver>
                    </drivers>
                </datasources>
            </subsystem>
    If the failover is occurred, the JBOSS will automatically be pointed to the standby DB. Additional actions are not needed.

  • Import into Consolidation system failed

    Hello,
    We are on NW04s and tryign to import ESS component and other standard components like SAP_JEE,SAP_BUILDT into the Consolidation system and import fails in between with the CBS Log as:
    Build number assigned: 7851
    Change request state from QUEUED to PROCESSING
    INTERNAL BUILD request in Build Space "D77_ESS11ADC_C" at Node ID: 778,177,350
         [id: 7,845; parentID: 7,539; type: 32]
    REQUEST PROCESSING started at 2007-12-14 23:25:37.149 GMT
    ===== Pre-Processing =====
    Calculate all combinations of components and variants to be built...
         "sap.com/ess/jp/fam" variant "default"
    Prepare build environment in the file system... started at 2007-12-14 23:25:37.227 GMT
         Synchronize development configuration... finished at 2007-12-14 23:25:37.227 GMT and took 0 ms
    Development line state verification started at 2007-12-14 23:25:41.336 GMT
    Verification of the development line [ws/ESS11ADC/sap.com_SAP_ESS/cons/active/] SUCCEEDED
    Development line state verification finished at 2007-12-14 23:25:41.352 GMT and took 16 ms
    Cache verification, level 2 (Comparison of attributes) started at 2007-12-14 23:25:41.352 GMT
    Verification of the following object:
         [DC: sap.com/ess/jp/fam, group: 0] SUCCEEDED
    Cache verification finished at 2007-12-14 23:25:41.368 GMT and took 16 ms
         Synchronize component definitions... finished at 2007-12-14 23:25:41.446 GMT and took 4 s 219 ms
         Synchronize sources...
    ===== Pre-Processing =====  finished at 2007-12-14 23:25:50.227 GMT and took 13 s 78 ms
    Change request state from PROCESSING to FAILED
    ERROR! The following error occurred during request processing:java.lang.OutOfMemoryError
    Unknown reason (java.lang.OutOfMemoryError)
    REQUEST PROCESSING finished at 2007-12-14 23:25:50.243 GMT and took 13 s 94 ms
    What can be dome to fix the OutOfMemory Error...?
    Any help would be highly appreciated..

    Hi Shikhil,
    Please do the follwoing
    1. Incrrease the memory size by navigating to Check the CBS settings in the Visual Administrator->Services->Component Build Service-> Tab 'Properties' 
    2.Refer to this thread
    CMS Error importing into Consolidation System
    3.SAP NOTE 723909
    Thanks
    Pankaj

  • Error While Importing into The Target Server

    Imported the export dump into the Target server.
    while running the import I am getting the following errors in the Transport set Import log file.
    13-JUL-05 15:00:11][ERROR] id = NULL context = style_pre_check user = ORCLADMIN The Unique Identifier of the source style KNPC_PORTAL does not match the target style KNPC_PORTAL
    Precheck failed for the whole KNPC_Page group and HRMs page group.
    Precheck failed for LOV REPORTS_COMPONENT_LOV
    Precheck failed for RWSVR EISDB_REPORT_SERVER
    Precheck failed for DYNAMIC DYN_0418073018
    Precheck failed for DATAPTL COMPONENT1
    Precheck failed for DATAPTL DATA_0418075629
    Precheck failed for AREPORT DETAILS
    Precheck failed for AREPORT EMPLOYEEDETAILS
    Precheck failed for HIERARCH HIE_0604143208
    Any help is highly appreciated
    Thanks&Regards
    Manoj
    All the prerequisites carried out in the Target Server

    The message
    13-JUL-05 15:00:11][ERROR] id = NULL context = style_pre_check user = ORCLADMIN The Unique Identifier of the source style KNPC_PORTAL does not match the target style KNPC_PORTAL
    indicates that a style with the same name has been created in target directly or imported from another instance (other than the one from which the transport set is attempting to import).
    In order for pre-check to pass, you will either need to locate the object(s) in the target, rename (or delete) them and reattempt import, (or) rename them in source and reexport (recreate dumps) and import the new transport set.

  • Error importing *.SCA file into NWDS:  Import (into source SC) not possible

    All,
    When attempting to import an SCA into a new project in NWDS, an "Import (into source SC) not possible: SCA doesn't contain DC sources" occurs.  Any assistance to fix this issue appreciated.
    Thanks,
    Lee

    You can indeed import SCAs into local development configurations in NWDS (Eclipse). You have to have a pre-existing software component of the same name and vendor which may or may not be empty.
    In the Development Infrastructure perspective in NWDS 7.1, just right-click on the software component you want to overwrite with the import and choose "Import".
    To import SCAs into a CMS track in the NWDI, you have to go to http://<NWDI server>:5<nn instance no>00/devinf/ and import the SCAs from the transport inbox directory of the NWDI server. Don't forget to maintain the SLD if necessary.
    Cheers,
    Thorsten

  • Do I need to compress video before importing into DVD SP?

    Hello,
    this is probably stupid question, but I would like to know, if I need to compress my video before I import it into DVD SP as an asset. It looks like everyone use Compressor before importing into DVD SP, (setting up number of passes, bit rates and so on), and then they do it again when setting up preferences in DVD SP. (setting 1 pass or 2passes, bit rate values)
    So is not the video actually compressed twice? Once in Compressor and once in DVD SP?
    And also, what video formats are actually OK to import into DVD SP? (QT, AVI, FCP movie, etc)
    Does DVD SP take them all, or does it only have to be QT?
    Thank you and I appologize for these novice questions.

    Welcome to the Boards
    Madagascar wrote:
    So is not the video actually compressed twice? Once in Compressor and once in DVD SP?
    And also, what video formats are actually OK to import into DVD SP? (QT, AVI, FCP movie, etc)
    Does DVD SP take them all, or does it only have to be QT?
    Thank you and I appologize for these novice questions.
    m2v will not be recompressed when placed on tracks (though in some instances it may recompress on menus.) Ultimately it is better to compress outside of DVD SP for control of encodes (and making AC3 files) and also some formats may not be supported by importing into DVD SP directly.
    Some information to take a look at
    http://dvdstepbystep.com/faqs_3.php
    http://dvdstepbystep.com/faqs_7.php
    http://dvdstepbystep.com/qc.php
    http://dvdstepbystep.com/fasttrackover.php (middle section discusses Compressor a bit more)

  • Re Importing into Lightroom 4.1

    My photographic equipment consists of a Nikon D 7000 ( set to RAW) and a P 5000 (JPEG). My computer is an iMac with Lightroom 4.1 installed. Up until recently it has been possible to import from the SD cards directly into Lightroom via the reader in the side of the Mac. As of late the program will let me see the previews but when I attempt to import the images a message pops up that the files cannot be read! I can import them into 'iPhoto' but this is not where I want them! Can any one please suggest what has caused this new problem? I eagerly await any suggestions. With Thanks.

    Dear tlsteinb.Thanks for your reply. I tried the ejection of the external drive as you suggested. To no avail! It is still a mystery to me. I do not know if i have to change a setting on the camera, the computer, or the program! I tried to send a 'chat' message to Adobe directly.How successful that will be we will wait and see! I shall keep you posted. Regards, Rae Dall
    Date: Tue, 5 Jun 2012 10:20:47 -0600
    From: [email protected]
    To: [email protected]
    Subject: Re Importing into Lightroom 4.1
        Re: Re Importing into Lightroom 4.1
        created by tlsteinb in Photoshop Lightroom - View the full discussion
    Just this morning I've begun experiencing the same thing. I can get the photos off the card and onto the iMac desktop for instance, but they won't import into Lightroom 4 from either the card reader or the desktop. So my assumption is that my card reader is working fine, but Lightroom is not. Any help would be greatly appreciated.
         Replies to this message go to everyone subscribed to this thread, not directly to the person who posted the message. To post a reply, either reply to this email or visit the message page: http://forums.adobe.com/message/4467079#4467079
         To unsubscribe from this thread, please visit the message page at http://forums.adobe.com/message/4467079#4467079. In the Actions box on the right, click the Stop Email Notifications link.
         Start a new discussion in Photoshop Lightroom by email or at Adobe Forums
      For more information about maintaining your forum email notifications please go to http://forums.adobe.com/message/2936746#2936746.

  • Ideas on how to export from Oracle (11g) for import into a MSSQL server?

    Hi folks,
    Crazy question yes...but I have some devs that are needing to export some schemas/tables from Oracle to somehow import into MSSQL.
    Are there any utilities our there, hopefully from Oracle that would allow for this? So far, all I can think of are to generate the DDL from the Oracle instance, and then do some scripts to output the data in a cvs type format....
    Anything better than this?
    Any links or suggestions greatly appreciated.
    Thank you in advance,
    cayenne

    cayenne wrote:
    EdStevens wrote:
    cayenne wrote:
    sybrand_b wrote:
    Heterogeneous Services using ODBC or OLEDB.
    No need for stupid csv files.
    Sybrand Bakker
    Senior Oracle DBAThank you for the prompt reply. I'm looking into this and am finding some Oracle documentation, but just off the top of my head, this approach seems to assume the 2 database servers can see or talk to each other.
    What if the oracle box is on one network...that cannot connect to the network that the MSSQL box is on. For example..running off some test data from one company to another company with networks that do not cross?Assuming some acceptable export/import utility, how do you plan on getting the file from the source to the target if the two servers are on networks that cannot talk to each other?Well, I'm still trying to gather facts, this is just a question coming in from some developers.
    From what it may be, is that the can scp from site to site, but no direct connectivity between the database servers on the sites...isolated, maybe a DMZ...I dunno everything yet.Well, you either have connectivity between the two machines or you don't. Binary decision tree.
    IF  net_connectivity_exists
         you can use oracle sqlnet enabled capabilities
         OR
         you can transfer an export file via ftp, sftp, scp, etc.
    ELSE
         you have to transfer an export file via sneakernet
    END-IF

Maybe you are looking for

  • Function-based indexes in 8i

    How can I enable function-based indexes on a already created database. Is an Installation/Db Creation setting? On another database I am able to create function-based indexes. Any help appreciated. Ashish null

  • Mounting USB 2.0 thumb drive

    I can't get my iMac running OS X (10.5.6) to mount a USB 2.0 thumb drive. It appears in System Profiler but can't see it on Finder or Desktop. I've already reset the SMC.

  • DNS FQDN Mavericks 1.9.0 and Server 3.0.1

    Hello everyone. I'm tstill on my Maverick test server and I can't get my DNS right. Please help. Mavericks 10.9.0 on Server 3.0.1 on 1 MBPro 2011 Mavericks 10.9.0 Client on 2 MacBook Air 2011 Lets say I registered example.com with easydns registrar.

  • Request to Update opty whn a project status in oracle status  is changed

    Hi, A Project Request is created for an opty of oracle sales in oracle projects using PRC: Manage Project Requests. A Project can be created for that Project Request. Is there any way to update the opty if this project is updated? In simple words is

  • Will a new K7N420 Pro(MS-6373) bios ever be released?

    I've used every bios ever released: http://www.msi.com.tw/program/support/bios/bos/spt_bos_detail.php?UID=10&NAME=MS-6373 What about support for Athlon XP 3000 (333FSB 2.16GHz) and future CPU speeds? Any bug fixes or performance tweaks? How about ope