Exp/imp related problem

Hi all,
I have one problem in Oracle 9.2.0.8 onRHEL4. Iwant to use exp/imp to update one table from one server to another.
Suppose I have one table abc on A server and it is having around 1 million records and another B server is also having the same no. of records and some of the record have been changed to A server and i also want to imp the same record to another server B.Is there any paramneter in IMP.
Please suggest me.

Hi,
Kamran Agayev A. wrote:
user00726 wrote:
but how would i know which table has been updated or modifiedUsing MERGE funtion, you'll merge two tables. This command will updated changed rows and insert non-inserted rows to the second table and make it as the same as the first tableSQL> create table azar(pid number,sales number,status varchar2(20));
Table created.
SQL> create table azar01(pid number,sales number,status varchar2(20));
Table created.
SQL> insert into azar01 values(1,12,'CURR');
1 row created.
SQL> insert into azar01 values(2,13,'NEW');
1 row created.
SQL> insert into azar01 values(3,15,'CURR');
1 row created.
SQL> insert into azar values(2,24,'CURR');
1 row created.
SQL> insert into azar values(3,0,'OBS');
1 row created.
SQL> insert into azar values(4,42,'CURR');
1 row created.
SQL> commit;
Commit complete.
SQL> select * from azar01;
PID SALES STATUS
1 12 CURR
2 13 NEW
3 15 CURR
SQL> select * from azar;
PID SALES STATUS
2 24 CURR
3 0 OBS
4 42 CURR
SQL> merge into azar01 a using azar b on (a.pid=b.pid) when matched
2 then update set a.sales=a.sales + b.sales, a.status=b.status
3 delete where a.status='OBS'
4 when not matched
5 then insert values(b.pid,b.sales,'NEW');
3 rows merged.
SQL> select * from azar01;
PID SALES STATUS
1 12 CURR
2 37 CURR
4 42 NEW
Hello Sir, is this correct?
Regards
S.Azar
DBA

Similar Messages

  • Exp/imp problems

    Hi,
    Kindly Help.
    I plan to export full the database then I'll import it back to the same database, my purpose is to fix the fragmentation of the tables/tablespaces.
    1. Before importing the export file should I do anything to the database( e.g. truncate the tables, delete the contents)?
    2. Does the exp/imp utility depend on the SGA for it's performance or if not the SGA what parameter should I change to speep up the exp/imp?
    Thank you

    I plan to export full the database then I'll import
    it back to the same database, my purpose is to fix
    the fragmentation of the tables/tablespaces. Why you using export/import to de-fragmentation, there are other ways to do same like rebuild indexes, Use LMT etc
    Virag

  • Need help for full db exp-imp

    Hi,
    My database is having undo segment corruption.I have considered and tried a lot of things to come out of the situation,but didn't get any +ve result. So I have decided to take a full database export,rename the database,create a new database on the same system and import the full database in the new database.Will you plz tell me the steps to do this(full db import)? Plz include a wise syntax for full db exp-imp.
    Thanks a lot

    If your only identified problem was sudden undo corruption to an online instance and you followed the steps referenced on the Burleson site (switch to manual undo management, create a new undo tablespace, drop the old undo tablespace, and switch back to automatic undo management) and then the very next day encountered undo corruption again, I would recommend two simultaneous courses of action:
    1) Get the hardware analyzed, this could be a signal of hard drive or controller problems.
    2) Scour metalink for any bug references that may be related (or just go ahead and open an iTar).
    If you persist in importing to create the database back on this same system and if you must use the same storage, I would at least try to get the file system or volume that the corruption was on taken offline and recreated.
    Just an alternate view of things ... good luck!

  • Running OMBPlus and EXP/IMP in mixed version environment

    OWB Mixed Environment Guru's
    Current environment:
    OWB Client: 10.1.0.2.0 on Windows XP Professional
    OWB Server side: 10.1.0.2.0 on UNIX (AIX 5.2)
    Repository: Oracle 9.2.0.4 on UNIX (AIX 5.2)
    UNIX Listener: 9.2.0.4 on UNIX (AIX 5.2)
    Runtime Repository: Oracle 9.2.0.4 on UNIX (AIX 5.2)
    I call this a mixed environment since my OWB stuff is 10g and my database stuff is 9.2.
    Issues:
    1- I can't get the command line exp.sh script to connect to the repository and returns the famous 'ORA-12154, TNS:listener does not currently know of service requested in connect descriptor'. It looks like the 'owbsetenv.sh' script is changing the value of $ORACLE_HOME to point to the 10g areas. Could that be then causing the system to look for a 10g LISTENER which doesn't exist since all my databases are 9.2.0.4???
    2- I have the same issue trying to run OMBPlus.sh.
    I am ultimately trying to set up a promotion process using the UNIX command line programs (exp/imp and OMBPlus) to get objects from the TEST environment into the PRODUCTION environment which is a separate repository and target schema on a different machine.
    Any advice on how to successfully operate in this 'mixed' environment is most welcomed.
    Many thanks!
    Gary

    Well it looks like I did it again!
    Total brain fart.
    The problem turned out that I wasn't specifying the entire SERVICE_NAME for the repository database. I had been leaving off the domain information. Must be a habit from not having to use it in the TNSNAMES.ORA files.
    I was able to compelte my test export and connect to OMBPLUS and will now try my test import.
    Sorry to clutter the forum but if it helps anyone else with the same affliction I seem to have frequently, I guess that's a small reward.
    Until next time.
    Gary

  • Best way of using exp/imp

    Dear all,
    I wanted to migrate database from 8i to 11g(8.1.5 to 11.1.0). I am going for exp/imp method. Which is the best method of doing this task? I mean Full export and Import Or Schemawise export and import? Is there any chances of missing objects or rows while doing this task? If yes, How to avoid? Please help me to take a best decision. I dont want any problem after migration.
    Approach is (take exp of 8.1.5 and imp in 9.2.0) then (exp of 9.2.0 and imp it to 11.1.0.6)
    OS is HP Unix
    Nishant Santhan

    Have you not yet completed this task? As we have already answered to your question a couple of days back.
    Take a look at the similar duplicate thread created by you.
    Re: Migrating from 8i to 11g
    Regards,
    Sabdar Syed.

  • How to exp/imp both diff character set tables between in DB1 and DB2?

    In the Solaris 2.7 ,the oracle 8i DB1 has NLS_CHARACTERSET
    ZHS16CGB231280 and NLS_NCHAR_CHARACTERSET ZHS16CGB231280
    character set.
    In other linux7.2 system ,the oracle 8i DB2 is install into the
    NLS_NCHAR_CHARACTERSET US7ASCII and NLS_CHARACTERSET US7ASCII
    character set.
    The tables contents of DB1 have some chinese. I want to exp/imp
    tables of DB1 into DB2 . But the chinese can't correct display
    in the SQLWheet tools. How do the Exp/Imp operation ? ples help
    me . thanks .

    The supported way to store GB231280-encoded characters is using a ZHS16CGB231280 database or a database created using a superset of GB231280 ,such as UTF8 .Can you not upgrade your target database from US7ASCII to ZHS16CGB231280 ?
    With US7ASCII and NLS_LANG set to US7ASCII , you are using the garbage in garbage out (GIGO) approach. This may seem to work but there are many hidden problems :-
    1. Invalid SQL String Function behaviours - LENGTH ( ) , SUBSTR ( ) , INSTR ( )
    2. Data can be corrupted when data is loaded into another database. e.g. EXP / IMP , Dblinks
    3. Communication with other clients will generate incorrect results. e.g. other Oracle products - Oracle Text, Forms. , Java , HTML etc.
    4. Linguistic sorts not available
    5. Query using the standard WHERE clause may return incorrect results ..
    6. Extra coding overhead in handling character conversions manually.
    I recommend you to check out the FAQ and the DB Character set migration guide on the Globalization Support forum on OTN.
    Nat.

  • Tablespace exp imp 11G

    I have problem,
    I want to imp file.dmp to schemauser
    schemauser have tablespace default USERS
    then I exp schmeauser from my pc. I named file.dmp
    then when
    I imp from other pc,
    I have prepare for schmeausernew with tablespace default tbspace.
    when I imp the data I suprise with my default tablespace tbspace not using.
    But my imp it take a place in USERS.
    Sow my question is, How to imp automaticly to my default tablespace tbspace ???
    I don't want to alter table tablespace when I must one by one and also my index.
    Thanks.
    for Supporting.

    Exp/Imp with tablespace autocreate?
    imp user/pass tablespaces=dbtothis FULL=Y
    Why still not into tablespace=dbtothis.
    Still in Tablespace users
    Thanks.
    For help me.

  • EXP/IMP with NCLOB/BLOB

    Hello all,
    I have to pass an schema from one database, to another recreated as the first one (same schema name, same rights, same tablespace, same all) Both are Oracle database 11.2.0.2 enterprise 64 bits on linux 64. Is there any consideration I have to take, as there's objects with NCLOB and BLOB types... I've tryed the export, and it has gone ok without any problem, buy there will be any issue when I import it on the second database?
    All this is to change the database's server from the phisical one that it's now, to a virtual one (OVM).
    Thanks for your help!!

    It would probably be better if you used Data Pump (expdp/impdp) instead of original exp/imp. It has support for more objects and features that you may have used.
    Dean

  • Does data pump really replace exp/imp?

    hi guys,
    Ive read some people saying we should be using data pump instead of exp/imp. But as far as I can see, if I have a database behind a firewall at some other place, and cannot connect to that database directly and need to get some data acrposs, then data pump is useless for me and I can only exp and imp the data.

    OracleGuy777 wrote:
    ...and i guess this means that data pump does not replace exp and imp.Well, depending of your database version, it is.
    "+Original Export is desupported for general use as of Oracle Database 11g. The only supported use of original Export in 11g is backward migration of XMLType data to a database version 10g release 2 (10.2) or earlier. Therefore, Oracle recommends that you use the new Data Pump Export and Import utilities, except in the following situations which require original Export and Import:+
    +* You want to import files that were created using the original Export utility (exp).+
    +* You want to export files that will be imported using the original Import utility (imp). An example of this would be if you wanted to export data from Oracle Database 10g and then import it into an earlier database release.+"
    http://download.oracle.com/docs/cd/E11882_01/server.112/e10701/original_export.htm#SUTIL3634
    Coming back on your problem, as already suggested, you have the NETWORK_LINK parameter, you're able to export data from source and import directly into your target db without the need of any intermediate file.
    Nicolas.

  • Problem description: I have ordered a new iMac but my current one seems to be running very slowly.  I believe it to be a memory related problem but can't seem to identify what is causing it.  Help is greatly appreciated.

    Problem description:
    I have ordered a new iMac but my current one seems to be running really slowly.  I believe it to be a memory related problem but can’t seem to identify what is causing it.  Your help is greatly appreciated.
    EtreCheck version: 2.1.8 (121)
    Report generated March 7, 2015 at 4:01:26 PM PST
    Download EtreCheck from http://etresoft.com/etrecheck
    Click the [Click for support] links for help with non-Apple products.
    Click the [Click for details] links for more information about that line.
    Hardware Information: ℹ️
        iMac (24-inch, Early 2008) (Verified)
        iMac - model: iMac8,1
        1 2.8 GHz Intel Core 2 Duo CPU: 2-core
        2 GB RAM Upgradeable
            BANK 0/DIMM0
                1 GB DDR2 SDRAM 800 MHz ok
            BANK 1/DIMM1
                1 GB DDR2 SDRAM 800 MHz ok
        Bluetooth: Old - Handoff/Airdrop2 not supported
        Wireless:  en1: 802.11 a/b/g/n
    Video Information: ℹ️
        ATI Radeon HD 2600 Pro - VRAM: 256 MB
            iMac 1920 x 1200
    System Software: ℹ️
        OS X 10.10 (14A389) - Time since boot: 23 days 18:5:35
    Disk Information: ℹ️
        ST3320820AS_Q disk0 : (320.07 GB)
            EFI (disk0s1) <not mounted> : 210 MB
            Macintosh HD (disk0s2) / : 319.21 GB (195.05 GB free)
            Recovery HD (disk0s3) <not mounted>  [Recovery]: 650 MB
    USB Information: ℹ️
        HP Officejet 6000 E609n
        Apple, Inc. Keyboard Hub
            Primax Electronics Apple Optical USB Mouse
            Apple, Inc Apple Keyboard
        Apple Inc. Built-in iSight
        Western Digital External HDD 500.11 GB
            My Passport (disk1s1) /Volumes/My Passport : 500.11 GB (200.60 GB free)
        Apple Inc. BRCM2046 Hub
            Apple Inc. Bluetooth USB Host Controller
        Apple Computer, Inc. IR Receiver
    Gatekeeper: ℹ️
        Mac App Store and identified developers
    Kernel Extensions: ℹ️
            /Applications/Popcorn 4/Popcorn.app
        [not loaded]    com.roxio.TDIXController (2.0) [Click for support]
            /System/Library/Extensions
        [not loaded]    com.aliph.driver.jstub (1.1.2 - SDK 10.7) [Click for support]
    Problem System Launch Agents: ℹ️
        [killed]    com.apple.accountsd.plist
        [killed]    com.apple.AirPlayUIAgent.plist
        [killed]    com.apple.bird.plist
        [killed]    com.apple.CalendarAgent.plist
        [killed]    com.apple.CallHistoryPluginHelper.plist
        [killed]    com.apple.CallHistorySyncHelper.plist
        [killed]    com.apple.cloudd.plist
        [killed]    com.apple.cmfsyncagent.plist
        [killed]    com.apple.coreservices.appleid.authentication.plist
        [killed]    com.apple.coreservices.uiagent.plist
        [killed]    com.apple.EscrowSecurityAlert.plist
        [killed]    com.apple.nsurlsessiond.plist
        [killed]    com.apple.pluginkit.pkd.plist
        [killed]    com.apple.printtool.agent.plist
        [killed]    com.apple.rcd.plist
        [killed]    com.apple.recentsd.plist
        [killed]    com.apple.sbd.plist
        [killed]    com.apple.scopedbookmarkagent.xpc.plist
        [killed]    com.apple.secd.plist
        [killed]    com.apple.security.cloudkeychainproxy.plist
        [killed]    com.apple.spindump_agent.plist
        [killed]    com.apple.telephonyutilities.callservicesd.plist
        22 processes killed due to memory pressure
    Problem System Launch Daemons: ℹ️
        [killed]    com.apple.AssetCacheLocatorService.plist
        [killed]    com.apple.awdd.plist
        [killed]    com.apple.ctkd.plist
        [killed]    com.apple.diagnosticd.plist
        [killed]    com.apple.emond.aslmanager.plist
        [killed]    com.apple.GSSCred.plist
        [killed]    com.apple.icloud.findmydeviced.plist
        [killed]    com.apple.ifdreader.plist
        [killed]    com.apple.nehelper.plist
        [killed]    com.apple.nsurlsessiond.plist
        [killed]    com.apple.periodic-daily.plist
        [killed]    com.apple.periodic-monthly.plist
        [killed]    com.apple.periodic-weekly.plist
        [killed]    com.apple.softwareupdate_download_service.plist
        [killed]    com.apple.softwareupdated.plist
        [killed]    com.apple.spindump.plist
        [killed]    com.apple.tccd.system.plist
        [killed]    com.apple.wdhelper.plist
        [killed]    com.apple.xpc.smd.plist
        [killed]    org.cups.cupsd.plist
        20 processes killed due to memory pressure
    Launch Daemons: ℹ️
        [loaded]    com.adobe.fpsaud.plist [Click for support]
        [loaded]    com.sonos.smbbump.plist [Click for support]
    User Login Items: ℹ️
        iTunesHelper    UNKNOWN Hidden (missing value)
        Jawbone Updater    Application  (/Applications/Jawbone Updater.app)
        Dropbox    Application  (/Applications/Dropbox.app)
    Internet Plug-ins: ℹ️
        Photo Center Plugin: Version: Photo Center Plugin 1.1.2.2 [Click for support]
        FlashPlayer-10.6: Version: 16.0.0.305 - SDK 10.6 [Click for support]
        Default Browser: Version: 600 - SDK 10.10
        Flash Player: Version: 16.0.0.305 - SDK 10.6 [Click for support]
        QuickTime Plugin: Version: 7.7.3
        OfficeLiveBrowserPlugin: Version: 12.3.6 [Click for support]
        CitrixICAClientPlugIn: Version: 11.2.0 [Click for support]
        iPhotoPhotocast: Version: 7.0
    3rd Party Preference Panes: ℹ️
        Citrix Online Plug-in  [Click for support]
        Flash Player  [Click for support]
        UE Smart Radio  [Click for support]
    Time Machine: ℹ️
        Auto backup: YES
        Volumes being backed up:
            Macintosh HD: Disk size: 319.21 GB Disk used: 124.16 GB
        Destinations:
            Time Machine Backups [Local]
            Total size: 3.00 TB
            Total number of backups: 98
            Oldest backup: 2013-08-28 05:45:36 +0000
            Last backup: 2014-12-30 19:39:32 +0000
            Size of backup disk: Excellent
                Backup size 3.00 TB > (Disk size 319.21 GB X 3)
    Top Processes by CPU: ℹ️
             5%    com.apple.WebKit.Plugin.64
             3%    WindowServer
             2%    sysmond
             0%    AppleSpell
             0%    com.apple.WebKit.Networking
    Top Processes by Memory: ℹ️
        41 MB    Finder
        41 MB    iTunes
        37 MB    Safari
        32 MB    com.apple.WebKit.Plugin.64
        32 MB    mds
    Virtual Memory Information: ℹ️
        53 MB    Free RAM
        501 MB    Active RAM
        463 MB    Inactive RAM
        398 MB    Wired RAM
        78.46 GB    Page-ins
        3.04 GB    Page-outs
    Diagnostics Information: ℹ️
        Mar 7, 2015, 12:41:34 PM    /Library/Logs/DiagnosticReports/Inkjet4_2015-03-07-124134_[redacted].crash
        Mar 5, 2015, 07:42:29 PM    /Library/Logs/DiagnosticReports/com.apple.WebKit.WebContent_2015-03-05-194229_[ redacted].crash

    That certainly looks like a low memory issue, 2 GB ram is barely enough to run Mac OS X versions from Lion through Yosemite.  The high number of Page Outs and killed processes show very high memory pressure.
    Try adding memory, but be sure it is high quality, Mac certified memory such as that from OWC, http://www.macsales.com or Crucial, http://www.crucial.com

  • Exp/Imp alternatives for large amounts of data (30GB)

    Hi,
    I've come into a new role where various test database are to be 'refreshed' each night with cleansed copies of production data. They have been using the Imp/Exp utilities with 10g R2. The export process is ok, but what's killing us is the time it takes to transfer..unzip...and import 32GB .dmp files. I'm looking for suggestions on what we can do to reduce these times. Currently the import takes 4 to 5 hours.
    I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilities. Are 'Transportable Tablespaces' the next logical solution? I've been reading up on them and could start prototyping/testing the process next week. What else is in Oracle's toolbox I should be considering?
    Thanks
    brian

    Hi,
    I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilitiesDatapump will be faster for a couple of reasons. It uses direct path to unload the data. DataPump also supports parallel processes, so while one process is exporting metadata, the other processes can be exporting the data. In 11, you can also compress the dumpfiles as you are exporting. (Both data and metadata compression is available in 11, I think metadata compression is available in 10.2). This will remove your zip step.
    As far as transportable tablespace, yes, this is an option. There are some requirements, but if it works for you, all you will be exporting will be the metadata and no data. The data is copied from the source to the target by way of datafiles. One of the biggest requirements is that the tablespaces need to be read only while the export job is running. This is true for both exp/imp and expdp/impdp.

  • Export / import  exp / imp commands Oracle 10gXE on Ubuntu

    I have Oracle 10gXE installed on Linux 2.6.32-28-generic #55-Ubuntu, and I need soe help on how to export / import the base with exp / imp commands. The commands seems to be installed on */usr/lib/oracle/xe/app/oracle/product/10.2.0/server/bin* directory but I cannot execute them.
    The error message I got =
    No command 'exp' found, did you mean:
    Command 'xep' from package 'pvm-examples' (universe)
    Command 'ex' from package 'vim' (main)
    Command 'ex' from package 'nvi' (universe)
    Command 'ex' from package 'vim-nox' (universe)
    Command 'ex' from package 'vim-gnome' (main)
    Command 'ex' from package 'vim-tiny' (main)
    Command 'ex' from package 'vim-gtk' (universe)
    Command 'axp' from package 'axp' (universe)
    Command 'expr' from package 'coreutils' (main)
    Command 'expn' from package 'sendmail-base' (universe)
    Command 'epp' from package 'e16' (universe)
    exp: command not found
    Is there something I have to do ?

    Hi,
    You have not set environment variables correctly.
    http://download.oracle.com/docs/cd/B25329_01/doc/install.102/b25144/toc.htm#BABDGCHH
    And of course that sciprt have small hickup, so see
    http://ubuntuforums.org/showpost.php?p=7838671&postcount=4
    Regards,
    Jari

  • Exp/Imp content Area Portlet..?

    How to Exp/Imp "Content area folder published as a portlet" which reside in "WWSBR_SITEBUILDER_PROVIDER".
    Anybody from Oracle..?
    Thanks.
    Rakesh

    I assume you are using the 309 version of portal. It is possible to export out an entire content area or a page but not the granular items. By granular I mean just the folder in this context. If you wish to export/import the entire content area/page, you can run the pageexp/pageimp/contexp/contimp scripts which reside in the WWU directory.

  • Exp/Imp In Oracle 10g client

    Hi All,
    I want to take one schema export from Oracle 10g Client. I am in new Oracle 10g.
    In oracle 9i, we can using exp/imp command for Export and Import from Client itself.
    I heard about in Oracle 10g, we can use Expdb/Impdb in Server only. Is there any possibility to take exp/imp in Client also.
    Pls help me...
    Cheers,
    Moorthy.GS

    To add up to expdb from oracle client
    NETWORK_LINK
    Default: none
    Purpose
    Enables an export from a (source) database identified by a valid database link. The data from the source database instance is written to a dump file set on the connected database instance.
    Syntax and Description
    NETWORK_LINK=source_database_link
    The NETWORK_LINK parameter initiates an export using a database link. This means that the system to which the expdp client is connected contacts the source database referenced by the source_database_link, retrieves data from it, and writes the data to a dump file set back on the connected system.
    The source_database_link provided must be the name of a database link to an available database. If the database on that instance does not already have a database link, you or your DBA must create one. For more information about the CREATE DATABASE LINK statement, see Oracle Database SQL Reference.
    If the source database is read-only, then the user on the source database must have a locally managed tablespace assigned as the default temporary tablespace. Otherwise, the job will fail. For further details about this, see the information about creating locally managed temporary tablespaces in the Oracle Database Administrator's Guide.
    Restrictions
    When the NETWORK_LINK parameter is used in conjunction with the TABLES parameter, only whole tables can be exported (not partitions of tables).
    The only types of database links supported by Data Pump Export are: public, fixed-user, and connected-user. Current-user database links are not supported.
    Example
    The following is an example of using the NETWORK_LINK parameter. The source_database_link would be replaced with the name of a valid database link that must already exist.
    expdp hr/hr DIRECTORY=dpump_dir1 NETWORK_LINK=source_database_linkDUMPFILE=network_export.dmp LOGFILE=network_export.log

  • GUI related problem in solaris 8 02/04

    Hello all,
    I am facing GUI related problem in the software running on solaris when I upgraded from solaris 8 02/02 to 8 02/04.
    The GUI which was working fine on 02/02 was giving some menu selection problem in 02/04. In GUI applications i was unable to select items after double clicking it in solaris 8 02/04 while it was working fine in solaris 8 02/02.
    Are there any O.S. related problem?
    Thanks and regards,
    Ankit

    Wolfgang073 wrote:
    Hi guys and ladys!
    I added the LD_LIBRARY_PATH, Why are you doing that. It's probably a bad idea.
    but the Openwindows not run OK (e.g. start cmdtool window without csh % prompt etc.).That's very possible. Don't make global changes to LD_LIBRARY_PATH (by putting in your login files for example). Instead if you need it to run a particular program, make a wrapper that sets the variable and then calls the program. That will limit the scope of those changes.
    vi .login
    # Erasing LD_LIBRARY_PATH
    setenv LD_LIBRARY_PATH
    # Important Paths set
    setenv LD_LIBRARY_PATH     /usr/ucblib:$LD_LIBRARY_PATHDon't do that.
    Darren

Maybe you are looking for

  • CAN I PASS A TABLE NAME AS A VARIABLE IN THE FROM CLAUSE?

    For some reason, I am trying to use a variable name containing the actual table name in the from clause and it won't allow me. I keep getting the error saying I need to declare the variable I'm using eventhough I've used it in another statement that

  • Terms and Conditions in PDF PO with RTF Template

    Hi Gurus, We have a custom RTF template for the PDF PO. I want to add the Terms and Conditions into this RTF template that should start after the last page of each PO I looked into the Oracle standard template also but it is in XSL-FO format. I tried

  • Argh! My itunes has gone crazy!

    Ok, so I tried to listen to a song in itunes today, but when I pressed play it just ran through all the songs in a random order, without stopping to play them.... it's v. weird, i've looked through all the settings and can't see anything that could b

  • PHP upgrade to v5 in 10gASR2 on AIX

    What is the best way to upgrade to PHP v5 inside a 10gASR2 installation on AIX 5.3 (PHP v4.3.9 came with the 10gASR2 install)? I took a look at the Zend Core and it looks like it must be installed as root, where our current 10gASR2 install is owned b

  • ESP scenario question!?

    Hi all, I have the following scenario in mind, can someone tell me if ESP can support this scenario and which elements do I need to use in order to accomplish my scenario… I want to use an input-stream to receive a lot of twitter messages which I wan