Can we store ORACLE DATA and LOG files in seperate file system

I am installing NW2004s in HP UNIX Server with Oracle database
I have selected the option as custom
I would want the SAP datafiles to be stored in /sapdata  file system
Log files to be stored in /oraclelog file system
While installation SAP is just asking me for the location of SAP data files. It is not asking me for the location of the log files.
(It is asking me for the location of control files)
Hence as per my understanding the oracle log files can be saved only in  the same directory as oracle Data file.
Please confirm

I have heard reports of people that made some kind of synchronization between AD and OID, but I have no hands on experience with this. There are some notes on metalink that describe the process, but I try to stay away from AD as far as possible (I usually work on unix or linux environments and those don't really mix with AD).

Similar Messages

  • Is iCloud a backup system?  Is it limited to only photos, music or can I store programs, data and so on?

    Can I use iCloud like a local network backup system?
    I'm thinking of replacing my Western Digital backup.
    Thanks for the info.

    Thanks Ralph
    I don't have more than 5Gb storage, but I have files, projects and so forth on an iMac, a PC, netbook and an iPad, as well as an iPod.
    I was just wondering if the types of things you could save is limited to photos, music.  Can you just connect to iCloud and copy the files you are working on into a folder, as I would do using my existing WD disk?

  • How can I save my data and the date,the time into the same file when I run this VI at different times?

    I use a translation stage for the experiment.For each user in the lab the stage position (to start an experiment) is different.I defined one end of the stage as zero. I want to save the position , date and time of the stage with respect to zero.I want all these in one file, nd everytime I run it it should save to the same file, like this:
    2/12/03 16:04 13567
    2/13/03 10:15 35678
    So I will track the position from day to day.If naybody helps, I appreciate it.Thanks.

    evolution wrote in message news:<[email protected]>...
    > How can I save my data and the date,the time into the same file when
    > I run this VI at different times?
    >
    > I use a translation stage for the experiment.For each user in the lab
    > the stage position (to start an experiment) is different.I defined one
    > end of the stage as zero. I want to save the position , date and time
    > of the stage with respect to zero.I want all these in one file, nd
    > everytime I run it it should save to the same file, like this:
    > 2/12/03 16:04 13567
    > 2/13/03 10:15 35678
    >
    > So I will track the position from day to day.If naybody helps, I
    > appreciate it.Thanks.
    Hi,
    I know the function "write to spreadsheet file.vi"
    can append the data
    to file. You can use the "concatenate strings" to display the date,
    time as well as data... Hope this help.
    Regards,
    celery

  • Steps to move Data and Log file for clustered SQL Server

    Hi guys 
    we have Active'passive SQL 2008R2 cluster environment.
    looking for steps to move Data and log files from user Database  and System Database for  SQL Server Clustered Instance. 
    Currently Data and log  files resides on same drive for user and system Databases..
    Thanks
    Please Mark As Answer if it is helpful. \\Aim To Inspire Rather to Teach A.Shah

    Try the below link
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/468de435-3432-45c2-a50b-23519cd2686e/moving-the-system-databases-in-a-sql-cluster?forum=sqldisasterrecovery
    -Prashanth

  • Do we need to format data and log files with 64k cluster size for sql server 2012?

    Do we need to format data and log files with 64k cluster size for sql server 2012?
    Does this best practice still applies to sql server 2012 & 2014?

    Yes.  The extent size of SQL Server data files, and the max log block size have not changed with the new versions, so the guidance should remain the same.
    Microsoft SQL Server Storage Engine PM

  • Change the Data and Log file locations in livecache

    Hi
    We have installed livecache in unix systems in the /sapdb mount directory where the installer have created sapdata and sapdblog directories. But the unix team has already created two mount direcotries as follows:
    /sapdb/LC1/lvcdata and /sapdb/LC1/lvclog mount points.
    While installing livecache we had selected this locations for creating the DATA and LOG volumes. Now they are asking to move the DATA and LOG volumes created in sapdata and saplog directories to these mount points. How to move the data and log file and make the database consistent. Is there any procedure to move the files to the mount point directories and change the pointers of livecahce to these locations.
    regards
    bala

    Hi Lars
    Thanks for the link. I will try it and let u know.
    But this is livecache (even it uses MaxDB ) database which was created by
    sapinst and morover is there any thing to be adjusted in SCM and as well as
    any modification ot be done in db level.
    regards
    bala

  • Can we have oracle database and gateway for SLQServer on same machine

    Can we install oracle database and Gateway for sQLServer on same machine>?
    If yes, How does the listener files in gateway home (EG: C:\product\11.2.0\NETWORK\ADMIN) and oracle home (EG: C:\app\Administrator\product\11.2.0\dbhome_1\NETWORK\ADMIN) look like?
    Where will we give dg4msql details in tnsnames and listner files?

    Here is the output of starting both the listeners. This is output when i appended listener1 settings in Oracle_home listener.ora file.
    C:\Users\Administrator>lsnrctl start listener
    LSNRCTL for 64-bit Windows: Version 11.2.0.2.0 - Production on 03-DIC-2012 04:46
    :15
    Copyright (c) 1991, 2010, Oracle. All rights reserved.
    Avvio di tnslsnr: attendere...
    TNSLSNR for 64-bit Windows: Version 11.2.0.2.0 - Production
    Il file dei parametri di sistema Þ C:\app\Administrator\product\11.2.0\dbhome_1\
    network\admin\listener.ora
    Messaggi di log registrati in C:\app\Administrator\diag\tnslsnr\WIN-77CAQGHJSA2\
    listener\alert\log.xml
    Ascolto su: (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=localhost)(PORT=1521))
    Connessione a (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=localhost)(PORT=1521
    STATO del LISTENER
    Alias listener
    Versione TNSLSNR for 64-bit Windows: Version 11.2.0.2.0 - Produ
    ction
    Data di inizio 03-DIC-2012 04:46:20
    Tempo di attivitÓ 0 giorni 0 ore 0 min. 5 sec.
    Livello trace off
    Sicurezza ON: Local OS Authentication
    SNMP OFF
    File di parametri listenerC:\app\Administrator\product\11.2.0\dbhome_1\network\a
    dmin\listener.ora
    File di log listener C:\app\Administrator\diag\tnslsnr\WIN-77CAQGHJSA2\list
    ener\alert\log.xml
    Summary table degli endpoint di ascolto...
    (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=localhost)(PORT=1521)))
    Summary table dei servizi...
    Il servizio "test" ha 1 istanze.
    L'istanza "test", stato UNKNOWN, ha 1 handler per questo servizio...
    Il comando Þ stato eseguito
    C:\Users\Administrator>lsnrctl start listener1
    LSNRCTL for 64-bit Windows: Version 11.2.0.2.0 - Production on 03-DIC-2012 04:46
    :23
    Copyright (c) 1991, 2010, Oracle. All rights reserved.
    Avvio di tnslsnr: attendere...
    TNSLSNR for 64-bit Windows: Version 11.2.0.2.0 - Production
    Il file dei parametri di sistema Þ C:\app\Administrator\product\11.2.0\dbhome_1\
    network\admin\listener.ora
    Messaggi di log registrati in D:\product\11.2.0\tg_2\diag\tnslsnr\WIN-77CAQGHJSA
    2\listener1\alert\log.xml
    Ascolto su: (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=localhost)(PORT=1522))
    Connessione a (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=localhost)(PORT=1522
    STATO del LISTENER
    Alias listener1
    Versione TNSLSNR for 64-bit Windows: Version 11.2.0.2.0 - Produ
    ction
    Data di inizio 03-DIC-2012 04:46:29
    Tempo di attivitÓ 0 giorni 0 ore 0 min. 5 sec.
    Livello trace off
    Sicurezza ON: Local OS Authentication
    SNMP OFF
    File di parametri listenerC:\app\Administrator\product\11.2.0\dbhome_1\network\a
    dmin\listener.ora
    File di log listener D:\product\11.2.0\tg_2\diag\tnslsnr\WIN-77CAQGHJSA2\li
    stener1\alert\log.xml
    Summary table degli endpoint di ascolto...
    (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=localhost)(PORT=1522)))
    Summary table dei servizi...
    Il servizio "dg4msql" ha 1 istanze.
    L'istanza "dg4msql", stato UNKNOWN, ha 1 handler per questo servizio...
    Il comando Þ stato eseguito
    C:\Users\Administrator>sqlplus /nolog
    SQL*Plus: Release 11.2.0.2.0 Production on Lun Dic 3 04:46:34 2012
    Copyright (c) 1982, 2010, Oracle. All rights reserved.
    SQL> conn system/Manager1@star
    Connesso.
    SQL> create public database link test2 connect to "sa" identified by "pwd"
    using 'dg4msql';
    Creato database link.
    SQL> select count(*) from prefer@test2;
    select count(*) from prefer@test2
    ERRORE alla riga 1:
    ORA-28545: errore diagnosticato da Net8 durante la connessione a un agente
    Unable to retrieve text of NETWORK/NCR message 65535
    ORA-02063: precedente 2 lines da TEST2
    Output of tnsping from oracle_home
    C:\Users\Administrator>tnsping dg4msql
    TNS Ping Utility for 64-bit Windows: Version 11.2.0.2.0 - Production on 03-DIC-2
    012 05:01:28
    Copyright (c) 1997, 2010, Oracle. All rights reserved.
    File di parametri utilizzati:
    C:\app\Administrator\product\11.2.0\dbhome_1\network\admin\sqlnet.ora
    ╚ stato utilizzato l'adattatore TNSNAMES per risolvere l'alias
    Tentativo di contattare (DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = localhost)(PORT = 1522)) (CONNECT_DATA = (SERVICE_NAME = dg4msql)) (HS=OK))
    OK (0 msec)
    tnsnames.ora file at D:\product\11.2.0\tg_2\NETWORK\ADMIN
    dg4msql =
    (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = localhost)(PORT = 1522))
    (CONNECT_DATA = (SERVICE_NAME = dg4msql))
    (HS=OK)
    C:\app\Administrator\product\11.2.0\dbhome_1\NETWORK\ADMIN
    LISTENER_test=
    (ADDRESS = (PROTOCOL = TCP)(HOST = localhost)(PORT = 1521))
    dg4msql =
    (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = localhost)(PORT = 1522))
    (CONNECT_DATA = (SERVICE_NAME = dg4msql))
    (HS=OK)
    test=
    (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = localhost)(PORT = 1521))
    (CONNECT_DATA =
    (SERVER = DEDICATED)
    (SERVICE_NAME = test)
    )

  • How can i store the data from the list to MS-Access?

    Hi all,
    We all know that from alv it is possible to export data to Ms-excel or word or text file. Can anyone tell me how can i store the data from alv or reports to Ms-Access?
    Thanks in Advance,
    Abhijit

    Hi,
    If you want that without code, i suppose there is a option in MS-Access to import data from Excel, so you have to first export to excel and then import to Access.
    If you want to code it yourself then have a look at FM TABLE_EXPORT_TO_MSACCESS, as said in Exporting Data from R/3 to MS ACCESS
    You can also see the search results in SCN for [export data to MS-Access (45 Results)|https://www.sdn.sap.com/irj/sdn/advancedsearch?query=exportdatato+MS-Access&cat=sdn_all].
    Regards
    Karthik D

  • How to defrag after Shrinking the data and log every night

    OK, my shop has a shrink job that runs every night. It shrinks the data and log files. DBCC SHRINKFILE (filename,1). We have recovery model Simple.
    So reading that SHRINK is horrible for the data file because of fragmentation, so looking to stop the daily job. BUt this job has been running everyday for years right after the ETL. We have a big nightly ETL process that (1) truncates several
    hundred SQL tables and then BULK INSERT from the AS400 (2) for really large tables we just bring in the last X number of days from the AS400 and insert those records into the SQL table. We have very few indexes, but do have some on the really
    large tables. FYI - The SQL Servers are at the clients' sites.
    What kind of fragmentation has the SHRINK on the data file caused? how would it affect the ETL, and retrieval of data during the day? how can the fragmentation be fixed? 

    Hello,
    Databases suffer from fragmentation on indexes as explained
    here, but they suffer from physical fragmentation at the storage level also. 
    When the shrinking process shrinks a data file and recover space from the disk and the data file needs to grow again as part of your ETL process the new portion of the data file will rarely be contiguous to the rest of the data file on disk, and if this
    shrink process happens many times, the data file may be spread among many tracks on the disk. This increases the disk requests required to write data on a fragmented data file and increases disk requests to create new data files. In general, fragmentation
    will originate that what should be a simple I/O request has to be broken on many disk requests, making disk activity and performance less predictable and making disk queues larger.
    My suggestion is to stop sql server services during maintenance window, defragment disk using the Windows Defrag tool or a third party tool, when finish start SQL Server services again, then defrag indexes on your databases.
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • How to recover data and log

    Hello, everyone
    I re-installed my windows XP due to virus with the maxdb data and log files left. After that I reinstalled maxdb 7.6 again. How can I recover the db with the existed data and log ? I did not backup the db. I can not figure out a way. Please help me.
    Chu

    Hi Chu,
    creating a MaxDB instance from the data- and logvolumes is not difficult, but some requirements have to be fullfilled:
    - you still have the parameter file left
    - you still know the password of the database manager operator (DBM in OpenSource or CONTROL for SAP instances)
    That given, all you've to do is to re-register the database, via the db_create command:
    dbmcli -R "
    So, let's go for an example.
    I've a database called DB77, the dbm operator is called control, password is also control.
    I'll drop the database but keep the files (just like you have it ... hopefully) and re-register the database afterwards:
    >dbmcli db_enum
    OK
    DB77    C:sapdbdb77db                        7.7.02.16       fast    offline
    DB77    C:sapdbdb77db                        7.7.02.16       slow    offline
    >dir c:sapdbdataconfig
    Datenträger in Laufwerk C: ist Boot
    Volumeseriennummer: 08C7-EC59
    Verzeichnis von c:sapdbdataconfig
    24.03.2008  13:37    <DIR>          .
    24.03.2008  13:37    <DIR>          ..
    25.10.2007  14:11            12.739 .M770216
    25.10.2007  14:11            97.900 .M770216.pah
    25.10.2007  14:11             1.536 .M770216.upc
    14.02.2008  02:21            20.579 .M770323
    14.02.2008  02:21           201.850 .M770323.pah
    14.02.2008  02:21             1.536 .M770323.upc
    14.03.2008  02:23            20.795 .UMDB
    14.02.2008  02:24            20.795 .UMDB.01
    25.10.2007  14:24            20.795 .UMDB.02
    14.03.2008  02:23                26 .UMDB.cfg
    14.02.2008  02:24           207.625 .UMDB.pah
    25.10.2007  14:24             2.048 .UMDB.upc
    14.03.2008  09:23            13.405 DB77  
    13.03.2008  23:51            13.405 DB77.01
    11.03.2008  20:10            13.405 DB77.02
    14.02.2008  02:21            13.405 DB77.03
    14.02.2008  02:17            13.405 DB77.04
    24.01.2008  19:06            13.405 DB77.05
    11.01.2008  18:57            13.405 DB77.06
    31.12.2007  01:40            13.405 DB77.07
    26.12.2007  14:40            13.404 DB77.08
    26.12.2007  14:20            13.181 DB77.09
    26.12.2007  14:01            13.181 DB77.10
    14.03.2008  09:23                75 DB77.cfg
    02.12.2007  18:24               648 DB77.mmm
    11.03.2008  20:16           343.475 DB77.pah
    24.03.2008  13:37             2.048 DB77.upc
    14.02.2008  02:20    <DIR>          install
                  27 Datei(en)      1.101.476 Bytes
                   3 Verzeichnis(se), 20.064.083.968 Bytes frei
    Ok, the database is currently correctly registered and the parameter file is present. Now let's drop the database:
    >dbmcli -d db77 -u control,control db_drop withoutfiles
    OK
    >dir c:sapdbdataconfig
    Datenträger in Laufwerk C: ist Boot
    Volumeseriennummer: 08C7-EC59
    Verzeichnis von c:sapdbdataconfig
    24.03.2008  13:37    <DIR>          .
    24.03.2008  13:37    <DIR>          ..
    25.10.2007  14:11            12.739 .M770216
    25.10.2007  14:11            97.900 .M770216.pah
    25.10.2007  14:11             1.536 .M770216.upc
    14.02.2008  02:21            20.579 .M770323
    14.02.2008  02:21           201.850 .M770323.pah
    14.02.2008  02:21             1.536 .M770323.upc
    14.03.2008  02:23            20.795 .UMDB
    14.02.2008  02:24            20.795 .UMDB.01
    25.10.2007  14:24            20.795 .UMDB.02
    14.03.2008  02:23                26 .UMDB.cfg
    14.02.2008  02:24           207.625 .UMDB.pah
    25.10.2007  14:24             2.048 .UMDB.upc
    14.03.2008  09:23            13.405 DB77
    13.03.2008  23:51            13.405 DB77.01
    11.03.2008  20:10            13.405 DB77.02
    14.02.2008  02:21            13.405 DB77.03
    14.02.2008  02:17            13.405 DB77.04
    24.01.2008  19:06            13.405 DB77.05
    11.01.2008  18:57            13.405 DB77.06
    31.12.2007  01:40            13.405 DB77.07
    26.12.2007  14:40            13.404 DB77.08
    26.12.2007  14:20            13.181 DB77.09
    26.12.2007  14:01            13.181 DB77.10
    14.03.2008  09:23                75 DB77.cfg
    02.12.2007  18:24               648 DB77.mmm
    11.03.2008  20:16           343.475 DB77.pah
    24.03.2008  13:37             2.048 DB77.upc
    14.02.2008  02:20    <DIR>          install
                  27 Datei(en)      1.101.476 Bytes
                   3 Verzeichnis(se), 20.064.083.968 Bytes frei
    >dbmcli db_enum
    OK
    As we see, the database is gone - but  due to the withoutfiles flag all files are left in place. Therefore we can reregister the database rightaway:
    >dbmcli -R C:sapdbdb77db db_create DB77 control,control
    OK
    >dbmcli db_enum
    OK
    DB77    C:sapdbdb77db                        7.7.02.16       fast    offline
    DB77    C:sapdbdb77db                        7.7.02.16       slow    offline
    >dir c:sapdbdataconfig
    Datenträger in Laufwerk C: ist Boot
    Volumeseriennummer: 08C7-EC59
    Verzeichnis von c:sapdbdataconfig
    24.03.2008  13:42    <DIR>          .
    24.03.2008  13:42    <DIR>          ..
    25.10.2007  14:11            12.739 .M770216
    25.10.2007  14:11            97.900 .M770216.pah
    25.10.2007  14:11             1.536 .M770216.upc
    14.02.2008  02:21            20.579 .M770323
    14.02.2008  02:21           201.850 .M770323.pah
    14.02.2008  02:21             1.536 .M770323.upc
    14.03.2008  02:23            20.795 .UMDB
    14.02.2008  02:24            20.795 .UMDB.01
    25.10.2007  14:24            20.795 .UMDB.02
    14.03.2008  02:23                26 .UMDB.cfg
    14.02.2008  02:24           207.625 .UMDB.pah
    25.10.2007  14:24             2.048 .UMDB.upc
    14.03.2008  09:23            13.405 DB77
    13.03.2008  23:51            13.405 DB77.01
    11.03.2008  20:10            13.405 DB77.02
    14.02.2008  02:21            13.405 DB77.03
    14.02.2008  02:17            13.405 DB77.04
    24.01.2008  19:06            13.405 DB77.05
    11.01.2008  18:57            13.405 DB77.06
    31.12.2007  01:40            13.405 DB77.07
    26.12.2007  14:40            13.404 DB77.08
    26.12.2007  14:20            13.181 DB77.09
    26.12.2007  14:01            13.181 DB77.10
    14.03.2008  09:23                75 DB77.cfg
    02.12.2007  18:24               648 DB77.mmm
    11.03.2008  20:16           343.475 DB77.pah
    24.03.2008  13:42             2.048 DB77.upc
    14.02.2008  02:20    <DIR>          install
                  27 Datei(en)      1.101.476 Bytes
                   3 Verzeichnis(se), 20.064.083.968 Bytes frei
    >dbmcli -d db77 -u control,control db_online
    OK
    Database reregistered, all files are OK, database online. We're done.
    Anyhow, you should in any case perform a backup before you reinstall your machine the next time. I don't know if you've still got all the files I mentioned, but if not - it's not possible to get your database back easily - if at all.
    Hope that helps,
    Lars

  • After getting the dreaded gray/blue screen, I tried to run disk repair on the internal disk. I got an error message saying "Disk Utility can't repair this disk and restore your backed-up files. The volume Macintosh HD could not be verified completely

    After getting the dreaded gray/blue screen, I tried to run disk repair on the internal disk. I got an error message saying "Disk Utility can't repair this disk and restore your backed-up files. The volume Macintosh HD could not be verified completely." What do I do now? This is an iMac and I'm running 10.6.8.

    Clean Install of Snow Leopard
    Be sure to make a backup first because the following procedure will erase
    the drive and everything on it. See below for how to clone a drive.
         1. Boot the computer using the Snow Leopard Installer Disc or the Disc 1 that came
             with your computer.  Insert the disc into the optical drive and restart the computer.
             After the chime press and hold down the  "C" key.  Release the key when you see
             a small spinning gear appear below the dark gray Apple logo.
         2. After the installer loads select your language and click on the Continue
             button. When the menu bar appears select Disk Utility from the Utilities menu.
             After DU loads select the hard drive entry from the left side list (mfgr.'s ID and drive
             size.)  Click on the Partition tab in the DU main window.  Set the number of
             partitions to one (1) from the Partitions drop down menu, click on Options button
             and select GUID, click on OK, then set the format type to MacOS Extended
             (Journaled, if supported), then click on the Apply button.
         3. When the formatting has completed quit DU and return to the installer.  Proceed
             with the OS X installation and follow the directions included with the installer.
         4. When the installation has completed your computer will Restart into the Setup
             Assistant. Be sure you configure your initial admin account with the exact same
             username and password that you used on your old drive. After you finish Setup
             Assistant will complete the installation after which you will be running a fresh
             install of OS X.  You can now begin the update process by opening Software
             Update and installing all recommended updates to bring your installation current.
    Download and install Mac OS X 10.6.8 Update Combo v1.1.
    You may be able to backup your data if you have an erased external drive you can use. Before you do the above but after you have opened Disk Utility you can try to clone your drive:
    Clone using Restore Option of Disk Utility
      1. Open Disk Utility.
      2. Select the destination volume from the left side list.
      3. Click on the Restore tab in the DU main window.
      4. Select the destination volume from the left side list and drag
           it to the Destination entry field.
      5. Select the source volume from the left side list and drag it to
          the Source entry field.
      6. Double-check you got it right, then click on the Restore button.
    Destination means the external backup drive. Source means the internal startup drive.
    Now this will only work if the drive is accessible and can be cloned by Disk Utility. Otherwise, you would need to access your drive from another Mac that you can connect via Firewire - Target Disk Mode.

  • Date and Time in the flat file

    Hi All,
    I am trying to design a flow which will get data from a flat file. The file has a field which contains both the time and date. How can I handle this in BW? Do I need to creat 2 infoobjects and split the flat file field in start routine or transfer rule? Or is there any stadard infoobject which can hold both the data and time?
    Also, the client asked me if I want a .CSV file or .XLS file. Which one is better for uploading into BW? Any PROS and CONS?
    Best Regards,

    Hi,
    There is no single Data Type which accepts the Date and Time. Other way is to look the data as CHAR. Else the Update Routine/Formula is the Best
    In the Update Rule or Transfer Rule use the Formula
    Let the Info Objects be
    0date
    0Time
    The Transfer Rule/ Update Rule Formulas Be
    0Date --> LEFT( 8, 'DateTime' )
    0Time --> RIGHT(6, 'DateTime')
    Then CSV is the best Option to accept the Data as it is ready for Upload
    Regards
    Happy Tony

  • How to write the oracle data as XML format. (.XML file)

    create or replace procedure pro(p_number )
    is
    cursor c1 is select *from emp where empno=p_number;
    v_file utl_file.file_type;
    begin
    v_file := utl_file.fopen('dirc','filename.txt','w');
    for i in c1 loop
    utl_file.put_line(v_file,i.ename || i.empno ||i.job);
    end loop;
    closef(v_file);
    end;
    Now my client want instead of .txt file he need .xml files
    File should contains xml tags. can any one help regarding this.. with one example.
    How to write the oracle data as XML format. (.XML file)

    hi,
    hope this example will do something....
    SQL> select employee_id, first_name, last_name, phone_number
    2 from employees where rownum < 6
    EMPLOYEE_ID FIRST_NAME LAST_NAME PHONE_NUMBER
    100 Steven King 515.123.4567
    101 Neena Kochhar 515.123.4568
    102 Lex De Haan 515.123.4569
    103 Alexander Hunold 590.423.4567
    104 Bruce Ernst 590.423.4568
    SQL> select dbms_xmlgen.getxml('select employee_id, first_name,
    2 last_name, phone_number from employees where rownum < 6') xml
    3 from dual;
    *<?xml version="1.0"?>*
    *<ROWSET>*
    *<ROW>*
    *<EMPLOYEE_ID>100</EMPLOYEE_ID>*
    *<FIRST_NAME>Steven</FIRST_NAME>*
    *<LAST_NAME>King</LAST_NAME>*
    *<PHONE_NUMBER>515.123.4567</PHONE_NUMBER>*
    *</ROW>*
    *<ROW>*
    *<EMPLOYEE_ID>101</EMPLOYEE_ID>*
    *<FIRST_NAME>Neena</FIRST_NAME>*
    *<LAST_NAME>Kochhar</LAST_NAME>*
    *<PHONE_NUMBER>515.123.4568</PHONE_NUMBER>*
    *</ROW>*
    *<ROW>*
    *<EMPLOYEE_ID>102</EMPLOYEE_ID>*
    *<FIRST_NAME>Lex</FIRST_NAME>*
    *<LAST_NAME>De Haan</LAST_NAME>*
    *<PHONE_NUMBER>515.123.4569</PHONE_NUMBER>*
    *</ROW>*
    *<ROW>*
    *<EMPLOYEE_ID>103</EMPLOYEE_ID>*
    *<FIRST_NAME>Alexander</FIRST_NAME>*
    *<LAST_NAME>Hunold</LAST_NAME>*
    *<PHONE_NUMBER>590.423.4567</PHONE_NUMBER>*
    *</ROW>*
    *<ROW>*
    *<EMPLOYEE_ID>104</EMPLOYEE_ID>*
    *<FIRST_NAME>Bruce</FIRST_NAME>*
    *<LAST_NAME>Ernst</LAST_NAME>*
    *<PHONE_NUMBER>590.423.4568</PHONE_NUMBER>*
    *</ROW>*
    *</ROWSET>*
    ask if you want more assistance.
    thanks.

  • Aperture was not able to adjust the date and time of the master file "2008-06-22 at 14-41-14.jpg" because it has a format that does not permit date modification.

    I can't seem to find a way to update the date/time of a master file. There is a format issue but I can't seem to figure out how to correct it.
    Aperture was not able to adjust the date and time of the master file “2008-06-22 at 14-41-14.jpg” because it has a format that does not permit date modification.

    Can you post an ipconfig /all from the server and the DC?
    Robert Pearman SBS MVP
    itauthority.co.uk |
    Title(Required)
    Facebook |
    Twitter |
    Linked in |
    Google+

  • Can you export the date and user to Excel that appears into a Comments field in a tracking list?

    Hi everyone,
    Can you export the date and user to Excel that appears into a Comments field in a tracking list?
    When i export a tracking list with a Comment field in the Content type, the screen where you enter the data for an item, the Comments field just export the text of the comment into the Excel file.
    In the Comments field appears also, besides the comment text,  the user and date from who and when the comment is added.
    Is there a way to export also the user and the date?
    Thanks
    Wim

    create another comment field that doesn't use 'appending', and through SP Designer change workflow keep
    updating that field with the new comments  (prepend the field with date/username/new comment). Hide the field on all forms, but put it in the view you need to export to Excel.
    Please refer few more links: Hope they will help
    http://www.nothingbutsharepoint.com/2009/04/16/versioning-append-changes-to-existing-text-view-entries-aspx/
    http://sympmarc.com/2011/02/07/showing-all-versions-of-append-changes-to-existing-text-in-a-data-view-web-part-dvwp/comment-page-3/
    https://mossipqueen.wordpress.com/2013/03/06/display-all-appending-field-entries-in-a-single-list-view/
    http://community.office365.com/en-us/f/154/t/278560.aspx
    Please 'propose as answer' if it helped you, also 'vote helpful' if you like this reply.

Maybe you are looking for