Performing a copy of a "live" data file?

Hello -
Is there a risk (to the source data file) of performing a copy command on a live data file? My sys admin informed me he was copying a data file. He was not concerned about the file on the destination drive being corrupted. He just had to fill up another file system to perform a test. My concern was any potential corruption to the "live" DB file.
Thanks,
mike

I agree that copying a file should be 0% risk of corruption on the source datafile.
Let's think about this. How does a conventional backup work? We put the tablespace in backup mode, and then use O/S level tools to make the copy. All that putting the tablespace in backup mode does is write to the datafile header of each datafile in the tablespace, and increases I/O by writing some full blocks to the redo log. After that, the O/S level copy simply copies the datafiles. If someone comes along and just copies the datafiles, the O/S copy mechanism is the very same as if you were doing a conventional backup. There is no difference.
So, if you believe that copying a file can damage the source file, then how do you reliably backup your database files without having to resort to a cold backup?
Copying files doesn't damage the source file. Having said that, I agree that the sysadmin should not be messing with Oracle datafiles. Today he uses the files as source to copy from, tomorrow he removes, renames, whatever. Teach him to use dd to create large files.
dd if=/dev/zero of=/tmp/my_2gb_file bs=1048576 count=2048
Hope that helps,
-Mark

Similar Messages

  • My videos won't play on my PC because they copied over as dat files.  Help!

    I followed the steps on how to copy my 5G nano videos over to my PC. They copied over, however, I cannot play them in Media Plaver because the file copied over as a dat file. And when I play them on Quick Time, there's a lag between voice and video. Any suggestions????

    see if they play with VLC
    it might be you dont have the codec for them to play
    http://www.videolan.org/
    Click here to Backup the data on your BlackBerry Device! It's important, and FREE!
    Click "Accept as Solution" if your problem is solved. To give thanks, click thumbs up
    Click to search the Knowledge Base at BTSC and click to Read The Fabulous Manuals
    BESAdmin's, please make a signature with your BES environment info.
    SIM Free BlackBerry Unlocking FAQ
    Follow me on Twitter @knottyrope
    Want to thank me? Buy my KnottyRope App here
    BES 12 and BES 5.0.4 with Exchange 2010 and SQL 2012 Hyper V

  • If data files are deleted?

    If data files are deleted, can they be recovered from the control file?

    It depends on what is in the datafiles in the tablespaces, if they actually contain data, if they are the only datafiles for the tablespace, etc.
    Let's assume that they are the only datafiles for the tablespace and segments have allocated and used space in them.
    If the segments are index segments, you can recreate the indexes from the data that currently exists in the table.
    If the segments are table segments, you will need to recreate the data either by hand or extracting it from another source.
    If no segment has allocated space in the datafile, you should be able to offline the datafile. If this works, it is a good idea to migrate the segments to a new tablespace and then drop the tablespace (unless it is SYSTEM) as Oracle does not like offline datafiles and it will become a maintenance hassle.
    If you do not have a backup, do you have a recent export?
    Once you have addressed the immediate issues of reconstructing the data, you need to address the issues of backups (you should at least backup the database periodically even if you don't run in archivelog mode) and security (no one should be dropping live data files).
    Regards,
    Daniel Fink

  • Backup MDC and ADIC license.dat file

    My system is 2 G5 Xserve's, one as the MDC and one as the backup controller, about 2.7 TBytes of storage, and a Solaris 10 StorNextFX client that does most of my sharing to the rest of my network through NFS. ( everyone still running version 1.3 )
    This morning, I restarted the MDC and during the time that it was down the backup controller took control of the SAN..
    My problem is that the SUN StorNext client gave an error about being unable to renew the license for the filesystem, which I suspect is because I neglected to put a copy of the license.dat file into /Library/Filesystems/Xsan/config on the backup controller. I wasn't able to get the SUN to "refind" the SAN until I had restarted the backup MDC ( to make it jump back to the MDC ), and then restarted the SUN... ( bit of a pain )
    OK... My question is..... I just put a copy of the license.dat file that is on the MDC into the similar place on the backup MDC , but will it work ? Or do I need to do something else. ( really hate to have to test this by rebooting the world again )..
    thanks much
    Mike

    My current installs have a license.dat file for the primary MDC and a seperate license.dat file for the backup MDCs based on the cvfsid of each of the MDCs.
    If you are using the exact same license.dat file for both MDCs then I'm not sure that it will work.
    Drew

  • Download of performance data file

    Hi,
    i conducted the performane check for the intended program in the TC SE30. I tried to download the performance data file according to the below mentioned procedure:
    1. From the Performance data file group box, choose Other file…
    2. On the next screen, enter your selection (for example, All).
    3. Under File user, select the required file.
    4. Choose Copy to local file…
    5. Enter the local path name and choose Copy.
    When i try to open the local file, i get the message that the file is corrupted and it shows junk values.
    Could anybody plz help me to resolve the above mentioned prb.
    <REMOVED BY MODERATOR>
    Thnks in advance!
    Edited by: Alvaro Tejada Galindo on Feb 27, 2008 12:29 PM

    Hi Promad,
    I usually do it like this: you run the analysis, but instead of clicking the Other File button, you take the Evaluate button. Now you see some nice graph and you should click the Hit List button (or F5).. this results in an ALV being shown! Here you can simply use the standard ALV functionality to download the overview: You press the Local File button (or F9) and choose the required file format.
    This works fine for me!
    Have a nice day!

  • Concatonat​e TDMS file data to live data

    I am monitoring 20 or 30 channels at any given time and storing all data to a TDMS file (LV 8.6). At any given time we are only displaying 3 channels on a waveform chart. The chart displays 5 minutes of data. say for the first minute I am viewing channels 1, 2, & 3. Then I switch over to 4 - 6 but in addition to the live data coming in I want to see what has happened since the test started a minute ago (or up to 5 minutes ago), as if I had been viewing these three channels all along.
    What I have tried is to read the stored data from the TDMS file, copy the most recent data (up to 15K samples = 5 min at 50Hz). Then I take the live acquired waveforms, break out the array of  Y values, concatonate with the Y values from file and rebuild the wavform. But it isn't workin. Seems like it should but I don't know what is wrong.
    Any ideas?
    Lawrence M. David Jr.
    Certified LabVIEW Architect
    cell: 516.819.9711
    http://www.aleconsultants.com
    [email protected]
    Attachments:
    extendedWaveforms.vi ‏15 KB

    well I tried using the "ignore attributes" property, both above the 'value' property, and below (top down ordering - didn't make any difference). Also I verified that the data is being concatonated; normally we are adding one point to the plot every 50mS, when we toggle to a different set of channels, the Y component of the waveform is not an array of one element but of (in this shot anyway) over 9000 elements. the screne seems to flash with a block of data and then the plot resumes filling 1 point at a time from the present time stamp. Maybe I can set an attribute of the x axis to be an earlier timestamp based on the number of points in the Y array?
    Lawrence M. David Jr.
    Certified LabVIEW Architect
    cell: 516.819.9711
    http://www.aleconsultants.com
    [email protected]
    Attachments:
    propertyNode.JPG ‏11 KB
    probes.JPG ‏45 KB

  • MySQL data files: where do they live

    I am building an online system using mySQL. I have two (I hope) simple questions.
    1. Where do the data files live eg. What do I have to backup to be sure my data is safe?
    2. I built the system on a Mac OS 10.3 machine (not 10.3 server). How do I transfer the mySQL database structures to the (now included) version of mySQL running in Mac OS 10.4 server?
    Any advice will be much appreciated.
    Ray
    G5   Mac OS X (10.4.3)  
    G5   Mac OS X (10.4.3)  

    On OS X Server, the standard location for the MySQL data files is: /var/mysql. Any databases you create will wind up in there. If you install a fresh copy of MySQL onto OS X from a MySQL package, they go into /usr/local/mysql/data.
    You have a couple of options for the transfer. 1) Using MySQL, use mysqldump to create SQL files of your database tables, both structure and data. 2) Using something like phpMyAdmin, you can export the tables into a structure that can then be imported on your 10.4 server. 3) Copy the relevant database directories from your /var/mysql directory on OS X 10.3 into the /var/mysql directory on your 10.4 Server. Either stop MySQL on 10.3 first or lock the tables before you do a copy. Then, after plopping them into the 10.4 directory, restart MySQL on 10.4 and you should be good to go.
    If you are interested in knowing how to use the tools like mysqldump or how to import from a text file, MySQL's site has a section of documentation on these: mysqldump / mysqlimport
    Xserve Dual 2.3 GHz / PowerMac Dual 2 GHz   Mac OS X (10.4.3)  

  • OraRRP Error with "Unable to copy data file;Error code 2, check disk space"

    Hi,
    Some users get this message -"Unable to copy data file;Error code 2, check disk space" when run report with orarrp, but most users do not get it.
    I check free space at both server and client side, they are very sufficient.
    I also checked directory exists for REPORTXX_TMP variable.
    My user call reports via URL (rwservlet) and it occur for all reports.
    How I can solve this problem?
    Thanks in advance.
    Tawatchai R.

    Hi,
    have the same problem now. One user has temporarily problems to download .rrpa files via URL (rwservlet) request. Error code: -"Unable to copy data file;Error code 2, check disk space". Did you get a solution??
    Thanks in advance. Axel

  • How can i copy my data files from a windows backup to mac mini

    I recently purchased a mac mini and waiting on its arrival.  I am converting from WinXP and my current system is not working at the moment (started smelling smoke one day and the video hasn't displayed since from the PC - so its been shut down).  I can easily fix whatever the issue is with the PC - however - my question is.  I perform periodic backup of my data files to an external drive with Windows Backup, can I restore my files from the Windows Backup to the Mac Mini?  If I don't have to fix the PC - I can save that cost and buy an CD drive for the Mini (or something else).
    I haven't investigated the real problem on the PC - I just know its hardware related (not OS).
    My last backup was atleast 1 month ago - so I know I might lose a couple of things....but not very much.  If its "simplier" for the migration process to fix the PC - then that is the answer but if I can extract from the external 1TB Seagate.  I think I only have 50GB of data (itunes / pictures / etc)
    Thanks

    Maybe by:
    Switch Basics: Migrate your Windows files or system to your Mac
    It depends upon the format of the disk.Yu will likey have to:
    http://reviews.cnet.com/8301-13727_7-57588773-263/how-to-manually-enable-ntfs-re ad-and-write-in-os-x/

  • Live Data / Problem with file permissions

    Just trying out an old version of Dreamweaver MX 2004. I am
    using my webhosting service for remote server/testing server
    duties. It is running PHP 4.3.10 and MySQL 3.23.58. I was able to
    set up the database connection and test-retrieve a recordset with
    no problems. In following the tutorial I found that Livedata
    wouldn't work, it just giving me a warning about file permissions
    being wrong. It turns out that when Dreamweaver creates a temporary
    file of the work-in-progress to upload to the remote server the
    file is created on the server with owner=rw, group=rw, world=r
    which explains why it won't run - group has to be set to group=r.
    The file is created on the fly and then immediately deleted by
    Dreamweaver so it is impossible to manually set the permission on
    the server and probably fairly pointless too.
    I tried just saving the file and previewing in the browser
    which again causes it to be uploaded to the remote server. The
    first time this resulted in the browser offering a file download
    box instead of running the page. The reason is - again - that
    Dreamweaver is setting the uploaded file permissions to include
    group=rw. If I manually set the permission for group to group=r it
    runs fine.
    It turns out that Dreamweaver is always setting the file
    permissions on file uploads (checked php and html) to the
    remote/testing server to include group=rw. Once I set it manually
    on the remote/testing server to group=r for a php file everything
    is fine and subsequent uploads of the same file do not change it
    again.
    I checked with the webhosting company and their second-line
    have reported back to me that the default file permission they set
    on uploaded files includes group=r so it must be DW that is causing
    the problem by setting group=rw the first time. I confirmed this by
    using WS-FTP to upload the same file (renamed) to the same target
    directory and the permissions set were owner=rw, group=r, world=r.
    So
    Can anyone please tell me how to change the permissions DW
    sets on files written to a remote server because I have spent
    countless hours on it without success. From looking at other posts
    in this forum it could be that other users are hitting the same
    kind of problem with DW8

    Stop using Live Data with a hosting account. Set up PHP and
    MySQL locally on
    your machine. That is how it's supposed to work. You
    shouldn't test files on
    the fly on a host as you write them. Change your test account
    in DW to use
    the local server. Upload your files to your remote server
    after they are
    fully tested.
    Tom Muck
    http://www.tom-muck.com/
    "nigelssuk" <[email protected]> wrote in
    message
    news:[email protected]...
    > Just trying out an old version of Dreamweaver MX 2004. I
    am using my
    > webhosting
    > service for remote server/testing server duties. It is
    running PHP 4.3.10
    > and
    > MySQL 3.23.58. I was able to set up the database
    connection and
    > test-retrieve a
    > recordset with no problems. In following the tutorial I
    found that
    > Livedata
    > wouldn't work, it just giving me a warning about file
    permissions being
    > wrong.
    > It turns out that when Dreamweaver creates a temporary
    file of the
    > work-in-progress to upload to the remote server the file
    is created on the
    > server with owner=rw, group=rw, world=r which explains
    why it won't run -
    > group
    > has to be set to group=r. The file is created on the fly
    and then
    > immediately
    > deleted by Dreamweaver so it is impossible to manually
    set the permission
    > on
    > the server and probably fairly pointless too.
    >
    > I tried just saving the file and previewing in the
    browser which again
    > causes
    > it to be uploaded to the remote server. The first time
    this resulted in
    > the
    > browser offering a file download box instead of running
    the page. The
    > reason is
    > - again - that Dreamweaver is setting the uploaded file
    permissions to
    > include
    > group=rw. If I manually set the permission for group to
    group=r it runs
    > fine.
    >
    > It turns out that Dreamweaver is always setting the file
    permissions on
    > file
    > uploads (checked php and html) to the remote/testing
    server to include
    > group=rw. Once I set it manually on the remote/testing
    server to group=r
    > for a
    > php file everything is fine and subsequent uploads of
    the same file do not
    > change it again.
    >
    > I checked with the webhosting company and their
    second-line have reported
    > back
    > to me that the default file permission they set on
    uploaded files includes
    > group=r so it must be DW that is causing the problem by
    setting group=rw
    > the
    > first time. I confirmed this by using WS-FTP to upload
    the same file
    > (renamed)
    > to the same target directory and the permissions set
    were owner=rw,
    > group=r,
    > world=r.
    >
    > So
    >
    > Can anyone please tell me how to change the permissions
    DW sets on files
    > written to a remote server because I have spent
    countless hours on it
    > without
    > success. From looking at other posts in this forum it
    could be that other
    > users
    > are hitting the same kind of problem with DW8
    >

  • Impact of data file size on DB performance

    Hi,
    I have a general query regarding size of data files.
    Considering DB performance, which of the below 2 options are better?
    1. Bigger data file size but less number of files (ex. 2 files with size 8G each)
    2. Smaller data file size but more number of files (ex. 8 files with size 2G each)
    I am working on a DB where I have noticed where very high I/O.
    I understand there might be many reasons for this.
    However, I am checking for possibility to improve DB performance though optimizing data file sizes. (Including TEMP/UNDO table spaces)
    Kindly share your experiences with determining optimal file size.
    Please let me know in case you need any DB statistics.
    Few details are as follows:
    OS: Solaris 10
    Oracle: 10gR2
    DB Size: 80G (Approx)
    Data Files: UserData - 6 (15G each), UNDO - 2 (8G each), TEMP - 2 (4G each)
    Thanks,
    Ullhas

    Ullhas wrote:
    I have a general query regarding size of data files.
    Considering DB performance, which of the below 2 options are better?Size or number really does not matter assuming other variables constant. More files results in more open file handles, but in your size db, it matters not.
    I am working on a DB where I have noticed where very high I/O.
    I understand there might be many reasons for this.
    However, I am checking for possibility to improve DB performance though optimizing data file sizes. (Including TEMP/UNDO table spaces)Remember this when tuning I/O: The fastest I/O is the one that never takes place! High I/O may very well be a symptom of unnecessary FTS or poor execution plans. Validate this first before tuning I/O and you will be much better off.
    Regards,
    Greg Rahn
    http://structureddata.org

  • When I send out mail from MS Outlook enterprise account in the office to my Mac at home they are received as "winmail.dat" files.  Even if I perform a "save as" to the correct file name the file format is still not recognized.  Why is this happening!?

    When I send out mail from MS Outlook enterprise account in the office to my Mac at home they are received as "winmail.dat" files.  Even if I perform a "save as" to the correct file name the file format is still not recognized.  Why is this happening!?

    http://www.joshjacob.com/mac-development/tnef.php

  • Error data file is empty in standard Copy package

    Hi,
    We have an appset that consists of four applications and we can't successfully run Data Manager Copy package in one of them. It launches the following tasks: Dump, Convert (execution fails in this step with error "Data file is empty") and Load. SSIS configuration in BIDS is defined by default and we haven't set any parameter.
    We have figured out this error appears when we select any member of a dimension in "COPYMOVEINPUT" prompt except for Time dimension. Previously there was a custom Copy package based on standard BPC and it only filters by Time dimension. Perhaps this error is related to application configuration to run custom package.
    We show code:
    INFO(%TEMPFILE1%,%TEMPPATH%%RANDOMFILE%)
    INFO(%TEMPFILE2%,%TEMPPATH%%RANDOMFILE%)
    TASK(DUMP,APPSET,%APPSET%)
    TASK(DUMP,APP,%APP%)
    TASK(DUMP,USER,%USER%)
    TASK(DUMP,FILE,%TEMPFILE1%)
    TASK(DUMP,SQL,%SQLDUMP%)
    TASK(DUMP,DATATRANSFERMODE,2)
    TASK(CONVERT,INPUTFILE,%TEMPFILE1%)
    TASK(CONVERT,OUTPUTFILE,%TEMPFILE2%)
    TASK(CONVERT,CONVERSIONFILE,%CONVERSION_INSTRUCTIONS%)
    TASK(LOAD,APPSET,%APPSET%)
    TASK(LOAD,APP,%APP%)
    TASK(LOAD,USER,%USER%)
    TASK(LOAD,FILE,%TEMPFILE2%)
    TASK(LOAD,DATATRANSFERMODE,4)
    TASK(LOAD,DMMCOPY,1)
    TASK(LOAD,CLEARDATA,%CLEARDATA%)
    TASK(LOAD,RUNTHELOGIC,%RUNLOGIC%)
    TASK(LOAD,CHECKLCK,%CHECKLCK%)
    Any variables as %CONVERSION_INSTRUCTIONS% aren't defined. Is it a system constant?
    Thanks.

    Hi Roberto,
    Thanks for having a look at my question.
    We're using .act files to upload data from SAP BW into SAP BPC.
    This is the content of  .act file that I'm trying to upload:
    ACTUAL
    1
    1
    GCN.CZN,2621.LC_.EUR,100.5000
    GCN.CZN,2621.TC_.CZK,7050.0000
    Transformation file looks like:
    Conversion files are:
    Time:
    Category:
    Entity:
    Counterpart:
    RCCinterco:
    IntercoCurr:
    Transcurrency:
    In case of any other info needed, please let me know.
    Thanks a lot in advance,
    Wai Yee Kong

  • How to speed up the loading of live data into flash file.

    How to speed up the loading of live data into flash file if the swf file size is 1.5 MB. Flash file is using 20 web service connections to load the live data.

    Hello,
    I am also facing a similar problem wherein the SWF file takes time to load the refreshed data in Infoview i.e. after exporting the xlf file to Business Objects platform. Currently I am using Xcelcius Engage/Enterprise 2008 SP3 Windows (file name: 51038212.ZIP) version 5.3.0.0 build number 12,3,0,670. Also the SWF file is approximately 2MB in size  and it uses 42 live office connections.
    Please suggest solution as to how to decrease the time it takes to refresh the live office connections.

  • Copying data files to separate partition

    I am using Oracle 9i Release 2 with Windows XP and I would like to store the tablespace or data files on a separate raw partition. I was told that this could increase performance. How would I go about doing this?
    Thank you,
    Malina

    First, to the best of my knowledge, Windows doesn't support raw partitions. I've only ever heard of using raw partitions on Unix.
    Second, this is one of those things that if you have to ask the question, it's probably not appropriate. Raw disk partitions are significantly harder to manage than normal partitions, so if you don't have the storage administration background to deal with this added complexity, it's really not worth it.
    Third, the performance difference between raw disk and normal partitions is pretty small and affects only certain operations. The difference for most systems will be small-- unless you have run out of other things to tune and really need a couple of % boost, I wouldn't even consider this.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

Maybe you are looking for