Moving files by using date

i have my oracle installed on Linux.i want to move my trace files to backup directory.my trace files got generated irregularly.so that i need to move them my using the date they got generated so as to keep the recently generated files still in the present directory.
so need a command to move files by using date.
can any one help me
thanks in advance.

There are several ways of doing this.
If you want to do it all yourself, use the 'find' command. There's the 'ctime' switch (find based on creation date) and 'mtime' (find based on modification time)
You might want to look into logrotate (available on all unix and linx platforms).

Similar Messages

  • ORA-03297: file contains used data beyond requested RESIZE value

    DB 10.2.0.4
    AIX 5.2
    PROD
    I have dropped many big tables after moving on another database, what i thought it will free the space of the disk because right now there is no provision to extend the disk size and disk size have already occupied 90%. I dropped tables which had 150 GB total size but when i am trying to resize the datafiles it is giving below error whereas that datafile has free space ( example: free space 8000mb and total size 20000mb of that file i tried to resize it to 14000mb it is giving below error)
    I tired the same for many datafiles for a particular tablespace all datafiles are giving same error except one of them.
    ERROR at line 1:
    ORA-03297: file contains used data beyond requested RESIZE value
    Thanks

    user13382934 wrote:
    DB 10.2.0.4
    AIX 5.2
    PROD
    I have dropped many big tables after moving on another database, what i thought it will free the space of the disk because right now there is no provision to extend the disk size and disk size have already occupied 90%. I dropped tables which had 150 GB total size but when i am trying to resize the datafiles it is giving below error whereas that datafile has free space ( example: free space 8000mb and total size 20000mb of that file i tried to resize it to 14000mb it is giving below error)
    I tired the same for many datafiles for a particular tablespace all datafiles are giving same error except one of them.
    ERROR at line 1:
    ORA-03297: file contains used data beyond requested RESIZE value
    ThanksThis error occurs when some object has blocks "adjacent" to the High Water Mark.
    In order to RESIZE, the object needs to be identified & then moved;
    preferably to a different tablespace.
    After do so you can RESIZE at least a modest amount.

  • ORA-3297:file contains used data beyond requested resize value.

    Hi All,
    I am trying to rezise a datafile and getting the ORA-3297 error.
    From the dynamic views i verified that there is more than 10GB of freespace in INDEX_TB tablesapce which contains only one datafile whose present size is > 19 GB
    alter database datafile '/data/oracle/oradata/testdb/INDEX_TB.dbf' resize 17G;
    ORA-3297:file contains used data beyond requested resize value.Oracle 10.1.0 on AIX 5.3
    Details are below,
    tablespace_name:INDEX_TB
    size_mb:19797.84375
    free_mb:11947
    free%:60.34495549546904571362727317210996778374
    How do i find the high water mark of the datafile?
    ~Thanks

    Thank you all for the comments.
    Here i got an odd situation. According to the dynamic views there are 56 or 60 % are free in DATA_TB & INDEX_TB tablespace.
    But sql from Doc 130866.1 gives the output as below. I tried coalesce tablespace but no effect.
    Tablespace: DATA_TB Datafile:
    /data/oracle/oradata/testdb/DATA_TB.dbf
    Can not be resized, no free space at end of file.
    Tablespace: DATA_TB Datafile:
    /data/oracle/oradata/testdb/DATA_TB02.dbf
    Can not be resized, no free space at end of file.
    Tablespace: INDEX_TB Datafile:
    /data/oracle/oradata/testdb/INDEX_TB.dbf
    Can not be resized, no free space at end of file.
    ~Thanks

  • Bursting entire report as a single file by using Data Template

    Hi,
    I have created a report by using data template, which consists of 3 queries.
    Is it possible to burst the entire report in one single file? without splitting based on any criteria?
    Something like I want to use a constant value key so that report will split into one file ONLY.
    Please advice.
    Regards
    Muarli.

    You are right in saying that I dont want to use the burst option.. but the problem with FTP delivery, is that I can't pass the date parameter in the file name. Only fixed single file name is provided there..
    Work around for this is to use bursting option to pass corresponding parameter value in output file.
    I'm facing the problem that by using data template source, I can't burst in enterprise edition based on SQL statement.
    Getting the below error:
    oracle.apps.xdo.servlet.scheduler.ProcessingException: java.lang.NullPointerException
         at oracle.apps.xdo.servlet.scheduler.XDOJob.runBurstingReport(XDOJob.java:2116)
         at oracle.apps.xdo.servlet.scheduler.XDOJob.execute(XDOJob.java:358)
         at org.quartz.core.JobRunShell.run(JobRunShell.java:195)
         at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:520)
    Caused by: java.lang.NullPointerException
         at oracle.apps.xdo.servlet.scheduler.XDOJob.runBurstingReport(XDOJob.java:2018)
         ... 3 more
    Bursting Definitions I used are as below.
         Bursting Node          /SCHEME_SETT_SUMMARY/LIST_G_VALUE_DATE/G_VALUE_DATE/ssc_code
         Delivery Node           /SCHEME_SETT_SUMMARY/LIST_G_VALUE_DATE/G_VALUE_DATE/ssc_code
    But the same report is running and I can schedule by using FTP delivery without bursting option successfully.
    Please assist in resolving this issue.
    Regards
    Murali.

  • How to spilt files when using DATA UNLOAD (External Table, 10g)?

    Hi,
    I am runnin 10gR2 and need to export partitions by using data_pump driver
    via dmp files (External Tables).
    Now as requierment the created files can not exceed 2GB mark.
    Some of partitions are larger than that.
    How could I split the partition so I could be able to create files smaller than 2GB? Is there any parameter I am not aware of or do I need to do SELECT COUNT(*) FROM source_table PARTITION(partiton_01);
    and than to work with ROWNUM?
    This example working fine for all partitions samller than 2GB:
    CREATE TABLE partiton_01_tbl
    2 ORGANIZATION EXTERNAL
    3 (
    4 TYPE ORACLE_DATAPUMP
    5 DEFAULT DIRECTORY def_dir1
    6 LOCATION ('inv_xt1.dmp')
    7 )
    8 PARALLEL 3
    9 AS SELECT * FROM source_table PARTITION(partiton_01);

    You could specify multiple destination files in the LOCATION parameter (the number of files should match the degree of parallelism specified). I am not aware of an option that would allow the external table to automatically add new data files as the partition size increased, so you'd likely have to do some sort computation about the expected size of the dump file in order to figure out how many files to create.
    Justin

  • Reading a text file and using data to plot a graph

    Dear Friends,
    I have the following problem...
    I have a text data looking like this;
    100000000
    1003ff001
    1010013ff
    1000003ff
    1023fe001
    102000000
    1023ff3ff
    1010013fd
    0ff0033fc
    0ff002001
    0fe3fd3ff
    0ff000002
    100000000
    1003ff001
    1010013ff
    1000003ff
    1023fe001
    102000000
    1023ff3ff
    1010013fd
    0ff0033fc
    0ff002001
    0fe3fd3ff
    0ff000002
    My Code should have a button , when you click that button it should ask you to choose a txt file to open this file from a specific place. The Indicators of the front panel should show me the ID , Name and Date. The data;
    100000000
    1003ff001
    1010013ff
    1000003ff
    1023fe001
    102000000
    1023ff3ff
    1010013fd
    the first 3 bits belong to X axis, the next 3 bits belong to Y and the rest 3 belongs to Z axis.  I have to convert those values to Decimal system, and plot them in a graph, 10 times a second , which means after converting and plotting 10 values , one second is over.
    I would aprriciate it very much if you could help me in this case....
    Thanks in advance.
    Message Edited by Support on 11-13-2007 01:03 PM
    Message Edited by Support on 11-13-2007 01:04 PM

    Some questions a teacher might ask:
    What's the display setting for the "\n" string diagram constant? Why?
    Is it better to read the file one line at a time or all at once?
    What would you do to be able to stop the chart display before it runs out of data?
    What would be different if we combine the two FOR loops into one?
    How would you display the header as a multiline string instead of an array of strings?
    Why is the scan format %03x instead of %3x? Would it make a difference?
    What is the purpose of the two items of the property node?
    What would you need to change to plot the data every 50ms? (change in two places!)
    Why is the error output of the property node wired to the first FOR loop? (After all, the error value is not really used anywhere there!)
    Instead of the two "array subset" operations, we could do it with one icon. Which one? (two possibilities!)
    So... be prepared!
    Message Edited by altenbach on 10-29-2007 11:31 AM
    LabVIEW Champion . Do more with less code and in less time .

  • How do you open recovered files after using data rescue II?

    i installed data rescue II because i had to restart my hard drive and my information did not transfer properly to the external hard drive. im pretty sure data rescue recovered all of my files but the manual and help windows will not scroll down so i have no idea how to get my files back onto my computer. please tell me how to use this program and get everything back to normal.

    What is your Reader version?  Is that all you get: "about ASCII filter"?

  • Dmp file created using data pump

    Hi All,
    I have created dmp file using oracle datapump utility and identified that even after dropping dmp file physically from hard disk. I saw there is no storage release. Is there any specefic way to get spaces free from disk?
    Best Regards,
    Abida

    986655 wrote:
    I have created dmp file using oracle datapump utility and identified that even after dropping dmp file physically from hard disk. I saw there is no storage release. Is there any specefic way to get spaces free from disk?In other words:
    - there are physical files on the file system
    - you delete these files
    - file system does not show an increase in free space
    Just how is this an Oracle issue, never mind a SQL and PL/SQL language issue?
    Oracle physical (datapump) files are no different than Excel spreadsheet files, are no different than mp4 movie files, are no different from any other file...
    If deleting a physical file from a file system does not release that allocated space and increase the file system's space, then you need to ask that question to file system. Not Oracle. Not SQL. Not PL/SQL.
    What is the o/s and what is the file system? Then use that, and your issue, as criteria for a stfw exercise.
    And yes, this is not an uncommon issue on Unix based file systems as an in-use file can be "deleted" - which means the file entry is removed from the file catalog, but the iondes, containing the file's contents, are only wiped (and space released) when the open file handles for that file are all closed.

  • Issue in moving files from data fodler to processed folder in background

    Hi All,
    I am facing one issue in moving files from data fodler to processed folder in case of background execution.
    When i am executing the file in the foreground, i can move the file from Data folder to processed folder. I am using SXPG_COMMAND_EXECUTE FM to move the file from data folder to processed folder.  I can see the file in processed folder once the program is executed.
    But in case of executing the same program in background, it is giving me the error "Failed to move the file to processed folder" in the spool of SM37 and i can see the file still laying in data folder.
    I tried to check other programs which acesses the same folder as the above program, whether they are able to move. They are able to move the file to processed fodler successfully both in foreground and background mode.
    Please help me in resolving this issue.
    Thanks,
    Deepa

    Hi Sanu,
                    Please use teh following code to move the file from source to target folder.
    This is a code showing how to create and use COPY command of UNIX in ABAP
    PARAMETERS:
    Input file path
    p_input TYPE localfile,
    Processed file path
    p_proc TYPE localfile.
    Declare the Types to file data
    TYPES: BEGIN OF L_X_OUTPUT,
    sys(200), " Please note, there are asterisk before and after sys (i.e.sys)
    END OF L_X_OUTPUT.
    * Internal table to store file data
    DATA l_i_output TYPE STANDARD TABLE OF l_x_output WITH HEADER LINE.
    * Variable for the UNIX command
    DATA: l_v_unix_comm(255) TYPE c.
    Copy command of UNIX
    CONCATENATE 'mv' p_input p_proc
    INTO l_v_unix_comm SEPARATED BY space.
    For example the Copy command is stored as below
    cp u2018/data/interfaces/input/input_fileu2019 u2018/data/interfaces/processed/processed_fileu2019
    Examples of UNIX Command *u2022 mv filename1 filename2 --- moves a file (i.e. gives it a different name, or moves it into a *different directory (see below) *u2022 cp filename1 filename2 --- copies a file
    Execute the UNIX Copy command.
    This command will copy the file from input file path to the processed file path
    CALL 'SYSTEM' ID 'COMMAND' FIELD l_v_unix_comm
    ID 'TAB'
    FIELD l_i_output-sys.
    IF sy-subrc eq 0.
    write: 'File is copied successfully using UNIX command in ABAP'.
    ENDIF.

  • I moved files off my computer to an external hard drive now time machine won't back up like it used to.  It says there is not enough room.

    My hard drive was completely full (mostly with pictures) so I moved several years worth of photos from the hard drive to an external hard drive I regularly use with the laptop. (Basically my master copies of all my photos back to 1997)  I have always had Time Machine set up to back up both the laptop AND that external drive.  It's always worked seamlessly until I moved those files.  Time Machine has been full for a while now and regularly gives me a message that the oldest files will be deleted.  That's fine. 
    I don't understand why moving files from the laptop to the external hd would now cause the following message (not an exact quote): Time Capsule cannot complete backup.  You need 233 gb to backup and time capsule only has 107 gb available. If time capsule was already set to backup up both the laptop and the external hd, why would moving those photos from one hd to the other cause a space issue?  It should equal out.   And now when I look at the files on the time capsule, it only shows me a recent backup and a partial incomplete backup.  None of the older backups are visible anymore. I have not deleted anything directly from the time capsule so I'm thinking somehow the index got damaged.  How can I fix this?  The error message says to change the settings but I really don't know what settings to change.  Help!!

    It does not immediately delete the files that are missing.. it merely tries to backup the new files it has discovered on the external drive. Since you moved the files there, the backup at least for a while will contain both copies.. and that is why it doesn't have enough space to backup.
    Fixing the problem.. hmm pondini is the expert.. you should be able to delete the backup of the photos in existing backup.. there are instructions to do it.. but it is a very poor way to do things.
    Much better. .archive off the existing backup.. this is long and slow but worth it. you need a usb drive of the same size as the TC drive. Then erase the TC and start a fresh set of backups.
    The alternative is to use the USB drive as a new target.. but it is much slower than the TC internal drive.
    http://pondini.org/TM/12.html

  • Hi am trying to save Data into a write to measurement file vi using a NI PXI 1042Q with a real time mode but it is not working but when i run it with uploading it into the PXI it save in to the file

    Hi am trying to save Data into a write to measurement file vi using a NI PXI 1042Q and DAQ NI PXI-6229 with a real time mode but it is not working but when i run it without uploading it into the PXI it save in to the file please find attached my vi
    Attachments:
    PWMs.vi ‏130 KB

     other problem is that the channel DAQmx only works at real time mode not on stand alone vi using Labview 8.2 and Real time 8.2

  • How to get the most current file based on date and time stamp using SSIS?

    Hello,
    Let us assume that files get copied in a specific directory. We need to pick up a file and load data. Can you guys let me know how to get the most current file based on date and time stamp using SSIS?
    Thanks
    thx regards dinesh vv

    hi simon
    i excuted this script it is giving error..
       Microsoft SQL Server Integration Services Script Task
       Write scripts using Microsoft Visual C# 2008.
       The ScriptMain is the entry point class of the script.
    using System;
    using System.Data;
    using Microsoft.SqlServer.Dts.Runtime;
    using System.Windows.Forms;
    namespace ST_9a6d985a04b249c2addd766b58fee890.csproj
        [System.AddIn.AddIn("ScriptMain", Version = "1.0", Publisher = "", Description = "")]
        public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
            #region VSTA generated code
            enum ScriptResults
                Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
                Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
            #endregion
            The execution engine calls this method when the task executes.
            To access the object model, use the Dts property. Connections, variables, events,
            and logging features are available as members of the Dts property as shown in the following examples.
            To reference a variable, call Dts.Variables["MyCaseSensitiveVariableName"].Value;
            To post a log entry, call Dts.Log("This is my log text", 999, null);
            To fire an event, call Dts.Events.FireInformation(99, "test", "hit the help message", "", 0, true);
            To use the connections collection use something like the following:
            ConnectionManager cm = Dts.Connections.Add("OLEDB");
            cm.ConnectionString = "Data Source=localhost;Initial Catalog=AdventureWorks;Provider=SQLNCLI10;Integrated Security=SSPI;Auto Translate=False;";
            Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
            To open Help, press F1.
            public void Main()
                string file = Dts.Variables["User::FolderName"].Value.ToString();
                string[] files = System.IO.Directory.GetFiles(Dts.Variables["User::FolderName"].Value.ToString());
                System.IO.FileInfo finf;
                DateTime currentDate = new DateTime();
                string lastFile = string.Empty;
                foreach (string f in files)
                    finf = new System.IO.FileInfo(f);
                    if (finf.CreationTime >= currentDate)
                        currentDate = finf.CreationTime;
                        lastFile = f;
                Dts.Variables["User::LastFile"].Value = lastFile;
                Dts.TaskResult = (int)ScriptResults.Success;
    thx regards dinesh vv

  • Regarding reading the data from the files without using Stremas

    hai to all of u...
    here i have a problem where i have to read the data from the files without using any streams.
    please guide me how to do this one,if possible by giving with an example
    Thanks & Regard
    M.Ramakrishna

    Simply put, you can't.
    By why do you need to?

  • Export table data in a flat file without using FL

    Hi,
    I am looking for options where I can export table data into a flat file without using FL(File Layout) i.e., by using App Engine only.
    Please share your experience if you did anything as this
    Thanks

    A simple way to export any record (table/view) to an csv fiel, is to create a rowset and loop through all record fields, like below example code
    Local Rowset &RS;
    Local Record &Rec;
    Local File &MYFILE;
    Local string &FileName, &strRecName, &Line, &Seperator, &Value;
    Local number &numRow, &numField;
    &FileName = "c:\temp\test.csv";
    &strRecName = "PSOPRDEFN";
    &Seperator = ";";
    &RS = CreateRowset(@("Record." | &strRecName));
    &RS.Fill();
    &MYFILE = GetFile(&FileName, "W", %FilePath_Absolute);
    If &MYFILE.IsOpen Then
       For &numRow = 1 To &RS.ActiveRowCount
          &Rec = &RS(&numRow).GetRecord(@("RECORD." | &strRecName));
          For &numField = 1 To &Rec.FieldCount
             &Value = String(&Rec.GetField(&numField).Value);
             If &numField = 1 Then
                &Line = &Value;
             Else
                &Line = &Line | &Seperator | &Value;
             End-If;
          End-For;
          &MYFILE.WriteLine(&Line);
       End-For;
    End-If;
    &MYFILE.Close(); You can of course create an application class for generic calling this piece of code.
    Hope it helps.
    Note:
    Do not come complaining to me on performance issues ;)

  • How to find out each Cell having Data or Not in Excel File by Using WDJ

    Hi Friends,
    I have one doubt on WDJ.
    I have to Upload Excel File. Click on Upload Button in Excel file Data will move to One Bapi. This is I was done. But my Requirement is if any empty Cell in Excel That File not uploaded it display one error message Please upload Correct Excel File
    How to find out each Cell having Data or Not in Excel File by Using WDJ. Please tell me.
    By Using this Code I have Upload Excel File
    InputStream text = null;
         int temp = 0;
         //wdComponentAPI.getMessageManager().reportSuccess("filePath Vijay:::");
         try
                   File file = new File(wdContext.currentContextElement().getResource().getResourceName().toString());     
    FileOutputStream op = new FileOutputStream(file);
                   if (wdContext.currentContextElement().getResource()!= null)
                          text = wdContext.currentContextElement().getResource().read(false);
                             while((temp=text.read())!= -1)
                                  op.write(temp);
                                  op.flush();
                                  op.close();
                                  path = file.getAbsolutePath();
                                  //wdComponentAPI.getMessageManager().reportSuccess("Path Name :::::"+path);
         catch(Exception ex)               
                   ex.printStackTrace();
    But my Requirement is If excel having any Empty Cell that excel file not uploaded.How to do this...
    Regards
    Vijay Kalluri

    Hi my friend
    I would like to share you some APACHE APi´s that i use when i have to read excel files in Web Dynpro.
    JAR = poi-3.2-FINAL-20081019.jar
    Some Example:
    POIFSFileSystem fs;
    HSSFWorkbook wb;
    HSSFSheet sheet;
    String myMexican_ValueFromExcelis = "";
    try {
             fs = new POIFSFileSystem(new FileInputStream();
             // and select the cell "y"
            cell = row.getCell( 0 );
            myMexican_ValueFromExcelis = cell.getCellValue();  
    }cach(Exception e){
    REgargds

Maybe you are looking for

  • HT204053 using same apple id on multiple devices

    Can you set up multiple devices(iphone, itouch, ipad) with the same apple id?

  • Disk Util cannot unmount ext HD to format or partition

    Computer:  MacBook Pro Retina 13" latest model.  All updates have been applied. I have a 1TB WD external harddrive.  It has 2 partitions (1 FAT32, 1 Mac OS Extended).  This drive was working properly until one day it stopped mounting. Problem: - Mac

  • Sequence order and  index scan

    1)If i miss the sequence while importing tables how can i restore the correct sequence value for that particular column of that table. 2) I created 2 indexes for a table the table is big around 3 gb.The indexes also around 1.3 and 1 gb each.when i am

  • Blackberry Z10 Software update 10.1.0.273

    Hi.. I got a message for the update of my blackberry Z10 software to 10.1.0.273... When I read review of it, I got mixed response... Lots of people are complaining for: 1- Not able to send SMS 2- Battery draining very quickly 3- Phone getting heated

  • How to create a oval shape JButton?

    Dear All, Can you please tell me how to create an Oval shaped Jbutton. Please send me code if possible. Regards, Sat