Export dump file and log file  name as sysdate in script

Hi to All,
Can anybody help me to give logical backup export dump file namd log file name is as contain sysdate in its name so that we can Uniquelly identified it.
Regards
DXB_DBA

On windows it gets a bit hairy as there really is no clean and nice way of doing it.There are a couple of options.
1. If you can rely on dateformat not changing, you can use a static substring expression. For example, the following might work w/ finnish locale echo %date:~3,2%%date:~6,2%%date:~9,4%Similarly, when you know the dateformat you can tokenize the output of 'date /t' and discard the tokens you don't want.
2. You can set dateformat to your liking and then just use %date% in your script
3. You can run a "SELECT to_char(sysdate,..." into a file and then read that file in your script.
4. Simon Sheppard also has a solution you could use as a basis. I have a slight issue with the approach, but that could just be me.
5. Use gnuwin32 or similar ;)
Also note that %date% env var is set automatically from w2k onwards, so some of the solutions might not work w/ older versions.

Similar Messages

  • System Center 2012 R2 install: SQL server Data file and log file

    This might be a dumb question, but I can't find the answer anywhere.  
    I'm installing a new instance of  System Center 2012 R2 on a new server, I'm stuck on the SQL Server data file section.  Everytime I put in a path, it says that tne path does not exist.  I'm I supposed to be creating some sort of SQL Server
    data file and log file before this installation, I didn't get this prompt when installing System Center 2012 SP1 or hen I upgraded from System Center 2012 SP1 to System Center 2012 R2
    My SQL is on a different server
    Thank you in advanced

    Have you reviewed the setup.log?
    On a side note, why would you put the database file on the same drive as the OS? That defeats the whole purpose of having a remote SQL Server. Why use a remote SQL Server in the first place.
    Jason | http://blog.configmgrftw.com

  • Sql server data file and log file

    hello experts
    what is the best way to save data file and log file in a two node cluster environment. i have an active\passive cluster with windows server 2008r2 enterprise and sql server 2008r2. i am new to the environment and i noticed that all system and user databases
    including their data and log files are stored in one drive, just curious, what is the best practise in such kinds of scenario, thank you as always for your help.

    Make sure  you have valid/tested  backup strategy for both system and user databases.
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • How to design SQL server data file and log file growth

    how to design SQL DB data file and log file growth- SQL server 2012
    if my data file is having 10 GB sizze and log file is having 5 GB size
    what should be the size in MB (not in %) of autogrowth. based on what we have to determine the ideal size of file auto growth.

    It's very difficult to give a definitive answer on this. Best principal is to size your database correctly in advance so that you never have to autogrow, of course in reality that isn't always practical.
    The setting you use is really dictated by the expected growth in your files. Given that the size is relatively small why not set it to 1gb on the datafile(s) and 512mb on the log file? The important thing is to monitor it on an on-going basis to see if that's
    the appropriate amount.
    One thing you should do is enable instant file initialization by granting the service account Perform Volume Maintenance tasks in group policy. This will allow the data files to grow quickly when required, details here:
    https://technet.microsoft.com/en-us/library/ms175935%28v=sql.105%29.aspx?f=255&MSPPError=-2147217396
    Also, it possible to query the default trace to find autogrowth events, if you wanted you could write an alert/sql job based on this 
    SELECT
    [DatabaseName],
    [FileName],
    [SPID],
    [Duration],
    [StartTime],
    [EndTime],
    CASE [EventClass]
    WHEN 92 THEN 'Data'
    WHEN 93 THEN 'Log' END
    FROM sys.fn_trace_gettable('c:\path\to\trace.trc', DEFAULT)
    WHERE
    EventClass IN (92,93)
    hope that helps

  • CREATE DATABASE with data file and log file in query pane

    Hi everyone, 
    After I ran the below code I got the following error message. Can someone help me fix this?
    Thanks
    CREATE DATABASE project
    ON
    (Name= 'project_dat',
    FILENAME ='C:\project.mdf',
    SIZE = 10,
    MAXSIZE = 100,
    FILEGROWTH = 5)
    LOG ON
    (NAME = project_log,
    FILENAME = 'C:\project.ldf',
    SIZE =40,
    MAXSIZE = 100,
    FILEGROWTH = 10);
    Msg 5123, Level 16, State 1, Line 1
    CREATE FILE encountered operating system error 5(Access is denied.) while attempting to open or create the physical file 'C:\project.mdf'.
    Msg 1802, Level 16, State 4, Line 1
    CREATE DATABASE failed. Some file names listed could not be created. Check related errors.
    skilo

    Hello ,
    Please go through by support site :
    Use SQL Server Enterprise Manager
    Note The instance of SQL Server Enterprise Manager that is included with SQL Server 7.0 does not support setting the default data directory and the default log directory. However, you can register your instance of SQL Server 7.0 in the instance
    of SQL Server Enterprise Manager that is included with SQL Server 2000, and you can then follow these steps to set the default data directory and the default log directory for your instance of SQL Server 7.0.
    Click Start, point to   Programs, point to
    Microsoft SQL Server, and then click Enterprise Manager.
    In SQL Server Enterprise Manager, right-click your instance of SQL Server, and then click  
    Properties.
    In the SQL Server Properties (Configure) - <Instance Name> dialog box, click the
    Database Settings tab.
    In the New database default location section, type a valid folder path in the
    Default data directory box and in the Default log directory box.
    Click OK.
    Stop your instance of SQL Server, and then restart your instance of SQL Server.
    Ahsan Kabir Please remember to click Mark as Answer and Vote as Helpful on posts that help you. This can be beneficial to other community members reading the thread. http://www.aktechforum.blogspot.com/

  • Shell Script to grep Job File name and Log File name from crontab -l

    Hello,
    I am new to shell scripting. I need to write a shell script where i can grep the name of file ie. .sh file and log file from crontab -l.
    #51 18 * * * /home/oracle/refresh/refresh_ug634.sh > /home/oracle/refresh/refresh_ug634.sh.log 2>&1
    #40 17 * * * /home/oracle/refresh/refresh_ux634.sh > /home/oracle/refresh/refresh_ux634.sh.log 2>&1
    In crontab -l, there are many jobs, i need to grep job name like 'refresh_ug634.sh' and corresponding log name like 'refresh_ug634.sh.log '.
    I am thinking of making a universal script that can grep job name, log name and hostname for one server.
    Then, suppose i modify the refresh_ug634.sh script and call that universal script and echo those values when the script gets executed.
    Please can anyone help.
    All i need to do is have footer in all the scripts running in crontab in one server.
    job file name
    log file name
    hostname
    Please suggest if any better solution. Thanks.

    957704 wrote:
    I need help how to grep that information from crontab -l
    Please can you provide some insight how to grep that shell script name from list of crontab -l jobs
    crontab -l > cron.log -- exporting the contents to a file
    cat cron.log|grep something -- need some commands to grep that infoYou are missing the point. This forum is for discussion of SQL and PL/SQL questions. What does your question have to do with SQL or PL/SQL.
    It's like you just walked into a hardware store and asked where they keep the fresh produce.
    I will point out one thing about your question. You are assuming every entry in the crontab has exactly the same format. consider this crontab:
    #=========================================================================
    # NOTE:  If this is on a clustered environment, all changes to this crontab
    #         must be replicated on all other nodes of the cluster!
    # minute        (0 thru 59)
    # hour          (0 thru 23)
    # day-of-month  (1 thru 31)
    # month         (1 thru 12)
    # weekday       (0 thru 6, sunday thru saturday)
    # command
    #=========================================================================
    00 01 1-2 * 1,3,5,7 /u01/scripts/myscript01  5 orcl  dev
    00 04 * * * /u01/scripts/myscript02 hr 365 >/u01/logs/myscript2.lis
    00 6 * * * /u01/scripts/myscript03  >/u01/logs/myscript3.lisThe variations are endless.
    When you get to an appropriate forum (this on is not it) it will be helpful to explain your business requirement, not just your proposed technical solution.

  • EXPDP generates new dmp file and reports "file already exists" error

    Hello everyone,
    Hope you all had a wonderful holiday. I got some problems with datapump expdp 10.2.0.4. It would be appreciated if you could provide some advice. Thanks in advance.
    I newly created a 10.2.0.4 database. The database can startup and be connected via Toad without problem. I can also use impdp to import some data to the new database. But when I'm trying to use expdp to export a schema from the database, I got the following errors:
    expdp parfile=expdp_scott_mfp1.parExport: Release 10.2.0.4.0 - 64bit Production on Monday, 26 December, 2011 22:10:49
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, Real Application Clusters, Data Mining and Real Application Testing options
    ORA-39001: invalid argument value
    ORA-39000: bad dump file specification
    ORA-31641: unable to create dump file "/u02/exports/mfp1/expdp_scott_mfp1_12262011.dmp"
    ORA-27038: created file already exists
    Additional information: 1
    Every time I run the expdp, it just creates the dmp file (expdp_scott_mfp1_12262011.dmp) specified in the parfile under the EXPORT directory and reports "file already exists " error.
    Your advice is highly appreciated.
    Thanks.
    Edited by: 904668 on Dec 27, 2011 8:47 AM

    i thought I found the problem. Used same file name on dump file and log file. How stupid of me. Sorry for bothering. Thanks and happy new year!

  • Process Flow ignores name and location for Control- and Log-Files

    Hi!
    Our OWB Version is 10.1.0.3.0 - DB Version 9.2.0.7.0 - OWF Version 2.6.2
    Clients and server are running on Windows. Database contains target schemas as well as OWB Design and Runtime, plus OWF repositories. The source files to load reside on the same server as the database.
    I have for example a SQL*Loader Mapping MAP_TEXT which loads one flat file "text.dat" into a table stg_text.
    The mapping MAP_TEXT is well configured and runs perfect. i.e. Control file "text.ctl" is generated to location LOC_CTL, flat file "text.dat" read from another location LOC_DATA, bad file "text.bad" is written to LOC_BAD and the log file "text.log" is placed into LOC_LOG. All locations are registered in runtime repository.
    When I integrate this mapping into a WorkFlow Process PF_TEXT, then only LOC_DATA and LOC_BAD are used. After deploying PF_TEXT, I execute it and found out, that Control and Log file are placed into the directory <OWB_HOME>\owb\temp and got generic names <Mapping Name>.ctl and <Mapping Name>.log (in this case MAP_TEXT.ctl and MAP_TEXT.log).
    How can I influence OWB to execute the Process Flow using the locations configured for the mapping placed inside?
    Has anyone any helpfull idea?
    Thx,
    Johann.

    I didn't expect to be the only one to encounter this misbehaviour of OWB.
    Meanwhile I found out what the problem is and had to recognize that it is like it is!
    There is no solution for it till Paris Release.
    Bug Nr. 3099551 at Oracle MetaLink adresses this issue.
    Regards,
    Johann Lodina.

  • Dynamic bad and Log file names!!

    Hi

    Hi
    I have the scenario like the data files would be passed as parameter to the mapping , and the mapping would have the premapping procedure which would change the file name in the external table definition. This works fine. But the one more issue I have is that, I would like to change the bad file,log file and discard file name also through this procedure. In other words, for each incoming data file I would like to create a bad ,log and disc file.So How do I accomplish in the present procedure which does the following alter table statement
    execute immediate 'alter table '||p_table_name||' location('''
    ||p_file_name||''')';
    Any suggestion would be greatly appreciated
    Thanks
    Balaji

  • Tmp file locations and Log file locations

     

    I have been having a real headache too trying to get WebLogic to put all its
    log files and temporary files in directories that I specify. It seems that
    WebLogic has a mind of its own as files get created all over the place.
    Trying to configure these really basic settings has proved extremely
    awkward. Why is it such a nightmare to do?
    "Scott Jones" <[email protected]> wrote in message
    news:3af0179d$[email protected]..
    OK, I changed the relative path for the log files.
    1. I am still getting app-startip.log
    app0000.tlog
    in the root directory and not in the ./logs directory. Any other
    settings?
    2. I sill do not know how to redirect the tmp_ejbdomain.port directory.
    Any suggestions?
    Scott
    "Sanjeev Chopra" <[email protected]> wrote in message
    news:3aef0a42$[email protected]..
    "Scott Jones" <[email protected]> wrote in message
    news:3aef05be$[email protected]..
    I a domain configured and running with two applications. WLS 6 is
    placing
    the following logs for each application at the same dir level as
    config
    dir. It is also creating tmp_ejb directory at the same level.
    1. How do I tell WLS 6 to place log files in a diff directoryIn Admin Console: modify the property Server -> Configuration ->Logging ->
    FileName
    In config.xml: the 'FileName' attr can be set to an absolute path OR apath
    relative to Server.RootDirectory
    <Server EnabledForDomainLog="true" ListenAddress="localhost"
    ListenPort="7701" Name="managed"
    StdoutDebugEnabled="true" ThreadPoolSize="15">
    <Log FileCount="10" FileMinSize="50" FileName="managed.log"
    Name="managed" NumberOfFilesLimited="true"
    RotationType="bySize"/>
    </Server>
    2. How do I tell WLS 6 to place tmp_ejb directories in a diff
    directory
    >>>
    Thanks,
    Scott

  • Steps to move Data and Log file for clustered SQL Server

    Hi guys 
    we have Active'passive SQL 2008R2 cluster environment.
    looking for steps to move Data and log files from user Database  and System Database for  SQL Server Clustered Instance. 
    Currently Data and log  files resides on same drive for user and system Databases..
    Thanks
    Please Mark As Answer if it is helpful. \\Aim To Inspire Rather to Teach A.Shah

    Try the below link
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/468de435-3432-45c2-a50b-23519cd2686e/moving-the-system-databases-in-a-sql-cluster?forum=sqldisasterrecovery
    -Prashanth

  • Printing messages in Log File and Output File using Dbms_output.put_line

    Hi,
    I have a requirement of printing messages in log file and output file using dbms_output.put_line instead of fnd_file.put_line API.
    Please let me know how can I achieve this.
    I tried using a function to print messages and calling that function in my main package where ever there is fnd_file.put_line. But this approach is not required by the business.
    So let me know how I can achieve this functionality.
    Regards
    Sandy

    What is the requirement that doesn't allow you using fnd_file.put_line?
    Please see the following links.
    https://forums.oracle.com/forums/search.jspa?threadID=&q=Dbms_output.put_line+AND+Log+AND+messages&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
    https://forums.oracle.com/forums/search.jspa?threadID=&q=%22dbms_output.put_line+%22+AND+concurrent&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
    Thanks,
    Hussein

  • JWS : Properties and Log file locations?

    We have a swing app that used log4j to log to a file. It also uses Apache commons PropertyConfiguration to allow the user to setup and save properties to a file.
    This all works great when running the app locally. The app creates the log file and properties file in the folder where the user launches the app from.
    Now that I'm trying to launch the app via JWS we should have the log file and properties file in some "known" location so that it doesn't matter if they launch the app from a browser, or from a desktop icon, or from a menu - the app will always look for the file in the same location.
    Any thoughts on where this location should be? For now we are just releasing to Windows users but eventually we will need to be able to launch on Mac and Linux as well. Is there some convention as to where such files should reside when using JWS?
    Thanks in advance!

    If it is signed, it can use a subdirectory of System.getProperty("user.home").
    You may want to look at PersistenceService.get(...).getXXXStream() that may overcome sandbox restrictions.
    Edited by: baftos on May 3, 2012 1:16 PM

  • Question regarding alert log file and trace files

    What should be the alert log file size ? And when should it be deleted? And for how many days user trace files should be kept?
    Also will anyone please tell me the importance of these files.
    Thanks

    This may help: http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14231/manproc.htm#sthref729
    There are a few discussions on it here:
    Re: Alert Log File
    alert log file contents viewing
    Re: how to read alert log file? is there any tool available?

  • Diff between Log files and Trace files

    hi
    What is the exact differences between Log Files and Trace Files.
    And the relevant informative URL/reference materils would be highly appriciated.
    Thanks
    Sekhar

    Hi,
    Go through these,
    http://help.sap.com/saphelp_nw04/helpdata/en/d1/7b1e40777cdd5fe10000000a155106/frameset.htm
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/0bf8c890-0201-0010-8b8c-fc93fa1f4795
    Hope it clarifies
    Regards
    Srinivasan T

Maybe you are looking for

  • Problem with Cross Dissolve

    I have a couple of clips I want to add the default Cross Dissolve between. The first clip is 04;04 sec., and the other clip is 04;12 sec. When I add the dissolve between them, it only comes in at a duration 00;01; while, when I add the same dissolve

  • Help! DLL's replying with strings give errors

    Hi , I use the Call Library node to call a DLL function I wrote in C void test (char Text[50]) strcpy(Text, "This is a test\0\n"); Next, I configured the Call Library Function as follows: Parameter RETURN TYPE: Type = Void; Parameter ARG1: Type = Str

  • J2eesdk-1_4- will not install

    Get a page fault error - help

  • TextArea & word wrapping

    How does one word wrap in TextArea?

  • Why does iPhoto blow up when I click on import?

    I open iPhoto, then click on "import to library" so I can select a file to import, and iPhoto blows up right away. Up comes an long error of some kind that says the error report will be sent to Apple. And, here I am!