Working Linux command to grep date range in a log file

Linux Gurus,
Could you please help me with command to show only those lines in a log file which falls under some date range, probably using grep command.
Our server logs are in following format <Jun 23, 2013 12:45:02 AM UTC>
Regards,
Varun

Perhaps you can do the following:
Go to Google.
Type "working Linux command to grep date range in a log file"
See what results you might get from that search.  ( I did, and got more than 600,000 search results.)

Similar Messages

  • Got 'bea.jolt.ServiceException: Data conversion failed; check log file'

    Got CHECK APPSERVER LOGS. THE SITE BOOTED WITH INTERNAL DEFAULT SETTINGS, BECAUSE OF: bea.jolt.ServiceException: Data conversion failed; check log file
    when started PIA and login screen appeared.

    What version of PeopleTools are you using?  Have you looked at E-AS: Error "Cannot find or open field table. Maybe FIELDTBLS32 is not set properly" (Doc ID 660607.1) yet?

  • How to see data for particular date from a alert log file

    Hi Experts,
    I would like to know how can i see data for a particular date from alert_db.log in unix environment. I'm suing 0racle 9i in unix
    Right now i'm using tail -500 alert_db.log>alert.txt then view the whole thing. But is there any easier way to see for a partiicular date or time
    Thanks
    Shaan

    Hi Jaffar,
    Here i have to pass exactly date and time, is there any way to see records for let say Nov 23 2007. because when i used this
    tail -500 alert_sid.log | grep " Nov 23 2007" > alert_date.txt
    It's not working. Here is the sample log file
    Mon Nov 26 21:42:43 2007
    Thread 1 advanced to log sequence 138
    Current log# 3 seq# 138 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo3.log
    Mon Nov 26 21:42:43 2007
    ARCH: Evaluating archive log 1 thread 1 sequence 137
    Mon Nov 26 21:42:43 2007
    ARC1: Evaluating archive log 1 thread 1 sequence 137
    ARC1: Unable to archive log 1 thread 1 sequence 137
    Log actively being archived by another process
    Mon Nov 26 21:42:43 2007
    ARCH: Beginning to archive log 1 thread 1 sequence 137
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_137
    .dbf'
    ARCH: Completed archiving log 1 thread 1 sequence 137
    Mon Nov 26 21:42:44 2007
    Thread 1 advanced to log sequence 139
    Current log# 2 seq# 139 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo2.log
    Mon Nov 26 21:42:44 2007
    ARC0: Evaluating archive log 3 thread 1 sequence 138
    ARC0: Beginning to archive log 3 thread 1 sequence 138
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_138
    .dbf'
    Mon Nov 26 21:42:44 2007
    ARCH: Evaluating archive log 3 thread 1 sequence 138
    ARCH: Unable to archive log 3 thread 1 sequence 138
    Log actively being archived by another process
    Mon Nov 26 21:42:45 2007
    ARC0: Completed archiving log 3 thread 1 sequence 138
    Mon Nov 26 21:45:12 2007
    Starting control autobackup
    Mon Nov 26 21:45:56 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0033'
    handle 'c-2861328927-20071126-01'
    Clearing standby activation ID 2873610446 (0xab47d0ce)
    The primary database controlfile was created using the
    'MAXLOGFILES 5' clause.
    The resulting standby controlfile will not have enough
    available logfile entries to support an adequate number
    of standby redo logfiles. Consider re-creating the
    primary controlfile using 'MAXLOGFILES 8' (or larger).
    Use the following SQL commands on the standby database to create
    standby redo logfiles that match the primary database:
    ALTER DATABASE ADD STANDBY LOGFILE 'srl1.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl2.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl3.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl4.f' SIZE 10485760;
    Tue Nov 27 21:23:50 2007
    Starting control autobackup
    Tue Nov 27 21:30:49 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0280'
    handle 'c-2861328927-20071127-00'
    Tue Nov 27 21:30:57 2007
    ARC1: Evaluating archive log 2 thread 1 sequence 139
    ARC1: Beginning to archive log 2 thread 1 sequence 139
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_139
    .dbf'
    Tue Nov 27 21:30:57 2007
    Thread 1 advanced to log sequence 140
    Current log# 1 seq# 140 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo1.log
    Tue Nov 27 21:30:57 2007
    ARCH: Evaluating archive log 2 thread 1 sequence 139
    ARCH: Unable to archive log 2 thread 1 sequence 139
    Log actively being archived by another process
    Tue Nov 27 21:30:58 2007
    ARC1: Completed archiving log 2 thread 1 sequence 139
    Tue Nov 27 21:30:58 2007
    Thread 1 advanced to log sequence 141
    Current log# 3 seq# 141 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo3.log
    Tue Nov 27 21:30:58 2007
    ARCH: Evaluating archive log 1 thread 1 sequence 140
    ARCH: Beginning to archive log 1 thread 1 sequence 140
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_140
    .dbf'
    Tue Nov 27 21:30:58 2007
    ARC1: Evaluating archive log 1 thread 1 sequence 140
    ARC1: Unable to archive log 1 thread 1 sequence 140
    Log actively being archived by another process
    Tue Nov 27 21:30:58 2007
    ARCH: Completed archiving log 1 thread 1 sequence 140
    Tue Nov 27 21:33:16 2007
    Starting control autobackup
    Tue Nov 27 21:34:29 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0205'
    handle 'c-2861328927-20071127-01'
    Clearing standby activation ID 2873610446 (0xab47d0ce)
    The primary database controlfile was created using the
    'MAXLOGFILES 5' clause.
    The resulting standby controlfile will not have enough
    available logfile entries to support an adequate number
    of standby redo logfiles. Consider re-creating the
    primary controlfile using 'MAXLOGFILES 8' (or larger).
    Use the following SQL commands on the standby database to create
    standby redo logfiles that match the primary database:
    ALTER DATABASE ADD STANDBY LOGFILE 'srl1.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl2.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl3.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl4.f' SIZE 10485760;
    Wed Nov 28 21:43:31 2007
    Starting control autobackup
    Wed Nov 28 21:43:59 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0202'
    handle 'c-2861328927-20071128-00'
    Wed Nov 28 21:44:08 2007
    Thread 1 advanced to log sequence 142
    Current log# 2 seq# 142 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo2.log
    Wed Nov 28 21:44:08 2007
    ARCH: Evaluating archive log 3 thread 1 sequence 141
    ARCH: Beginning to archive log 3 thread 1 sequence 141
    Wed Nov 28 21:44:08 2007
    ARC1: Evaluating archive log 3 thread 1 sequence 141
    ARC1: Unable to archive log 3 thread 1 sequence 141
    Log actively being archived by another process
    Wed Nov 28 21:44:08 2007
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_141
    .dbf'
    Wed Nov 28 21:44:08 2007
    ARC0: Evaluating archive log 3 thread 1 sequence 141
    ARC0: Unable to archive log 3 thread 1 sequence 141
    Log actively being archived by another process
    Wed Nov 28 21:44:08 2007
    ARCH: Completed archiving log 3 thread 1 sequence 141
    Wed Nov 28 21:44:09 2007
    Thread 1 advanced to log sequence 143
    Current log# 1 seq# 143 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo1.log
    Wed Nov 28 21:44:09 2007
    ARCH: Evaluating archive log 2 thread 1 sequence 142
    ARCH: Beginning to archive log 2 thread 1 sequence 142
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_142
    .dbf'
    Wed Nov 28 21:44:09 2007
    ARC0: Evaluating archive log 2 thread 1 sequence 142
    ARC0: Unable to archive log 2 thread 1 sequence 142
    Log actively being archived by another process
    Wed Nov 28 21:44:09 2007
    ARCH: Completed archiving log 2 thread 1 sequence 142
    Wed Nov 28 21:44:36 2007
    Starting control autobackup
    Wed Nov 28 21:45:00 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0202'
    handle 'c-2861328927-20071128-01'
    Clearing standby activation ID 2873610446 (0xab47d0ce)
    The primary database controlfile was created using the
    'MAXLOGFILES 5' clause.
    The resulting standby controlfile will not have enough
    available logfile entries to support an adequate number
    of standby redo logfiles. Consider re-creating the
    primary controlfile using 'MAXLOGFILES 8' (or larger).
    Use the following SQL commands on the standby database to create
    standby redo logfiles that match the primary database:
    ALTER DATABASE ADD STANDBY LOGFILE 'srl1.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl2.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl3.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl4.f' SIZE 10485760;
    Thu Nov 29 21:36:44 2007
    Starting control autobackup
    Thu Nov 29 21:42:53 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0206'
    handle 'c-2861328927-20071129-00'
    Thu Nov 29 21:43:01 2007
    Thread 1 advanced to log sequence 144
    Current log# 3 seq# 144 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo3.log
    Thu Nov 29 21:43:01 2007
    ARCH: Evaluating archive log 1 thread 1 sequence 143
    ARCH: Beginning to archive log 1 thread 1 sequence 143
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_143
    .dbf'
    Thu Nov 29 21:43:01 2007
    ARC1: Evaluating archive log 1 thread 1 sequence 143
    ARC1: Unable to archive log 1 thread 1 sequence 143
    Log actively being archived by another process
    Thu Nov 29 21:43:02 2007
    ARCH: Completed archiving log 1 thread 1 sequence 143
    Thu Nov 29 21:43:03 2007
    Thread 1 advanced to log sequence 145
    Current log# 2 seq# 145 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo2.log
    Thu Nov 29 21:43:03 2007
    ARCH: Evaluating archive log 3 thread 1 sequence 144
    ARCH: Beginning to archive log 3 thread 1 sequence 144
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_144
    .dbf'
    Thu Nov 29 21:43:03 2007
    ARC0: Evaluating archive log 3 thread 1 sequence 144
    ARC0: Unable to archive log 3 thread 1 sequence 144
    Log actively being archived by another process
    Thu Nov 29 21:43:03 2007
    ARCH: Completed archiving log 3 thread 1 sequence 144
    Thu Nov 29 21:49:00 2007
    Starting control autobackup
    Thu Nov 29 21:50:14 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0280'
    handle 'c-2861328927-20071129-01'
    Thanks
    Shaan

  • How to add a date suffix to the log file name

    In Windows, I want to run certain commands and save the output to a logfile every day. How to add a suffix to the log file name so I can distinguish which log file for which day?
    e.g. cmd >> logfile.date

    AZ wrote:
    In Windows, I want to run certain commands and save the output to a logfile every day. How to add a suffix to the log file name so I can distinguish which log file for which day?
    e.g. cmd >> logfile.datemy best friend name is "google", refer to this [url | http://stackoverflow.com/questions/203090/how-to-get-current-datetime-on-windows-command-line-in-a-suitable-format-for-usi]
    This is what i did
    1) created a dummy file in c drive
    2) copy pasted below lines, you can play around more with the format
    set _my_datetime=%date%_%time%
    set _my_datetime=%_my_datetime: =_%
    set _my_datetime=%_my_datetime::=%
    set _my_datetime=%_my_datetime:/=_%
    set _my_datetime=%_my_datetime:.=_%3) Rename the file from dos
    ren some.txt dummy_file_%_my_datetime%.txt4) Here goes the output
    C:\dir
    dummy_file_Mon_09_20_2010_161347_21.txt
    Most of the code i copied from above url, you can tweak a little bit based on ur requirement and format.
    Regards
    Learner                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Dates appear different in log file vs. debug page for Deferred Task

    I added a deferred task and in the log file, the date appeared correctly as
    Mon Dec 15 16:34:11 PST 2008
    But when I viewed the user in the debug page, the date appeared as
    <Date>2008-12-16T00:34:11.430Z</Date>
    I called an external java class that return a Date object
    <Action id='0' application='com.waveset.session.WorkflowServices'>
    <Argument name='op' value='addDeferredTask'/>
    <Argument name='date'>
    <invoke name='addWeekDays' class='MyDateUtil'/>
    </Argument>
    </Action>
    Do you have any ideas?

    IDM commonly stores dates in a Java or JDBC date format (which is what your debug date is) but often formats the date differently for log files and for web pages. It's annoying if you're trying to line two different outputs up.

  • Data Synchronisation - cannot see log files

    Hi,
    when I run the data synchronisation process it completes normally, but when I try to view source log / destination log / data synchronisation log they all give me the same message in the output file =>
    "Unable to download file because of an error or the file wasn't found. Please contact your system administrator."
    The version is 11.1.2.1.0.83
    Our server OS is windows 2008 r2.
    This feels like a security issue on a directory, can anyone tell me which directory I need to look for, and what rights the middleware would need to be able to view these files?
    with thanks,
    Robert.

    I think the logs will be generarted under <MIDDLEWARE_HOME>\user_projects\<instancename>\tmp
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Date format inconsistent in log files

    Hi All,
    I have a cluster spread accross 4 machines(4 different physical boxes).
    In three of the machines, the log format is ,
    <17/1/2010>
    while in the fourth on it is like,
    <Jan 17 >
    We require the log format in the form <17/1/2010>
    Any help or suggestion???
    Thanks.

    Usually, this is due to a localization difference in the JVMs. When WLS boots, the .log file should show you some detailed information about the locale it is using, make sure that is consistent across the machines.

  • Search by date range

    This should be simple but I cannot get it to work. Trying to
    construct a page where the user enters a start date & an end
    date & we display results.
    I am using :
    SELECT *
    FROM AVRmonth
    WHERE ExpiryDate BETWEEN #varStart# AND #varEnd#
    In the Variables editor I have:
    varStart % Request("startFld")
    varEnd % Request("endFld")
    This fails to work, DW gives the error of "Syntax error in
    query expression 'ExpiryDate BETWEEN #%# AND #%#
    If I change default values to be a date, eg: #1/1/04 DW gives
    no error but query does not work
    If I use no variables but just use :
    SELECT *
    FROM AVRmonth
    WHERE ExpiryDate BETWEEN #01/01/04# AND #01/06/05#
    It works & displays results from that date range.
    I think I have tried all combinations of using # ' to
    surround variable names but still no joy.
    I am sure this is something blindingly obvious but I cannot
    see it.
    Can anyone please help...........
    This is using MS Access 2000 database

    You could use this and all you need to change is the
    MM_NAMEOFYOURCONNECTION_STRING, look for the Dim
    MM_YOURCONNECTION_STRING in the connections folder, but this should
    work.add the <!--#include file="Connections/YOURCONN.asp" -->
    from the top of your page. You can make this recordset from the
    advanced button in the recordset creation wizard, there is a good
    tutorial on adobes site also search for 'dreamweaver multiple
    paramter query' for other tutorials
    http://www.adobe.com/support/ultradev/building/multiple_param_search/multiple_param_search 02.html
    <%
    Dim rsFriday__MMstart
    rsFriday__MMstart = "1"
    If (Request.Form("strtFld") <> "") Then
    rsFriday__MMstart = Request.Form("strtFld")
    End If
    %>
    <%
    Dim rsFriday__MMend
    rsFriday__MMend = "1"
    If (Request.Form("strtFld") <> "") Then
    rsFriday__MMColParam = Request.Form("endFld")
    End If
    %>
    <%
    Dim rsFriday
    Dim rsFriday_cmd
    Dim rsFriday_numRows
    Set rsFriday_cmd = Server.CreateObject ("ADODB.Command")
    rsFriday_cmd.ActiveConnection =
    MM_NAMEOFYOURCONNECTION_STRING
    rsFriday_cmd.CommandText = "SELECT * FROM AVRmonth WHERE
    ExpiryDate BETWEEN ? AND ?"
    rsFriday_cmd.Prepared = true
    rsFriday_cmd.Parameters.Append
    rsFriday_cmd.CreateParameter("param1", 135, 1, -1,
    rsFriday__MMstart) ' adDBTimeStamp
    rsFriday_cmd.Parameters.Append
    rsFriday_cmd.CreateParameter("param1", 135, 1, -1, rsFriday__MMend)
    ' adDBTimeStamp
    Set rsFriday = rsFriday_cmd.Execute
    rsFriday_numRows = 0
    %>
    put this at end of page after </body> </html>
    tags;
    <%
    rsFriday.Close()
    Set rsFriday = Nothing
    %>

  • Problem with customer exit variable on date range

    Hi All,
    I have customer exit variable on date range. In the selection screen it has to give the week range  as a default (05/21/2009 to 05/27/2009).
    Earlier its working fine and from yesterday onwards it is not working properly. yesday onwards default date range was not displaying in selection screen.
    Wht would be the problem.
    Thanks in Advance

    Hi Ashish,
    I checked every thing what u told earlier. Every thing is fine.
    And another thing is
    I have routine in infopackage level. Since day before yesterday it was working fine and yesterday onwards it was not working.
    Eg: budat will take the data based on routine for the week. But yesterday onwards it is not picking up the data.however I have the data in datasource for the week.
    Wht would be the problem. I debuged the code and its working fine.
    Sekhar

  • How to delete the data in archived log files

    hi
    how can i delete the enteries in archived log files. and what is the disadvantage of deleting archived log enteries.

    There is no documented way to delete data stored in archived log files: you can only remove the archived log files if needed.

  • Corrupt log file, but how does db keep working?

    We recently had a fairly devastating outage involving a hard drive failure, but are a little mystified about the mechanics of what went on with berkeleydb which I hope someone here can clear up.
    A hard drive running a production instance failed because of a disk error, and we had to do a hard reboot to get the system to come back up and right itself (we are running RedHat Enterprise). We actually had three production environments running on that machine, and two came back just fine, but in one, we would get this during recovery:
    BDBStorage> Running recovery.
    BerkeleyDB> : Log file corrupt at LSN: [4906][8294478]
    BerkeleyDB> : PANIC: Invalid argument
    BerkeleyDB> : /usr/local/BerkeleyDB.4.8/lib/libdb_java-4.8.so(__os_stack+0x20) [0x2c23af2380]
    BerkeleyDB> : /usr/local/BerkeleyDB.4.8/lib/libdb_java-4.8.so(__os_abort+0x15) [0x2c23aee9c9]
    BerkeleyDB> : /usr/local/BerkeleyDB.4.8/lib/libdb_java-4.8.so(__env_panic+0xef) [0x2c23a796f9]
    BerkeleyDB> : /usr/local/BerkeleyDB.4.8/lib/libdb_java-4.8.so(__env_attach_regions+0x788) [0x2c23aae82c]
    BerkeleyDB> : /usr/local/BerkeleyDB.4.8/lib/libdb_java-4.8.so(__env_open+0x130) [0x2c23aad1e7]
    BerkeleyDB> : /usr/local/BerkeleyDB.4.8/lib/libdb_java-4.8.so(__env_open_pp+0x2e7) [0x2c23aad0af]
    BerkeleyDB> : /usr/local/BerkeleyDB.4.8/lib/libdb_java-4.8.so [0x2c23949dc7]
    BerkeleyDB> : /usr/local/BerkeleyDB.4.8/lib/libdb_java-4.8.so(Java_com_sleepycat_db_internal_db_1javaJNI_DbEnv_1open+0xbc) [0x2c239526ea]
    BerkeleyDB> : [0x2a99596e77]
    We thought, well, perhaps this is related to the disk error, it corrupted a log file and then died. Luckily (or so we thought) we diligently do backups twice a day, and keep a week's worth around. These are made using the standard backup procedure described in the developer's guide, and whenever we've had to restore them, they have been just fine (we've been using our basic setup for something like 9 years now). However, as we retrieved backup after backup, going back three or four days, they all had similar errors, always starting with [4096]. Then we noticed an odd log file, numbered with 4096, which sat around in our logs directory ever since it was created. Eventually we found a good backup, but the customer lost several days' worth of work.
    My question here is, how could a log file be corrupted for days and days but not be noticed, say during a checkpoint (which we run every minute or so)? Doesn't a checkpoint itself basically scan the logs, and shouldn't that have hit the corrupt part not long after it was written? The system was running without incident, getting fairly heavy use, so it really mystifies me as to how that issue could be sitting around for days and days like that.
    For now all we can promise the customer is that we will automatically restore every backup as soon as it's made, and if something like this happens, we immediately try a graceful shutdown, and if that doesn't come back up, we automatically go back to the 12-hour-old backup. And perhaps we should be doing that anyway, but still, I would like to understand what happened here. Any ideas?

    Please note, I don't want to make it sound like I'm somehow blaming berkeleydb for the outage-- we realize in hindsight there were better things to do than go back to an old backup, but the customer wanted an immediate answer, even if it was suboptimal. I just feel like I am missing something major about how the system works.

  • What's the command to dump archived log file?

    Could someone share if you remember? I forgot the syntax. Thanks alot.

    Are you referring to extracting data from the archive log files? If so, you should refer to this Using LogMiner.

  • FMLA standard hours infotype 2001 not editable when working with date range

    Hi All,
    The issue described below arises at the Juncture between FMLA Workbench and Absence Infotype IT2001.
    When we try to tie a FMLA Request to Absences the process takes us to the Infotype PA2001 screen wherein we would like the Absence Hours Field to be Editable so that we could enter Hours (less than full day).
    The process works fine when the FMLA absence is for a single day, i.e. if an employee availed a FMLA leave on a day (whether full day or part of a day) the Absence Hours field (PA2001-STDAZ) is editable and the user can enter any number of hours of his choice.
    But the process faces a glitch when the FMLA absence spans over a period of time (i.e. an employee availed a FMLA leave for over a period wherein each day may or may not be a full day absence, in this scenario it would be best if the system allow the user to enter Absence Hours so that the User could enter the exact number of Absence hours correctly. But unfortunately, when an absence spans multiple days the Infotype 2001 screen dynamically turns the Absence Hours field (PA2001-STDAZ) uneditable (grayed out) and the system forcefully enters hours equal to the sum of the work schedule hours for each working day covered in the span of period for which we are trying to enter FMLA Absence Hours.
    I have used the USER EXIT and the BADI route and tried to change the Screen Table to editable, but with no luck.
    Wondering, if anyone of you could help me out of this glitch.
    Thanks in Advance.

    Hi,
       I don't think this is possible.
       You cannot make the Hours field editable for a date range.
       I don't think its feasible to enter absence hours on a day basis given a date range.
       Better think of creating a custom infotype to enter absence hours .
       (Design would be : based on the date range entered , say 01.01.2011 to 05.01.2011 , you should get five input enabled
         boxes(dynamically) for five days to enter the absence hours , this is not possible in standard infotype ..... )
    Regards,
    Srini.

  • Forum search doesn't seem to work for Date Range 'ALL'

    Hi there,
    Forum search doesn't seem to work for Date Range option: 'ALL' .
    For a given search criteria, I get few search results when the date range is chosen as 'last year', but for the same search criteria, when the date range is chosen as ALL, no search results are shown (not even the ones shown earlier for last year selection).
    regards,
    AJ

    Can you please delete my few duplicate replies in [CJ20n|Re: Long Text at Activity Level in CJ20N] thread?:-)
    Cheers,
    Amit.

  • Date range stops working when I add record selection criteria

    I have a simple report, using only nine fields, from four tables, plus two date parameter fields that I use to set a date range:
    SELECT "Job"."Job", "Job_Operation"."Vendor", "Customer"."Customer", "Job"."Part_Number", "Delivery"."Promised_Date", "Job_Operation"."Status", "Job_Operation"."Sched_End", "Job_Operation"."Sched_Start", "Job_Operation"."Operation_Service"
    FROM   ("TECH"."dbo"."Delivery" "Delivery" INNER JOIN ("TECH"."dbo"."Job_Operation" "Job_Operation" INNER JOIN "TECH"."dbo"."Job" "Job" ON "Job_Operation"."Job"="Job"."Job") ON "Delivery"."Job"="Job"."Job") INNER JOIN "TECH"."dbo"."Customer" "Customer" ON "Job"."Customer"="Customer"."Customer"
    WHERE  (("Job_Operation"."Sched_End">={ts '2013-08-05 00:00:00'} AND "Job_Operation"."Sched_End"<{ts '2013-08-08 00:00:01'}) AND "Job_Operation"."Status"='O' OR "Job_Operation"."Status"='S' AND "Job_Operation"."Operation_Service"='150-170 SS' OR ("Job_Operation"."Operation_Service"='150-170 ST' OR "Job_Operation"."Operation_Service"='60-180' OR "Job_Operation"."Operation_Service"='180-200 SS' OR "Job_Operation"."Operation_Service"='180-200 ST' OR "Job_Operation"."Operation_Service"='200-220 ST' OR "Job_Operation"."Operation_Service"='F-1.1923'))
    ORDER BY "Job"."Job"
    When my record selection formula is
    {Job_Operation.Sched_End} IN {?StartDate} TO {?EndDate}
    AND
    {Job_Operation.Status} = 'O' OR {Job_Operation.Status} = 'S'
    the date range works.
    However, when my record selection formula is
    {Job_Operation.Sched_End} IN {?StartDate} TO {?EndDate}
    AND
    {Job_Operation.Status} = 'O' OR {Job_Operation.Status} = 'S'
    AND
    {Job_Operation.Operation_Service} = '150-170 SS' OR
    {Job_Operation.Operation_Service} = '150-170 ST' OR
    {Job_Operation.Operation_Service} = '60-180' OR
    {Job_Operation.Operation_Service} = '180-200 SS' OR
    {Job_Operation.Operation_Service} = '180-200 ST' OR
    {Job_Operation.Operation_Service} = '200-220 ST' OR
    {Job_Operation.Operation_Service} = 'F-1.1923'
    the date range doesn't work. Instead, the report returns records with all kinds of {Job_Operation.Sched_End} dates.
    What could be the reason?
    Thanks,
    Matteo

    Hi Matteo,
    Try this as the selection formula:
    {Job_Operation.Sched_End} IN {?StartDate} TO {?EndDate}
    AND
    ({Job_Operation.Status} = 'O' OR {Job_Operation.Status} = 'S')
    AND
    {Job_Operation.Operation_Service} = '150-170 SS' OR
    {Job_Operation.Operation_Service} = '150-170 ST' OR
    {Job_Operation.Operation_Service} = '60-180' OR
    {Job_Operation.Operation_Service} = '180-200 SS' OR
    {Job_Operation.Operation_Service} = '180-200 ST' OR
    {Job_Operation.Operation_Service} = '200-220 ST' OR
    {Job_Operation.Operation_Service} = 'F-1.1923'
    -Abhilash

Maybe you are looking for