How to do a WHERE cause with dates

Hello!
I'm new to Oracle and I'm wondering how to select records like this: WHERE (YEAR(SOME_DATE) > YEAR(NOW()) - 3).
YEAR in T-SQL returns an Int, so I can do math pretty easy. How would one do this with TO_CHAR? Do I have to cast to an Int or what?
Thanks!

Hi,
So, when you do this in 2009, you want to know if dt is in 2006 or later, and when you do the same thing in 2010, you'll want to know if dt is 2007 or later; is that it?
Date arithmetic in Oracle was designed for counting days. For dealing with larger units, ADD_MONTHS is the best thing.
SYSDATE is today's date
ADD_MONTHS (SYSDATE, -36) is exactly 3 years ago
TRUNC (ADD_MONTHS (SYSDATE, -36), 'YYYY') is the beginning of the year 3 years ago, so
WHERE   dt >= TRUNC (ADD_MONTHS (SYSDATE, -36), 'YYYY') will be TRUE if dt is any time in 2006 or later, given that it is currently 2009.
By not applying any functions or date arithmetic to dt, we allow the optimizer to use an index on it.

Similar Messages

  • How to declare a bind variable with 'date' data type in command prompt?

    how to declare a bind variable with 'date' data type in command prompt?
    sql>variable q date;
    when i execute it show list of datatypes

    Hi,
    As Lokanath said, there are no DATE bind variables.
    You can use a VARCHAR2 bind variable, and convert it to a DATE in your SQL statment.
    If you're using SQL*Plus, consider a substitution variable. It won't be as efficient as a bind variable, but it will be as convenient.
    For example:
    DEFINE  first_entrry_date = "DATE '2010-06-20''
    SELECT   ...
    WHERE   entry_date >= &first_entry_date
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • How to transport/move a table with data from development to Test to Production

    Hi,
    How to transport/move a table with data from development to Test to Production..? Export-Import a Delivery Unit does only the structure and not the data
    Reg
    Sri

    Hi Sri,
    You cannot transport Data via Transport route in HANA, you can only transport code changes/Structure via DU. For Data movement, you either have a do a export/import from a flat file or replication from a Source System to HANA.
    Thanks Much,
    Abhishek

  • In power mac g5, how to take Automatic Screen Shot with Date

    in power mac g5, how to take Automatic Screen Shot with Date ?
    please help me

    If you are standing by, and activating the process by keyboard
    you could be sure to include the open Date Time menu bar
    image in a complete screenshot, using cmd shift 3. Or outline
    the area you want if not full-screen, by using cmd shift 4.
    The command button may also have an apple icon on it.
    So that's the three key shortcuts for full & partial screen.
    For a more automated method, perhaps try using software to
    run an AppleScript, like Automator. Then write a script or find
    one that you can modify to do tasks and test it.
    There may be other methods, however identifying your intent
    would be higher on the list at this time; to narrow the method.
    Good luck & happy computing!

  • How do I figure where is the data in a materialized view coming from

    Hi: when I run select NAME, OWNER from dba_mview_refresh_times, I see a number of materialized views. How do I find more details about this view i.e where is the data coming from and which fields. The source table that is in another database changed. But the view on my database where the materialized view exist has not changed. I want to confirm from where is data coming in this view
    TIA
    Ravi

    SQL>  select * from dict where table_name like 'ALL%MVIEW%'
    TABLE_NAME                     COMMENTS                                                             
    ALL_BASE_TABLE_MVIEWS          All materialized views with log(s) in the database that the user can s
                                   ee                                                                   
    ALL_MVIEWS                     All materialized views in the database                               
    ALL_MVIEW_AGGREGATES           Description of the materialized view aggregates accessible to the user
    ALL_MVIEW_ANALYSIS             Description of the materialized views accessible to the user         
    ALL_MVIEW_COMMENTS             Comments on materialized views accessible to the user                
    ALL_MVIEW_DETAIL_PARTITION     Freshness information of all PCT materialized views in the database  
    ALL_MVIEW_DETAIL_RELATIONS     Description of the materialized view detail tables accessible to the u
                                   ser                                                                  
    ALL_MVIEW_DETAIL_SUBPARTITION  Freshness information of all PCT materialized views in the database  
    ALL_MVIEW_JOINS                Description of a join between two columns in the                     
                                   WHERE clause of a materialized view accessible to the user           
    ALL_MVIEW_KEYS                 Description of the columns that appear in the GROUP BY               
                                   list of a materialized view accessible to the user                   
    ALL_MVIEW_LOGS                 All materialized view logs in the database that the user can see     
    ALL_MVIEW_REFRESH_TIMES        Materialized views and their last refresh times  for each master table
                                    that the user can look at                                           
    ALL_REGISTERED_MVIEWS          Remote materialized views of local tables that the user can see      
    13 rows selected.

  • How to debug DAQ VIs (problem with data)

    I am trying to debug (and understand) the attached VI.  I’m not asking anyone to analyze/debug it, since it is far more than an example; I am new to these forums and feel like asking such a thing would be inappropriate.
    I would like advice on how to debug it.  I am also going to describe my problem to see if the cause may be obvious to someone.
    The outputs of the "Angle Calc" sub-vi are fed into a state machine.  The state machine actuates outputs to move a motor and measures changing angles.  It first moves up/down, then left/right.  The attached VI is called by a top-level VI and the front panel is not usually open.  For some reason, when the front panel is opened, it behaves differently when it is called.  The behavior only changes for the right/left movements.
    Here's where I'm hoping someone can point out some better debugging methods for me.
    I have been using the Express Write to LVM File VI and the Convert to Dynamic Data function to write acquired data to text files for analysis after the acquisition has completed.  Are there better methods recommended for doing this?  I wish I could run through the code while recording the execution and then step through a playback of it!
     As for the specific problem I am having...
    In the "Right Find Stall" case, the signal wired from the "Angle Calc" VI (array size 100) is connected to a Max/Min Array function.  When I log the data coming from the Min/Max function, I see an alternation between valid data and 0 (i.e. 12.2, 0, 13.3, 0, 14.0, 0, 14.5, 0...).  When I log the data being fed into the Min/Max function I do not see 0s. 
    A guess of mine is that the array is empty and therefore no data is logged but Min/Max returns a 0 (not sure why this would happen).  I also have no idea why this would ALWAYS happen with the front panel opened.  I have seen the effect with the front panel closed, but never to the same degree; usually just a seemingly random zero or two but not a pattern of every other point...
    Thanks all,
    Dave
    Attachments:
    Functional_Test.vi ‏2710 KB

    Angle_calc receives array controls in and passes array indicators out.  It performs a trig function and a bit of multiplication. 
    I'll check out the Write to Spreadsheet File VI, haven't used it in the past.  If it will allow me to write text out that'll be enough reason for me to stop using the Express VI.
    I've used Execution Highlighting and stepping before but with the data acquisition by the time the application makes it around to the next read the DAQ card has sampled a ton of data and the data coming in is no longer useful (a different state should be active by then).
    I do think the problem is related to the computer slowing down with the front panel open.  I've logged the 'backlog' number of the Read function and found that the number in there is greater than 0 much more often when the VI's front panel is open and that I've seen relations between the 0s that I discussed earlier and the backlog going above 0 (usually the scan after a non-0 backlog seems to be a problem).
    I need to familiarize myself with the DAQ basics, the buffer, backlog, and relation to sampling rate, etc.
    In a simple VI I just wrote, I noticed that if I increased the sampling rate high enough, data being acquired into a chart cuts-out on a regular basis; instead of a smooth line, I see small dashes or dots of data (spaces in between).  I don't understand this- I would expect data to be lost but not "no data" to be read in.  I think understanding what is happening here would be helpful.

  • How to update a table AUTOMATICALY with data from a .txt file??

    HI ,
    I want to upload data which is in a notepad file and Create a table. This notepad file stores a string of datas. Here, the data comes in to the .txt file every second.
    (The data is actually stored into the notepad file when a program is executed in Processing.js software)
    Is it possible to do the following using APEX ? ?
    1. This table has to be updated automatically once in 2 minutes (I want this completely automatic, no human intervention -- no update buttons)
    2. A report is to be created in a apex application with this table, it should reflect the updates in the table every 2 minutes.
    Is it possible?? give your ideas.. I want to do this soon for my engineering project :) Any help from you is appreciated. :)
    Edited by: user13301695 on 28-Mar-2011 10:09

    And how do you expect your database server to access a local file in your machine ?
    Is the file accessible from outside your machine say inside a webserver folder so that some DB process can poll on the file ?
    Or, is your DB server in the same machine where you have the text file ?
    You will have to figure out the file acess part before automating user interaction or even auto-refreshing.

  • How to check the integer value with Date Column

    Hi Friends,
    I have a filter called 'Days'. I need to show the data based on the Days filter (Example : 2 Days).
    Example Query:
    Select * from Tb1
    Where EndDate(Value as '03/05/2013' > Days ( value as 2)
    How to handle the above scenario.
    Thanks in Advance....
    Regards,
    LuckyAbdul

    what meanings does it make. how can you compare a date to day count. Or is the inetger value an offset ie say 2 days from today etc?
    If yes you can use like below illlustration
    DECLARE @DayOffset int
    SET @DayOffset = 2
    SELECT *
    FROM TAble
    WHERE ENdDate >= DATEADD(dd,DATEDIFF(dd,0,GETDATE()),@DayOffset+1)
    refer
    http://visakhm.blogspot.in/2010/01/some-quick-tips-for-date-formating.html
    Please clarify with an example what you're expecting if its different from the above.
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Can i Use a time stamp vale  in where condition with date field?

    HI,
    In a where condition the table field vale is of type timestamp and select options value is of type date.
    How do i select data?

    Define a range for r_dt for timestamp field.
    Initailize the range with the below values.
    r_dt-sign = 'I'.
    r_dt-option = 'CP'.
    concatenate r_date-low '*' into r_dt-low.
    concatenate r_date-high '*' into r_dt-high.
    append r_dt.
    Use r_dt in your select query where clause.
    Note - r_date is the select-option for your date field.
    Regards
    Anurag

  • How to pre-populate ADF form with data?

    Hi Guys,
    Just wondering how you go about generating a pre-populated ADF form. Is it something that is done though the ADF designer and Business Objects? or do you have to create the Database tables seperatley and bind them into ADF and the workflow. Specifically I want this so a user is presented with a several drop-down lists when the initiator human task is run.
    Thanks,
    Ross

    Hi Ross
    1. Yes, in the very first place in the Database, we added all the data manually for all the Lookups. Its our own custom Schema. We are planning to provide some simple admin screens in a different WebApps to maintain these lookup values. Note that this is a different simple ADF App, with one EJB Layer and EJB Layer calling this database and getting/updating the values.
    2. ALSO, we have some lookup values that we get from totally another System/application like Oracle EBS Suite. For this, we already have a custom schema created. In this custom schema, we have some stored procedures created by the EBS database developers (they are more familiar with EBS). Now within our EJB layer, we use JPA architecture, call these stored procedures, and get all the lookup values and show in the TaskDetails page. These lookup values are maintained in EBS itself, so we do not have any custom admin screens within our SOA/BPM project.
    In conclusion, for our SOA/BPM applications, we have like 2 Databases. First database is for SOA Components where we run RCU comand and it has all SOA_INFRA, MDS, ORABAM schemas etc. The second database is for our custom schema specific to our application. Here we have our own lookup tables, custom tables and any cutom stored procedures that coonect to external EBS etc. From Weblogic console, we just create a Datasource for this second database and use that in JPA layer (persistence.xml file). For the first database, you already know that at domain creation itself, it will create all datasources also.
    Thanks
    Ravi Jegga

  • How to sync TCP/IP command with data

    Hello. I am trying to catch a TCP/PI data and write it to the file. I am using the TCP Communicator example as the base. My goal is to send a command and write to the file the response from the host. Few challenges. I can send multiple commands, but I only want to write to file after specific one and only once. It is the reason I have some logic in the send loop to enable write block. My problem is, by the time data comes, my enable changes value, so I only write the data that was on the bus from previous command. Or, I keep witing to file ove r and over.
    What is the best way to sync send a command and wait for the result for this command? Should I just use case structure since I do not need to listen host all the time, only after I sent  commands.
    Labview 8.2
    Win7
    Thank you
    Attachments:
    TCPIP.PNG ‏71 KB

    Do you have control over the data on both ends of the connection? If so, create a simple messaging protocol that will help you keep things in sync and give you the ability to recognize the commands. The simplest protocol would be to define your data packets to contain a message header and data. The message header would contain a message identifier identifier and the message data length. The header should be fixed length. This allows you to easily read the header and the data. The data will be whatever you need for each message type. The header message length will let you know how much data to read. Using this simple protocol the message reading would be very simple. Since each message will contain an identifier you can easily chose how to process the message and what actions you take. If you need to match responses with specific commands you could put a command sequence number in the message header. This would allow you to match the response with the specific command. You would need to include the command sequence number in the response.
    If you don't have control over the other side of the connection you will need to get the message format the device uses. That will dictate how to process the messages.
    Also, if your system is always a command/response system you do not have to have the read and write in separate tasks. You can send the command and wait for the response immediately afterward. You would not send the next command until you get a reposes. If you know you won't get a response for a specific command you would skip the read.
    Mark Yedinak
    "Does anyone know where the love of God goes when the waves turn the minutes to hours?"
    Wreck of the Edmund Fitzgerald - Gordon Lightfoot

  • How to fill the table UD_SAPHR with data from SAP HR Connector

    Hi, I`ve searched all the forrum but didn`t find how to map fields from connector form to the any field in OIM. I tryied with entity adapters but failed to reach any result. The problem is that I even cant put incoming data with connector to the table of the connector form (UD_SAPHR).
    I will be grateful for any help. Below is the cut of the log. In few words - how to map, for example, field "Department=40000260" to the field in the USR table or UD_SAPHR table (or any other - to have this information in OIM).
    OIM-9101Upd5, OS-Oracle Enterprise Linux4x64,SAP COnnector-SAP-ER90460, Database-Oracle 10gx64, Appication server-SOA(Oracle Application server)
    INFO,01 Jul 2009 14:08:11,504,[XL_INTG.SAPHRMS],getDetailsForRecon(): *****************************
    INFO,01 Jul 2009 14:08:11,505,[XL_INTG.SAPHRMS],getDetailsForRecon(): 00000017
    INFO,01 Jul 2009 14:08:11,505,[XL_INTG.SAPHRMS],reconcileEmp(): SAPUserRecon for SAP : SAP HRMS IT Resource:EmployeeId=00000017
    DEBUG,01 Jul 2009 14:08:11,505,[XL_INTG.SAPHRMS],SAPUserRecon:returnDate(): Employee Status***3
    DEBUG,01 Jul 2009 14:08:11,505,[XL_INTG.SAPHRMS],EmpStatus***3
    INFO,01 Jul 2009 14:08:11,505,[XL_INTG.SAPHRMS],getUserId(): Recon key is 00000017
    INFO,01 Jul 2009 14:08:11,505,[XL_INTG.SAPHRMS],getUserId(): Hash Key is USR_UDF_PERSONNEL_NUMBER
    INFO,01 Jul 2009 14:08:11,505,[XL_INTG.SAPHRMS],getUserId(): Get UserId of user with reconKey: 00000017
    INFO,01 Jul 2009 14:08:11,506,[XL_INTG.SAPHRMS],getUserId():hmFilter: {USR_UDF_PERSONNEL_NUMBER=00000017}
    INFO,01 Jul 2009 14:08:11,540,[XL_INTG.SAPHRMS],Result set rs contains : Thor.API.tcMetaDataSet@790a37
    INFO,01 Jul 2009 14:08:11,540,[XL_INTG.SAPHRMS],getUserId(): Xellerate RS Count*** 0
    DEBUG,01 Jul 2009 14:08:11,541,[XL_INTG.SAPHRMS],getUserId(): hmReturn ***{Create=0, UserLinked=0, userId=, UserKey=, XelUser=, Exist=0, User=}
    INFO,01 Jul 2009 14:08:11,541,[XL_INTG.SAPHRMS],getUserId(): Recon key is 00000017
    INFO,01 Jul 2009 14:08:11,541,[XL_INTG.SAPHRMS],getUserId(): Hash Key is USR_UDF_PERSONNEL_NUMBER
    INFO,01 Jul 2009 14:08:11,541,[XL_INTG.SAPHRMS],getUserId(): Get UserId of user with reconKey: 00000017
    INFO,01 Jul 2009 14:08:11,541,[XL_INTG.SAPHRMS],getUserId():hmFilter: {USR_UDF_PERSONNEL_NUMBER=00000017}
    INFO,01 Jul 2009 14:08:11,570,[XL_INTG.SAPHRMS],Result set rs contains : Thor.API.tcMetaDataSet@282f10
    INFO,01 Jul 2009 14:08:11,570,[XL_INTG.SAPHRMS],getUserId(): Xellerate RS Count*** 0
    DEBUG,01 Jul 2009 14:08:11,570,[XL_INTG.SAPHRMS],getUserId(): hmReturn ***{Create=0, UserLinked=0, userId=, UserKey=, XelUser=, Exist=0, User=}
    DEBUG,01 Jul 2009 14:08:11,570,[XL_INTG.SAPHRMS],sEmployeeUserID***
    DEBUG,01 Jul 2009 14:08:11,570,[XL_INTG.SAPHRMS],UserKey***
    DEBUG,01 Jul 2009 14:08:11,570,[XL_INTG.SAPHRMS],EmpStatus fetch from target System is 3
    DEBUG,01 Jul 2009 14:08:11,570,[XL_INTG.SAPHRMS],EmpStatus Mentioned in Task scheduler is 3
    INFO,01 Jul 2009 14:08:11,570,[XL_INTG.SAPHRMS],reconcileEmp():sUser get from Resource Object Table ***
    INFO,01 Jul 2009 14:08:11,570,[XL_INTG.SAPHRMS],reconcileEmp():sXelUser get from Xellerate Table***
    el Number=00000017, LinkedUserID=, Xellerate Type=End-User Administrator, First Name=п⌠п╣пҐпҐп╟пЄп╦п╧, Last Name=п■п╟п╡я▀пЄп╬п╡, Role=Consultant, Middle Name=FromHRMS=1}╣п╡п╦я┤, Userr
    INFO,01 Jul 2009 14:08:11,571,[XL_INTG.SAPHRMS],Reconciliation of the user entered
    DEBUG,01 Jul 2009 14:08:11,571,[XL_INTG.SAPHRMS],before create HR user reconciliation event***{PostalCode=, EndDate=9999/12/31 12:00:00 MSK, EmailAddress=, SSN=, State=, District=, TelephoneNumber=, Manager=, StartDate=1888/12/01 12:00:00 MSK, City=, EmplUserId=, UserLinked=0, UserLocked=0, FirstName=п⌠п╣пҐпҐп╟пЄп╦onnel Number=00000017, EmployeeId=00000017, Country=RU, Title=Mr, Department=40000260, MiddleName=п░пҐпЄя─п╣п╣п╡п╦я┤, LastName=п■п╟п╡я▀пЄп╬п╡}
    INFO,01 Jul 2009 14:08:11,711,[XL_INTG.SAPHRMS],disableEnableXelUser(): Get UserKey of user with reconKey: 00000017 for SAP SAP HRMS IT Resource
    DEBUG,01 Jul 2009 14:08:11,742,[XL_INTG.SAPHRMS],disableEnableXelUser(): Xellerate RS Count*** 0
    INFO,01 Jul 2009 14:08:11,742,[XL_INTG.SAPHRMS],reconcileEmp(): User Reconciliation for Create & Modify User is completed.
    INFO,01 Jul 2009 14:08:11,742,[XL_INTG.SAPHRMS],reconcileEmp() **** DONE RECONCILIATION

    That is my problem. I get only those fields, that are in the xellerate user standard form. And I need also fields "Department" and "City".
    This information connector gets from SAP (as it is seen in log) but doesn`t save it anywhere inside the OIM, though such fields exists in the UD_SAPHR table (and correspondend form) but this table remains empty after reconciliation.
    DEBUG,26 Jun 2009 18:01:41,746,[XL_INTG.SAPHRMS],before create XL user reconciliation event***{Email=, Organization=Xellerate Users, User ID=00000072, Personnel Number=00000072, LinkedUserID=, Xellerate Type=End-User Administrator, First Name=п°пҐп╣, Last Name=п п╬п╪п©п╣пҐя│п╦я─я┐п╧п╨п╟, Role=Consultant, Middle NamomHRMS=1}
    INFO,26 Jun 2009 18:01:41,746,[XL_INTG.SAPHRMS],Reconciliation of the user entered
    DEBUG,26 Jun 2009 18:01:41,746,[XL_INTG.SAPHRMS],before create HR user reconciliation event***{PostalCode=, EndDate=9999/12/31 12:00:00 MSK, EmailAddress=, SSN=, State=, District=, TelephoneNumber=, Manager=, StartDate=1970/01/01 12:00:00 MSK, City=, EmplUserId=, UserLinked=0, UserLocked=0, FirstName=п°пҐп╣, Person Number=00000072, EmployeeId=00000072, Country=RU, Title=Mr, Department=40000240, MiddleName=п·я┌п©я┐я│п╨, LastName=п п╬п╪п©п╣пҐя│п╦я─я┐п╧п╨п╟}
    Edited by: user6645106 on 03.07.2009 10:12

  • How to: create a Login page with data tags

    hi, how could i create a jsp login page using the data tags.. and how to associate it with the other jsp pages that should be displayed in case of the correct insertion of the password .

    http://technet.oracle.com:89/ubb/Forum2/HTML/006025.html

  • Decode in where claus with date comparasion

    hi
    i m using this query can we compare date in decode function in where claue if yes how
    SELECT PPA.START_DATE,
    PPA.END_DATE
    FROM pa_project_assignments PPA,
    PA_CONTROL_ITEMS PCI
    WHERE DECODE(PPA.START_DATE < PCI.ATTRIBUTE1,PCI.ATTRIBUTE1,PPA.START_DATE) BETWEEN (PCI.ATTRIBUTE1) AND (PCI.DATE_REQUIRED)
    AND PPA.END_DATE BETWEEN (PCI.ATTRIBUTE1) AND( PCI.DATE_REQUIRED)
    AND PPA.PROJECT_ID=PCI.PROJECT_ID

    This works.
    SELECT * FROM emp
    WHERE
    CASE WHEN hiredate < SYSDATE THEN
    hiredate
    ELSE SYSDATE
    END BETWEEN SYSDATE-10000 AND SYSDATEIn your CASE
    SELECT PPA.START_DATE,
    PPA.END_DATE
    FROM pa_project_assignments PPA,
    PA_CONTROL_ITEMS PCI
    WHERE
    case when PPA.START_DATE < PCI.ATTRIBUTE1 then
    PCI.ATTRIBUTE1
    else
    PPA.START_DATE
    end  BETWEEN PCI.ATTRIBUTE1 AND PCI.DATE_REQUIRED
    AND PPA.END_DATE BETWEEN  PCI.ATTRIBUTE1 AND PCI.DATE_REQUIRED
    AND PPA.PROJECT_ID=PCI.PROJECT_IDCheers!!!
    Bhushan

  • How to load an XML schema with Data Integrator ?

    Post Author: Kris Van Belle
    CA Forum: Data Integration
    Is someone having experience with loading data from a regular table into an XML schema ?
    What are exactly the steps to undertake ? The DI user manual does not provide that much information...

    Post Author: bhofmans
    CA Forum: Data Integration
    Hi Kris,
    In the Designer.pdf there is a chapter called 'nested data' with more information, but you can also check this website with some detailed instructions and examples on how to build a nested relational data model (NRDM).
    http://www.consulting-accounting.com/time/servlet/ShowPage?COMPANYID=43&ELEMENTID=161
    (Thanks to Werner Daehn for putting this together).

Maybe you are looking for