DataGaurd - Failback query

Dear Guru's
I am requesting you all. can any one please explain the fail back procudere after a fail over in dataguard. I go through the oracle DG concepts giude but it s not explained any where.
Thanks,
Prasanna.

First iam not guru
1. At the time of failover the PDB is down abnormally due to some reason and the SDB becomes primary.
2. Till the time we rectify the issue in PDB the transactions will takes place against the SDB.
3. Now I identifie the problem and resolved it in PDB and I can bring up my PDB.Please once again make sure it is performed switch over or fail over?
If switch over, you can turn standby --> primary & primary --> standby..
if in fail over, once standby database commited to fail over this acts as primary database, i think again you need to rebuild the standy database.
check http://vnull.pcnet.com.pl/dl/oracle/937wp.pdf
Thanks
Edited by: CKPT on Nov 15, 2010 4:18 PM

Similar Messages

  • Datagaurd Log Query

    Hi ,
    we are on oracle 11.1.0.6 , implemented Datagaurd , OS is SPARC - 64 bit.
    We are transfering Log from Primary to Secondary using datagaurd and we have found a gap in the secondary Db.
    On furthur analysis we came to know that one of the log files is not generated in the primary database.
    To over come to this issue,apart from recreating standby database , do we have any other solution /option.
    Thanks
    Kkukreja

    Hi CKPT,
    please find the response below :
    from primary
    SQL> select thread#,max(sequence#) from v$archived_log group by thread#;
       THREAD# MAX(SEQUENCE#)
             1          39136from secondary :
    SQL> select * from v$archive_gap;
       THREAD# LOW_SEQUENCE# HIGH_SEQUENCE#
             1         37007          37007
    SQL> select thread#,max(sequence#) from v$archived_log where applied='YES' group by thread#;
       THREAD# MAX(SEQUENCE#)
             1          37006
    I rechecked , the archive logs were generated in primary... but it seems someone has removed them.
    We have not configured the RMAN.As the archieve log is not present in primary how would we restore ?
    Regards
    Kkukreja

  • Issue with Datagaurd Broker..

    hi friends..
    I m using datagaurd from several moths now..Primary and physical standby are working fine..
    Primary databasee--->sid=primary and Physical Standby database-->sid-standby.
    My OS version is windows 2003 sever and Oracle version is 11gR1
    NOw i m planning to use dgmgrl..
    so the steps that i performed was..
    1) Make sure Data guard is configured and both the Primary and Physical Standby database in place and redo transport and redo apply have already been configured and that both the Primary and Standby database are in sync.
    2)Make sure both the databases should b up with SPFILE only..
    ALTER SYSTEM SET DG_BROKER_START=TRUE SCOPE=BOTH; on both databases
    Now after setting oracle_sid =primary
    DGMGRL>
    create configuration 'sai'
    as primary database is 'primary'
    connect identifier is 'primary';Error-ora-16625..cannot reach the database ...
    Note:- the archive logs r transfereed to standby by properly so..transoporation and apply is perfectly fine..but only dgmgrl is not wokring..
    my listener file on both database (sid=primary and sid-standby..on both severs)..
    SID_LIST_LISTENER =
    (SID_LIST =
    (SID_DESC =
    (GLOBAL_DBNAME = primary_dgmgrl)
    (ORACLE_HOME = C:\app\Administrator\product\11.1.0\db_1)
    (SID_NAME = primary)
    Let me know if u have any idea or if u need any info...
    Thnks guys..

    Guys..any idea about my query..Kindly help..

  • Concept about Query Scn in Dataguard in Oracle 11g

    What are the details concepts behind the’ query scn’ in Dataguard in Oracle 11g. I read the concept from Oracle Datagaurd 11g Handbook but it was not clear for me.

    It's the highest SCN to which data is synchronised between the primary and secondary in a way that the secondary guaranteed read-consistent. It's therefore not possible to query an active standby for data modified after that SCN. At any given time, there might be transactions committed on the primary but not yet shipped or applied to the standby. You're not allowed to see those on the secondary. Not sure how much more detailed you want to get, really.
    Edited by: Catfive Lander on Jan 15, 2013 4:15 AM

  • Datagaurd setup

    Hi Gurus,
    As to learn datagaurd, i am planning to use same server where my primary database and physical standby database reside.
    As document , i have created one database, made it force logging and then made it archive . Changed the parameter files and also added stand by redo logs. Since i am using 2 online redo log groups, created 3 stand by redo log groups .. (Maximum No of log files per thread +1)*Maximun no of thread.
    From alert log i found there is 1 redo thread and hence added 3. Changed primary database to use spfile from Pfile.I made database shutdown and copied files to different directory (all redo+data+standby files Here is confusion which all need to be copied ??????). For standby database, created control file from primary database and moved to different directory (made change in pfile in standby databse to point to this control file before start). In primary database when i queried on v$standby_log , status is still unassgined . There is no active file in primary database after switch log. I started stand by database using all files from primary database. When i did switch log , nothing is getting updated in status of v$standby_log. How it can be done ?
    Also query...
    Do i need to create 3 more standby log files at stand by database after standby database creation ??/
    Please suggest me simple and best method to do the same
    Thanks & Regards

    Hi Nitin,
    1>log_archive_dest string
    log_archive_dest_1 string LOCATION=/data/datacut/SAMDBCU
    T/archredo/
    VALID_FOR=(ALL_LOGFILES,ALL_RO
    LES)
    DB_UNIQUE_NAME=SAMDBCUT
    log_archive_dest_10 string
    log_archive_dest_2 string SERVICE=INVDBCUT LGWR ASYNC
    VALID_FOR=(ONLINE_LOGFILES,PRI
    MARY_ROLE)
    DB_UNIQUE_NAME=INVDBCUT
    2>
    FACILITY SEVERITY DEST_ID MESSAGE_NUM ERROR_CODE CAL TIMESTAMP MESSAGE
    Log Transport Services Informational 0 1 0 NO 31-DEC-09 ARC0: Archival started
    Log Transport Services Informational 0 2 0 NO 31-DEC-09 ARC1: Archival started
    Log Transport Services Informational 0 3 0 NO 31-DEC-09 ARC2: Archival started
    Log Transport Services Informational 0 4 0 NO 31-DEC-09 ARC3: Archival started
    Log Transport Services Informational 0 5 0 NO 31-DEC-09 ARC4: Archival started
    Log Transport Services Informational 0 6 0 NO 31-DEC-09 ARC5: Archival started
    Log Transport Services Informational 0 7 0 NO 31-DEC-09 ARC6: Archival started
    Log Transport Services Informational 0 8 0 NO 31-DEC-09 ARC7: Archival started
    Log Transport Services Informational 0 9 0 NO 31-DEC-09 ARC8: Archival started
    Log Transport Services Informational 0 10 0 NO 31-DEC-09 ARC9: Archival started
    Log Transport Services Informational 0 11 0 NO 31-DEC-09 ARCa: Archival started
    Log Transport Services Informational 0 12 0 NO 31-DEC-09 ARCb: Archival started
    Log Transport Services Informational 0 13 0 NO 31-DEC-09 ARCc: Archival started
    Log Transport Services Informational 0 14 0 NO 31-DEC-09 ARCd: Archival started
    Log Transport Services Informational 0 15 0 NO 31-DEC-09 ARCe: Archival started
    Log Transport Services Informational 0 16 0 NO 31-DEC-09 ARCf: Archival started
    Log Transport Services Informational 0 17 0 NO 31-DEC-09 ARCg: Archival started
    Log Transport Services Informational 0 18 0 NO 31-DEC-09 ARCh: Archival started
    Log Transport Services Informational 0 19 0 NO 31-DEC-09 ARCi: Archival started
    Log Transport Services Informational 0 20 0 NO 31-DEC-09 ARCj: Archival started
    Log Transport Services Informational 0 21 0 NO 31-DEC-09 ARCk: Archival started
    Log Transport Services Informational 0 22 0 NO 31-DEC-09 ARCl: Archival started
    Log Transport Services Informational 0 23 0 NO 31-DEC-09 ARCm: Archival started
    Log Transport Services Informational 0 24 0 NO 31-DEC-09 ARCn: Archival started
    Log Transport Services Informational 0 25 0 NO 31-DEC-09 ARCo: Archival started
    Log Transport Services Informational 0 26 0 NO 31-DEC-09 ARCp: Archival started
    Log Transport Services Informational 0 27 0 NO 31-DEC-09 ARCq: Archival started
    Log Transport Services Informational 0 28 0 NO 31-DEC-09 ARCr: Archival started
    Log Transport Services Informational 0 29 0 NO 31-DEC-09 ARCs: Archival started
    Log Transport Services Informational 0 30 0 NO 31-DEC-09 ARCt: Archival started
    Log Transport Services Informational 0 31 0 NO 31-DEC-09 ARC7: Becoming the 'no FAL' ARCH
    Log Transport Services Informational 0 32 0 NO 31-DEC-09 ARC7: Becoming the 'no SRL' ARCH
    Log Transport Services Informational 0 33 0 NO 31-DEC-09 ARCs: Becoming the heartbeat ARCH
    Log Transport Services Error 0 34 12505 YES 31-DEC-09 Error 12505 received logging on to the standby
    Fetch Archive Log Error 2 35 12505 YES 31-DEC-09 FAL[server, ARC1]: Error 12505 creating remote archivelog file 'INVDBCUT'
    35 rows selected.
    3> Yes i am able to connect using sql*plus
    4> Alert log of primaryu
    Error 12505 received logging on to the standby
    Thu Dec 31 10:23:18 2009
    Errors in file /var/oracle/admin/SAMDBCUT/bdump/samdbcut_arc1_14072.trc:
    ORA-12505: TNS:listener does not currently know of SID given in connect descriptor
    FAL[server, ARC1]: Error 12505 creating remote archivelog file 'INVDBCUT'
    FAL[server, ARC1]: FAL archive failed, see trace file.
    Thu Dec 31 10:23:18 2009
    Errors in file /var/oracle/admin/SAMDBCUT/bdump/samdbcut_arc1_14072.trc:
    ORA-16055: FAL request rejected
    ARCH: FAL archive failed. Archiver continuing
    Thu Dec 31 10:23:18 2009
    ORACLE Instance SAMDBCUT - Archival Error. Archiver continuing.
    Thu Dec 31 10:23:18 2009
    Successfully onlined Undo Tablespace 1.
    Thu Dec 31 10:23:18 2009
    SMON: enabling tx recovery
    Thu Dec 31 10:23:18 2009
    Database Characterset is AL32UTF8
    replication_dependency_tracking turned off (no async multimaster replication found)
    Starting background process QMNC
    QMNC started with pid=44, OS id=14135
    Thu Dec 31 10:23:19 2009
    Completed: ALTER DATABASE OPEN

  • FailBack URL

    Hi
    I have Exchange server 2010 
    owa:   mail.mydomain.com
    I installing exchange 2013 in my organization  , i changed my DNS to refer Mail.mydomain.com to My exchange 2013 server
    some of my users complain they cannot access owa after this change .
    I read about Failback and it can solve my issue
    How can i configure failback?
    is this should apply on exchange 2010 or exchange 2013 ?
    what should the name for this failback ?
    i need example please .
    Thanks
    MCP MCSA MCSE MCT MCTS CCNA

    Its really a Exchange 2010 thing, not for Exchange 2013 though the parameter still exists in 2013.
    In other words, you probably have no need to set it and in your case Failback would not have solved anything. 
    More here:
    http://blogs.technet.com/b/exchange/archive/2010/11/22/robert-s-rules-of-exchange-namespace-planning.aspx
    FailbackURL Names
    In Exchange 2010 <acronym title="Release to Manufacturing">RTM</acronym> (pre-SP1, that is), there's a situation that can happen during a datacenter failure scenario. During the "fail back" scenario when services are being
    relocated back to the primary datacenter, in some circumstances an OWA client will get a redirect that sends them to the server they just connected to, causing an outage for that user. For instance, let's use our names we just defined above and assume that
    the datacenter that hosts mail.robertsrules.ms has failed. We moved all services over to the LFH datacenter, including making some DNS changes to point users trying to find
    mail.robertsrules.ms to the VIP that is actually maillfh.robertsrules.ms.
    When we go to "fail back", we reactivate the services in the HSV datacenter, redirect DNS and everything seems to work as expected. But one thing we can't control from the servers is the client browser cache. Browsers can maintain their own DNS
    cache that is separate from the operating system. Internet Explorer (IE) has a cache that acts similar to the DNS <acronym title="Time To Live">TTL</acronym> value (note that you can manage this setting on your clients - see
    KB 263558). If you haven't made these changes, <acronym title="Internet Explorer">IE</acronym> will hang on to an IP address for a given URL for 30 minutes before it asks the <acronym
    title="Operating System">OS</acronym> to re-query DNS for a new IP address. You can clear this cache by restarting <acronym title="Internet Explorer">IE</acronym>, but that's a poor client experience.
    This means that even if we set the TTL on the DNS record for mail.robertsrules.ms to 5 minutes, we could have up to 25 additional minutes where a client requests information from what it thinks is
    mail.robertsrules.ms but is really maillfh.robertsrules.ms. After you move the databases back over to the other datacenter,
    maillfh.robertsrules.ms replies back to the client, "Hey, welcome back. I see that your mailbox is in the other Active Directory Site and because the CAS in that site have an ExternalURL defined, please use that URL (which happens to be
    mail.robertsrules.ms) and try to log in." The user takes this redirect
    (the user is prompted, and the user manually accepts this redirect - meaning that there is no "single sign on" redirect here) and tries to reconnect to "mail.robertsrules.ms", asking the user to re-authenticate. Problem is that
    the IE cache is still pointing at the IP address of "maillfh.robertsrules.ms". So the client gets in a loop for up to 25 minutes with OWA just prompting for credentials over and over again. Not great — especially if that user happens to be the person
    who signs your paycheck.
    In Exchange 2010 SP1 (which we'll be deploying at Robert's Rules), we added the
    FailbackURL parameter to the OWA Virtual Directory (we'll set this using the
    Set-OwaVirtualDirectory cmdlet). When OWA detects that the client is in this looping condition, it'll redirect it to the
    FailbackURL. Thus, we have a cleaner fail back scenario which provides a better availability stance for our clients.
    To implement this, we need our FailbackURL values to be included in our list of <acronym title="Subject Alternative Name">SAN</acronym>s. For our environment, I've chosen some equally obvious names:
    FailbackHSV.robertsrules.ms and FailbackLFH.robertsrules.ms.
    Twitter!: Please Note: My Posts are provided “AS IS” without warranty of any kind, either expressed or implied.

  • Error while running a query-Input for variable 'Posting Period is invalid

    Hi All,
    NOTE: This error is only cropping up when I input 12 in the posting period variable selection. If I put in any other value from 1-11 I am not getting any errors. Any ideas why this might be happening?
    I am getting the following error when I try and run a query - "Input for variable 'Posting Period (Single entry, mandatory)' is invalid" - On further clicking on this error the message displayed is as follows -
    Diagnosis
    Variable Posting Period (Single Value Entry, Mandatory) is used as a lower limit (X) and an upper limit () in an interval selection. This limit has the value #.
    System Response
    Procedure
    Enter a different value for variable Posting Period (Single Value Entry, Mandatory). If the value of the other limit is determined by another variable, you can change its value also.
    Procedure for System Administration

    OK.
    Well, if the variable is not used in any interval selection, then I would say "something happened to it".
    I would make a copy of the query and run it to check if I get the same problem with period 12.
       -> If not, something is wrong in the original query (you can proceed as below, if changes to original are permitted).
    If so, then try removing the variable completely from the query and hardcode restriction to 12.
       -> If problem still persists, I would have to do some thinking.
    If problem is gone, then add the variable again. Check.
       -> If problem is back, then the variable "is sick". Only quick thing to do, is to build an identical variable and use that one.
    If problem also happens with the new variable, then it's time to share this experience with someone else and consider raising an OSS.
    Good luck!
    Jacob
    P.S: what fisc year variant are you using?
    Edited by: Jacob Jansen on Jan 25, 2010 8:36 PM

  • Logical database in adhoc query

    Hello All,
    Can anyone tell me what is the logical database in adhoc query?

    Hi
    When you create a query , you have to select an infoset. Infoset can be considered as a source from which data is populated in the Query Fields.
    Infosets are created from Transaction SQ02.
    There can be four methods through which an Infoset can become a source of data:
    1.  Table join ( By joining two or more tables from Data dictionary)
         example: Joining tables PA0001 and PA0006 on Pernr to get a one resultant dataset
    2. Direct read of Basis Table ( Like PA0001 as a source for data in Infoset )
    3. Logical Database ( A Pre-written Program by SAP that extract data from clusters, tables taking care of authorizations and validity periods)
    Example : Logical database PNP, PNPCE (Concurrent Employement),PCH ( LDB for Personnel Development Data)
    Custom Logical DBs can be created in T_Code SE-36.
    4. Data Retrieval by a Program ( Custom code written by ABAP developers which will collect and process data) . This program has a corresponding Structure in data dictionary and the fields of this structure will be used in query)
    Reward Points, if helpful.
    Regards
    Waseem Imran

  • Query help on Goods Receipt Query with AP Invoice

    Looking for a little help on a query.  I would like to list all the goods receipts for a given date range and then display the AP Invoice information (if its been copied to an AP Invoice).  I think my problem is in my where clause, I plagerized an SAP query to show GR and AP from a PO as a start.  SBO 2005 SP01.  Any help would be great appreciated.  Thanks
    SELECT distinct 'GR',
    D0.DocStatus,
    D0.DocNum ,
    D0.DocDate,
    D0.DocDueDate,
    D0.DocTotal,
    'AP',
    I0.DocStatus,
    I0.DocNum ,
    I0.DocDate,
    I0.DocDueDate,
    I0.DocTotal,
    I0.PaidToDate
    FROM
    ((OPDN  D0 inner Join PDN1 D1 on D0.DocEntry = D1.DocEntry)
    full outer join
    (OPCH I0 inner join PCH1 I1 on I0.DocEntry = I1.DocEntry)
    on (I1.BaseType=20 AND D1.DocEntry = I1.BaseEntry AND D1.LineNum=I1.BaseLine))
    WHERE
    (D1.BaseType=22 AND D1.DocDate>='[%0]' AND D1.DocDate<='[%1]')
    OR (I1.BaseType=20 AND I1.BaseEntry IN
    (SELECT Distinct DocEntry
    FROM PDN1 WHERE BaseType=22 AND DocDate>='[%0]' AND DocDate<='[%1]'))

    Hi Dalen ,
    I  believe it is because of the condition
    (D1.BaseType=22 AND D1.DocDate>='%0' AND D1.DocDate<='%1')
    OR (I1.BaseType=20 AND I1.BaseEntry IN
    (SELECT Distinct DocEntry FROM PDN1 WHERE PDN1.BaseType=22 AND DocDate>='%0' AND DocDate<='%1'))
    Try changing
    D1.BaseType=22 OR D1.DocDate>='%0' AND D1.DocDate<='%1
    PDN1.BaseType=22 OR DocDate>='%0' AND DocDate<='%1'))
    Lets see what would be the result . Lets have some fun with troubleshooting
    See what would be the difference in the result .
    Thank you
    Bishal

  • Can you check for data in one table or another but not both in one query?

    I have a situation where I need to link two tables together but the data may be in another (archive) table or different records are in both but I want the latest record from either table:
    ACCOUNT
    AccountID     Name   
    123               John Doe
    124               Jane Donaldson           
    125               Harold Douglas    
    MARKETER_ACCOUNT
    Key     AccountID     Marketer    StartDate     EndDate
    1001     123               10526          8/3/2008     9/27/2009
    1017     123               10987          9/28/2009     12/31/4712    (high date ~ which means currently with this marketer)
    1023     124               10541          12/03/2010     12/31/4712
    ARCHIVE
    Key     AccountID     Marketer    StartDate     EndDate
    1015     124               10526          8/3/2008     12/02/2010
    1033     125               10987         01/01/2011     01/31/2012  
    So my query needs to return the following:
    123     John Doe                        10526     8/3/2008     9/27/2009
    124     Jane Donaldson             10541     12/03/2010     12/31/4712     (this is the later of the two records for this account between archive and marketer_account tables)
    125     Harold Douglas               10987          01/01/2011     01/31/2012     (he is only in archive, so get this record)
    I'm unsure how to proceed in one query.  Note that I am reading in possibly multiple accounts at a time and returning a collection back to .net
    open CURSOR_ACCT
              select AccountID
              from
                     ACCOUNT A,
                     MARKETER_ACCOUNT M,
                     ARCHIVE R
               where A.AccountID = nvl((select max(M.EndDate) from Marketer_account M2
                                                    where M2.AccountID = A.AccountID),
                                                      (select max(R.EndDate) from Archive R2
                                                    where R2.AccountID = A.AccountID)
                   and upper(A.Name) like parameter || '%'
    <can you do a NVL like this?   probably not...   I want to be able to get the MAX record for that account off the MarketerACcount table OR the max record for that account off the Archive table, but not both>
    (parameter could be "DO", so I return all names starting with DO...)

    if I understand your description I would assume that for John Dow we would expect the second row from marketer_account  ("high date ~ which means currently with this marketer"). Here is a solution with analytic functions:
    drop table account;
    drop table marketer_account;
    drop table marketer_account_archive;
    create table account (
        id number
      , name varchar2(20)
    insert into account values (123, 'John Doe');
    insert into account values (124, 'Jane Donaldson');
    insert into account values (125, 'Harold Douglas');
    create table marketer_account (
        key number
      , AccountId number
      , MktKey number
      , FromDt date
      , ToDate date
    insert into marketer_account values (1001, 123, 10526, to_date('03.08.2008', 'dd.mm.yyyy'), to_date('27.09.2009', 'dd.mm.yyyy'));
    insert into marketer_account values (1017, 123, 10987, to_date('28.09.2009', 'dd.mm.yyyy'), to_date('31.12.4712', 'dd.mm.yyyy'));
    insert into marketer_account values (1023, 124, 10541, to_date('03.12.2010', 'dd.mm.yyyy'), to_date('31.12.4712', 'dd.mm.yyyy'));
    create table marketer_account_archive (
        key number
      , AccountId number
      , MktKey number
      , FromDt date
      , ToDate date
    insert into marketer_account_archive values (1015, 124, 10526, to_date('03.08.2008', 'dd.mm.yyyy'), to_date('02.12.2010', 'dd.mm.yyyy'));
    insert into marketer_account_archive values (1033, 125, 10987, to_date('01.01.2011', 'dd.mm.yyyy'), to_date('31.01.2012', 'dd.mm.yyyy'));
    select key, AccountId, MktKey, FromDt, ToDate
         , max(FromDt) over(partition by AccountId) max_FromDt
      from marketer_account
    union all
    select key, AccountId, MktKey, FromDt, ToDate
         , max(FromDt) over(partition by AccountId) max_FromDt
      from marketer_account_archive;
    with
    basedata as (
    select key, AccountId, MktKey, FromDt, ToDate
      from marketer_account
    union all
    select key, AccountId, MktKey, FromDt, ToDate
      from marketer_account_archive
    basedata_with_max_intervals as (
    select key, AccountId, MktKey, FromDt, ToDate
         , row_number() over(partition by AccountId order by FromDt desc) FromDt_Rank
      from basedata
    filtered_basedata as (
    select key, AccountId, MktKey, FromDt, ToDate from basedata_with_max_intervals where FromDt_Rank = 1
    select a.id
         , a.name
         , b.MktKey
         , b.FromDt
         , b.ToDate
      from account a
      join filtered_basedata b
        on (a.id = b.AccountId)
    ID NAME                     MKTKEY FROMDT     TODATE
    123 John Doe                  10987 28.09.2009 31.12.4712
    124 Jane Donaldson            10541 03.12.2010 31.12.4712
    125 Harold Douglas            10987 01.01.2011 31.01.2012
    If your tables are big it could be necessary to do the filtering (according to your condition) in an early step (the first CTE).
    Regards
    Martin

  • Query help : Query to get values SYSDATE-1 18:00 hrs to SYSDATE 08:00 hrs

    Hi Team
    I want the SQl query to get the data for the following comparison : -
    Order Created is a Date Column , and i want to find out all the values from (SYSDATE-1) 18:00 hours to SYSDATE 08:00 hours
    i.e.
    (SYSDATE-1) 18:00:00 < Order.Created < SYSDATE 08:00:00.
    Regards

    Hi, Rohit,
    942281 wrote:
    If i want the data in the below way i.e.
    from (SYSDATE-1) 18:00 hours to SYSDATE 17:59 hours ---> (SYSDATE-1) 18:00:00 < Order.Created < SYSDATE 07:59:00.If you want to include rows from exactly 18:00:00 yesterday (but no earlier), and exclude rows from exatly 08:00:00 today (or later), then use:
    WHERE   ord_dtl.submit_dt  >= TRUNC (SYSDATE) - (6 / 24)
    AND     ord_dtl.submit_dt  <  TRUNC (SYSDATE) + (8 / 24)
    So can i use the below format : -
    ord_dtl.submit_dt BETWEEN trunc(sysdate)-(6/24) and trunc(sysdate)+(7.59/24) . Please suggest . .59 hours is .59 * 60 * 60 = 2124 seconds (or .59 * 60 = 35.4 minutes), so the last time included in the range above is 07:35:24, not 07:59:59.
    If you really, really want to use BETWEEN (which includes both end points), then you could do it with date arithmentic:
    WHERE   ord_dtl.submit_dt  BETWEEN  TRUNC (SYSDATE) - (6 / 24)
                      AND         TRUNC (SYSDATE) + (8 / 24)
                                               - (1 / (24 * 60 * 60))but it would be simpler and less error prone to use INTERVALs, as Karthick suggested earlier:
    WHERE   ord_dtl.submit_dt  BETWEEN  TRUNC (SYSDATE) - INTERVAL '6' HOUR
                      AND         TRUNC (SYSDATE) + INTERVAL '8' HOUR
                                               - INTERVAL '1' SECONDEdited by: Frank Kulash on Apr 17, 2013 9:36 AM
    Edited by: Frank Kulash on Apr 17, 2013 11:56 AM
    Changed "- (8 /24)" to "+ (8 /24)" in first code fragment (after Blushadown, below)

  • Query help, subtract two query parts

    Hi,
    I am beginner of PL/SQL and have a problem I couldn’t solve:
    Table (op_list):
    Item     -     Amount -     Status
    Item1     -     10     -     in
    Item2     -     12     -     in
    Item3     -     7     -     in
    Item1     -     2     -     out
    Item2     -     3     -     out
    Item1     -     1     -     dmg
    Item3     -     3     -     out
    Item1     -     2     -     out
    Item2     -     5     -     out
    Item2     -     2     -     in
    Item3     -     1     -     exp
    Would like to get result of query (subtract amount of 'out/dmg/exp' from 'in' ):
    Item - Amount left
    Item1     -     5
    Item2     -     6
    Item3 -     3
    I wrote code that returns sum of all incoming items and sum all out/dmg/exp items, but couldn’t solve how to subtract one part of querry from another. Or maybe there is a better way. Also worried what happens if there is no 'out/dmg/exp' only 'in'
    select item.name, sum(op_list.item_amount)
    from op_list
    inner join item
    on op_list.item = item.item_id
    where op_list.status = 'in'
    group by item.name
    union
    select item.name, sum(op_list.item_amount)
    from op_list
    inner join item
    on op_list.item = item.item_id
    where op_list.status = 'out'
    or op_list.status = 'dmg'
    or op_list.status = 'exp'
    group by item.name
    Return:
    Item1     -     10      [10 in]
    Item1     -     5     [2+1+2]
    Item2     -     14     [12+2]
    Item3     -     7
    Item3     -     4     [3+1]
    Thanks in advance

    Hi,
    We can also use simple inline views to get what we need.
    select a.item,a.amount-b.amount Balance from
    (select item,sum(amount) Amount from op_list
    where status = 'in'
    group by item) a,
    (select item,sum(amount) Amount from op_list
    where status in ('out','dmg','exp')
    group by item) b
    where
    a.item=b.item
    order by item;
    ITEM       BALANCE
    Item1                      5
    Item2                      6
    Item3                      3Regards,
    Prazy

  • Query help: query to return column that represents multiple rows

    I have a table with a name and location column. The same name can occur multiple times with any arbitrary location, i.e. duplicates are allowed.
    I need a query to find all names that occur in both of two separate locations.
    For example,
    bob usa
    bob mexico
    dot mexico
    dot europe
    hal usa
    hal europe
    sal usa
    sal mexico
    The query in question, if given the locations usa and mexico, would return bob and sal.
    Thanks for any help or advice,
    -=beeky

    How about this?
    SELECT  NAME
    FROM    <LOCATIONS_TABLE>
    WHERE   LOCATION IN ('usa','mexico')
    GROUP BY NAME
    HAVING COUNT(DISTINCT LOCATION) >= 2Results:
    SQL> WITH person_locations AS
      2  (
      3          SELECT 'bob' AS NAME, 'USA' AS LOCATION FROM DUAL UNION ALL
      4          SELECT 'bob' AS NAME, 'Mexico' AS LOCATION FROM DUAL UNION ALL
      5          SELECT 'dot' AS NAME, 'Mexico' AS LOCATION FROM DUAL UNION ALL
      6          SELECT 'dot' AS NAME, 'Europe' AS LOCATION FROM DUAL UNION ALL
      7          SELECT 'hal' AS NAME, 'USA' AS LOCATION FROM DUAL UNION ALL
      8          SELECT 'hal' AS NAME, 'Europe' AS LOCATION FROM DUAL UNION ALL
      9          SELECT 'sal' AS NAME, 'USA' AS LOCATION FROM DUAL UNION ALL
    10          SELECT 'sal' AS NAME, 'Mexico' AS LOCATION FROM DUAL
    11  )
    12  SELECT  NAME
    13  FROM    person_locations
    14  WHERE   LOCATION IN ('USA','Mexico')
    15  GROUP BY NAME
    16  HAVING COUNT(DISTINCT LOCATION) >= 2
    17  /
    NAM
    bob
    salHTH!
    Edited by: Centinul on Oct 15, 2009 2:25 PM
    Added sample results.

  • QUERY HELP!!! trying to create a query

    i'm creating a summary report
    i have a table with sale dates
    for example i have a table tab_1 and column saleDate as
    saleDat
    1923
    1936
    1945
    2003
    2005
    saleDate contains years and there are some missing years where no sale
    was made
    My report has to display years starting from earliest year
    so i have to create a query that starts with 1923
    but the problem is that I have to have years that are not in table.
    for example i have to display years 1924 which is not in table
    so the part of report has to look like
    1923 blah blah summary.........
    1924 "
    1925
    1926
    2005
    2006
    upto current year (2006 may not be in the table, but i have to display)
    i just need to know the query that can query all the years starting from
    the ealiest saleDate to current year
    thanks in advance

    Please write the query in the following form:
    SELECT a.year, --- place other columns from your table.
    FROM (SELECT (:start_num + rownum) year
    FROM all_tab_columns
    WHERE :start_num + rownum <= :end_num) a,
    tab_1 b
    WHERE a.year = b.saleDat(+);
    Note:
    1) if your start year and end year are 1923 and 2006. Then input as below:
    :start_num = 1922
    :end_num = 2006
    2) Since for some of the years (1924 etc) may not be there in your so you may need to use NVL to print proper indicators.
    3) If you have more than one record in tab_1 for a particular year then group them based year and then use it.
    Hope this helps.
    - Saumen.

  • IF statement in Query

    Hi
    I have a query / recordset that  would be looking at 12000 rows in a database and 10 different variables and potential filters chosen by end users.
    Should I put 10 wild card / url Where statements in my recordset query or should I put IF statements in my query.
    ie.
    If URL colname then ",and BetType= "xzz"" .
    Is that the most efficient way to run my queries.
    I will be running about 10 recordsets on my page all looking at these url variables so it will be a busy page.
    If that is the answer can someone please tell me how to insert the if statement - i've tried allsorts but it wont work.
    $colname_Recordset4 = "%";
    if (isset($_GET['colname'])) {
      $colname_Recordset4 = $_GET['colname'];
    mysql_select_db($database_racing_analysis, $racing_analysis);
    $query_Recordset4 = sprintf("SELECT BetType, sum(if(season='2006-2007', Bet, 0)) AS '2006-2007',  sum(if(season='2007-2008', Bet, 0)) AS '2007-2008',  sum(if(season='2008-2009', Bet, 0)) AS '2008-2009' FROM dataextract WHERE BetType Like %s and TrackID = 1 and Distance = 1000 and Class = 1 GROUP BY BetType", GetSQLValueString($colname_Recordset4, "text"));
    $Recordset4 = mysql_query($query_Recordset4, $racing_analysis) or die(mysql_error());
    $row_Recordset4 = mysql_fetch_assoc($Recordset4);
    $totalRows_Recordset4 = mysql_num_rows($Recordset4);
    hope someone can help.
    Simon

    That part of the query cross tabs my data into three columns - not intended to confuse the issue. I'd have the same problem with a basic query.
    So Here is a very basic version:
    I want an IF statement to go around: "WHERE Distance = 1000" and "AND Class = 1"
    That I believe will reduce the effort on the MYSQL Server as it wouldn't be running a bunch of wild card queries and would only run if there is a URL paramter delivered.??
    I just don't get how to put the IF statements in PHP.
    thanks
    $maxRows_Recordset3 = 5;
    $pageNum_Recordset3 = 0;
    if (isset($_GET['pageNum_Recordset3'])) {
      $pageNum_Recordset3 = $_GET['pageNum_Recordset3'];
    $startRow_Recordset3 = $pageNum_Recordset3 * $maxRows_Recordset3;
    mysql_select_db($database_racing_analysis, $racing_analysis);
    $query_Recordset3 = "SELECT * FROM dataextract WHERE Distance = 1000 AND Class = 1";
    $query_limit_Recordset3 = sprintf("%s LIMIT %d, %d", $query_Recordset3, $startRow_Recordset3, $maxRows_Recordset3);
    $Recordset3 = mysql_query($query_limit_Recordset3, $racing_analysis) or die(mysql_error());
    $row_Recordset3 = mysql_fetch_assoc($Recordset3);
    if (isset($_GET['totalRows_Recordset3'])) {
      $totalRows_Recordset3 = $_GET['totalRows_Recordset3'];
    } else {
      $all_Recordset3 = mysql_query($query_Recordset3);
      $totalRows_Recordset3 = mysql_num_rows($all_Recordset3);
    $totalPages_Recordset3 = ceil($totalRows_Recordset3/$maxRows_Recordset3)-1;

Maybe you are looking for

  • Vendor line item report in New GL

    Hi We have activated New GL. We need vendor line item report based on GL view. FBL1n provides report based on entry view. But we need simillar to FBL1n based on GL view. There is a standard report S_AC0_52000888 which provides vendor line item doucme

  • Torn between buying either the New (Mid 2012) 13" Macbook Pro or the New (Mid 2012) 13" Macbook Air

    Which notebook do you think has more "life" in it?  Which will be the more outdated machine sooner i.e. which one do you think Apple will stop supporting and allowing you to update to the most current OS?  I'm going to get the base model of either an

  • Desktop appears to be Read Only - Permissions issue

    I believe I have an issue with Permissions that has caused me to lose access to the Dock and items on the Desktop. I was having an issue with my external FW Drive and was doing some troubleshooting with it. At one point I restarted the entire system

  • Can't "Add Accounts" for second User on machine

    Just upgraded from my G4 running 10.4.x to a Mini running 10.5.2. There are two user log ins on the machine. I set up my e-mail accounts in Mail on my "side" of the machine, but my partner cannot "Add Accounts" in Mail on their side of the machine---

  • Image Display got rotten (really: the Indicator, not the Image!)

    I'd like to hear comments on the following VI, containing just a single Image Display Indicator, which, for some reason got corrupted during porting across LV versions and resizing. By divide and conquer I found that it was the origin of pretty odd p