Filtering data using dates and SQL

Post Author: Ivanbennett
CA Forum: Data Connectivity and SQL
Hi all
been struggling with this one all morning could do with a list help
I am using CR XI rel 2
I have 2 tables
Table one - AUDIT_LOG
PositionIdDateTimeStatCode
Table two - POSITION
PositionIdSiteID
Table Three - SiteID
SiteIDNameTownPostCode
I would like to have a user type in a start date and an end date and then the report will return, records from the Position table, where the PositionID does not appear in the Audit Log. I can establish who has not got an entry for the entire table but I now want a snapshot for a period typed in by the user.
This is the SQL used when I added a command from database expert
SELECT distinct POSITION.ID, POSITION.SITE_ID, SITE.NAME,AUDIT_LOG.DATETIMEFROM SITE INNER JOIN (POSITION LEFT outer JOIN AUDIT_LOG ON POSITION.ID = AUDIT_LOG.POSITION_ID) ON SITE.ID = POSITION.SITE_IDWHERE (((AUDIT_LOG.POSITION_ID) Is Null))
Current Output10/09/2007                                ID                        SITE_ID                   NAMEAndy Arms                               4                          AB120002                Andy Arms                              103                       AB120002                Andy Arms                              3                          AB120002                Andy Arms                              104                      AB120002                Andy ArmsCharter Court                               2                        120001                     Charter Court                              101                     120001                     Charter Court                              102                     120001                     Charter Court
Charter Court Test Site                             60                        129999                    Charter Court Test Site
Forte Jester                              7                        200005                      Forte Jester
here                            48                        123456789                 here
Any help appreciated

Post Author: foghat
CA Forum: Data Connectivity and SQL
you need to create 2 command parameters:  start_date_from and start_date_to then add to your where clause:and datetime >=  {?start_date_from}and datetime <= {?start_date_to}

Similar Messages

  • Error message when importing data using Import and export wizard

    Getting below error message when importing data using IMPORT and EXPORT WIZARD
    Error 0xc0202009: Data Flow Task 1: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
    <dir>
    <dir>
    Messages
    Error 0xc0202009: Data Flow Task 1: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80004005  Description: "Could not allocate a new page for database REPORTING' because of insufficient disk space in filegroup 'PRIMARY'.
    Create the necessary space by dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.".
    (SQL Server Import and Export Wizard)
    Error 0xc0209029: Data Flow Task 1: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "Destination - Buyer_.Inputs[Destination Input]" failed because error code 0xC020907B occurred, and the error row disposition on "Destination
    - Buyer_First_Qtr.Inputs[Destination Input]" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
    (SQL Server Import and Export Wizard)
    Error 0xc0047022: Data Flow Task 1: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "Destination - Buyer" (28) failed with error code 0xC0209029 while processing input "Destination Input" (41). The
    identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information
    about the failure.
    (SQL Server Import and Export Wizard)
    Error 0xc02020c4: Data Flow Task 1: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
    (SQL Server Import and Export Wizard)
    </dir>
    </dir>
    Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on Source - Buyer_First_Qtr returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine called PrimeOutput().
    The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
    (SQL Server Import and Export Wizard)
    Smash126

    Hi Smash126,
    Based on the error message” Could not allocate a new page for database REPORTING' because of insufficient disk space in filegroup 'PRIMARY'. Create the necessary space by dropping objects in the filegroup, adding additional files to the filegroup, or setting
    autogrowth on for existing files in the filegroup”, we can know that the issue is caused by the there is no sufficient disk space in filegroup 'PRIMARY' for the ‘REPORTING’ database.
    To fix this issue, we can add additional files to the filegroup by add a new file to the PRIMARY filegroup on Files page, or setting Autogrowth on for existing files in the filegroup to increase the necessary space.
    The following document about Add Data or Log Files to a Database is for your reference:
    http://msdn.microsoft.com/en-us/library/ms189253.aspx
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Upload XML data using XSQL and HTTP Post ?

    Upload XML data using XSLQ and HTTP Post: is that possible ?
    An xsql contains an <xsql:insert-request table="aTable">
    The XML data file follows the ROWSET/ROW paradigm.
    What is the HTML form to upload the xml file to the XSQL ?
    I tried:
    <form action="myXSQL.xsql" method="POST" ENCTYPE="multipart/form-data">
    XML data file to upload: <input type="file">
    <input type="submit">
    </form>
    But the answer of myXSQL is:
    <xsql-status action="xsql:insert-request" result="No posted document to process" />
    Where is the problem ?
    Thank you.

    Hello,
    You are posting your XML file as a parameter therefore you should use the <xsql:insert-params/> tag, not the <xsql:insert-request/>. The insert-request can only handle data not posted via a parameter.
    Usage:
    <form action="myXSQL.xsql" method="GET" ENCTYPE="multipart/form-data">
    XML data file to upload: <input type="file" name="myXML">
    <input type="submit">
    </form>
    in combination with
    <xsql>
    <xsql:insert-params name="myXML" table="your table"/>
    </xsql>
    2 remarks:
    I was not able to succesfully POST the form. The answer was <xsql-status action="xsql:insert-request" result="No posted document to process" />. With GET is was succesfull.
    Second, if you use MSInternet explorer 5 or higher use could post the XML directly (not aw parameter) using an ActiveX object.
    Regards,
    Harm Verschuren

  • HT5594 why in spite of i turn off all of location servies when i turn on cellular data use data again ?

    why in spite of i turn off all of location servies when i turn on cellular data use data again
    in cellular >>system services >> mapping services
    how can i turn off mapping services completly ??
    & why now it use data until i turn it off ???

    Settings>Cellular>Use Cellular Data for and turn Maps off. If you are not using Maps for navigation, it's not using data, though.

  • Error while updating data using session and call transaction method

    Hi all,
        i have to update data using MM01 transaction from flat file to database.i have used both session method and call transaction method to do that.in both the methods data has been transferred from internal tables to screens but while updating the data that is by clicking the ok-code at the end of the transaction iam getting a dialogue box stating
       SAP EXPRESS DOCUMENT "UPDATE WAS TERMINATED" RECEIVED FROM AUTHOR "SAP".
      please tell whether the problem lies and solution for that.
                                       thanks and regards.

    hi,
    check your recording.check whether u saved your material no in recording or not.
    once again record the transacton mm01.
           MATNR LIKE RMMG1-MATNR,
           MBRSH LIKE RMMG1-MBRSH,
           MTART LIKE RMMG1-MTART,
           MAKTX LIKE MAKT-MAKTX,
           MEINS LIKE MARA-MEINS,
           MATKL LIKE MARA-MATKL,
           BISMT LIKE MARA-BISMT,
           EXTWG LIKE MARA-EXTWG,
    these are the fields which u have to take in internal table.
    this is the record which i took in my flatfile.use filetype as asc and hasfieldseperator as 'X'.
    SUDHU-6     R     ROH     MATSUDHU     "     001     7890     AA
    i did the same.but i didn't get any error.

  • Problem in Creating new row & inserting data using CreateInsert and Commit

    Hello All,
    I have created a page there are few input text and i want to insert the data into a database table. I have created an Application Module I am using CreateInsert and Commit operation but there is one problem.
    At first it created a row in database after that it is not creating the new row instead it is updating the same row with the new values.
    In bindings of my jspx page I have created two binding for action (1) CreateInsert for the VO of that Application Module (2) Commit operation of that Application Module.
    Here is the code snippet of my application:
    BindingContainer bindings = getBindings();
    OperationBinding operationBinding = bindings.getOperationBinding("CreateInsert");
    Object result = operationBinding.execute();
    *if (!operationBinding.getErrors().isEmpty()) {*
    return null;
    OperationBinding operationBinding1 = bindings.getOperationBinding("Commit");
    Object result1 = operationBinding1.execute();
    *if (!operationBinding1.getErrors().isEmpty()) {*
    return null;
    I have tried using Execute+Commit and Insert+Commit case also in every case it is updating the same row and not inserting a new row.
    Is there anything I am missing?
    Please Help.

    hi user,
    i dono. why are trying with codes. adf provides zero lines codes.
    a wonderful drag and drop functionality provide by the framework.
    while double click the button the codes are  registered in your bean
        public String cb6_action() {
            BindingContainer bindings = getBindings();
            OperationBinding operationBinding = bindings.getOperationBinding("CreateInsert");
            Object result = operationBinding.execute();
            if (!operationBinding.getErrors().isEmpty()) {
                return null;
            return null;
        public String cb8_action() {
            BindingContainer bindings = getBindings();
            OperationBinding operationBinding = bindings.getOperationBinding("Commit");
            Object result = operationBinding.execute();
            if (!operationBinding.getErrors().isEmpty()) {
                return null;
            return null;
        public String cb7_action() {
            BindingContainer bindings = getBindings();
            OperationBinding operationBinding = bindings.getOperationBinding("Delete");
            Object result = operationBinding.execute();
            if (!operationBinding.getErrors().isEmpty()) {
                return null;
            return null;
        public String cb14_action() {
            BindingContainer bindings = getBindings();
            OperationBinding operationBinding =
                bindings.getOperationBinding("Delete4");   // some different here. after deleting usually do commit
            OperationBinding operationBinding1 =  
                bindings.getOperationBinding("Commit");    // so here commit operation.
            Object result = operationBinding.execute();
            Object result1 = operationBinding1.execute();
            if (!operationBinding.getErrors().isEmpty()) {
                return null;
            if (!operationBinding1.getErrors().isEmpty()) {
                //add error handling here
                return null;
            return null;
        }if am not understud correctly. please some more explanation need.

  • Not able to Import data using "clear and replace"

    Hi,
    If I import data using the data admin package "Import" and "Merge" as 'method for importing' the process runs without problems.
    If I change the 'method for importing' to "Clear and Replace" the process fails. See message:
    TOTAL STEPS  2
    1. Convert Data:         completed  in 3 sec.
    2. Load and Process:     Failed  in 1 sec.
    3. Import:               completed  in 1 sec.
    [Selection]
    FILE=\UHRENHOLT\LEGAL_DATALOAD\DataManager\DataFiles\\Axapta_Load.txt
    TRANSFORMATION=\UHRENHOLT\LEGAL_DATALOAD\DataManager\TransformationFiles\\Axapta_Load.xls
    CLEARDATA= Yes
    RUNLOGIC= Yes
    CHECKLCK= Yes
    [Messages]
    Key cannot be null.
    Parameter name: key
    I'm uisng the standard data admin package (and thereby the values 0 and 1). For some reason the value 1 is not accepted.
    Any suggestions?
    /Lars

    Hi,
    The "Replace & clear..." feature during data import depends on Work Status. So to use this functionality, you need to setup Work Status under your application. Notice that you need to setup Work Status even if you aren't selecting the option to check Work Status when running the package.
    Hope this will help you.
    Kind Regards,
    Patrick

  • How to get EKBE-BUDAT (GR Date) using data of BSEG

    hI ,
    My requirement is to get the GR date from EBKE which is in the field BUDAT.
    my report already has BSEG data , using that i want to get the EKBE-BUDAT.
    One of the Function person suggested this:
    Select LFBNR
               LFPOS
               LFGJA
       From EKBE
    into it_ekbe_temp
      where  EKBE -EBELN = EKBEBSEG-EBELN
          AND  EKBE-EBELP = EKBEBSEG-EBELP
          AND  EKBE-BELNR = EKBEBSEG-BELNR
          AND  EKBE-BUZEI = BSEG-BUZEI
    once we get these 3 fields, again put a query on EKBE and get the GR date BUDAT
    select a~ebeln
              a~ebelp
              a~budat
              a~lfbnr
              a~lfpos
              a~lfgja
    into table it_ekbe
    from ekbe as a
    inner join bseg as b on
    b~ebeln = a~ebeln
    and b~ebelp = a~ebelp
    for all entries in it_ekbe_temp
    where a~gjahr = it_ekbe_temp-lfgja
    and   a~belnr = it_ekbe_temp-lfbnr
    and   a~buzei = it_ekbe_temp-lfpos.
    endif.
    Can anyone suggest me how to get the GR date from EKBE using BSEG data.

    Hi Mayank,
    You can get through by hitting MSEG table first , get the required key info. and then hit EKBE and get BUDAT.
    Pass ebeln,ebelp to mseg and get the key info. ...
    Hope this helps.
    Thanks,
    Amresh

  • Issue with importing data using data pump

    Hi Guys,
    Need your expertise here. I have just exported a table using the following datapump commands. Please note that I used *%U* to split the export into chunk of 2G files.
    expdp "'/ as sysdba'" dumpfile=DP_TABLES:PT_CONTROL_PS_083110_*%U*.dmp logfile=DP_TABLES:PT_CONTROL_PS_083110.log tables=(PT_CONTROL.pipeline_session) filesize=2G job_name=pt_ps_0831_1
    The above command produced the following files
    -rw-r----- 1 oracle oinstall 2.0G Aug 31 15:04 PT_CONTROL_PS_083110_01.dmp
    -rw-r----- 1 oracle oinstall 2.0G Aug 31 15:05 PT_CONTROL_PS_083110_02.dmp
    -rw-r----- 1 oracle oinstall 2.0G Aug 31 15:06 PT_CONTROL_PS_083110_03.dmp
    -rw-r----- 1 oracle oinstall 394M Aug 31 15:06 PT_CONTROL_PS_083110_04.dmp
    -rw-r--r-- 1 oracle oinstall 2.1K Aug 31 15:06 PT_CONTROL_PS_083110.log
    So far things are good.
    Now when I import the data using the below command, it truncates the table but do no import any data. Last line says "*Job "SYS"."PT_PS_IMP_0831_1" completed with 1 error(s) at 15:14:57*".
    impdp "'/ as sysdba'" dumpfile=DP_TABLES:PT_CONTROL_PS_083110_%U.dmp logfile=DP_TABLES:PT_CONTROL_PS_083110_IMP.log Tables=(PT_CONTROL.pipeline_session) TABLE_EXISTS_ACTION=Truncate job_name=PT_ps_imp_0831_1
    Import: Release 10.2.0.3.0 - Production on Tuesday, 31 August, 2010 15:14:53
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Release 10.2.0.3.0 - Production
    Master table "SYS"."AT_PS_IMP_0831_1" successfully loaded/unloaded
    Starting "SYS"."AT_PS_IMP_0831_1": '/******** AS SYSDBA' dumpfile=DP_TABLES:PT_CONTROL_PS_083110_*%U*.dmp logfile=DP_TABLES:PT_CONTROL_PS_083110_IMP.log Tables=(PT_CONTROL.pipeline_session) TABLE_EXISTS_ACTION=Truncate job_name=AT_ps_imp_0831_1
    Processing object type TABLE_EXPORT/TABLE/TABLE
    ORA-39153: Table "PT_CONTROL"."PIPELINE_SESSION" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/TRIGGER
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "SYS"."AT_PS_IMP_0831_1" completed with 1 error(s) at 15:14:57
    I suspect that it has something to do with %U in the impdp command. Anyone encounter this kind of situation before? What should be my import command be? Just want to confirm I am using the right import command.
    Thanks
    --MM
    Edited by: UserMM on Aug 31, 2010 3:11 PM

    I also looked into the alert log but didn't find anything about the error there. Any opinion?
    --MM                                                                                                                                                                                                           

  • Using TOAD and SQL Developer to compare db objects in schemas in databases

    Hi All,
    My primary objective was to compare objects in schemas in two different databases and find out the differences,
    Execute DDL's in the database where objects are missing and syn schemas in two different databases.
    So I need to compare schemas in databases. Which tool would be helpful and will be user friendly to make a comparison of database objects existing in schemas in two different databases.
    I'd like to see if I can get a list of pro and cons between Toad and SQL Developer for comparing schemas pros and cons.
    Could you please also help me on navigation in SQL Developer to compare schemas.
    Connect to Source
    Connect to Target
    Compare schemas with different object types
    Find out differences
    Generate DDL's for the missing objects or for the objects in difference report
    Run them in missing instace(Source/Target)
    Make sure both are in sync.
    Thanks All

    Hi,
    Most dba type tools have this kind of functionality - personally i use plsqldeveloper but this is very similar to most other tools on the market. SQL Developer seems to rely on you needing the change management pack licence which you likely dont have and would be quite a large cost to roll out.
    I'd compare plsqldeveloper/toad/sqlnavigator etc and maybe the tools from redgate which look pretty good though i haven't used them.
    Regards,
    Harry

  • Import data using Excel and Oracle SQL Developer - I need some help

    Dear friends,
    I'm using Oracle SQL Developer 1.5.1 and I need to import an Excel file into a database table. If I try to import this file in XLS format, fields and headers are correctly being shown and separated, but if I press "Next" button, it simply doesn't do anything - it stays stopped.
    I did some search here in this forum and it seems to me that, when you try to import a file via XLS format, SQL Developer has bugs. Is this correct?
    If I save the same file in CSV format and try to import the same file, it goes ahead, but SQL Developer is not separating fields and headers correctly. It combines all CSV fields into one - let's say Column0 and fields 1;TEST;01/01/2000 below the same Column0.
    Saving the file in CSV format is not a problem, taking a very little time. But I don't know how to make SQL Developer import it correctly. Could somebody please help me? Thanks in advance.
    Best regards,
    Franklin

    Hello K,
    yes, you're right. I found the following topic after posting this question here.
    Re: After update to 1.5.1, import from Excel fails
    I downloaded version 1.5.0 and I'll use it whenever I import data from Excel.
    Thanks,
    Franklin

  • Return the end of week date using month and week number

    Hi all,
    I am trying to find the date for end of the week using given month (201401, 201402...i.e, 2014 is year and 01,02 are months) and week number (01,02,03...) for the year. For example if my month is 201402 and week_number is 06 then it return the week_date
    as 2014-02-08. Can you please help me in writing a sql statement for this scenario.
    Thanks in advance,
    Nikhil

    Current month is irrelevant
    with dt as (
       select dateadd(day, (6-1)*7, '2014-01-01') as xwk
    select dateadd(day, 7-datepart(weekday, dt.xwk), xwk)
    from dt;
    Change the "6" in the with statement to the week of interest and 2014 to the year of interest...

  • How to cleanse the Arabic-General and Address data using Data Services 3.1

    I m working in UAE project(Sap Customer & Vendor master) data migration. Main address and customer tables are builded in english and arabic equally.
    I can able to read the arabic data, but there is no clue, how to cleanse or modify those datas?
    Is it possbile to handle the arabic data in business objects-data services XI 3.1?
    Is it possible to use the EMEA address directories to cleanse or standandize the arabic data?
    Please help me out.
    Thanks in advance.

    Dear All,
    Anyone with any inputs for above question. Please advice
    Vamshi - im also looking for some advice as per your questions with arabic versions
    Best Regards

  • Not able to fetch updated data using jdbc and oracle 10g

    Whenever i m updating the data and fetching the same record after updating i m not able to get the fresh/new updated data , old record is fetched every time, but when i checked in database the record gets updated successfully , even if i fire the query two times after 10 seconds using Thread.sleep even then problem persist.
    Please help me out!!!!!!!!
    Implementation has been stucked up!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! :(

    Well its okWhat is OK? Your stuff working now?
    i m doing the same thing Apparently not.
    please go thru the below code fragment:
    code for Update :::
    DataAccessBean1 partsHistoryDataAccessBean = new
    DataAccessBean1(context);
    partsHistoryDataAccessBean.setData(data);
    partsHistoryDataAccessBean.update();
    After Updating i m forwading it to the other servletForwarding what? Why do you need another servlet? Just do the query, put the new data into the response object, and return.
    According to your assumption i should get the updated
    data on the other screen but Nope, you don't understand what I'm saying.
    %

  • When I acquire data using USB6008 and NI_DAQmx Base in VC++, Shall I start and stop the task in ONTimer function?

    void Ctest2Dlg:nTimer(UINT nIDEvent)
       DAQmxErrChk (DAQmxBaseStartTask(taskHandle));
       DAQmxErrChk (DAQmxBaseReadAnalogF64(taskHandle,pointsToRead,timeout,DAQmx_Val_GroupByScanNumber,data,100,&pointsRead,NULL));
      DAQmxBaseStopTask (taskHandle);
    If I do this, the program will halt in about half minute. But if I do not start and stop task in OnTmer function, the program will not halt and the data acquired is not corrected.
    Please help me! Thanks

    This is a duplicate post.  Please post any new replies to the linked forum.
    Samantha
    National Instruments
    Applications Engineer

Maybe you are looking for