Select data from BKPF for a date range

Hi,
I want to select data from BKPF depending upon the date range given in the selection screen.
SELECTION-SCREEN : BEGIN OF BLOCK B1 WITH FRAME TITLE TEXT-001.
" Selection Criteria
SELECT-OPTIONS : S_DATE FOR SY-DATUM OBLIGATORY.
SELECTION-SCREEN : END OF BLOCK B1.
How do I do that??.. Please help.
Thank You,
SB.

HI SB,
     Create Indexes and select the data ...
i.e,
SELECTION-SCREEN : BEGIN OF BLOCK B1 WITH FRAME TITLE TEXT-001.
" Selection Criteria
<b>SELECT-OPTIONS : S_DATE FOR BKPF-BLDAT OBLIGATORY.</b>
SELECTION-SCREEN : END OF BLOCK B1.
<b>    SELECT * FROM BKPF WHERE BLDAT IN s_date.</b>
Regards,
Santosh P
Message was edited by: Santosh Kumar Patha

Similar Messages

  • How to display the date from callender for more than one date format items

    i have one form which contains many date format items. i want to show the calendar and take the date from calendar.
    i am successfully show the calendar and take the date from calendar for one date item.
    my problem is i want to do same for all date format items in the form
    can any one help me?
    thanks

    Maybe you could provide more information.
    What calendar ? what item works ? what other don't ?
    is the issue in the different format masks ?
    Please elaborate more your situation.
    Francoisit is not the issue of different format masks.
    i created a block which name is jbeanblock. in this block item is bean. bean has one trigger when_custom_item_event.
    the code of this trigger is :
    DECLARE
    eventName varchar2(30) := :system.custom_item_event;
    eventValues ParamList;
    eventValueType number;
    LC$Date varchar2(256);
    LC$Day varchar2(256);
    LC$Month varchar2(256);
    LC$Year varchar2(256);
    v_date date;
    BEGIN
    IF (eventName='CALENDAR_EVENT') THEN
    eventValues := get_parameter_list(:system.custom_item_event_parameters);
    get_parameter_attr(eventValues,'CALENDAR_EVENT_DATE',eventValueType, LC$Date);
    get_parameter_attr(eventValues,'CALENDAR_EVENT_DAY',eventValueType, LC$Day);
    get_parameter_attr(eventValues,'CALENDAR_EVENT_MONTH',eventValueType, LC$Month);
    get_parameter_attr(eventValues,'CALENDAR_EVENT_YEAR',eventValueType, LC$Year);
    Clear_Message;
    select to_date(LC$Day||'/'||LC$Month||'/'||LC$Year,'dd/mm/yyyy') into v_date from dual;
    :ds_employee.hiredate:=v_date;
    synchronize ;
    END IF;
    END;
    hiredate is the item of the ds_employee table. hiredate has trigger when_new_item_instance. The code is
    Set_Custom_Property('JBEANBLOK.BEAN',1, 'SHOW_CALENDAR','50,50');
    ds_employee is my table name. which has 2 date items. one is hiredate and another is date_of_birth.
    for hiredate it is working. if i want to do same thing for date_of_birth also then how can we do?

  • Gurus...Need help....extract data from BKPF header table and BSEG line item

    Gurus,
    I have to write the logic to fetch data from bkpf and bseg. Need help on how can i do that..
    I have to get bukrs  belnr gjahr ldgrp from BKPF for a given date and company code. For all these documents, then i have to get the line items from BSEG if the ldgrp is I1 or SPACE.
    If the ldgrp is not I1 or SPACE then i have to fetch the records from BSEG_ADD and then generate a ALV report with all the data including the data that was fetched from BKPF.
    So, it wil be a combined ALV report that displays header as well as LINE item data together...
    Can u please help me with the code...I am not sure how can everything go all together in one internal table....Becoz once its there in one table then only a ALV list can be generated.......
    Cheers:
    Sam

    hi Sam, this may be of some similar thing.
    Use this program, I got this prog from a source and we added a small conditional check in the program which checks document numbers in BSEG and also comapres in BKPF and sees if the output from BSEG falls under the posting data range specified in the initial selection.
    Now just so you know, this output is kinda messed up, so you will have to play with it in Excel to extract the document numbers, if that is what you want.
    ============================
    PROGRAM....... ZFI_BSEG_DOWNLOAD
    TITLE......... Download BSEG
    PROGRAM TYPE.. Download
    ======================================================================
    GENERAL DOCUMENTATION AND COMMENTS
    <...>
    ======================================================================
    ASSOCIATED PROGRAMS
    <Program>..... <Description>
    ======================================================================
    CHANGE HISTORY
    Date By Ticket Description
    REPORT zfi_bseg_download.
    TABLES: bseg, bkpf.
    TYPES: BEGIN OF ty_output,
    line(6000) TYPE c,
    END OF ty_output.
    TYPES: ty_tab_output TYPE TABLE OF ty_output,
    ty_tab_nametab TYPE TABLE OF x031l.
    CONSTANTS: c_delimiter(04) TYPE c VALUE '"%%"',
    c_records TYPE i VALUE 10000.
    SELECTION-SCREEN
    SELECT-OPTIONS: p_bukrs FOR bseg-bukrs,
    p_belnr FOR bseg-belnr,
    p_buzei FOR bseg-buzei,
    p_gjahr FOR bseg-gjahr,
    p_budat for bkpf-budat.
    SELECTION-SCREEN SKIP.
    PARAMETERS: p_file LIKE rlgrap-filename OBLIGATORY.
    SELECTION-SCREEN SKIP.
    PARAMETERS: p_append AS CHECKBOX DEFAULT 'X'.
    START-OF-SELECTION
    START-OF-SELECTION.
    PERFORM get_records.
    *& Form get_records
    FORM get_records.
    DATA: l_cursor TYPE cursor,
    lt_bseg TYPE TABLE OF bseg,
    ls_bseg LIKE LINE OF lt_bseg,
    lt_output TYPE ty_tab_output,
    ls_output LIKE LINE OF lt_output,
    lt_nametab TYPE ty_tab_nametab,
    ls_nametab LIKE LINE OF lt_nametab,
    l_field(30) TYPE c,
    l_output(50) TYPE c,
    l_date(10) TYPE c,
    l_len TYPE i.
    FIELD-SYMBOLS: <field>.
    IF p_append NE space.
    OPEN DATASET p_file FOR APPENDING IN TEXT MODE.
    ELSE.
    OPEN DATASET p_file FOR OUTPUT IN TEXT MODE.
    ENDIF.
    Retrieve BSEF fieldnames and data types
    PERFORM get_fields CHANGING lt_nametab.
    OPEN CURSOR l_cursor FOR
    SELECT * FROM bseg
    WHERE bukrs IN p_bukrs
    AND belnr IN p_belnr
    AND buzei IN p_buzei
    AND gjahr IN p_gjahr.
    Write out fieldnames
    IF p_append IS INITIAL.
    LOOP AT lt_nametab INTO ls_nametab.
    CONCATENATE ls_output ls_nametab-fieldname
    INTO ls_output SEPARATED BY c_delimiter.
    ENDLOOP.
    IF ls_output+0(4) = c_delimiter.
    SHIFT ls_output LEFT BY 4 PLACES.
    ENDIF.
    l_len = strlen( ls_output ).
    TRANSFER ls_output TO p_file LENGTH l_len.
    ENDIF.
    Process BSEG records
    DO.
    CLEAR lt_bseg.
    FETCH NEXT CURSOR l_cursor
    INTO TABLE lt_bseg
    PACKAGE SIZE c_records.
    IF sy-subrc 0.
    EXIT.
    ENDIF.
    LOOP AT lt_bseg INTO ls_bseg.
    SELECT single * FROM BKPF
    WHERE BUKRS = ls_bseg-BUKRS
    AND BELNR = ls_bseg-BELNR
    AND GJAHR = ls_bseg-GJAHR
    AND BUDAT in p_budat.
    if syst-subrc 0.
    continue.
    endif.
    CLEAR ls_output.
    Process individual fields of BSEG record
    LOOP AT lt_nametab INTO ls_nametab.
    CONCATENATE 'LS_BSEG-' ls_nametab-fieldname INTO l_field.
    ASSIGN (l_field) TO <field>.
    CLEAR l_output.
    Process by field data types
    CASE ls_nametab-exid.
    WHEN 'C' OR 'N' OR 'I'.
    Character, Numeric & Integer
    l_output = <field>.
    WHEN 'D'.
    Dates
    WRITE <field> TO l_date DD/MM/YYYY.
    l_output = l_date.
    WHEN 'P'.
    Packed decimals
    WRITE <field> TO l_output.
    WHEN OTHERS.
    MESSAGE a000(zs) WITH 'Data type error - ' ls_nametab-exid.
    ENDCASE.
    SHIFT l_output LEFT DELETING LEADING space.
    CONCATENATE ls_output l_output
    INTO ls_output SEPARATED BY c_delimiter.
    ENDLOOP.
    IF ls_output+0(4) = c_delimiter.
    SHIFT ls_output LEFT BY 4 PLACES.
    ENDIF.
    l_len = strlen( ls_output ).
    TRANSFER ls_output TO p_file LENGTH l_len.
    ENDLOOP.
    IF sy-subrc = 0.
    ENDIF.
    ENDDO.
    CLOSE CURSOR l_cursor.
    CLOSE DATASET p_file.
    ENDFORM. " get_records
    *& Form get_fields
    FORM get_fields CHANGING pt_nametab TYPE ty_tab_nametab.
    CALL FUNCTION 'RFC_GET_NAMETAB'
    EXPORTING
    tabname = 'BSEG'
    TABLES
    nametab = pt_nametab
    EXCEPTIONS
    table_not_active = 1
    OTHERS = 2.
    IF sy-subrc 0.
    ENDIF.
    ENDFORM. " get_fields
    hope this helps.
    cheers,
    Hema.

  • How to - Extract data from Cloud For Customer into SAP HANA system

    Hello Community,
    I have a requirement for extracting the existing data from Cloud for Customer into separate SAP HANA Box.
    Is it possible to achieve the same ? If yes, Please guide me for the same.
    Awaiting quick response.
    Regards
    Kumar

    Hi Kumar,
    In addition to what Thierry mentioned you could also use the C4C integration via standard Operational Data Provisioning (ODP) interfaces. This integration was acutally built for SAP BW and allows you to access any C4C data sources. From my perspective you can also build upon that for a native SAP HANA integration. Please also have a look at this guide: How To... load SAP Business Suite data into SAP... | SAP HANA.
    Besides that question let me also add the following: SAP Cloud for Customer already runs on SAP HANA since Nov. 2013. You may also use the powerful built-in analytics within C4C for analyzing data and any of your reporting demands. If your report should consider external data as well, you can combined the existing C4C data source with an external, so-called Cloud Data Source. More infomation is published in the C4C Analytics Guide: http://help.sap.com/saphelp_sapcloudforcustomer/en/PDF/EN-3.pdf.
    I hope this helps...
    Best regards,
    Sven

  • BPC10 - Data manager package for dimension  data export and import

    Dear BPC Expers,
    Need your help.
    I am trying to set up a data manager package for first time to export dimension - master data from one application and import in another application ( both have same properties) .
    I created a test data manager package from Organize > add package > with  process chain /CPMB/EXPORT_MD_TO_FILE  and Add
    In the advance tab of each task there are some script logic already populated. please find attached the details of the script logic written under each of the tasks like MD_Source, concvert and target .
    I have not done any chnages in the script inside the task .
    But when i run the package , I have selected a dimension 'Entity' but in second prompt ,it ask for a transformation file , and syatem autometically add the file ... \ROOT\WEBFOLDERS\COLPAL\FINANCE\DATAMANAGER\TRANSFORMATIONFILES\Import.xls
    I have not changed anything there
    in the next prmpt , it ask for a output file ..and it won't allow me enter the file name .....i
    Not sure how to proceed further.
    I shall be greatfull if someone guide me from your experiance  how to set up a simple the data manager package for master data export from dimension . Should I update the transformation file in the script for import file and  output file in the advance tab. how and what  transformation file to be created and link to the data manager package for export / import .
    What are the steps to be executed to run the package for exporting master data from dimension and import it another application .
    Thanks in advance for your guidance.
    Thanks and Regards,
    Ramanuj
    =====================================================================================================
    Detals of the task
    Task : APPL_MD-SOURCE
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    Task : EXPORT_MD_CONVERT
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    Task : FILE_TARGET
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    ================================================================================

    1. Perhaps you want to consider a system copy to a "virtual system" for UAT?
    2. Changes in QAS (as with PROD as well) will give you the delta. They should ideally be clean... You need to check the source system.
    Another option is to generate the profiles in the target system. But for that your config has to be sqeaky clean and in sync, including very well maintained and sync'ed Su24 data.
    Cheers,
    Julius

  • Power View in SharePoint Server - The data extension type for a data source is not valid

    Hi All,
    All of a sudden I am getting following error when trying to create Power View report using shared report data source (no error testing the connection):
    "The current action cannot be completed. The data extension type for a data source
    'http://dev/Shared Ducuments/Sales.rsds' is not valid for this operation"
    I already have a data source (I had created it after creating my site collection a week ago) and when I use this source to create Power View report then there is no error but I am getting above error when I create another similar data source and use it to create
    a Power View report.
    Please help me to resolve the error.
    Thanks

    I am going nuts! I had selected 'Analysis Services' instead of 'Microsoft BI Semantic Model for Power View'

  • Verizon is using the Elipsis tablets to steal data from their customers.  The only way to stop data from registering as cellular data is to pull the SIM card.  This occurs with wifi enabled and cellular,dara disabled.  If,you take your tablet to a Verizon

    Verizon is using the Elipsis tablets to steal data from their customers.  The only way to stop data from registering as cellular data is to pull the SIM card.  This occurs with wifi enabled and cellular,dara disabled.  If,you take your tablet to a Verizonstore they will upgrade your data plan to cover the overage and credit the upgrade.  You will then have to remember to downgrade your plan or continue to pay for more data.  Verizon, how would you feel if I walked into one of your stores and started filling my pockets with merchandise?  If cought can I just give the product back and say "oops, sorry"

    Today, my FCC complaint hit the same person working on the BBB complaint.  Jimmie has been very nice and seems willing to work with this problem.  We have been able to come to an agreement.  I paid the purchase price for the phone and he returned my upgrade and unlimited data plan.  This is what would have occurred if Verizon had given me correct information to begin with.  I am happy with this result.  He also brought quite a few instances concerning the handling of my transfer and upgrade that did not follow proper procedure.  I am also confident that I would not have resolved this without complaining to BBB and or FCC.  Verizon had no interest in solving the problem nor did they show any propensity to keeping a 20 year client.  Even though this last CSR was very polite and helpful, his sole job is to respond to formal Federal and State complaints.  He is required by law to address every complaint and report the reporting agency the agreed upon results - good or bad.  Again, I suggest - If you are not getting the proper customer service, complain to someone outside of Verizon.  Jimmie had not received any complaints registered with Verizon directly and I still have not had any contact with any other management representative that I was told would call.

  • Dump while loading data from POS DM to data source

    Hello experts
    I'm loading data from POS DM to data source 0RT_PA_TRAN_CONTROL, when amount of data is more then 1800000 records i get dump with the message "Unable to fulfil request for 8004 bytes of memory space."
    How can i divide data on small packets? or can you suggest other solution?
    I already check ztta/max_memreq_MB parameter and it's set to maximum.

    Check the parameter ztta/roll_area in RZ11 , changes to this parameter would affect all the datasources.
    The Maximum recommended size of a datapackage is 100.000 records and If a delta load is big, then divide the package into smaller using the parameter MAXDPAKS.
    check this note for more info 1292059.
    Also check below notes
    1102641 Consult: Dump during insert in PSA table due to incorrect values
    1163359 Load methods using SMQS or SAPI-controlled to transfer to BW
    1160555 Determination of maximum data packages in a delta request
    Points are appreciated.

  • Error 100 File contains erroneous data. Normally for user data files.

    Hello.  We have a LabVIEW program that reads in test steps and data in order to execute a test sequence.  Recently we had to apply Retina and Gold Disk security patches in accordance with DOD security policies.  Now we are getting the following error:
    Error 100
    LabVIEW: File contains erroneous data.  Normally for user data files.
    We have not changed the code or files that the program is reading in.  My guess would be it is some sort of permission issue.  However, we have given the user modify permission to the entire C drive, and still get this error.  Does anyone have any ideas on what could be causing it?  Thanks!

    Do you have any backup copies of the files, by any chance? Is it possible the files were modified somehow (perhaps something extra was added when the new security measures were implemented)?
    How is the file being accessed? Is it occurring on the local machine, or are the files accessed from a remote location?
    Caleb Harris
    National Instruments | Mechanical Engineer | http://www.ni.com/support

  • The error about loading data from ODS to master data.

    hi,experts,when I load the data from ODS to master data,there is an error messege:
    " InfoSource 8ZPA_0022 is not defined in the source system.
         Message no. R3005
    Diagnosis
         The InfoSource 8ZPA_0022 specified in the data request, is not defined
         in the source system.
    System response
         The data transfer is terminated.
    Procedure
         In the Administrator Workbench of the Business Information Warehouse,
         update the metadata for this source system, and delete the InfoPackages
         belonging to InfoSources that no longer existing ."
    But,the InfoSource 8ZPA_0022 is not problem,how can I do?thanks.

    Hi,
    as suggested by the message, did you update the meta data by doing a replication of datasources for the source system 'myself'? If not, do so and reactivate the communication structure. Then reload.
    regards
    Siggi

  • Data element creation for custom 'data element orgchart'

    Hi all,
    I'd like to create my own Data element orgchart, using self-developed functional modules to achieve more flexibility in outputting data. Is any recommendations or examples for in-out interface of such modules?
    Regards,
    Sergey Aksenov

    Hi Sergey,
    What are you trying to achieve here? It sounds like you want to create new structures or use FMs to replace the data retrieval for the existing structures. This cannot be done in OrgChart, at least not through the AdminConsole.
    The structure is called in one call, as well as the data in the nodes. However, for each data source (e.g. NakisaRFC) used in a node there will be a call. For example, if you show position data from HRP1000 and employee data from PA0001 in a node then there will be 2 calls (which calls the data for all nodes at once). You can use an FM to do one call, but this would be in the view and not in the structure. The details panel makes multiple calls to retrieve data, so you can write an FM to call all data in one call rather than many.
    You should refer to page 58 of the OrgChart Admin Guide for more information on custom RFCs.
    Best regards,
    Luke

  • Select data from table depending on date range

    i have first table with following data, This is calender for a year
    I have 2nd table , user created period from 1st table
    I want to select acctstartdate from 1st tale which are not within 2nd table period.
    I want to select acctstartdate as jan to aug only. I dont want to select acctstartdate for sept and oct-dec.
    Same with acctenddate. I want to select Jan-aug only
    How to do this ??
    h2007

    Do you mean this?
    SELECT *
    FROM Table1 t1
    WHERE NOT EXISTS (SELECT 1
    FROM Table2
    WHERE ACCTYRID = t1.ACCTYRID
    AND ACCTYR = t1.ACCTYR
    AND (t1.ACCTSTARTDATE BETWEEN ACCTSTARTDATE AND ACCTENDDATE
    OR t1.ACCTENDDATE BETWEEN ACCTSTARTDATE AND ACCTENDDATE
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Pls help : How To select fields and data from user_table for each tablename

    Please help with the query to generate a output which selects the code,meaning,inuse for each table in the user_table that has "CODED" as a part of table name.
    User table has some 800 table that contains CODED in the tablename.
    Desc of the table:
    DESCPTION:
    Name Null? Type
    SHORT_NAME NOT NULL VARCHAR2(20)
    CODE NOT NULL VARCHAR2(4)
    MEANING NOT NULL VARCHAR2(240)
    IN_USE VARCHAR2(1)
    NOTES VARCHAR2(2000
    UNITS NOT NULL VARCHAR2(1)
    AMOUNT NOT NULL VARCHAR2(3)
    CONVERTED VARCHAR2(1)
    RUN_NAME VARCHAR2(30)
    But all the table have code, meaning,in_use fields.
    O/P format :
    TABLE_NAME CODE MEANING IN_USE
    Help me pls.

    Not 100% sure what you want. If you want to see all the tables that have all three of those columns, then you could do something like:
    SELECT table_name, 'CODE', 'MEANING', 'IN_USE'
    FROM user_tab_columns
    WHERE column_name = 'CODE' and
          table_name like '%CODED%'
    INTERSECT
    SELECT table_name, 'CODE', 'MEANING', 'IN_USE'
    FROM user_tab_columns
    WHERE column_name = 'MEANING' and
          table_name like '%CODED%'
    INTERSECT
    SELECT table_name, 'CODE', 'MEANING', 'IN_USE'
    FROM user_tab_columns
    WHERE column_name = 'INUSE' and
          table_name like '%CODED%'If you want to select those three columns from each of the tables, then you could do something like this.
    Create a command file called, for example, makesel.sql that looks like:
    SET PAGES 0 lines 500 trimspool on feedback off;
    spool sel.sql;
    prompt spool selout.txt;
    SELECT 'SELECT '''||table_name||''', code, meaning, in_use FROM '||
           table_name||';'
    FROM (SELECT table_name
          FROM user_tab_columns
          WHERE column_name = 'CODE' and
                table_name like '%CODED%'
          INTERSECT
          SELECT table_name
          FROM user_tab_columns
          WHERE column_name = 'MEANING' and
                table_name like '%CODED%'
          INTERSECT
          SELECT table_name
          FROM user_tab_columns
          WHERE column_name = 'INUSE' and
                table_name like '%CODED%')
    prompt 'spool off;'
    spool off;
    @sel.sqlAt the sqlplus prompt run the file using @makesel.sql. This will create another file called sel.sql containing the commands to select those three columns from each table that has all three columns, then after the new file is created, it runs the file (@sel.sql). The output will be spooled to a file called selout.txt.
    HTH
    John

  • How to Cache Data  from database for java mapping ?

    Hi
      I have a scenario where  i have to dynamically query  a huge table in some other database from java mapping code.
    Therefore instead of making a new Database trip everytime is there a mechanism by which i can cache the entire table contents into XI first and then use this cache for looking up data through my java mapping.
    Any other alternative also welcome which would give best performance.
    Please Suggest
    regards
    Nilesh Taunk.

    Hi Nilesh,
    I am not sure if you can actually cache the table in XI. You will have to look up directly from your Database everytime your mapping excecutes.
    To perform DB lookup during mapping very efficiently, I would suggest that you take a look at this blog,
    /people/siva.maranani/blog/2005/08/23/lookup146s-in-xi-made-simpler
    Also, instead of doing the DB look up in mapping, you can also use your JDBC adapter as a sender and collect the information you want from your DB,
    If you are using your JDBC as a sender, then your JDBC will poll over your database and select the rows that satisfy your Select Query. Also, there is another field in your JDBC adapter that is very important and that is the Update satatement. Once your JDBC adapter executes your select query and selects rows from the database, you might not need those rows to be selected again. In this csse, you can use the Update statement to update the database.
    http://help.sap.com/saphelp_nw04/helpdata/en/7e/5df96381ec72468a00815dd80f8b63/content.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/2e/96fd3f2d14e869e10000000a155106/content.htm
    The choice between a JDBC sender adapter , and the DB lookup will have to be on the basis of your requirements.
    Regards,
    Bhavesh

  • Sql query - Selecting last recorded values for each date in specified period

    Hello,
    Can someone please help me with my problem.
    I'm trying to get last recorded balance for each day for specific box (1 or 2) in specified period of days from ms access database using ADOTool.
    I'm trying to get that information with SQL query but so far unsuccessfully...  
    My table looks like this:
    Table name: TestTable
    Date Time Location Box Balance
    20.10.2014. 06:00:00 1 1 345
    20.10.2014. 12:00:00 1 1 7356
    20.10.2014. 18:45:00 1 1 5678
    20.10.2014. 23:54:00 1 1 9845
    20.10.2014. 06:00:02 1 2 35
    20.10.2014. 12:00:04 1 2 756
    20.10.2014. 18:45:06 1 2 578
    20.10.2014. 23:54:10 1 2 845
    21.10.2014. 06:00:00 1 1 34
    21.10.2014. 12:05:03 1 1 5789
    21.10.2014. 15:00:34 1 1 1237
    21.10.2014. 06:00:00 1 2 374
    21.10.2014. 12:05:03 1 2 54789
    21.10.2014. 15:00:34 1 2 13237
    22.10.2014. 06:00:00 1 1 8562
    22.10.2014. 10:00:00 1 1 1234
    22.10.2014. 17:03:45 1 1 3415
    22.10.2014. 22:00:00 1 1 6742
    22.10.2014. 06:00:05 1 2 562
    22.10.2014. 10:00:16 1 2 123
    22.10.2014. 17:03:50 1 2 415
    22.10.2014. 22:00:10 1 2 642
    23.10.2014. 06:00:00 1 1 9876
    23.10.2014. 09:13:00 1 1 223
    23.10.2014. 13:50:17 1 1 7768
    23.10.2014. 19:47:40 1 1 3456
    23.10.2014. 21:30:00 1 1 789
    23.10.2014. 23:57:12 1 1 25
    23.10.2014. 06:00:07 1 2 976
    23.10.2014. 09:13:45 1 2 223
    23.10.2014. 13:50:40 1 2 78
    23.10.2014. 19:47:55 1 2 346
    23.10.2014. 21:30:03 1 2 89
    23.10.2014. 23:57:18 1 2 25
    24.10.2014. 06:00:55 1 1 346
    24.10.2014. 12:30:22 1 1 8329
    24.10.2014. 23:50:19 1 1 2225
    24.10.2014. 06:01:00 1 2 3546
    24.10.2014. 12:30:26 1 2 89
    24.10.2014. 23:51:10 1 2 25
    Let's say the period is 21.10.2014. - 23.10.2014. and I want to get last recorded balance for box 1. for each day. The result should look like this:
    Date Time Location Box Balance
    21.10.2014. 15:00:34 1 1 1237
    22.10.2014. 22:00:00 1 1 6742
    23.10.2014. 23:57:12 1 1 25
    So far I've managed to write a query that gives me balance for ONLY ONE date (date with highest time in whole table), but I need balance for EVERY date in specific period.
    My incorrect code (didn't manage to implement "BETWEEN" for dates...):
    SELECT TestTable.[Date], TestTable.[Time], TestTable.[Location], TestTable.[Box], TestTable.[Balance]
    FROM TestTable
    WHERE Time=(SELECT MAX(Time)
    FROM TestTable
    WHERE Location=1 AND Box=1 );
    Tnx!
    Solved!
    Go to Solution.

    For loop
    following query keep day (here 24 in below query) Variable from ( 1 to 28-29/30/31 as per month)
    SELECT TOP 1 TestTable.[Date], TestTable.[Time], TestTable.[Location], TestTable.[Box], TestTable.[Balance]
    FROM Test Table.
    WHERE  Time=(SELECT MAX(Time) FROM TestTable WHERE Location=1 AND Box=1 )
    AND DATE = "2014-10-24";
    PBP (CLAD)
    Labview 6.1 - 2014
    KUDOS ARE WELCOMED.
    If your problem get solved then mark as solution.

Maybe you are looking for