Flat File to IDOC- didn't get data into IDOC

Hi All,
Our scenario is Flat File to IDOC....
working fine... didn't get any errors... IDOC also generated but didn't get any data in that IDOC....
for details see the attachment...
Thanks In Advance,
vishnu.........

Hi Rajesh,
I am getting payload in RWB from sender Cc like below:
<?xml version="1.0" encoding="utf-8" ?>
- <ns:MT_File xmlns:ns="http://mouritech.com/file2idocsender"> 
- <MT_File> 
<Value>1</Value>  
<Reason>test</Reason>  
</MT_File>
</ns:MT_File>
If i test with this payload in mapping i am getting a pop up with this message and mapping not completed ...
Could you please let me know what i have to do now?
Pop Up: 
        The processing instruction target matching "[xX][mM][lL]" is not allowed.
See error logs for details    

Similar Messages

  • Getting data into an applet

    I am creating an applet to draw graphs.
    I have only made standalone applets before.
    My data will come from a database.
    The user will have the data selected for them, when they select the applet page, the data to be drawn will already have been selected.
    i.e. I only need to give data across in the initialisation stage of the applet.
    What is the standard way to get data into an applet?
    By a giving a file path, by direct contact with the DB, with direct contact with another servlet or by giving the data in the HTML page the Applet is contained in.
    (The data will come from the same server as the applet)
    Maybe somone can point me to a tutorial on this?

    I am currently developing some graph applets, too. At first I implemented direct database access via JDBC, but found out, that this ain't suitable, because you can get in serious trouble with customers who don't want to open database tcp ports in their internal firewall system which - from a security point of view - is a good decision.
    So I gathered information about other methods to get data across the network. RMI needs own ports just like JDBC. Direct access to port 80 would be a good, so one could use Java's network capabilities. But this needs lots of coding on both ends (server and client-applet) and so takes time and is error prone.
    Using SOAP leads to fat applets and so I gave XML-RPC a try. In my case I have an XML-RPC server written in PHP (using the PEAR module see: http://pear.php.net ) and in my applet I use the XML-RPC library from the Apache Group (see: http://ws.apache.org/xmlrpc/ ). This works just fine!
    Pros:
    - small library code in applet (less than 200 KB)
    - only needs one network port. standard is 80 (http) which is almost everywhere available
    - applet code is independent from server code and architecture, as almost every language has some XML-RPC module (other than RMI being a Java-only solution)
    - free of charge & open source
    - reliable
    Cons:
    - based on XML it can slow down things a bit when lots of data has to be sent over a slow network link - but that's more of a general problem in network application development
    - other than SOAP/RMI you can not transmit/access whole objects but only data of primitive types (int, boolean, string, ..), so you have to wrap and unwrap your data
    By now I am very pleased with the decision I made earlier this year. :)

  • Populating the data into idoc

    Hi
    Could you plz tell me the ways through which we can populate the data into idoc ?? I can think of two ways.
    1. By writing the report program and executing the same
    2. By change pointer concept.
    Are there any ways through which we can populate the data into idocs ??
    thanks
    Kumar

    Hi,
    Others are
    1. Creation through NAST message control
    2. Creation through workflow
    aRs

  • Hi, i purchased a 2 dvd digital set. i cant just download it straight to my ipad. they said i have to down load it to my i tunes, then can transfer to i pad. im not seeing how to click the files from my email, and get them into the i tunes acct... ugh.

    hi, i purchased a 2 dvd digital set. i cant just download it straight to my ipad. they said i have to down load it to my i tunes, then can transfer to i pad. im not seeing how to click the files from my email, and get them into the i tunes acct... ugh.
    i do not have a mac home pc. just a regular pc

    I had the same problem after I gave my old iPad to my parents and tried to install Netflix. This is what you have to do:  Open iTunes on your computer, the one you sync your iPad to. Then go to iTunes Store and search for and download Netflix app. After you download it, if your iPad is set to download new purchases it may start downloading on your iPad. If so, tap and hold to delete the app (because it is trying to install the new version on the iPad) Next step, go to the App Store on your iPad and find Netflix and it should say install since you already purchased it on the computer. Tap to install, and it will say the version is not compatible, tap to download a previous version. Click that and it will install the older version!    One more thing, if and when you sync to your computer again it will say something like " Unable to install Netflix on your iPad" Just click the box to never remind you again, because it's trying to sync the newer Netflix app to your iPad, but it doesn't work so it displays the message. The old app will remain on the ipad. Hope this helps, good luck

  • Not getting data into 3rd int table

    Hi all,
             here i ve data in 2 internal tables. this data i want store into 3rd internal table, here am adding with inner join but am not getting data into 3rd one.
    plz check my logic.
      REPORT  ZEXCHANGE_RETES                         .
    TABLES : tcurr,           " Exchange Rates
             /msg/rabr.       " Account (Posting Headers)
    DATA : l_date type datum.
    TYPES : begin of t_tcurr,
            kurst like tcurr-kurst,  " Exchange Rate type
            fcurr like tcurr-fcurr,   " From Currrency
            gdatu like tcurr-gdatu,   " Date as of which
        end of t_tcurr.
    TYPES : begin of t_rabr,
            OW_WHGNR like /msg/rabr-OW_WHGNR,
            bil_dat like /msg/rabr-bil_dat,
            abrnr like /msg/rabr-abrnr,
           end of t_rabr.
    TYPES : begin of t_output,
            kurst like tcurr-kurst,
            fcurr like tcurr-fcurr,
            gdatu like tcurr-gdatu,
            OW_WHGNR like /msg/rabr-OW_WHGNR,
            bil_dat like /msg/rabr-bil_dat,
            abrnr like /msg/rabr-abrnr,
           end of t_output.
    DATA : it_output TYPE STANDARD TABLE OF t_output WITH HEADER LINE,
            wa_output TYPE t_output.
    DATA : it_rabr TYPE STANDARD TABLE OF t_rabr WITH HEADER LINE,
            wa_rabr TYPE t_rabr.
    DATA : it_tcurr TYPE STANDARD TABLE OF t_tcurr WITH HEADER LINE,
            wa_tcurr TYPE t_tcurr.
    getting data into 1st itab
    SELECT kurst fcurr gdatu
              from tcurr into table it_tcurr
              where kurst EQ 'M'.
              SORT it_tcurr by  fcurr GDATU DESCENDING.
              delete adjacent duplicates from it_tcurr comparing fcurr.
    getting data into 2nd itab
       SELECT * FROM /msg/rabr into CORRESPONDING FIELDS OF TABLE it_rabr.
        SORT it_rabr BY OW_WHGNR bil_dat abrnr.
    getting data into 3rd itab
    SELECT t~kurst
            t~fcurr
            t~gdatu
            r~OW_WHGNR
            r~bil_dat
            r~abrnr
            FROM tcurr as t INNER JOIN
            /msg/rabr as r on tfcurr EQ rOW_WHGNR into table it_output
            WHERE rabrnr BETWEEN '00000000000000800251' AND '00000000000000800300' AND rbil_dat < wa_tcurr-gdatu.
    printing output
    LOOP at it_output into wa_output.
    WRITE: /10 wa_output-kurst,
             15 wa_output-fcurr,
             25 wa_output-gdatu,
             50 wa_output-OW_WHGNR,
             60 wa_output-bil_dat,
             80 wa_output-abrnr.
    ENDLOOP.
    here am not getting data into 3rd i tab.
      Thanks & Regards,
    sudharsan.

    Hi,
    The select command is the most fundamental function of writing ABAP programs allowing the retrieval of data from SAP database tables.
    Try filling the 3rd internal table with Loop ... Endloop.
    Loop at t_tcurr.
    Read table t_rabr with key field1 = t_tcurr-field1.
    If sy-subrc  = 0.
    Move t_tcurr-field1 = itab_final-field1.
    Move t_tcurr-field2 = itab_final-field2.
    Move  t_rabr -field3 = itab_final-field3.
    Move  t_rabr -field4 = itab_final-field4.
    Append itab_final.
    Endloop.
    Hope this helps you.
    Regards,
    Ruthra

  • Flat file (csv) to oracle database - get dictionary data from oracle table

    Hello,
    I need to develop following scenario. I have flat csv file with some data, for example:
    City;Address;Name
    In oracle schema, there is table where data from imported csv should be written. Also, in the same schema, there is table named CityDictrionary. After ODI loads each row from csv, it should translate text representation of City to numerric ID got from CityDictionary table. And this numeric value should be placed to destination table, instead of direct text value from csv.
    What is the simplest way to accomplish this task? Can you provide any tips?

    You can achieve this easily with a single ODI interface. The flat file and the CityDictionary tables are your sources. Identify the field on the CityDictionary table and the corresponding field on the flat file that will be used to join the data sets (e.g. the city name). Create a join between the sources using these fields. Because you're using a flat file as one of your sources, the join logic will have to be performed on either the staging or the target.
    The target is your destination table, and you should map the ID from the CityDictionary table to the appropriate field in the target, as well as any other required fields from the flat file.
    This type of interface is a fairly typical method of populating a normalized table.
    Alternatively, you can use a lookup - see the following blog for an example. In your case, the flat file will be your source and the CityDictionary table will be used for the lookup.
    http://www.odigurus.com/2012/02/lookup-transformation-using-odi.html
    Edited by: _Phil on Oct 1, 2012 11:52 PM
    Edited by: _Phil on Oct 1, 2012 11:57 PM                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Function module to get data into internal table from Excel file sheets

    Hi,
    I have to upload customers from excel file.
    we are donloading customer data excel file sheets.
    Customer data in 1 sheet, tax data the other sheet of same excel file, Customer master-Credit data in other sheet of same excel file.
    so i have 3-4 sheet in one excel file.
    now my requirement is to get the data from excel file into internal table.
    is there any function module.
    Thanks & Regards

    I am sending you the idea with an example how you can upload data from an EXCEL file into an internal table. I am not sure if you can take data from different sheet in the same EXCEL file. I think that this is not possible (try it )
    Upload the data into an internal table, like the way that I am describing in the above:
      DATA: L_MAX_COL_NB TYPE I.
      DATA: l_file_name LIKE RLGRAP-FILENAME.
    Just to be sure that is the correct type for the FM.
      l_file_name = P_FILE_NAME.
      L_MAX_COL_NB = 58.  "Maximum nb of colums that the FM can read.
      CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
           EXPORTING
                FILENAME                = l_file_name
                I_BEGIN_COL             = 1
                I_BEGIN_ROW             = 2
                I_END_COL               = L_MAX_COL_NB
                I_END_ROW               = 9999
           TABLES
                INTERN                  = PT_EXCEL
           EXCEPTIONS
                INCONSISTENT_PARAMETERS = 1
                UPLOAD_OLE              = 2
                OTHERS                  = 3.
      IF SY-SUBRC <> 0.
      ENDIF.
    Now you should upload the data into your own itab. The Function Module will return to you all the an itab
    from all fields and columns. Define the structure of the uploading file into SE11 - Data Dictionary. Then read the fieldcatalog of this structure. In the code that I am sending to you, I am insearting an empty line into the internal table and then I am assigning this line into a corresponding field-symbol. Then I am able to change the working area - so and the line of the itab. Propably you could you the statement APPEND INITIAL LINE TO (your_table_name) ASSIGNING <your_field_symbol>, but the example was written in an old SAP version.
      FIELD-SYMBOLS:
                     <F_REC> LIKE WA_UPLOAD_FILE,      "working are of the uploading file
                     <F_FIELD> TYPE ANY.
      DATA: COLUMN_INT TYPE I,
            C_FIELDNAME(30) TYPE C.
      PERFORM GET_FIELDCATOLG TABLES FIELDCAT
                               USING 'ZECO_CHARALAMBOUS_FILE'.
      LOOP AT PT_EXCEL.
        AT NEW ROW.
          ASSIGN WA_UPLOAD_FILE TO <F_REC>.
        ENDAT.
        COLUMN_INT = PT_EXCEL-COL.
        READ TABLE FIELDCAT INTO WA_FIELDCAT INDEX COLUMN_INT.
        CONCATENATE '<F_REC>-' WA_FIELDCAT-FIELDNAME INTO C_FIELDNAME.
        ASSIGN (C_FIELDNAME) TO <F_FIELD>.
        <F_FIELD> = PT_EXCEL-VALUE.
        AT END OF ROW.
          APPEND WA_UPLOAD_FILE TO GT_UPLOAD_FILE.
          CLEAR WA_UPLOAD_FILE.
        ENDAT.
      ENDLOOP.
    With Regards
    George
    Edited by: giorgos michaelaris on Mar 4, 2010 3:44 PM

  • How to dump table to the flat file, if the table has NVARCHAR data type fie

    I need to dump the table to the text file. The table has NVARCHAR data type field and simple select * from table_name does not work. What do I have to do? Do I need convert NVARCHAR to VARCHAR and if yes, then how?
    Thanks,
    Oleg

    I need to dump the table to the text file. The table has NVARCHAR data type field and simple select * from table_name does not work. What do I have to do? Do I need convert NVARCHAR to VARCHAR and if yes, then how?
    Thanks,
    Oleg

  • Problem in getting data into database with standard direct input program

    HI All,
    I am having problem which is not updating the records in MM01 or MM02 with standard direct input program. i have data in internal table. from that table i am trying to upload into database by using background job MRP_MATERIAL_MASTER_DATA_LOAD.
    when i execute my program it is showing message job is started. then i go into sm37 and seethe job status by executing. there also i am seeing job completed succesfully.
    but if i go to mm03 and find the materials are updated or created there i couldn't find the material numbers which are from internal table.
    So if ny one help me it wil be great.
    Thanks in Advance
    Venkat N

    Hi Anil,
    Thanks for your answer, but i am facing problem is i have material no and denominator and Actual UOM and nominator field values in the flat file.....
    by using RMDATIND direct input program with MRP_MATERIAL_UPLOAD as job name for background job while uploading data into database.
    here i am not getting data in to database, but when i execute the job in sm37 it is showing that message job processing successfully completed...this is my status..
    if u can help me in this it will be gr8ful..
    Thanks,
    Venkat N

  • How to get data into the mySQL database?

    First some background.
    I have a website that has outgrown its designed dimensions and is a huge burden to maintain. See PPBM5 Benchmark
    There is a lot of maintenance work involved, so I'm investigating a PHP/MySQL approach to easen the burden and to add functionality to the site. With the current Excel based structure and over 420 entries, it is cumbersome for me to maintain, but also for users to find what they need.
    A MySQL based dynamic structure is a lot easier and offers vastly more selection capabilities, like selecting only records that meet specific criteria.
    Data submission is done with a form, that contains most of the relevant data, but the drawack is that people submitting their data are often not technically inclined, give wrong answers due to a lack of understanding or making typo's. The test results are attached in one or two separate .txt files, but often they have not read the instructions correctly or did something wrong, so these attached .txt files can not be trusted automatically, they have to be checked before inclusion.
    These were my initial thoughts:
    1. Data collection:
    To avoid spending all our energy and time  on correcting typo's, getting missing data, correcting errors, I am  investigating the use of CPU-Z in Ghost mode to create a .txt or .html  file that contains all relevant hardware info we need and even more. It gives all the info we currently have, but adds  data like number of memory sticks, DDR timings, stock clock speed and  BCLK setting, video card info and VRAM size, etc.
    To see what I mean, run CPU-Z, go to the About tab and press the Save Report button and look at the results.
    This can all be done without user intervention in an automatic way, but  maybe I need to add an Auto-It file to the test to make it all run as  desired.
    If this works and I'm able to extract the relevant data from the created  file and can insert it into the database, we may be in business for the  next version of PPBM5.5 or PPBM6. It does require a modification to the instructions, making them a lot  easier, because there is less data to fill out.
    2. Data submission:
    The submission form can be simplified if  the CPU-Z data can be used. We have to create an automatic way to attach  the created .html file from CPU-Z to the submission form and we have to  streamline the Output.txt and Output-MPE.txt files to be more easily included in the 'form.lib.php' file. It  currently is manual labor and very time consuming.
    3. Adding to Database:
    I have to find a way to create database  records from the Gmail forms I receive. All incoming mail messages need  to be checked on relevancy and if relevant, need to be added  automatically to the database and then offered for approval before final inclusion in the database. Data included in the database  will then include submission date and time, Email address,  IP address  used, plus links to the files submitted and available on the website.
    4. Publication of the database:
    After approval of new records from step  3, all updates will be automatically applied to the database and  accessible for users. I do not yet intend to introduce a user account ,  requesting login before all functionality is accessible. Too much trouble and administration.
    Queries should be possible on things like CPU (check box), so include  17-920, i7-930, i7-950 but exclude i7-980X and i7-990X, Size of memory  (check box), Overclocked (boolean, yes, no), SSD as OS disk, and similar  options.
    The biggest problem is to keep the color grading and statistical  indicators (Top, D9, Q3, Med, Q1 and D1) intact on dynamically generated  queries. Say you make a query which results in 20 observations, this  should show the related colors and legends. Next query results in 48 observations and of course the color grading and legends  do need to reflect that. Question in my mind, does the RPI remain  constant, independent of the query or does that need to be recalculated  on the basis of the query?
    Next thing is to allow a user to select a specific observation and by  simply clicking on it be shown, in a separate window (detail page) or  accordion, all the CPU-Z related information about the hardware.
    The graphs, Top-20 and MPE Gains, need to be dynamically adjusted, based on the query used.
    5. Ideally, external links:
    In an ideal situation, one could link the  CPU-Z data to external price databases, looking up current prices for  CPU, memory, video card, disks, raid controller, etc. to get instant  BFTB charts, based on the query made. But that is the next step.
    Situation now:
    I have a MySQL database that is easily updated with the new submissions. Simply create a .CSV flie from the submitted forms and import that into the database. The bulk of the initial work is done.Lots remain to be done as you can see above, but that is for a later time.
    Question:
    I have this table, that needs to be filled with data in the submitted and attached files. Mr. X submitted his data and can be uniquely identified by his "Ref_ID". He attached one or two files in .TXT format with the relevant test data. These files are stored on the server with a concatenated name:
    "Ref_ID","-","filename"
    Say his Ref-ID is: 20110204-6cf5 and his submitted file is called: Output(99).txt then the file can be found on the server as
    20110204-6cf5-Output(99).txt
    I need to be able to open that comma delimited file, the contents may look like this: "439","1036","819","531" and insert these contents into the relevant record and fields.
    Graphically,
    is what I want to achieve.
    This being my first exposure to PHP/MySQL, you can imagine I'm not clear on how to go from here.
    Added complication is that I actually have 5 numbers to insert per record and two calculated fields, Total Score and RPI should be calculated fields. Haven't yet figured out how to handle calculated fields, maybe only in the PHP/HTML code and not in the database.
    I hope someone can help me.

    You do have a very complex looking site and may need several tables in mysql to handle all that data. If you knew to phpmysql I would suggest taking a look at this tutorial it will help get you started in understanding how to $_GET info from a database and also how to $_POST data to a database. I am no expert just learning myself and I found this very helpful. This is the link http://www.adobe.com/devnet/dreamweaver/articles/first_dynamic_site_pt1.html
    There are also many tutorials on Youtube to help build a CMS Content Management Site I would suggest the following: -
    http://www.youtube.com/user/phpacademy
    http://www.youtube.com/user/betterphp
    http://www.youtube.com/user/flashbuilding
    And many more on my channel here
    http://www.youtube.com/user/Whisperingonthewind
    CMS's are easier to maintain, add edit and delete content.
    I have also recently bought a Book by David Powers Training from the Source very helpful.
    Anyway hope you get it sorted.

  • Getting Data INTO XML

    I'm trying to create a class to build an XML file from data that I'm setting inside the class. I'm using a ContentHandler to put the data into a SAXParser. Now I need to use a transformer to put the data into XML. My problem is, it's calling for an InputSource as a SAXSource and I'm not sure how to get the data from the ContentHandler to an InputSource. Can anyone help me with this? I've tried the tutorial and it wasn't much help.
    Linda Shaffer

    You are sending a stream of SAX events to your ContentHandler? By which I mean, you are calling its startElement() method and so on? If so, then here's some code that I use to transform that virtual XML via XSLT:TransformerHandler handler = // a TransformerHandler using that XSLT
    handler.setResult(new StreamResult(whereverYourOutputGoes));
    SAXParserFactory spf = SAXParserFactory.newInstance();
    XMLReader reader = spf.newSAXParser().getXMLReader();
    reader.setContentHandler(handler);
    reader.setProperty("http://xml.org/sax/properties/lexical-handler", handler);
    reader.setFeature("http://xml.org/sax/features/namespaces", true);
    reader.setFeature("http://xml.org/sax/features/namespace-prefixes", false);That's all. The TransformerHandler object is the one on which you should be calling startDocument() and so on; as soon as you call endDocument(), the transformation and output will take place.
    If you aren't transforming, just outputting, then use an identity TransformerHandler.

  • Reading file from ftp server and importing data into table

    Hi experts,
    Well basically i have text files with different layout that have been uploaded on an ftp server. Now i have to write a procedure to fetch those files, read them and insert data in a table... can that be done?
    your precious help would be greatly helpful.
    Thanks

    declare
    file1 UTL_FILE.FILE_TYPE;
    filename varchar2(1000) := 'GTECHFILES';
    str long;
    begin
    file1 := UTL_FILE.FOPEN (filename,'agent_dump_csv.rep','r',32767);
    loop
    UTL_FILE.GET_LINE ( file1, str );
    --dbms_output.put_line('Value is :'||to_char(str));
    end loop;
    UTL_FILE.FCLOSE( file1 );
    exception
    when no_data_found then
    dbms_output.put_line('END OF FILE');
    UTL_FILE.FCLOSE( file1 ) ;
    when others then
    UTL_FILE.FCLOSE( file1 ) ;
    dbms_output.put_line('ERROR: '||sqlcode||':'||sqlerrm) ;
    end;
    i have managed to write this piece of code and all lines are being read and now i need to insert data into my table and the fields are seperated by a `|` i am strill trying to figure how to do that now. help ...
    Edited by: Kevin CK on 17-Jan-2010 22:40

  • Just finished using iTunes, closed out and then tried to get back in.  Got this message "he iTunes library .itl file is locked, on a locked disk, or you do not have write permission for this file."  How can I get back into iTunes ?

    I just finished using iTunes, closed out and then tried to get ack in.  Got this message "The iTunes library .9tl file is locked, on a locked disk, or you do not have write permission for this file."   How can I get back ino iTunes ?

    I actually figured it out...I had to go to the iTunes Library Extras.itdb file and give myself permission to have full control.  THEN, I could go and estore a previos version.  Once I had done this, I got the same message for iTunes Library Genius.itdb . . . I did the same thing with it and Voila'!!
    Hope this helps...
    SVT

  • Getting data into internal table wa_final

    Hi Guys
    I have change the logic for this program , I have created 2 workareas and internal table now I need help to place the data into
         wa_final-max_date = wa_data-idate.
             wa_final-min_date  = wa_data-idate.
             wa_final-max_km  = wa_data-recdv.
             wa_final-min_km  = wa_data-recdv.
             wa_final-max_hR   = wa_data-recdv.
             wa_final-min_hR   = wa_data-recdv.
             wa_final-max_lit  = wa_data-recdv.
             wa_final-min_lit  = wa_data-recdv.
             wa_final-t_max_min_km   = wa_data-recdv.  " min_km - max_km
             wa_final-t_max_min_hr  = wa_data-recdv.   " min_hr - max_hr
             wa_final-t_max_min_lit  = wa_data-recdv.  " min_lit - max_lit.
    so how can I put the logic to the value please correct my program.seeing my program can anyone give some idea ?
    REPORT Z_FUEL_MONTHLY_QTY LINE-SIZE  260 LINE-COUNT 75
             NO STANDARD PAGE HEADING.
    TABLES : equi,
             equz,
             imptt,
             imrg,
             eqkt,
             iloa.
    Type Declaration
    *DATA: BEGIN OF ty_equi occurs 0,
         equnr type equi-equnr,
         END OF ty_equi.
    *DATA: BEGIN of ty_eqkt occurs 0,
         equnr type eqkt-equnr,
         eqktx type eqkt-eqktx,
         END OF ty_eqkt.
    *DATA: BEGIN of ty_iloa occurs 0,
         iloan type iloa-iloan,
         eqfnr type iloa-eqfnr,
         END OF ty_iloa.
    *DATA: BEGIN of ty_imptt occurs 0,
         mpobj type imptt-mpobj,
         END of ty_imptt.
    *DATA: BEGIN of ty_imrg occurs 0,
         idate type imrg-idate,
         recdv type imrg-recdv,
         recdu type imrg-recdu,
         END of ty_imrg.
    TYPES:  BEGIN OF ty_data  ,
             equnr      type equnr,         " Euipment no
             eqktx      type eqkt-eqktx,    " Equipment Text
             eqfnr       type iloa-eqfnr,     " Equipment Sort field
             idate      type imrg-idate,    " Measuring Date
             recdu      type imrg-recdu,    " Unit of measuring ='KM','L','H'
             recdv      type imrg-recdv,    " Counter reading data
           END OF ty_data.
    TYPES: BEGIN OF ty_final,
             equnr           type equnr,            "  Equipment no
             eqktx           type eqkt-eqktx,       "  Equipment Text
             eqfnr           type iloa-eqfnr,       "  Equipment Sort field
             min_date        type imrg-idate,       "  Min Date
             min_km          type p decimals 2,     "  Max Km
             max_km          type p decimals 2,     "  Min km
             t_max_min_km    type i,                "  Total min_km-max_km
             max_date        type imrg-idate,       "  Max Date
             min_hr          type imrg-recdv,       "  Max hr
             max_hr          type imrg-recdv,       "  Min hr
             t_max_min_hr    type i,                "  Total min_hr-max_hr
             min_lit         type imrg-recdv,       "  Max lit
             max_lit         type imrg-recdv,       "  Min lit
             t_max_min_lit   type i,                "  Total min_lit-max_lit
             fuel_con        type p decimals 2,     "  Total_hrs / t_max_min_hr
             km_l            type p decimals 2,     "  km / L
             lit_per_hr      type i           ,     "  fuel comsumed / t_max_min_hr
           END OF ty_final.
    DATA: i_data TYPE TABLE OF ty_data, " internal table
    wa_data TYPE ty_data, " work area
    i_final TYPE TABLE OF ty_final, " internal table
    wa_final TYPE ty_final. " work area
    DATA :  max_date type date ,
             min_date type date,
             max_km TYPE p DECIMALS 2,
             min_km TYPE p DECIMALS 2,
             max_hr TYPE p DECIMALS 2,
             min_hr TYPE p DECIMALS 2,
             max_lit TYPE p DECIMALS 2,
             min_lit TYPE p DECIMALS 2,
             t_max_min_km  TYPE p DECIMALS 2,
             t_max_min_hr TYPE p DECIMALS 2,
             t_max_min_lit TYPE p DECIMALS 2.
    SELECTION-SCREEN BEGIN OF BLOCK blk WITH FRAME.
    SELECTION-SCREEN BEGIN OF BLOCK blk1 WITH FRAME TITLE text-001.
    SELECT-OPTIONS: p_equnr FOR equi-equnr, "no-extension no intervals,
                    p_idate FOR imrg-idate.  "NO-EXTENSION NO INTERVALS OBLIGATORY,
                    "p_recdu FOR imrg-recdu NO-EXTENSION NO INTERVALS ."default 'M3'" OBLIGATORY.
    SELECTION-SCREEN END OF BLOCK blk1.
    SELECTION-SCREEN BEGIN OF BLOCK blk2 WITH FRAME TITLE text-002.
    SELECTION-SCREEN END OF BLOCK blk2.
    SELECTION-SCREEN END OF BLOCK blk.
    TOP-OF-PAGE.
      FORMAT INTENSIFIED ON.
      WRITE:/1(40) ' INVESTMENT LIMITED  '.
      WRITE:/50(40) ' FUEL CONSUMPTION REPORT ' CENTERED   ,
              2 'Page', sy-pagno.
      FORMAT INTENSIFIED OFF.
      WRITE:/50(40) '----
    ' CENTERED .
      FORMAT INTENSIFIED ON.
      WRITE:/2 sy-datum COLOR 3, sy-uzeit .
      "WRITE:/1 S903-SPMON ."p_yearf.
      ULINE.
      "CENTERED.
      write: /2 'Date From     :'.
      write: /2 'Equipment No  :'.
      write: /2 'Unit          :'.
      SKIP.
      ULINE.
      WRITE:/1 sy-vline,
        2   'EQUIP NO',              10 sy-vline,
        11  'NAME',                  40 sy-vline,
        41  'SORT',                  60 sy-vline,
        61  'MIN DATE',              74 sy-vline,
        75  'MAX DATE',              87 sy-vline,
        88  'MIN KM',                100 sy-vline,
        101  'MAX KM' ,              113 sy-vline,
        114 'TOTALK',                126 sy-vline,
        127  'MIN HR',               139 sy-vline,
        140 'MAX HR',                152 sy-vline,
        153 'TOTALH' ,               167 sy-vline,
        168 'MIN LIT',               180 sy-vline,
        181 'MAX LIT',               193 sy-vline,
        194 'TOTALL',                206 sy-vline,
        207 'FUEL CON',              219 sy-vline,
        220 'KM L',                  232 sy-vline,
        233 'LIT PER KM',            246 sy-vline.
      FORMAT COLOR 3 ON.
      ULINE.
    END-OF-PAGE.
    START-OF-SELECTION.
    select a~equnr d~eqktx f~eqfnr e~idate e~recdu e~recdv
    into table i_data
    from equi AS a
    inner join equz as b
    on a~equnr = b~equnr
    inner join iloa as f
    on b~iloan = f~iloan
    inner join imptt as c
    on a~objnr = c~mpobj
    inner join eqkt as d
    on a~equnr = d~equnr
    inner join imrg as e
    on e~point = c~point
    where a~equnr in p_equnr
    and
    e~idate in p_idate.
    loop  at i_data into wa_data.
    CLEAR: wa_final.
      READ TABLE i_final into wa_final
               with key equnr = wa_data-equnr.
        if sy-subrc EQ 0.
          PERFORM prepare_final_rec USING'M'. " Modify Existing Record
         ElSE.
          PERFORM prepare_final_rec USING'A'. " Append New Record.
        ENDIF.
        ENDLOOP.
        LOOP AT i_final into wa_final.
        WRITE:/1 sy-vline,
    2  wa_final-equnr                                                 , 10 sy-vline,
    11 wa_final-eqktx                                                 , 40 sy-vline,
    41 wa_final-eqfnr                                                 , 60 sy-vline,
    61 wa_final-min_date                                              , 74 sy-vline,
    75 wa_final-max_date                                              , 87 sy-vline,
    88 wa_final-min_km EXPONENT 0 DECIMALS 2 LEFT-JUSTIFIED           , 100 sy-vline,
    101 wa_final-max_km EXPONENT 0 DECIMALS 2 LEFT-JUSTIFIED          , 113 sy-vline,
    114 wa_final-t_max_min_km EXPONENT 0 DECIMALS 2 LEFT-JUSTIFIED    , 126 sy-vline,
    127 wa_final-min_hr EXPONENT 0 DECIMALS 2 LEFT-JUSTIFIED          , 139 sy-vline,
    140 wa_final-max_hr EXPONENT 0 DECIMALS 2 LEFT-JUSTIFIED          , 152 sy-vline,
    153 wa_final-t_max_min_hr EXPONENT 0 DECIMALS 2  LEFT-JUSTIFIED   , 167 sy-vline,
    168 wa_final-min_lit EXPONENT 0 DECIMALS 2 LEFT-JUSTIFIED         , 180 sy-vline,
    181 wa_final-max_lit EXPONENT 0 DECIMALS 2 LEFT-JUSTIFIED         , 193 sy-vline,
    194 wa_final-t_max_min_lit EXPONENT 0 DECIMALS 2 LEFT-JUSTIFIED   , 206 sy-vline,
    207 wa_final-fuel_con EXPONENT 0 DECIMALS 2 LEFT-JUSTIFIED        , 219 sy-vline,
    220 wa_final-km_l EXPONENT 0 DECIMALS 2 LEFT-JUSTIFIED            , 232 sy-vline,
    233 wa_final-lit_per_hr EXPONENT 0 DECIMALS 2 LEFT-JUSTIFIED      , 246 sy-vline.
    ULINE.
    endloop.
    FORM prepare_final_rec  USING    p_mode TYPE char1.
    SORT i_data BY equnr idate descending .
            if wa_data-recdu = 'KM'.
            max_km = wa_data-recdv.
            min_km = wa_data-recdv.
            endif.
            if wa_data-recdu ='H'.
            max_hr = wa_data-recdv.
            min_hr = wa_data-recdv.
            endif.
            if wa_data-recdu ='L'.
            max_lit = wa_data-recdv.
            min_lit = wa_data-recdv.
           endif.
       at new equnr.
           read table i_final into wa_final index sy-tabix.
           write:/ wa_final-equnr, wa_final-eqktx ,wa_final-eqfnr ,wa_final-idate ,
           'Min KM',min_km EXPONENT 0 DECIMALS 2 color 7 ,
            'Min H',min_hr EXPONENT 0 DECIMALS 2 color 7 ,
             'Min L',min_lit EXPONENT 0 DECIMALS 2 color 7.
       endat.
    *at end of equnr.
           read table i_data into wa_data index sy-tabix.
           write:/ wa_final-equnr, wa_final-eqktx ,wa_final-eqfnr ,wa_final-idate ,
           'Max KM', max_km EXPONENT 0 DECIMALS 2 color 7,
           'Max H', max_hr EXPONENT 0 DECIMALS 2 color 7,
           'Max L', max_lit EXPONENT 0 DECIMALS 2 color 7.
    *endat.
             wa_final-max_date = wa_data-idate.
             wa_final-min_date  = wa_data-idate.
             wa_final-max_km  = wa_data-recdv.
             wa_final-min_km  = wa_data-recdv.
             wa_final-max_hR   = wa_data-recdv.
             wa_final-min_hR   = wa_data-recdv.
             wa_final-max_lit  = wa_data-recdv.
             wa_final-min_lit  = wa_data-recdv.
             wa_final-t_max_min_km   = wa_data-recdv.  " min_km - max_km
             wa_final-t_max_min_hr  = wa_data-recdv.   " min_hr - max_hr
             wa_final-t_max_min_lit  = wa_data-recdv.  " min_lit - max_lit.
      IF p_mode = 'A'.
        wa_final-equnr = wa_data-equnr.
        wa_final-eqktx = wa_data-eqktx.
        wa_final-eqfnr = wa_data-eqfnr.
        APPEND wa_final TO i_final.
      ELSE.
        MODIFY i_final FROM wa_final
          TRANSPORTING
              max_date
              min_date
              max_km
              min_km
              max_hr
              min_hr
              max_lit
              min_lit
              t_max_min_km
              t_max_min_hr
              where equnr = wa_data-equnr.
      ENDIF.
    ENDFORM.                    " PREPARE_FINAL_REC
    regards;

    Hi
    Thanks , where to use the loop and endloop.
    I want individual data to go into wa_final but confused how to start to get
    min date..max..date..minkm_value..maxkm_value...total min-max,
    min_hr,max_hr,total diff , min_lit,max_lit, total as in the program but how the data will go according to the values in wa_final
    regards
    Piroz

  • How to Create a Rich Dynamic Input and Output PDF to insert and get data into, from Database (MSSQL)

    HI ,
    I want to use Adobe LiveCycle Designer and Adobe LiveCycle Workbench features to create a Dynamic PDF form which can allow me to store data into database(MSSQL Server) and Can also provide me the Output PDF form with all the information which has been been filled in by the User in input form. Both Input and Output forms must be Dynamic. I am stuck in this process and Need Experts Advice on the Complete optimal process flow of the Adobe.
    Regards
    Ritesh Grover

    HI
    Go to the lay out of your screen and doublr click on the table control fields, you can get the Properties/Attribute of the screen or table control Fields
    Assing a Group GRP1 for all the fields i n the table control.
    in PBO
    if ok_code = 'INPUT'.
    LOOP AT SCREEN.
    IF screen-grp1 = 'GRP1'.
    screen-input = 1.
    modify screen.
    endif.
    ENDLOOP.
    elseif ok_code = 'OUTPUT'.
    LOOP AT SCREEN.
    if screen-grp1 = 'GRP1'.
    screen-input = 0.
    modify screen.
    endif.
    endloop.
    endif.
    Regards
    Ramchander Rao.K
    Edited by: Ramchander Krishnamraju on Aug 8, 2009 5:27 AM

Maybe you are looking for