Fragementation of table data in the DB file

Hi,
I am working on an OLTP system where this OLTP is residing on an oracle 10G database[database name ORCL] now further it has got a user OLTP which contains about 50 tables and all these tables are residing physically on a single database file[dbf file] and tablespace OLTP is pointing to this dbf file which is OLTP.dbf, Now there are a certain applications that are front end applications and inserting/updating/deleting in these 50 tables, I am taking a case of inserts so that makes me believe that the individual table data stored in the OLTP dbf file is not getting stored by individual table, now I am facing a lot of performance issues and instead I am trying to fragment data by table, so my questions are:
1) is there a way to fragment data in the physical dbf file such that during the weekend I could go on and fragement the data when the applications are not encountering heavy loads, this is ultimately going to help me with performance issues.
2) is there any way to physically see data in the dbf file since using this information I can solve a lot of my performance issues.
Many thanks
Rahul

Thanks BCV for the link it is very informative.
However I am not concerned about weather some blocks are under utilized or not(the link is actually suggesting a manual way of bringing the data together and I can further pack it up in a plsql procedure with a cursor reading information about my concerned tables by setting up a table with concerned table names), my concern is to bring the reserved blocks for a table together(since inserts will happen on a daily basis on a table and that makes me believe that the blocks reserved will be scattered throughout the DBF file and since they are scattered every time oracle needs to refer a particular table and  its full table scan would take more IO on disk if the blocks are scattered throughout the database file as compared to contiguous blocks for a particular table(meaning defragmenting the table data))
PS: I am trying to clear out a few doubts, may be I am not able to understand everything in the link, it would be great if some1 with experience with this kind of activity could throw more light on this topic.
Many thanks
Rahul

Similar Messages

  • Is it possible to view/get the table data from a dump file

    I have dmp files generated using expdp on oracle 11g..
    expdp_schemas_18MAY2013_1.dmp
    expdp_schemas_18MAY2013_2.dmp
    expdp_schemas_18MAY2013_3.dmp
    Can I use a parameter file given below to get the table data in to the sql file or impdp the only option to load the table data in to database.
    vi test1.par
    USERID="/ as sysdba"
    DIRECTORY=DATA
    dumpfile=expdp_schemas_18MAY2013%S.dmp
    SCHEMAS=USER1,USER2
    LOGFILE=user_dump_data.log
    SQLFILE=user_dump_data.sql
    and impdp parfile=test1.par.

    Hi,
    To explain more about my situation.
    Target database has the dump files, where as the source database (cloud) doesn't have access to the target database.
    However, I can request the target DB DBA team to run the par files and provide me the output like a SQL file which I can take and run it in my source database.
    I got the metadata the same way, but is there any way I could get the data from the dump files on the target DB without actually accessing it? My question might sound weird, but please let me know.
    <
    1. export from the source into a dumpfile. You can do this on the source database and then copy the file over to your local database or you can do this from a local database if you have a database link defined on the local database that points to the source database.  In the second case, your dumpfile will be created on your local database.
    >
    How can i access the dump using the database link?

  • How can we export table data to a CSV file??

    Hi,
    I have the following requirement. Initially business agreed upon, exporting the table data to Excel file. But now, they would like to export the table data to a CSV file, which is not being supported by af:exportCollectionActionListener component.
    Because, when i opened the exported CSV file, i can see the exported data sorrounded with HTML tags. Hence the issue.
    Does someone has any solution for this ... Like, how can we export the table data to csv format. And it should work similar to exporting the data to excel sheet.
    For youre reference here is the code which i have used to export the table data..
    ><f:facet name="menus">
    ><af:menu text="Menu" id="m1">
    ><af:commandMenuItem text="Print" id="cmi1">
    ><af:exportCollectionActionListener exportedId="t1"
    >title="CommunicationDistributionList"
    >filename="CommunicationDistributionList"
    >type="excelHTML"/> ---- I tried with removing value for this attribute. With no value, it did not worked at all.
    ></af:commandMenuItem>
    ></af:menu>
    ></f:facet>
    Thanks & Regards,
    Kiran Konjeti

    Hi Alex,
    I have already visited that POST and it works only in 10g. Not in 11g.
    I got the solution for this. The solution is :
    Use the following code in jsff
    ==================
    <af:commandButton text="Export Data" id="ctb1">><af:fileDownloadActionListener contentType="text/csv; charset=utf-8"
    >filename="test.csv"
    >method="#{pageFlowScope.pageFlowScopeDemoAppMB.test}"/>
    ></af:commandButton>
    OR
    <af:commandButton text="Export Data" id="ctb1">><af:fileDownloadActionListener contentType="application/vnd.ms-excel; charset=utf-8"
    >filename="test.csv"
    >method="#{pageFlowScope.pageFlowScopeDemoAppMB.test}"/>
    ></af:commandButton>
    And place this code in ManagedBean
    ======================
    > public void test(FacesContext facesContext, OutputStream outputStream) throws IOException {
    > DCBindingContainer dcBindings = (DCBindingContainer)BindingContext.getCurrent().getCurrentBindingsEntry();
    >DCIteratorBinding itrBinding = (DCIteratorBinding)dcBindings.get("fetchDataIterator");
    >tableRows = itrBinding.getAllRowsInRange();
    preparaing column headers
    >PrintWriter out = new PrintWriter(outputStream);
    >out.print(" ID");
    >out.print(",");
    >out.print("Name");
    >out.print(",");
    >out.print("Designation");
    >out.print(",");
    >out.print("Salary");
    >out.println();
    preparing column data
    > for(Row row : tableRows){
    >DCDataRow dataRow = (DCDataRow)row;
    > DataLoaderDTO dto = (DataLoaderDTO)dataRow.getDataProvider();
    >out.print(dto.getId());
    >out.print(",");
    >out.print(dto.getName());
    >out.print(",");
    >out.print(dto.getDesgntn());
    >out.print(",");
    >out.print(dto.getSalary());
    >out.println();
    >}
    >out.flush();
    >out.close();
    > }
    And do the following settings(*OPTIONAL*) for your browser - Only in case, if the file is being blocked by IE
    ==================================================================
    http://ais-ss.usc.edu/helpdoc/main/browser/bris004b.html
    This resolves implementation of exporting table data to CSV file in 11g.
    Thanks & Regards,
    Kiran Konjeti

  • ABAP Code for Backup the entire table data in the application server

    Hello Friends,
    I have to create the table data Backup and Store the entire table data in the application server and also be able to restore the data back if needed.
    this should be dynamic program for any table based on the table name given on the application server.. I have developed a program for this but its having problems with the Quantity, amount. Its not writing it correctly at the application level.
    ANy Suggestions on this.
    Below is the program for this.
    Thanks,
    Ster.
    * Report  YWMM_TABLE_DUMP                                             *
    REPORT ywmm_table_dump .
    TABLES :
            dd03l.
    * Type spool declaration
    TYPE-POOLS:
            abap, slis.
    DATA: i_table_data1  TYPE REF TO data.
    DATA : it_dd03l LIKE dd03l OCCURS 0 WITH HEADER LINE.
    *DATA : gt_fieldcat TYPE lvc_s_fcat.
    DATA : i_fcat      TYPE STANDARD TABLE OF lvc_s_fcat,
           l_dr_line         TYPE   REF TO data,
           l_v_as4vers       TYPE as4vers.
    FIELD-SYMBOLS: <f_table_data1>     TYPE STANDARD TABLE,
                   <f_wa_table_data1>  TYPE ANY.
    SELECTION-SCREEN: BEGIN OF BLOCK bl1 WITH FRAME TITLE text-001.
    PARAMETERS: rb_copy RADIOBUTTON GROUP map DEFAULT 'X',
                rb_rest RADIOBUTTON GROUP map.
    SELECTION-SCREEN: END   OF BLOCK bl1.
    SELECTION-SCREEN: BEGIN OF BLOCK bl2 WITH FRAME TITLE text-002.
    PARAMETERS: p_table  TYPE tabname OBLIGATORY,
                p_plfld TYPE dd03l-fieldname.
    SELECTION-SCREEN SKIP 1.
    PARAMETERS: p_bkfile TYPE localfile OBLIGATORY.
    SELECTION-SCREEN: END   OF BLOCK bl2.
    PERFORM get_data.
    IF rb_copy = 'X'.
      PERFORM backup.
    ELSEIF rb_rest = 'X'.
      PERFORM database_update.
    ENDIF.
    *&      Form  get_data
    FORM get_data.
      CLEAR   i_fcat.
      REFRESH i_fcat.
      CALL FUNCTION 'LVC_FIELDCATALOG_MERGE'
           EXPORTING
                i_structure_name = p_table  " Table Name
           CHANGING
                ct_fieldcat      = i_fcat
           EXCEPTIONS
                OTHERS           = 1.
      CALL METHOD cl_alv_table_create=>create_dynamic_table
      EXPORTING
        it_fieldcatalog = i_fcat
      IMPORTING
        ep_table        = i_table_data1.
      IF sy-subrc = 0.
        ASSIGN i_table_data1->* TO <f_table_data1>.
      ELSE.
        WRITE: 'Error creating internal table'.
      ENDIF.
      IF rb_copy = 'X'.
        SELECT  * FROM (p_table) INTO CORRESPONDING FIELDS OF
                  TABLE <f_table_data1> UP TO 20 ROWS.
      ELSEIF rb_rest = 'X'.
        CREATE DATA l_dr_line LIKE LINE OF <f_table_data1>.
        ASSIGN l_dr_line->* TO <f_wa_table_data1>.
    *Get Data from Application Server
    * Opening the dataset P_BKFILE given in the selection screen
        TRANSLATE p_bkfile TO LOWER CASE.
        OPEN DATASET p_bkfile FOR INPUT IN TEXT MODE." ENCODING DEFAULT.
        IF sy-subrc NE 0.
    *    MESSAGE:
        ELSE.
          DO.
    * Reading the file from application server
            READ DATASET p_bkfile INTO <f_wa_table_data1>.
            IF sy-subrc = 0.
              APPEND <f_wa_table_data1> TO <f_table_data1>.
            ELSE.
              EXIT.
            ENDIF.
          ENDDO.
    * Closing the dataset
          CLOSE DATASET p_bkfile.
        ENDIF.
      ENDIF.
    ENDFORM.                    " get_data
    *&      Form  backup
    *       text
    *  -->  p1        text
    *  <--  p2        text
    FORM backup.
      TRANSLATE p_bkfile TO LOWER CASE.
      OPEN DATASET p_bkfile FOR OUTPUT IN TEXT MODE.
      IF sy-subrc NE 0.
        WRITE: text-017.
        STOP.
      ELSE.
        LOOP AT <f_table_data1> ASSIGNING <f_wa_table_data1>.
          TRANSFER <f_wa_table_data1> TO p_bkfile.
        ENDLOOP.
      ENDIF.
      CLOSE DATASET p_bkfile.
    ENDFORM.                    " backup
    *&      Form  database_update
    FORM database_update.
      DATA : i_mara_u TYPE STANDARD TABLE OF mara WITH HEADER LINE,
             i_ekpo_u TYPE STANDARD TABLE OF ekpo WITH HEADER LINE,
             i_eban_u TYPE STANDARD TABLE OF eban WITH HEADER LINE,
             i_resb_u TYPE STANDARD TABLE OF resb WITH HEADER LINE,
             i_plpo_u TYPE STANDARD TABLE OF plpo WITH HEADER LINE,
             i_stpo_u TYPE STANDARD TABLE OF stpo WITH HEADER LINE,
             i_vbap_u TYPE STANDARD TABLE OF vbap WITH HEADER LINE,
             i_vbrp_u TYPE STANDARD TABLE OF vbrp WITH HEADER LINE,
             i_lips_u TYPE STANDARD TABLE OF lips WITH HEADER LINE,
             i_afvc_u TYPE STANDARD TABLE OF afvc WITH HEADER LINE,
             i_asmd_u TYPE STANDARD TABLE OF asmd WITH HEADER LINE,
    *           i_cooi_u TYPE STANDARD TABLE OF cooi WITH HEADER LINE,
             i_qmel_u TYPE STANDARD TABLE OF qmel WITH HEADER LINE,
             i_cooi_u TYPE STANDARD TABLE OF cooi WITH HEADER LINE,
             i_esll_u TYPE STANDARD TABLE OF esll WITH HEADER LINE,
             i_t165_u  TYPE STANDARD TABLE OF t165 WITH HEADER LINE,
             i_t165e_u TYPE STANDARD TABLE OF t165e WITH HEADER LINE,
             i_twpko_u TYPE STANDARD TABLE OF twpko WITH HEADER LINE,
             i_tpext_u TYPE STANDARD TABLE OF tpext WITH HEADER LINE,
             i_ce4mxpa_u TYPE STANDARD TABLE OF ce4mxpa WITH HEADER LINE,
             i_ce4mxpa_acct_u TYPE STANDARD TABLE OF ce4mxpa_acct WITH
                                                             HEADER LINE,
             i_zaim_u  TYPE STANDARD TABLE OF zaim WITH HEADER LINE,
             i_s012_d TYPE STANDARD TABLE OF s012 WITH HEADER LINE,
             i_s012_i TYPE STANDARD TABLE OF s012 WITH HEADER LINE,
             i_dummy  TYPE STANDARD TABLE OF mara.
      CASE p_table.
        WHEN 'MARA'.
    *     Non-Key
          PERFORM move_to_table USING   <f_table_data1>
                                CHANGING i_mara_u[]
                                         i_mara_u.
          PERFORM update_table USING i_mara_u[].
      ENDCASE.
    ENDFORM.                    " database_update
    *&      Form  move_to_mara
    FORM move_to_table USING    p_tab_from TYPE STANDARD TABLE
                       CHANGING p_tab_to   TYPE STANDARD TABLE
                                p_w_table.
      DATA:  l_wa_fcat TYPE lvc_s_fcat.
      FIELD-SYMBOLS: <f_field_from> TYPE ANY,
                     <f_field_to>   TYPE ANY.
      LOOP AT p_tab_from ASSIGNING <f_wa_table_data1>.
        LOOP AT i_fcat INTO l_wa_fcat.
          ASSIGN COMPONENT l_wa_fcat-fieldname
         OF STRUCTURE <f_wa_table_data1> TO <f_field_from>.
          ASSIGN COMPONENT l_wa_fcat-fieldname
         OF STRUCTURE p_w_table TO <f_field_to>.
          <f_field_to> = <f_field_from>.
        ENDLOOP.
        APPEND p_w_table TO p_tab_to.
      ENDLOOP.
    ENDFORM.                    " move_to_mara
    *&      Form  update_table
    FORM update_table  USING p_table_update TYPE STANDARD TABLE.
      SELECT SINGLE *
        FROM dd03l
       WHERE fieldname = p_plfld
         AND tabname   = p_table
         AND keyflag   <> 'X'
         AND as4local = 'A'
         AND   as4vers = l_v_as4vers
         AND   ( comptype = 'E' OR comptype = space ).
      IF sy-subrc = 0.
    *   Do update
        IF NOT p_table_update IS INITIAL.
          UPDATE (p_table) FROM TABLE p_table_update.
          IF sy-subrc = 0.
            COMMIT WORK.
          ELSE.
            ROLLBACK WORK.
            WRITE: text-003.
            STOP.
          ENDIF.
        ENDIF.
      ELSE.
    *delete and insert.
        IF NOT p_table_update IS INITIAL.
    *      DELETE (p_table).
          IF sy-subrc = 0.
            INSERT (p_table) FROM TABLE p_table_update.
            IF sy-subrc = 0.
              COMMIT WORK.
            ELSE.
              ROLLBACK WORK.
              WRITE: text-018.
              STOP.
            ENDIF.
          ELSE.
            ROLLBACK WORK.
            WRITE: text-018.
            STOP.
          ENDIF.
        ENDIF.
      ENDIF.
    ENDFORM.                    " update_table
    Edited by: Julius Bussche on Jul 18, 2008 1:43 PM
    Please use a meaningfull subject title!

    ARS,
    I am struggling a bit to get this.
    there is a syntax error,
    Field "FIELDS_INT-TYPE" is unknown. It is neither in one of thespecified tables nor defined by a "DATA" statement.     
    Again you have asked to move to a diffrent table. What is that table and how to build it.
        LOOP AT <f_table_data1> ASSIGNING <f_wa_table_data1>.
          LOOP AT i_fcat INTO l_fcat.
            IF l_fcat-inttype EQ 'P'.
              ASSIGN COMPONENT l_fcat-fieldname
                  OF STRUCTURE <f_wa_table_data1> TO <f_field>
                  TYPE     fields_int-type
                  DECIMALS fields_int-decimals.
            ELSE.
              ASSIGN COMPONENT l_fcat-fieldname
                  OF STRUCTURE <f_wa_table_data1> TO <f_field>
                  TYPE     fields_int-type.
            ENDIF.
            " Move <f_field> to a new table and use this table for download
          ENDLOOP.
          TRANSFER <f_wa_table_data1> TO p_bkfile.
        ENDLOOP.
    Ster

  • Exporting Table data to delimited txt file

    I am trying to import the table data into delimited text file, I am running the following code from Tom Kyte's website on the server. When I run the procedure after I run the function it is not creating any .dat file for me.
    I have also checked for the parameter 'utl_file_dir' in the database and it is set to the correct path. Is there anything that I am missing??
    any suggestions/inputs would help.
    create or replace function dump_csv( p_query in varchar2,
    p_separator in varchar2
    default ',',
    p_dir in varchar2 ,
    p_filename in varchar2 )
    return number
    AUTHID CURRENT_USER
    is
    l_output utl_file.file_type;
    l_theCursor integer default dbms_sql.open_cursor;
    l_columnValue varchar2(2000);
    l_status integer;
    l_colCnt number default 0;
    l_separator varchar2(10) default '';
    l_cnt number default 0;
    begin
    l_output := utl_file.fopen( p_dir, p_filename, 'w' );
    dbms_sql.parse( l_theCursor, p_query, dbms_sql.native );
    for i in 1 .. 255 loop
    begin
    dbms_sql.define_column( l_theCursor, i,
    l_columnValue, 2000 );
    l_colCnt := i;
    exception
    when others then
    if ( sqlcode = -1007 ) then exit;
    else
    raise;
    end if;
    end;
    end loop;
    dbms_sql.define_column( l_theCursor, 1, l_columnValue,
    2000 );
    l_status := dbms_sql.execute(l_theCursor);
    loop
    exit when ( dbms_sql.fetch_rows(l_theCursor) <= 0 );
    l_separator := '';
    for i in 1 .. l_colCnt loop
    dbms_sql.column_value( l_theCursor, i,
    l_columnValue );
    utl_file.put( l_output, l_separator ||
    l_columnValue );
    l_separator := p_separator;
    end loop;
    utl_file.new_line( l_output );
    l_cnt := l_cnt+1;
    end loop;
    dbms_sql.close_cursor(l_theCursor);
    utl_file.fclose( l_output );
    return l_cnt;
    end dump_csv;
    create or replace procedure test_dump_csv
    as
    l_rows number;
    begin
    l_rows := dump_csv( 'select *
    from doctype
    where rownum < 25',
    ',', '/tmp', 'test.dat' );
    end;
    /

    Hi,
    It works for me...
    SQL> select dump_csv('select * from dba_users',',','/tmp','teste.txt') from dual;
    DUMP_CSV('SELECT*FROMDBA_USERS',',','/TMP','TESTE.TXT')
                                                       149
    SQL> show parameter utl_file_dir
    NAME                                 TYPE        VALUE
    utl_file_dir                         string      /tmp
    oracle@icaro:/tmp> ls -l teste.txt
    -rw-r--r--  1 rps users 394353 2006-07-31 17:40 teste.txtJust for information:
    I'm using SUSE LINUX 10 and Oracle 10g 10.1.0.2
    Cheers
    Message was edited by:
    Legatti

  • Exporting table data to a CSV file

    hello everyone,
    i need help on exporting table data to a CSV file.
    I have a div element containing the table which displays some data.
    there is a "Export" button just above the table, which if clicked, will export the current data in the table to a .csv file.
    can this be done in jsp? or i need to use servlet?
    also how can i submit the table data only? (as my webpage contiain other data also)
    can anyone provide with some sort of sample code?
    thanks in advance!!

    oops!!
    i just forgot..
    when the user will click on the export button, i want to greet him with a "Save As"
    popup, so that he can specify the filename and location to save the file.
    thanks!!

  • How to  fetch the relational  data from the xml file registered in xdb

    Hi,
    I have to register the xml file into the  xdb repository and i have to fetch the data of the xml file as relational structure  through the select statement .
    i used the below query to register the xml file in xdb.
    DECLARE
    v_return BOOLEAN;
    BEGIN
    v_return := DBMS_XDB.CREATERESOURCE(
    abspath => '/public/demo/xml/db_objects.xml',
    data => BFILENAME('XML_DIR', 'db_objects.xml')
    COMMIT;
    END;
    Now i have to fetch the values in the xml file as relational data .
    whether it is possible ?
    can any one help me.
    Regards,
    suresh.

    When you transform your XMLdata to a xmltype you can do something like this for example:
    select
    extractvalue(value(p),'/XMLRecord/Session_Id') session_id,
    extractvalue(value(p),'/XMLRecord/StatementId') StatementId,
    extractvalue(value(p),'/XMLRecord/EntryId') EntryId
    from
    table(xmlsequence(extract(xmltype('
    <XMLdemo>
    <FormatModifiers><FormatModifier>UTFEncoding</FormatModifier></FormatModifiers>
    <XMLRecord>
    <Session_Id>117715</Session_Id>
    <StatementId>6</StatementId>
    <EntryId>1</EntryId>
    </XMLRecord>
    </XMLdemo>
    '),'/XMLdemo/*'))) p
    where extractvalue(value(p),'/XMLRecord/Session_Id') is not null;
    For this sample I've put a readable XML in plain text and convert it to xmltype so you can run it on your own database.

  • I have a VI and an attched .txt data file. Now I want to read the data from the .txt file and display it as an array in the front panel. But the result is not right. Any help?

    I have a VI and an attched .txt data file. Now I want to read the data from the .txt file and display it as an array in the front panel. But the result is not right. Any help?
    Attachments:
    try2.txt ‏2 KB
    read_array.vi ‏21 KB

    The problem is in the delimiters in your text file. By default, Read From Spreadsheet File.vi expects a tab delimited file. You can specify a delimiter (like a space), but Read From Spreadsheet File.vi has a problem with repeated delimiters: if you specify a single space as a delimiter and Read From Spreadsheet File.vi finds two spaces back-to-back, it stops reading that line. Your file (as I got it from your earlier post) is delimited by 4 spaces.
    Here are some of your choices to fix your problem.
    1. Change the source file to a tab delimited file. Your VI will then run as is.
    2. Change the source file to be delimited by a single space (rather than 4), then wire a string constant containing one space to the delimiter input of Read From Spreadsheet File.vi.
    3. Wire a string constant containing 4 spaces to the delimiter input of Read From Spreadsheet File.vi. Then your text file will run as is.
    Depending on where your text file comes from (see more comments below), I'd vote for choice 1: a tab delimited text file. It's the most common text output of spreadsheet programs.
    Comments for choices 1 and 2: Where does the text file come from? Is it automatically generated or manually generated? Will it be generated multiple times or just once? If it's manually generated or generated just once, you can use any text editor to change 4 spaces to a tab or to a single space. Note: if you want to change it to a tab delimited file, you can't enter a tab directly into a box in the search & replace dialog of many programs like notepad, but you can do a cut and paste. Before you start your search and replace (just in the text window of the editor), press tab. A tab character will be entered. Press Shift-LeftArrow (not Backspace) to highlight the tab character. Press Ctrl-X to cut the tab character. Start your search and replace (Ctrl-H in notepad in Windows 2000). Click into the Find What box. Enter four spaces. Click into the Replace With box. Press Ctrl-V to paste the tab character. And another thing: older versions of notepad don't have search and replace. Use any editor or word processor that does.

  • Create a Purchase order using the BAPI using the data in the XML file.

    Hello Gurus,
    here is the scenario can anyone help me how to proceed explaining the procedure?
    Create a Purchase order using the BAPI using the data in the XML file.
    comprehensive explanations are appreciated.
    thanks in advance.

    hi,
      first use fm "bapi_po_create".
      then use fm "BAPI_ACC_GL_POSTING_POST"
    The demo environment was made with real business scenario in mind, but following subjects need to be addressed in a live implementation:
    •     No exceptions and error handling is implemented, except the order rejection (e.g. partly delivery);
    •     In Navision both XML Ports and the XML DOM has been used to integrate with SAP XI, because XML ports has some drawbacks regarding to Namespaces in XML Documents (mandatory in SAP XI);
    •     A minimum of SAP and Navision customization is required to implement this solution. (e.g. user exit in SAP, Navision XML DOM).

  • Export table data in a flat file without using FL

    Hi,
    I am looking for options where I can export table data into a flat file without using FL(File Layout) i.e., by using App Engine only.
    Please share your experience if you did anything as this
    Thanks

    A simple way to export any record (table/view) to an csv fiel, is to create a rowset and loop through all record fields, like below example code
    Local Rowset &RS;
    Local Record &Rec;
    Local File &MYFILE;
    Local string &FileName, &strRecName, &Line, &Seperator, &Value;
    Local number &numRow, &numField;
    &FileName = "c:\temp\test.csv";
    &strRecName = "PSOPRDEFN";
    &Seperator = ";";
    &RS = CreateRowset(@("Record." | &strRecName));
    &RS.Fill();
    &MYFILE = GetFile(&FileName, "W", %FilePath_Absolute);
    If &MYFILE.IsOpen Then
       For &numRow = 1 To &RS.ActiveRowCount
          &Rec = &RS(&numRow).GetRecord(@("RECORD." | &strRecName));
          For &numField = 1 To &Rec.FieldCount
             &Value = String(&Rec.GetField(&numField).Value);
             If &numField = 1 Then
                &Line = &Value;
             Else
                &Line = &Line | &Seperator | &Value;
             End-If;
          End-For;
          &MYFILE.WriteLine(&Line);
       End-For;
    End-If;
    &MYFILE.Close(); You can of course create an application class for generic calling this piece of code.
    Hope it helps.
    Note:
    Do not come complaining to me on performance issues ;)

  • How to read data from the excel file using java code.

    Hi to all,
    I am using below code to getting the data from the excel file but I can't get the corresponding data from the specific file. can anyone give me the correct code to do that... I will waiting for your usefull reply......
    advance thanks....
    import java.io.*;
    import java.sql.*;
        public class sample{
                 public static void main(String[] args){
                      Connection connection = null;
                          try{
                               Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
                               Connection con = DriverManager.getConnection( "jdbc:odbc:Mydsn","","" );
                               Statement st = con.createStatement();
                               ResultSet rs = st.executeQuery( "Select * from [Sheet1$]" );
                               System.out.println("sample:"+rs);
                                  ResultSetMetaData rsmd = rs.getMetaData();
                                  System.out.println("samplersd:"+rsmd);
                               int numberOfColumns = rsmd.getColumnCount();
                                  System.out.println("numberOfColumns:"+numberOfColumns);
                                   while (rs.next()) {
                                            System.out.println("sample1:"+rs);
                                            for (int i = 1; i <= numberOfColumns; i++) {
                                                 if (i > 1) System.out.print(", ");
                                                 String columnValue = rs.getString(i);
                                                 System.out.print(columnValue);
                                            System.out.println("");
                                       st.close();
                                       con.close();
                                      } catch(Exception ex) {
                                           System.err.print("Exception: ");
                                           System.err.println(ex.getMessage());
                        }

    1: What is the name of the excel sheet?
    2: What is printed in this program? null ? anything?
    error?Excel file name is "sample.xls" I set excel file connectivity in my JDBC driver(DSN). Here in my program I am not giving that excel file name. I am giving only that excel sheet name. that is followed
    ResultSet rs = st.executeQuery( "Select * from [Sheet1$]" );The output of this program is given bellow.
    sample:sun.jdbc.odbc.JdbcOdbcResultSet@1b67f74
    samplersd:sun.jdbc.odbc.JdbcOdbcResultSetMetaData@530daa
    numberOfColumns:2

  • How can I show the last amend date of the .rpt file on my report?

    I want to show the last amendment date of the .rpt file on my report. 
    Modification date and time refer to the actual report rather than the .rpt file.
    Does anyone have any suggestions?  I am using version 11.2.
    Thanks,
    Anne-Marie

    annemarier,
    Have you tried the "Modification Date" or "Modification Time" fields located under the Special Fields section of the Field Explorer?
    I've never used them, but they seem like good candidates.
    Jason

  • How to edit the existing data in the XML file from java programming.

    Hi all
    i am able to create XML file with the sample data as below from java programming.
    i need sample code on how to edit the existing data in the XML file?
    for example
    <?xml version="1.0"?>
       <mydata>
               <data1>
                         <key1>467</key1>
                        <name1>Paul</name1>
                        <id1>123</id1>
              </data1>
              <data2>
                         <key2>467</key2>
                        <name2>Paul</name2>
                        <id2>123</id2>
              </data2>
        </mydata>
    i am able to insert the data in the XML.
    now i need sample code on how to modify the data in the above XML file from the java programming for only key2,name2,id2 tags only. the remaining tags data in the XML file i want to keep same data except for key2,name2,id2 which are i want to modify from java code
    Regards
    Sunil
    [points will be always rewardable]

    hi
    u need a parser or validate the xml file for to read the xml file from java coding u need for this
    xml4j.jar u can download this file  from here
    http://www.alphaworks.ibm.com/tech/xml4j
    or we can use the SAX(simple API for XML)
    some sample applications for this
    http://www.java-tips.org/java-se-tips/javax.xml.parsers/how-to-read-xml-file-in-java.html
    http://www.developertutorials.com/tutorials/java/read-xml-file-in-java-050611/page1.html
    http://www.xml-training-guide.com/e-xml44.html
    let me know u need any other info
    bvr

  • How to upload data in the pdf file to the R/3 with fileupload element ?....

    Hi Experts,
    I need to upload the data in the pdf file to the R/3 with fileupload element.
    But I am not able to get the correct data(it is confusion code) in the pdf file when I debug the programe. However, I am able to get the data if the data is in the txt or excel file.
    Do you give some hint for this problem ?
    Thanks & Regards,
    Tao

    Hi, experts,
    The version of R/3 : ABAP: 10, BASIS:11.
    Best regards,
    tao

  • How do read the data of the excel file line by line to the waveform?

    hello:
          i am a beginner.so i hope that  you can give me a small example。the data of the excel  file i
     have given out.
    looking for your replys!!!
    Solved!
    Go to Solution.
    Attachments:
    数据.xls ‏14 KB

    try this. best of luck
    Gaurav k
    CLD Certified !!!!!
    Do not forget to Mark solution and to give Kudo if problem is solved.
    Attachments:
    Read column.zip ‏26 KB

Maybe you are looking for