EXPORT FILE에서 TABLE 생성 SCRIPT를 만드는 방법

제품 : ORACLE SERVER
작성날짜 : 2002-04-08
EXPORT FILE에서 TABLE SCRIPT를 만드는 방법
==========================================
PURPOSE
다음은 EXPORT FILE에서 TABLE을 생성하는 SCRIPT를 만드는 내용이다.
Explanation
$ strings EXPDAT.DMP | grep 'CREATE TABLE' >> tables.sql
Reference Documents
--------------------

Similar Messages

  • BI 7 : Command to export a table structure of SAP R/3 into a script/text ?

    Hi All.
    Greetings.
    Am New to SAP R/3 system.  And request help.
    We are trying to pull data from SAP R/3 thro Bussiness Objects Data Services into Oracle.
    For now : we create a target oracle table looking at the table structure of SAP R/3 from SE 11.
    In BODS, We then do the query transformation, and use the oracle target table created by us manually.
    This works absolutely fine.
    We would like to know the command by which we could export the table structure of any existing table
    in SAP R/3 into a script / or to text file,
    which we could use to create the same table structure in oracle.
    Rather than manually typing some 200 field names for each tables.
    Can anyone advise on this.
    Thanks
    Indu

    Hello,
    The problem is caused due to the spaces in your directories
    C:\SAP Dumps\Core Release SR1 Export_CD1_51019634/DB/ADA/DBSIZE.XML
    Replace the spaces with underscores and restart the installation from from scratch.
    Cheers
    Bert

  • [CS3]How to export a table in InDesign to an excel file?

    Hi,
    Can any one tell me how to script the process of exporting a table to excel file in InDesign CS3 using javascript?
    Thanks in advance.
    myRiaz

    Sorry, no javaScript, but here are some lines from a localization tool that I made in VB. I simply loop through the Rows and Columns and use the Excel DOM to fill the Excel-cells.
    Of course you can read amended/localized Excel files back into inDesign the same way, provided that the tables match (nr of rows and columns).
    Afterwards you could loop through the characters of each cell to apply the formatting that you want to keep in Excel, such as SuperScripts.
    Hope it helps you.
    Set myExcel = CreateObject("Excel.Application")
    myExcel.Visible = True
    Set myTableBook = myExcel.Workbooks.Add
    Set myTableSheet = myTableBook.Worksheets.Item(1)
    myTableSheet.Columns.ColumnWidth = 35
    myTableSheet.Cells.VerticalAlignment = xlVAlignTop
    myTableSheet.Cells.WrapText = True
    For R = 1 To myTable.Rows.Count
        Set myTableRow = myTable.Rows.Item(R)
        For C = 1 To myTableRow.Cells.Count
            Set myTableCell = myTableRow.Cells.Item(C)
            If Len(myTableCell.Contents) = 0 Then
                myTableSheet.Cells(R, C) = ""
            Else
                myTableSheet.Cells(R, C) = myTableCell.Contents
                myTableSheet.Cells(R, C).Value = Replace(myTableSheet.Cells(R, C).Value, "1397058884", "—")
                myTableSheet.Cells(R, C).Value = Replace(myTableSheet.Cells(R, C).Value, Chr(13), Chr(10))
            End If
        Next C
    Next R
    good luck
    TonyT 

  • Export with Table Splitting : ORA-01115: IO error reading block from file

    Hello,
    We are in perfroming the last dryrun of our CU&UC conversion.
    The are now in the process of exporting the ECC6 system (Oracle 10.2.0.4.0, HPUX ia64) using sapinst features, "table splitting preparation"
    When doing so, we are facing critical errors :
    Creating file /export_uni/sapinst_splitting/ora_query3_tmp3_1.sql.
    ERROR 2010-08-11 10:27:28.881
    CJS-00084  SQL statement or script failed. DIAGNOSIS: Error message: ORA-12801: error signaled in parallel query server P002
    ORA-01115: IO error reading block from file 90 (block # 16640)
    ORA-27072: File I/O error
    HPUX-ia64 Error: 22: Invalid argument
    Additional information: 4
    Additional information: 16640
    Additional information: -1
    ORA-01115: IO error reading block from file 90 (block # 16640)
    ORA-27072: File I/O error
    HPUX-ia64 Error: 22: Invalid argument
    ORA-06512: at "SAPR3.TABLE_SPLITTER", line 775
    ORA-06512: at line 1
    I have therefore perfmed a dbverify ; no corruption has been recorded.
    When trying to perfrom the EXPORT, without table splitting ; it works fine ...but the processing time is extremely long, as you can imagine. Any help would be highly appreciated.Regards.
    Raoul

    Thank you Stefan,
    Our HPUX Release seems to be indeed 11v3,
    [root@:/root]# uname -a
    HP-UX B.11.31 U ia64 2566039091 unlimited-user license
    I'll check the installation of the  patch and keep you informed
    Thank you
    Raoul
    Edited by: Raoul Shiro on Aug 11, 2010 11:57 AM

  • Taking Table script back up in text file

    Hi:
    One of my customer has Oracle 8 running on IBM AIX. I need to take their table sript back up in form of a text file. Could anyone please walk me through the
    commands on IBM AIX for the above?
    Thanks - Prabir.

    What version of Oracle are they running? "Oracle 8" could refer to anything from 8.0.3 to 8.1.7.
    What is a "table script backup"? The DDL to recreate their tables? Or something else?
    Does it really need to be a human readable text file? Or would a binary export file be sufficient?
    Justin

  • The export file from a calc script - naming and date/time stamp

    Here is a simple calc script to export data.
    2 questions:
    1. Is there an easy way to add a date/time stamp on the name of the data file so that it does not overwrite the older ones every time?
    2. I tried to specify the path so that it write to another server. "\\mfldappp011\E:\Exp_AW.txt". It's not working. How should I specify the path ?
    Fix (@Relative("Yeartotal",0),"Actual","Working",&ActualYear);
    Dataexport "file" "," "C:\Exp_AW.txt" "#MI";
    EndFix;
    Edited by: user9959627 on Sep 7, 2012 11:25 AM

    Probably easiest to call the maxl script from a command line script, then rename the exported file to include the tme stamp and copy/move it to a location.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • RE: How to Export the Table data Into PDF File  in ADF

    Hi Experts,
    I am using Jdeveloper 11.1.2.3.0
    I am created employee VO and Drag and Drop as a Table in a page. So need to Export the Table data into A PDF file.
    So please give me some suggestions regarding this Scnerio.
    With Regards,
    satish

    Hi Guys ,
    Any more answers for this question.
    Please find my jsff below
    <?xml version='1.0' encoding='UTF-8'?>
    <jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1" xmlns:af="http://xmlns.oracle.com/adf/faces/rich"
              xmlns:f="http://java.sun.com/jsf/core" xmlns:report="http://www.adfwithejb.blogspot.com">
      <af:panelGroupLayout layout="vertical" id="pgl2">
          <af:query id="qryId1" headerText="Service Tariff Mapping Details" disclosed="true"
                    value="#{bindings.findByTarifValidFromQuery.queryDescriptor}"
                    model="#{bindings.findByTarifValidFromQuery.queryModel}"
                    queryListener="#{reportWiseInvoiceBean.genericQueryListener}"
                    queryOperationListener="#{bindings.findByTarifValidFromQuery.processQueryOperation}"
                    resultComponentId="pc1::t2">
         <f:attribute name="queryExpression" value="bindings.findByTarifValidFromQuery.processQuery"/>
                          </af:query>
        <af:panelCollection id="pc1" styleClass="AFStretchWidth">
          <f:facet name="menus"/>
          <f:facet name="toolbar">
              <af:toolbar id="t1">
                 <af:menuBar id="pt_m1">
                <report:reportDeclarative ButtonName="ExportToExcel" ReportName="ServiceTariffMappingDetails"
                                          ReportType="PDF" TableId=":::pc1:t2" id="rd1" Pagination="true"/>
                <af:commandButton text="excel" id="cb1" binding="#{exportToExcelBean.exportID}">
                <af:setActionListener from="pt1:pgl1:pgl2:pc1:t2" to="#{viewScope['exporter.exportedId']}"/>
                <af:setActionListener from="border:1px solid #cccccc" to="#{viewScope['exporter.thStyle']}"/>
                <af:setActionListener from="border:1px solid #cccccc" to="#{viewScope['exporter.tdStyle']}"/>
                <af:fileDownloadActionListener method="#{exportToExcelBean.exportToExcel}" filename="Service TariffMapping.xls"
                                                 contentType="text/excel;chatset=UTF-8;"/>
                </af:commandButton>
                <af:commandMenuItem id="pt_cmi133" icon="/images/common/Excel-icon.png"
                                                shortDesc="ExportToExcel"
                                >
                                <af:exportCollectionActionListener exportedId="t2" type="excelHTML"
                                                                   title="Service Tariff Mapping"
                                                                   filename="Service Tariff Mapping.xls"/>
                            </af:commandMenuItem></af:menuBar>
              </af:toolbar>
          </f:facet>
          <f:facet name="statusbar"/>
          <af:table value="#{bindings.ServiceTariffMappingDtlsRVO1.collectionModel}" var="row"
                    rows="#{bindings.ServiceTariffMappingDtlsRVO1.rangeSize}"
                    emptyText="#{bindings.ServiceTariffMappingDtlsRVO1.viewable ? 'No data to display.' : 'Access Denied.'}"
                    fetchSize="#{bindings.ServiceTariffMappingDtlsRVO1.rangeSize}" rowBandingInterval="0"
                    filterModel="#{bindings.findByTarifValidFromQuery.queryDescriptor}"
                    queryListener="#{bindings.findByTarifValidFromQuery.processQuery}" filterVisible="true" varStatus="vs"
                    id="t2" columnStretching="last" binding="#{ServiceTariffMappBean.testTable}">
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.label}"
                       id="c1">
              <af:inputText value="#{row.bindings.NormalTariffCode.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.tooltip}" id="it1">
                <f:validator binding="#{row.bindings.NormalTariffCode.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.label}"
                       id="c2">
              <af:inputText value="#{row.bindings.ServiceCode.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.tooltip}" id="it2">
                <f:validator binding="#{row.bindings.ServiceCode.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.label}" id="c3">
              <f:facet name="filter">
                <af:inputDate value="#{vs.filterCriteria.TrfVldFrm}" id="id1">
                  <af:convertDateTime pattern="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.format}"/>
                </af:inputDate>
              </f:facet>
              <af:inputDate value="#{row.bindings.TrfVldFrm.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.displayWidth}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.tooltip}" id="id2">
                <f:validator binding="#{row.bindings.TrfVldFrm.validator}"/>
                <af:convertDateTime pattern="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.format}"/>
              </af:inputDate>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.label}"
                       id="c4">
              <af:inputText value="#{row.bindings.ServiceDesc.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.tooltip}" id="it3">
                <f:validator binding="#{row.bindings.ServiceDesc.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.label}" id="c5">
              <af:inputText value="#{row.bindings.OtTrfCode.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.tooltip}" id="it4">
                <f:validator binding="#{row.bindings.OtTrfCode.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.label}" id="c6">
              <af:inputText value="#{row.bindings.OtUnitRate.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.tooltip}" id="it5">
                <f:validator binding="#{row.bindings.OtUnitRate.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.label}" id="c7">
              <af:inputText value="#{row.bindings.NtUnitRate.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.tooltip}" id="it6">
                <f:validator binding="#{row.bindings.NtUnitRate.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.label}" id="c8">
              <af:inputText value="#{row.bindings.TrfGrt.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.tooltip}" id="it7">
                <f:validator binding="#{row.bindings.TrfGrt.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.label}"
                       id="c9">
              <af:inputText value="#{row.bindings.ChargePartyCode.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.tooltip}" id="it8">
                <f:validator binding="#{row.bindings.ChargePartyCode.validator}"/>
              </af:inputText>
            </af:column>
          </af:table>
        </af:panelCollection>
      </af:panelGroupLayout>
    </jsp:root>

  • Xml file to table script

    Hi all,
    I would be thanks you and greatful if someone can help me to create a table script.
    I have an XML file with PK and FK relationship.
    I also have a DTD file.
    I want to create tables in Oracle Database from XML or DTD file.
    Thanks in Advance
    Saaz

    You might want to take a look at registerSchema (http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_xmlsch.htm#sthref9696) with genTables set to TRUE. This would let you use the DTD to create a table structure within Oracle. This may or may not be what you are looking for but you didn't provide many details.
    That's 10.2 documentation so use the edition for your version of Oracle

  • Exporting R3 tables into Flat Files for BPC

    Dear BPC experts,
    I understand currently the way for BPC to extract data from R3 is via flat files. I have the following queries:
    1) What exactly are the T codes and steps required to export R3 tables into flat files (without going through the OpenHub in BI7)? Can this process be automated?
    2) Is Data Manager of BPC equivalent to SSIS (Integration Services) of SQL Server?
    Please advise. Thanks!!
    SJ

    Hi Soong Jeng,
    I would take a look at the existing BI Extractors for the answer to Q1, I am working on finishing up a HTG regarding this. Look for it very soon.
    Here is the code to dump out data from a BI Extractor directly from ERP.
    You need dev permissions in your ERP system and access to an app server folder. ...Good Luck
    *& Report  Z_EXTRACTOR_TO_FILE                                         *
    report  z_extractor_to_file                     .
    type-pools:
      rsaot.
    parameters:
      p_osrce type roosource-oltpsource,
      p_filnm type rlgrap-filename,
      p_maxsz type rsiodynp4-maxsize default 100,
      p_maxfc type rsiodynp4-calls default 10,
      p_updmd type rsiodynp4-updmode default 'F'.
    data:
      l_lines_read type sy-tabix,
      ls_select type rsselect,
      lt_select type table of rsselect,
      ls_field type rsfieldsel,
      lt_field type table of rsfieldsel,
      l_quiet type rois-genflag value 'X',
      l_readonly type rsiodynp4-readonly value 'X',
      l_desc_type type char1 value 'M',
      lr_data type ref to data,
      lt_oltpsource type rsaot_s_osource,
      l_req_no type rsiodynp4-requnr,
      l_debugmode type rsiodynp4-debugmode,
      l_genmode type rois-genflag,
      l_columns type i,
      l_temp_char type char40,
      l_filename    like rlgrap-filename,
      wa_x030l      like x030l,
      tb_dfies      type standard table of dfies,
      wa_dfies      type dfies,
      begin of tb_flditab occurs 0,
    *   field description
        fldname(40)  type c,
      end of tb_flditab,
      ls_flditab like line of tb_flditab,
      l_file type string.
    field-symbols:
      <lt_data> type standard table,
      <ls_data> type any,
      <ls_field> type any.
    call function 'RSA1_SINGLE_OLTPSOURCE_GET'
      exporting
        i_oltpsource   = p_osrce
      importing
        e_s_oltpsource = lt_oltpsource
      exceptions
        no_authority   = 1
        not_exist      = 2
        inconsistent   = 3
        others         = 4.
    if sy-subrc <> 0.
    * ERROR
    endif.
    create data lr_data type standard table of (lt_oltpsource-exstruct).
    assign lr_data->* to <lt_data>.
    call function 'RSFH_GET_DATA_SIMPLE'
      exporting
        i_requnr                     = l_req_no
        i_osource                    = p_osrce
        i_maxsize                    = p_maxsz
        i_maxfetch                   = p_maxfc
        i_updmode                    = p_updmd
        i_debugmode                  = l_debugmode
        i_abapmemory                 = l_genmode
        i_quiet                      = l_quiet
        i_read_only                  = l_readonly
      importing
        e_lines_read                 = l_lines_read
      tables
        i_t_select                   = lt_select
        i_t_field                    = lt_field
        e_t_data                     = <lt_data>
      exceptions
        generation_error             = 1
        interface_table_error        = 2
        metadata_error               = 3
        error_passed_to_mess_handler = 4
        no_authority                 = 5
        others                       = 6.
    * get table/structure field info
    call function 'GET_FIELDTAB'
      exporting
        langu               = sy-langu
        only                = space
        tabname             = lt_oltpsource-exstruct
        withtext            = 'X'
      importing
        header              = wa_x030l
      tables
        fieldtab            = tb_dfies
      exceptions
        internal_error      = 01
        no_texts_found      = 02
        table_has_no_fields = 03
        table_not_activ     = 04.
    * check result
    case sy-subrc.
      when 0.
    *      copy fieldnames
        loop at tb_dfies into wa_dfies.
          case l_desc_type.
            when 'F'.
              tb_flditab-fldname = wa_dfies-fieldname.
            when 'S'.
              tb_flditab-fldname = wa_dfies-scrtext_s.
            when 'M'.
              tb_flditab-fldname = wa_dfies-scrtext_m.
            when 'L'.
              tb_flditab-fldname = wa_dfies-scrtext_l.
            when others.
    *         use fieldname
              tb_flditab-fldname = wa_dfies-fieldname.
          endcase.
          append tb_flditab.
    *        clear variables
          clear: wa_dfies.
        endloop.
      when others.
        message id sy-msgid type sy-msgty number sy-msgno
        with  sy-subrc raising error_get_dictionary_info.
    endcase.
    describe table tb_flditab lines l_columns.
    " MOVE DATA TO THE APPLICATION SERVER
    open dataset p_filnm for output in text mode encoding utf-8
      with windows linefeed.
    data i type i.
    loop at <lt_data> assigning <ls_data>.
      loop at tb_flditab into ls_flditab.
        i = sy-tabix.
        assign component i of structure <ls_data> to <ls_field>.
        l_temp_char = <ls_field>.
        if i eq 1.
          l_file = l_temp_char.
        else.
          concatenate l_file ',' l_temp_char  into l_file.
        endif.
      endloop.
      transfer l_file to p_filnm.
      clear l_file.
    endloop.
    close dataset p_filnm.
    Cheers,
    Scott
    Edited by: Jeffrey Holdeman on May 25, 2010 4:44 PM
    Added  markup to improve readability
    Edited by: Jeffrey Holdeman on May 25, 2010 4:47 PM

  • Export Oracle tables and metadata to file for importation into another Db

    Hi all,
    In relation to past posts I'm wondering if a tool exists that allows one to export chosen table(s), along with associated metadata (hopefully automatic incorporation of associated triggers etc) to an export file and then add (import) this data to another Oracle Database?
    Looking at remote application development on specific data on a 'standalone' PC running Oracle XE at home.

    Hi,
    well there is only a version of datapump.
    You can set a 'VERSION' parameter for compatibility..
    Check this out
    http://download-uk.oracle.com/docs/cd/B14117_01/server.101/b10825/toc.htm
    It can help you. :)

  • I am using CS6. When exporting layers to files in Scripts, I get an error message Error 519: Server

    I am using CS6. When exporting layers to files in Scripts, I get an error message Error 519: Server Interface error. 'No component returned from CreateWidget'

    Hi. You’ve posted your question in a forum that is for beginners trying to learn the basics of Photoshop.  I'm moving your question to the Photoshop General Discussion forum for specialized attention to your situation.

  • R3load export of  table REPOSRC with lob col - error ora-1555 and ora-22924

    Hello,
    i have tried to export data from our production system for system copy and then upgrade test. while i export the R3load job has reported error in table REPOSRC, which has lob column DATA. i have apsted below the conversation in which i have requested SAP to help and they said it comes under consulting support. this problem is in 2 rows of the table.
    but i would like to know if i delete these 2 rows and then copy from our development system to production system at oracle level, will there be any problem with upgrade or operation of these prorgams and will it have any license complications if i do it.
    Regards
    Ramakrishna Reddy
    __________________________ SAP SUPPORT COnveration_____________________________________________________
    Hello,
    we have are performing Data Export for System copy of our Production
    system, during the export, R3load Job gave error as
    R3LOAD Log----
    Compiled Aug 16 2008 04:47:59
    /sapmnt/DB1/exe/R3load -datacodepage 1100 -
    e /dataexport/syscopy/SAPSSEXC.cmd -l /dataexport/syscopy/SAPSSEXC.log -stop_on_error
    (DB) INFO: connected to DB
    (DB) INFO: DbSlControl(DBSL_CMD_NLS_CHARACTERSET_GET): WE8DEC
    (DB) INFO: Export without hintfile
    (NT) Error: TPRI_PAR: normal NameTab from 20090828184449 younger than
    alternate NameTab from 20030211191957!
    (SYSLOG) INFO: k CQF :
    TPRI_PAR&20030211191957&20090828184449& rscpgdio 47
    (CNV) WARNING: conversion from 8600 to 1100 not possible
    (GSI) INFO: dbname = "DB120050205010209
    (GSI) INFO: vname = "ORACLE "
    (GSI) INFO: hostname
    = "dbttsap "
    (GSI) INFO: sysname = "AIX"
    (GSI) INFO: nodename = "dbttsap"
    (GSI) INFO: release = "2"
    (GSI) INFO: version = "5"
    (GSI) INFO: machine = "00C8793E4C00"
    (GSI) INFO: instno = "0020111547"
    (DBC) Info: No commits during lob export
    DbSl Trace: OCI-call 'OCILobRead' failed: rc = 1555
    DbSl Trace: ORA-1555 occurred when reading from a LOB
    (EXP) ERROR: DbSlLobGetPiece failed
    rc = 99, table "REPOSRC"
    (SQL error 1555)
    error message returned by DbSl:
    ORA-01555: snapshot too old: rollback segment number with name "" too
    small
    ORA-22924: snapshot too old
    (DB) INFO: disconnected from DB
    /sapmnt/DB1/exe/R3load: job finished with 1 error(s)
    /sapmnt/DB1/exe/R3load: END OF LOG: 20100816104734
    END of R3LOAD Log----
    then as per the note 500340, i have chnaged the pctversion of table
    REPOSRC of lob column DATA to 30, but i get the error still,
    i have added more space to PSAPUNDO and PSAPTEMP also, still the same
    error.
    the i have run the export as
    exp SAPDB1/sap file=REPOSRC.dmp log=REPOSRC.log tables=REPOSRC
    exp log----
    dbttsap:oradb1 5> exp SAPDB1/sap file=REPOSRC.dmp log=REPOSRC.log
    tables=REPOSRC
    Export: Release 9.2.0.8.0 - Production on Mon Aug 16 13:40:27 2010
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    Connected to: Oracle9i Enterprise Edition Release 9.2.0.8.0 - 64bit
    Production
    With the Partitioning option
    JServer Release 9.2.0.8.0 - Production
    Export done in WE8DEC character set and UTF8 NCHAR character set
    About to export specified tables via Conventional Path ...
    . . exporting table REPOSRC
    EXP-00056: ORACLE error 1555 encountered
    ORA-01555: snapshot too old: rollback segment number with name "" too
    small
    ORA-22924: snapshot too old
    Export terminated successfully with warnings.
    SQL> select table_name, segment_name, cache, nvl(to_char
    (pctversion),'NULL') pctversion, nvl(to_char(retention),'NULL')
    retention from dba_lobs where
    table_name = 'REPOSRC';
    TABLE_NAME | SEGMENT_NAME |CACHE | PCTVERSION | RETENTION
    REPOSRC SYS_LOB0000014507C00034$$ NO 30 21600
    please help to solve this problem.
    Regards
    Ramakrishna Reddy
    Dear customer,
    Thank you very much for contacting us at SAP global support.
    Regarding your issue would you please attach your ORACLE alert log and
    trace file to this message?
    Thanks and regards.
    Hello,
    Thanks for helping,
    i attached the alert log file. i have gone through is, but i could
    not find the corresponding Ora-01555 for table REPOSRC.
    Regards
    Ramakrishna Reddy
    +66 85835-4272
    Dear customer,
    I have found some previous issues with the similar symptom as your
    system. I think this symptom is described in note 983230.
    As you can see this symptom is mainly caused by ORACLE bug 5212539 and
    it should be fixed at 9.2.0.8 which is just your version. But although
    5212539 is implemented, only the occurrence of new corruptions will be
    avoided, the already existing ones will stay in the system regardless of the patch.
    The reason why metalink 452341.1 was created is bug 5212539, since this
    is the most common software caused lob corruption in recent times.
    Basically any system that was running without a patch for bug 5212539 at some time in the past could be potentially affected by the problem.
    In order to be sure about bug 5212539 can you please verify whether the
    affected lob really is a NOCACHE lob? You can do this as described in
    mentioned note #983230. If yes, then there are basically only two
    options left:
    -> You apply a backup to the system that does not contain these
    corruptions.
    -> In case a good backup is not available, it would be possible to
    rebuild the table including the lob segment with possible data loss . Since this is beyond the scope of support, this would have to be
    done via remote consulting.
    Any further question, please contact us freely.
    Thanks and regards.
    Hello,
    Thanks for the Help and support,
    i have gone through  the note 983230 and metalink 452341.1.
    and i have ran the script and found that there are 2 rows corrupted in
    the table REPOSRC. these rows belong to Standard SAP programs
    MABADRFENTRIES & SAPFGARC.
    and to reconfirm i have tried to display them in our development system
    and production system. the development systems shows the src code in
    Se38 but in production system it goes to short dump DBIF_REPO_SQL_ERROR.
    so is it possible to delete these 2 rows and update ourselves from our
    development system at oracle level. will it have any impact on SAP
    operation or upgrade in future.
    Regards
    Ramakrishna Reddy

    Hello, we have solved the problem.
    To help someone with the same error, what we have done is:
    1.- wait until all the processes has finished and the export is stopped.
    2.- startup SAP
    3.- SE14 and look up the tables. Crete the tables in the database.
    4.- stop SAP
    5.- Retry the export (if you did all the steps with sapinst running but the dialogue window in the screen) or begin the sapinst again with the option: "continue with the old options".
    Regards to all.

  • Am exporting html table containing images data into pdf and after exporting images are not displaying in pdf document.

    Hi all,
             I trying to export html table which contains images into pdf through java script but after downloading pdf file am unable to see images.Is this problem with plugins are any other.Can any one help me out from this.
    Thanx in advance.

    Another option will be
    window.print(); as pdf. 
    Please 'propose as answer' if it helped you, also 'vote helpful' if you like this reply.

  • Override Export File Name

    Hi Everyone,
    I have a question which I cannot see being asked in the forum yet.
    I'd like to override the standard behaviour of exporting files to the target system by loading an once exported file again if the POV has not been validated since the last target data load. This is to speed up batch loading (we load 1500 POVs every night to Essbase). As such I need to overwrite the name of the export files as they are not unique and file names may be used more than one time for different POVs.
    Is there any way to do this in an action script? I can see that strFile occurs as a parameter in the function calls for BefExportToDat, ActExport etc. I was not able to find an API function for this purpose yet and manipulating tPOVPartition.PartLastExpFile for the location was not doing the trick as this entry is overriden in the background after export.
    Thanks in advance.
    Regards,
    Hauke

    Wayne,
    Thank you for your answer. You lead me into the right direction. Obviously, something was going horribly wrong when I tested the manipulated export script for the first time. This caused FDM to overwrite export files during batch and made me believe that the file names are not uniquely exported for all POVs.
    For anyone interested, FDM is keeping the export filenames unique for all POVs as long as historical export data files exist in the Outbox folder. When you clear the Outbox folder, file names are reused in different POVs during export. You may query the last file which was exported for any POV in the tDataArchive table in FDM.
    I am querying this table instead of tLogActivity now and move the file which was exported last for the respective POV and the filename has not been reused by another POV (due to clearing Outbox) afterwards to the filename which is created by FDM before exporting. This is speeding up the export batch to run in one fourth of the time compared to the original export behaviour as the database is not queried for all these files.
    Best regards,
    Hauke

  • Sqlldr multiple files/tables

    Hello.
    I have sqlldr loading data from 5 different external files to 5 different tables working just fine. I'm using 5 control files, since each file has a different structure and different target table. So, I run Sqlldr 5 times using the proper control file.
    File1.dat ---> Table 1 using controlfile1.ctl
    File2.dat ---> Table 2 using controlfile2.ctl
    FileN.dat ---> Table N using controlfileN.ctl
    I'd like to run sqlldr only one with only ONE control file which includes 5 loads.
    Any ideas?
    Thanks.
    I'm trying to create a control file which includes several source files and load data into different tables.

    This isn't really a spatial question, you might have better luck getting an answer in a different topic area.
    I think the question you are going to get is... why do you have sqlldr files?
    Can't the source just provide an export file, or even a sql script to run? That's the way most people move data.
    Bryan

Maybe you are looking for